Tree boosting with xgboost-why does xgboost win" every" machine learning competition? D Nielsen NTNU, 2016 | 190* | 2016 |
Fast and scalable bayesian deep learning by weight-perturbation in adam M Khan, D Nielsen, V Tangkaratt, W Lin, Y Gal, A Srivastava International Conference on Machine Learning, 2611-2620, 2018 | 103 | 2018 |
Slang: Fast structured covariance approximations for bayesian deep learning with natural gradient A Mishkin, F Kunstner, D Nielsen, M Schmidt, ME Khan arXiv preprint arXiv:1811.04504, 2018 | 27 | 2018 |
Fast yet simple natural-gradient descent for variational inference in complex models ME Khan, D Nielsen 2018 International Symposium on Information Theory and Its Applications …, 2018 | 20 | 2018 |
Variational adaptive-newton method for explorative learning ME Khan, W Lin, V Tangkaratt, Z Liu, D Nielsen arXiv preprint arXiv:1711.05560, 2017 | 7 | 2017 |
Survae flows: Surjections to bridge the gap between vaes and flows D Nielsen, P Jaini, E Hoogeboom, O Winther, M Welling Advances in Neural Information Processing Systems 33, 2020 | 5 | 2020 |
Closing the dequantization gap: Pixelcnn as a single-layer flow D Nielsen, O Winther arXiv preprint arXiv:2002.02547, 2020 | 1 | 2020 |
Natural-Gradient Stochastic Variational Inference for Non-Conjugate Structured Variational Autoencoder W Lin, ME Khan, N Hubacher, D Nielsen | 1 | 2017 |
Argmax Flows and Multinomial Diffusion: Towards Non-Autoregressive Language Models E Hoogeboom, D Nielsen, P Jaini, P Forré, M Welling arXiv preprint arXiv:2102.05379, 2021 | | 2021 |
Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC P Jaini, D Nielsen, M Welling arXiv preprint arXiv:2102.02374, 2021 | | 2021 |
Argmax Flows: Learning Categorical Distributions with Normalizing Flows E Hoogeboom, D Nielsen, P Jaini, P Forré, M Welling | | |
PixelCNN as a Single-Layer Flow D Nielsen, O Winther | | |
The Variational Adaptive-Newton Method ME Khan, W Lin, VTZ Liu, D Nielsen | | |