Didrik Nielsen
Title
Cited by
Cited by
Year
Tree boosting with xgboost-why does xgboost win" every" machine learning competition?
D Nielsen
NTNU, 2016
187*2016
Fast and scalable bayesian deep learning by weight-perturbation in adam
M Khan, D Nielsen, V Tangkaratt, W Lin, Y Gal, A Srivastava
International Conference on Machine Learning, 2611-2620, 2018
1022018
Slang: Fast structured covariance approximations for bayesian deep learning with natural gradient
A Mishkin, F Kunstner, D Nielsen, M Schmidt, ME Khan
arXiv preprint arXiv:1811.04504, 2018
272018
Fast yet simple natural-gradient descent for variational inference in complex models
ME Khan, D Nielsen
2018 International Symposium on Information Theory and Its Applications …, 2018
202018
Variational adaptive-newton method for explorative learning
ME Khan, W Lin, V Tangkaratt, Z Liu, D Nielsen
arXiv preprint arXiv:1711.05560, 2017
72017
Survae flows: Surjections to bridge the gap between vaes and flows
D Nielsen, P Jaini, E Hoogeboom, O Winther, M Welling
arXiv preprint arXiv:2007.02731, 2020
52020
Closing the dequantization gap: Pixelcnn as a single-layer flow
D Nielsen, O Winther
arXiv preprint arXiv:2002.02547, 2020
12020
Natural-Gradient Stochastic Variational Inference for Non-Conjugate Structured Variational Autoencoder
W Lin, ME Khan, N Hubacher, D Nielsen
12017
Argmax Flows and Multinomial Diffusion: Towards Non-Autoregressive Language Models
E Hoogeboom, D Nielsen, P Jaini, P Forré, M Welling
arXiv preprint arXiv:2102.05379, 2021
2021
Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC
P Jaini, D Nielsen, M Welling
arXiv preprint arXiv:2102.02374, 2021
2021
Argmax Flows: Learning Categorical Distributions with Normalizing Flows
E Hoogeboom, D Nielsen, P Jaini, P Forré, M Welling
PixelCNN as a Single-Layer Flow
D Nielsen, O Winther
The Variational Adaptive-Newton Method
ME Khan, W Lin, VTZ Liu, D Nielsen
The system can't perform the operation now. Try again later.
Articles 1–13