Folgen
Erik Daxberger
Erik Daxberger
Bestätigte E-Mail-Adresse bei apple.com
Titel
Zitiert von
Zitiert von
Jahr
Laplace Redux--Effortless Bayesian Deep Learning
E Daxberger*, A Kristiadi*, A Immer*, R Eschenhagen*, M Bauer, ...
NeurIPS 2021, 2021
2072021
Sample-Efficient Optimization in the Latent Space of Deep Generative Models via Weighted Retraining
A Tripp*, E Daxberger*, JM Hernández-Lobato
NeurIPS 2020, 2020
1182020
Embedding Models for Episodic Knowledge Graphs
Y Ma, V Tresp, EA Daxberger
Journal of Web Semantics, 2018
972018
Bayesian Deep Learning via Subnetwork Inference
E Daxberger, E Nalisnick, JU Allingham, J Antorán, ...
ICML 2021, 2021
81*2021
Distributed Batch Gaussian Process Optimization
EA Daxberger, BKH Low
ICML 2017, 2017
552017
Bayesian Variational Autoencoders for Unsupervised Out-of-Distribution Detection
E Daxberger, JM Hernández-Lobato
Bayesian Deep Learning Workshop, NeurIPS 2019, 2019
532019
Mixed-Variable Bayesian Optimization
E Daxberger*, A Makarova*, M Turchetta, A Krause
IJCAI 2020, 2020
432020
Adapting the Linearised Laplace Model Evidence for Modern Deep Learning
J Antorán, D Janz, JU Allingham, E Daxberger, R Barbano, E Nalisnick, ...
ICML 2022, 2022
25*2022
Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning
R Eschenhagen, E Daxberger, P Hennig, A Kristiadi
Bayesian Deep Learning Workshop, NeurIPS 2021, 2021
192021
Mobile V-MoEs: Scaling Down Vision Transformers via Sparse Mixture-of-Experts
E Daxberger, F Weers, B Zhang, T Gunter, R Pang, M Eichner, ...
arXiv 2023, 2023
22023
Improving Continual Learning by Accurate Gradient Reconstructions of the Past
E Daxberger, S Swaroop, K Osawa, R Yokota, RE Turner, ...
TMLR 2023, 2023
12023
Advances in Probabilistic Deep Learning and Their Applications
EA Daxberger
University of Cambridge, 2023
2023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–12