Dominik Stöger
Dominik Stöger
KU Eichstätt-Ingolstadt
Bestätigte E-Mail-Adresse bei - Startseite
Zitiert von
Zitiert von
Small random initialization is akin to spectral learning: Optimization and generalization guarantees for overparameterized low-rank matrix reconstruction
D Stöger, M Soltanolkotabi
Advances in Neural Information Processing Systems 34, 2021
Blind demixing and deconvolution at near-optimal rate
P Jung, F Krahmer, D Stöger
IEEE Transactions on Information Theory 64 (2), 704-727, 2017
Understanding overparameterization in generative adversarial networks
Y Balaji, M Sajedi, NM Kalibhat, M Ding, D Stöger, M Soltanolkotabi, ...
International Conference on Learning Representations 1, 2021
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
C Kümmerle, C Mayrink Verdun, D Stöger
Advances in Neural Information Processing Systems 34, 2873-2886, 2021
On the convex geometry of blind deconvolution and matrix completion
F Krahmer, D Stöger
Communications on Pure and Applied Mathematics 74 (4), 790-832, 2021
Implicit balancing and regularization: Generalization and convergence guarantees for overparameterized asymmetric matrix sensing
M Soltanolkotabi, D Stöger, C Xie
The Thirty Sixth Annual Conference on Learning Theory, 5140-5142, 2023
Complex phase retrieval from subgaussian measurements
F Krahmer, D Stöger
Journal of Fourier Analysis and Applications 26 (6), 89, 2020
Rigidity for perimeter inequality under spherical symmetrisation
F Cagnetti, M Perugini, D Stöger
Calculus of Variations and Partial Differential Equations 59, 1-53, 2020
Blind deconvolution and compressed sensing
D Stöger, P Jung, F Krahmer
2016 4th International Workshop on Compressed Sensing Theory and its …, 2016
Sparse Power Factorization: Balancing peakiness and sample complexity
J Geppert, F Krahmer, D Stöger
Advances in Computational Mathematics 45 (3), 1711-1728, 2019
Proof methods for robust low-rank matrix recovery
T Fuchs, D Gross, P Jung, F Krahmer, R Kueng, D Stöger
Compressed Sensing in Information Processing, 37-75, 2022
Refined performance guarantees for sparse power factorization
JA Geppert, F Krahmer, D Stöger
2017 International Conference on Sampling Theory and Applications (SampTA …, 2017
Randomly initialized alternating least squares: Fast convergence for matrix sensing
K Lee, D Stöger
SIAM Journal on Mathematics of Data Science 5 (3), 774-799, 2023
Blind demixing and deconvolution with noisy data: Near-optimal rate
D Stöger, P Jung, F Krahmer
WSA 2017; 21th International ITG Workshop on Smart Antennas, 1-5, 2017
Blind Demixing and Deconvolution with Noisy Data: Near-optimal Rate
P Jung, F Krahmer, D Stoeger
WSA 2017; 21th International ITG Workshop on Smart Antennas; Proceedings of, 1-5, 2017
Blind Deconvolution: Convex Geometry and Noise Robustness
F Krahmer, D Stöger
2018 52nd Asilomar Conference on Signals, Systems, and Computers, 643-646, 2018
Robust Recovery of Low-Rank Matrices and Low-Tubal-Rank Tensors from Noisy Sketches
A Ma, D Stöger, Y Zhu
SIAM Journal on Matrix Analysis and Applications 44 (4), 1566-1588, 2023
How to induce regularization in generalized linear models: A guide to reparametrizing gradient flow
HH Chou, J Maly, D Stöger
arXiv preprint arXiv:2308.04921, 2023
Sparse power factorization with refined peakiness conditions
D Stöger, J Geppert, F Krahmer
2018 IEEE Statistical Signal Processing Workshop (SSP), 816-820, 2018
On the Lipschitz constant of random neural networks
P Geuchen, T Heindl, D Stöger, F Voigtlaender
arXiv preprint arXiv:2311.01356, 2023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20