Follow
Dominik Stöger
Dominik Stöger
Assistant Professor, KU Eichstätt-Ingolstadt
Verified email at ku.de - Homepage
Title
Cited by
Cited by
Year
Small random initialization is akin to spectral learning: Optimization and generalization guarantees for overparameterized low-rank matrix reconstruction
D Stöger, M Soltanolkotabi
Advances in Neural Information Processing Systems 34, 2021
962021
Blind demixing and deconvolution at near-optimal rate
P Jung, F Krahmer, D Stöger
IEEE Transactions on Information Theory 64 (2), 704-727, 2017
542017
Understanding overparameterization in generative adversarial networks
Y Balaji, M Sajedi, NM Kalibhat, M Ding, D Stöger, M Soltanolkotabi, ...
International Conference on Learning Representations 1, 2021
352021
Implicit balancing and regularization: Generalization and convergence guarantees for overparameterized asymmetric matrix sensing
M Soltanolkotabi, D Stöger, C Xie
The Thirty Sixth Annual Conference on Learning Theory, 5140-5142, 2023
232023
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
C Kümmerle, C Mayrink Verdun, D Stöger
Advances in Neural Information Processing Systems 34, 2873-2886, 2021
23*2021
On the convex geometry of blind deconvolution and matrix completion
F Krahmer, D Stöger
Communications on Pure and Applied Mathematics 74 (4), 790-832, 2021
182021
Complex phase retrieval from subgaussian measurements
F Krahmer, D Stöger
Journal of Fourier Analysis and Applications 26 (6), 89, 2020
172020
Rigidity for perimeter inequality under spherical symmetrisation
F Cagnetti, M Perugini, D Stöger
Calculus of Variations and Partial Differential Equations 59, 1-53, 2020
172020
Randomly initialized alternating least squares: Fast convergence for matrix sensing
K Lee, D Stöger
SIAM Journal on Mathematics of Data Science 5 (3), 774-799, 2023
132023
Blind deconvolution and compressed sensing
D Stöger, P Jung, F Krahmer
2016 4th International Workshop on Compressed Sensing Theory and its …, 2016
122016
Sparse Power Factorization: Balancing peakiness and sample complexity
J Geppert, F Krahmer, D Stöger
Advances in Computational Mathematics 45 (3), 1711-1728, 2019
102019
Proof methods for robust low-rank matrix recovery
T Fuchs, D Gross, P Jung, F Krahmer, R Kueng, D Stöger
Compressed Sensing in Information Processing, 37-75, 2022
92022
Refined performance guarantees for sparse power factorization
JA Geppert, F Krahmer, D Stöger
2017 International Conference on Sampling Theory and Applications (SampTA …, 2017
72017
Blind Demixing and Deconvolution with Noisy Data: Near-optimal Rate
P Jung, F Krahmer, D Stoeger
WSA 2017; 21th International ITG Workshop on Smart Antennas; Proceedings of, 1-5, 2017
5*2017
How to induce regularization in generalized linear models: A guide to reparametrizing gradient flow
HH Chou, J Maly, D Stöger
arXiv preprint arXiv:2308.04921, 2023
32023
Non-convex matrix sensing: Breaking the quadratic rank barrier in the sample complexity
D Stöger, Y Zhu
arXiv preprint arXiv:2408.13276, 2024
22024
Blind Deconvolution: Convex Geometry and Noise Robustness
F Krahmer, D Stöger
2018 52nd Asilomar Conference on Signals, Systems, and Computers, 643-646, 2018
22018
Robust Recovery of Low-Rank Matrices and Low-Tubal-Rank Tensors from Noisy Sketches
A Ma, D Stöger, Y Zhu
SIAM Journal on Matrix Analysis and Applications 44 (4), 1566-1588, 2023
12023
Sparse power factorization with refined peakiness conditions
D Stöger, J Geppert, F Krahmer
2018 IEEE Statistical Signal Processing Workshop (SSP), 816-820, 2018
12018
Linear Convergence of Iteratively Reweighted Least Squares for Nuclear Norm Minimization
C Kümmerle, D Stöger
2024 IEEE 13rd Sensor Array and Multichannel Signal Processing Workshop (SAM …, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–20