Hongzhou Lin
Hongzhou Lin
Bestätigte E-Mail-Adresse bei mit.edu - Startseite
Titel
Zitiert von
Zitiert von
Jahr
A universal catalyst for first-order optimization
H Lin, J Mairal, Z Harchaoui
Advances in neural information processing systems, 3384-3392, 2015
3492015
Resnet with one-neuron hidden layers is a universal approximator
H Lin, S Jegelka
Advances in neural information processing systems, 6169-6178, 2018
762018
Catalyst for gradient-based nonconvex optimization
C Paquette, H Lin, D Drusvyatskiy, J Mairal, Z Harchaoui
66*2018
Catalyst acceleration for first-order convex optimization: from theory to practice
H Lin, J Mairal, Z Harchaoui
The Journal of Machine Learning Research 18 (1), 7854-7907, 2017
552017
An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
H Lin, J Mairal, Z Harchaoui
SIAM Journal on Optimization 29 (2), 1408-1443, 2019
16*2019
On Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions
J Zhang, H Lin, S Sra, A Jadbabaie
arXiv preprint arXiv:2002.04130, 2020
52020
Ideal: Inexact decentralized accelerated augmented lagrangian method
Y Arjevani, J Bruna, B Can, M Gurbuzbalaban, S Jegelka, H Lin
Advances in Neural Information Processing Systems 33, 2020
22020
Complexity of Finding Stationary Points of Nonconvex Nonsmooth Functions
J Zhang, H Lin, S Jegelka, S Sra, A Jadbabaie
International Conference on Machine Learning, 11173-11182, 2020
2020
Stochastic Optimization with Non-stationary Noise
J Zhang, H Lin, S Das, S Sra, A Jadbabaie
arXiv preprint arXiv:2006.04429, 2020
2020
On the Complexity of Minimizing Convex Finite Sums Without Using the Indices of the Individual Functions
Y Arjevani, A Daniely, S Jegelka, H Lin
arXiv preprint arXiv:2002.03273, 2020
2020
Perceptual Regularization: Visualizing and Learning Generalizable Representations
H Lin, J Robinson, S Jegelka
2019
Algorithmes d'accélération générique pour les méthodes d'optimisation en apprentissage statistique
H Lin
Grenoble Alpes, 2017
2017
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–12