关注
Peter Richtarik
Peter Richtarik
Professor, KAUST
在 kaust.edu.sa 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Federated learning: Strategies for improving communication efficiency
J Konečný, HB McMahan, FX Yu, P Richtárik, AT Suresh, D Bacon
arXiv preprint arXiv:1610.05492 [NIPS Private Multi-Party Machine Learning …, 2016
25242016
Federated optimization: Distributed machine learning for on-device intelligence
J Konečný, HB McMahan, D Ramage, P Richtárik
arXiv preprint arXiv:1610.02527, 2016
10342016
Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
P Richtarik, M Takáč
Mathematical Programming 144 (2), 1-38, 2014
7362014
Generalized power method for sparse principal component analysis
M Journee, Y Nesterov, P Richtárik, R Sepulchre
Journal of Machine Learning Research 11, 517-553, 2010
6172010
Parallel coordinate descent methods for big data optimization
P Richtárik, M Takáč
Mathematical Programming 156 (1), 433-484, 2016
4922016
Accelerated, parallel and proximal coordinate descent
O Fercoq, P Richtárik
SIAM Journal on Optimization 25 (4), 1997-2023, 2015
3602015
Mini-batch semi-stochastic gradient descent in the proximal setting
J Konečný, J Liu, P Richtárik, M Takáč
IEEE Journal of Selected Topics in Signal Processing 10 (2), 242-255, 2016
2652016
Semi-stochastic gradient descent methods
J Konečný, P Richtárik
Frontiers in Applied Mathematics and Statistics 3:9, 2017
236*2017
Randomized iterative methods for linear systems
RM Gower, P Richtárik
SIAM Journal on Matrix Analysis and Applications 36 (4), 1660-1690, 2015
2262015
Distributed coordinate descent method for learning with big data
P Richtárik, M Takáč
Journal of Machine Learning Research 17 (75), 1-25, 2016
2232016
SGD: General Analysis and Improved Rates
RM Gower, N Loizou, X Qian, A Sailanbayev, E Shulgin, P Richtarik
ICML 2019, 2019
2062019
Mini-batch primal and dual methods for SVMs
M Takáč, A Bijral, P Richtárik, N Srebro
Proceedings of the 30th Int. Conf. on Machine Learning, PMLR 28 (3), 1022-1030, 2013
191*2013
Tighter theory for local SGD on identical and heterogeneous data
A Khaled, K Mishchenko, P Richtárik
The 23rd International Conference on Artificial Intelligence and Statistics, 2020
1822020
Adding vs. averaging in distributed primal-dual optimization
C Ma, V Smith, M Jaggi, MI Jordan, P Richtárik, M Takáč
Proceedings of the 32nd Int. Conf. on Machine Learning, PMLR 37, 1973-1982, 2015
1752015
Even faster accelerated coordinate descent using non-uniform sampling
Z Allen-Zhu, Z Qu, P Richtarik, Y Yuan
Proceedings of The 33rd Int. Conf. on Machine Learning, PMLR 48, 1110-1119, 2016
1662016
Scaling distributed machine learning with in-network aggregation
A Sapio, M Canini, CY Ho, J Nelson, P Kalnis, C Kim, A Krishnamurthy, ...
arXiv preprint arXiv:1903.06701, 2019
1622019
Distributed optimization with arbitrary local solvers
C Ma, J Konečný, M Jaggi, V Smith, MI Jordan, P Richtárik, M Takáč
Optimization Methods and Software 32 (4), 813-848, 2017
1502017
Federated learning of a mixture of global and local models
F Hanzely, P Richtárik
arXiv preprint arXiv:2002.05516, 2020
1462020
Stochastic block BFGS: squeezing more curvature out of data
RM Gower, D Goldfarb, P Richtárik
Proceedings of The 33rd Int. Conf. on Machine Learning, PMLR 48, 1869-1878, 2016
1462016
Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
N Loizou, P Richtárik
Computational Optimization and Applications 77 (3), 653-710, 2020
1392020
系统目前无法执行此操作,请稍后再试。
文章 1–20