Подписаться
Chenxin Ma
Chenxin Ma
Подтвержден адрес электронной почты в домене lehigh.edu - Главная страница
Название
Процитировано
Процитировано
Год
CoCoA: A general framework for communication-efficient distributed optimization
V Smith, S Forte, M Chenxin, M Takáč, MI Jordan, M Jaggi
Journal of Machine Learning Research 18, 230, 2018
2492018
Adding vs. averaging in distributed primal-dual optimization
C Ma, M Jaggi, MI Jordan, B EDU, P Richtárik, M Takác
1892015
Distributed optimization with arbitrary local solvers
C Ma, J Konečný, M Jaggi, V Smith, MI Jordan, P Richtárik, M Takáč
optimization Methods and Software 32 (4), 813-848, 2017
1842017
Efficient distributed hessian free algorithm for large-scale empirical risk minimization via accumulating sample strategy
M Jahani, X He, C Ma, A Mokhtari, D Mudigere, A Ribeiro, M Takác
International Conference on Artificial Intelligence and Statistics, 2634-2644, 2020
262020
Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
C Ma, R Tappenden, M Takáč
The Journal of Machine Learning Research 17 (1), 8138-8161, 2016
182016
An accelerated communication-efficient primal-dual optimization framework for structured machine learning
C Ma, M Jaggi, FE Curtis, N Srebro, M Takáč
Optimization Methods and Software 36 (1), 20-44, 2021
122021
Partitioning data on features or samples in communication-efficient distributed optimization?
C Ma, M Takáč
arXiv preprint arXiv:1510.06688, 2015
122015
Underestimate sequences via quadratic averaging
C Ma, NVC Gudapati, M Jahani, R Tappenden, M Takác
arXiv preprint arXiv:1710.03695, 2017
112017
Distributed inexact damped newton method: Data partitioning and load-balancing
C Ma, M Takáč
arXiv preprint arXiv:1603.05191, 2016
102016
Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
M Jahani, NVC Gudapati, C Ma, R Tappenden, M Takáč
Computational Optimization and Applications 79, 369-404, 2021
32021
Distributed inexact damped newton method: Data partitioning and work-balancing
C Ma, M Takác
Workshops at the Thirty-First AAAI Conference on Artificial Intelligence, 2017
32017
Distributed Methods for Composite Optimization: Communication Efficiency, Load-Balancing and Local Solvers
C Ma
Ph. D. thesis, Lehigh University, 2018
12018
Grow Your Samples and Optimize Better via Distributed Newton CG and Accumulating Strategy
M Jahani, X He, C Ma, A Mokhtari, D Mudigere, A Ribeiro, M Takác
Distributed Restarting NewtonCG Method for Large-Scale Empirical Risk Minimization
M Jahani, X He, C Ma, D Mudigere, A Mokhtari, A Ribeiro, M Takac
CoCoA+: Adding vs. Averaging in Distributed Optimization
M Takác, C Ma, V Smith, M Jaggi, MI Jordan, P Richtrik
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–15