Подписаться
Anton Rodomanov
Anton Rodomanov
CISPA Helmholtz Center for Information Security
Подтвержден адрес электронной почты в домене cispa.de - Главная страница
Название
Процитировано
Процитировано
Год
Greedy quasi-Newton methods with explicit superlinear convergence
A Rodomanov, Y Nesterov
SIAM Journal on Optimization 31 (1), 785-811, 2021
662021
Rates of superlinear convergence for classical quasi-Newton methods
A Rodomanov, Y Nesterov
Mathematical Programming, 1-32, 2022
582022
Putting MRFs on a tensor train
A Novikov, A Rodomanov, A Osokin, D Vetrov
International Conference on Machine Learning, 811-819, 2014
562014
New Results on Superlinear Convergence of Classical Quasi-Newton Methods
A Rodomanov, Y Nesterov
Journal of Optimization Theory and Applications 188, 744-769, 2021
542021
A superlinearly-convergent proximal Newton-type method for the optimization of finite sums
A Rodomanov, D Kropotov
International Conference on Machine Learning, 2597-2605, 2016
482016
Primal-dual method for searching equilibrium in hierarchical congestion population games
P Dvurechensky, A Gasnikov, E Gasnikova, S Matsievsky, A Rodomanov, ...
arXiv preprint arXiv:1606.08988, 2016
372016
A randomized coordinate descent method with volume sampling
A Rodomanov, D Kropotov
SIAM Journal on Optimization 30 (3), 1878-1904, 2020
142020
Smoothness parameter of power of Euclidean norm
A Rodomanov, Y Nesterov
Journal of Optimization Theory and Applications 185, 303-326, 2020
112020
Subgradient ellipsoid method for nonsmooth convex problems
A Rodomanov, Y Nesterov
Mathematical Programming 199 (1), 305-341, 2023
62023
Quasi-Newton methods with provable efficiency guarantees
A Rodomanov
PhD thesis, UCL-Université Catholique de Louvain, 2022
32022
Universal Gradient Methods for Stochastic Convex Optimization
A Rodomanov, A Kavis, Y Wu, K Antonakopoulos, V Cevher
arXiv preprint arXiv:2402.03210, 2024
22024
Stabilized proximal-point methods for federated optimization
X Jiang, A Rodomanov, SU Stich
arXiv preprint arXiv:2407.07084, 2024
12024
Universality of AdaGrad Stepsizes for Stochastic Optimization: Inexact Oracle, Acceleration and Variance Reduction
A Rodomanov, X Jiang, S Stich
arXiv preprint arXiv:2406.06398, 2024
12024
Global Complexity Analysis of BFGS
A Rodomanov
arXiv preprint arXiv:2404.15051, 2024
12024
Federated Optimization with Doubly Regularized Drift Correction
X Jiang, A Rodomanov, SU Stich
arXiv preprint arXiv:2404.08447, 2024
12024
Non-Convex Stochastic Composite Optimization with Polyak Momentum
Y Gao, A Rodomanov, SU Stich
arXiv preprint arXiv:2403.02967, 2024
12024
Polynomial preconditioning for gradient methods
N Doikov, A Rodomanov
International Conference on Machine Learning, 8162-8187, 2023
12023
Optimizing -Smooth Functions by Gradient Methods
D Vankov, A Rodomanov, A Nedich, L Sankar, SU Stich
arXiv preprint arXiv:2410.10800, 2024
2024
Gradient Methods for Stochastic Optimization in Relative Scale
Y Nesterov, A Rodomanov
arXiv preprint arXiv:2301.08352, 2023
2023
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–19