Greedy quasi-Newton methods with explicit superlinear convergence A Rodomanov, Y Nesterov SIAM Journal on Optimization 31 (1), 785-811, 2021 | 65 | 2021 |
Rates of superlinear convergence for classical quasi-Newton methods A Rodomanov, Y Nesterov Mathematical Programming, 1-32, 2022 | 58 | 2022 |
Putting MRFs on a tensor train A Novikov, A Rodomanov, A Osokin, D Vetrov International Conference on Machine Learning, 811-819, 2014 | 56 | 2014 |
New Results on Superlinear Convergence of Classical Quasi-Newton Methods A Rodomanov, Y Nesterov Journal of Optimization Theory and Applications 188, 744-769, 2021 | 54 | 2021 |
A superlinearly-convergent proximal Newton-type method for the optimization of finite sums A Rodomanov, D Kropotov International Conference on Machine Learning, 2597-2605, 2016 | 48 | 2016 |
Primal-dual method for searching equilibrium in hierarchical congestion population games P Dvurechensky, A Gasnikov, E Gasnikova, S Matsievsky, A Rodomanov, ... arXiv preprint arXiv:1606.08988, 2016 | 37 | 2016 |
A randomized coordinate descent method with volume sampling A Rodomanov, D Kropotov SIAM Journal on Optimization 30 (3), 1878-1904, 2020 | 14 | 2020 |
Smoothness parameter of power of Euclidean norm A Rodomanov, Y Nesterov Journal of Optimization Theory and Applications 185, 303-326, 2020 | 11 | 2020 |
Subgradient ellipsoid method for nonsmooth convex problems A Rodomanov, Y Nesterov Mathematical Programming 199 (1), 305-341, 2023 | 6 | 2023 |
Federated Optimization with Doubly Regularized Drift Correction X Jiang, A Rodomanov, SU Stich arXiv preprint arXiv:2404.08447, 2024 | 3 | 2024 |
Quasi-Newton methods with provable efficiency guarantees A Rodomanov PhD thesis, UCL-Université Catholique de Louvain, 2022 | 3 | 2022 |
Universality of AdaGrad Stepsizes for Stochastic Optimization: Inexact Oracle, Acceleration and Variance Reduction A Rodomanov, X Jiang, S Stich arXiv preprint arXiv:2406.06398, 2024 | 2 | 2024 |
Universal Gradient Methods for Stochastic Convex Optimization A Rodomanov, A Kavis, Y Wu, K Antonakopoulos, V Cevher arXiv preprint arXiv:2402.03210, 2024 | 2 | 2024 |
Polynomial preconditioning for gradient methods N Doikov, A Rodomanov International Conference on Machine Learning, 8162-8187, 2023 | 2 | 2023 |
Optimizing -Smooth Functions by Gradient Methods D Vankov, A Rodomanov, A Nedich, L Sankar, SU Stich arXiv preprint arXiv:2410.10800, 2024 | 1 | 2024 |
Stabilized proximal-point methods for federated optimization X Jiang, A Rodomanov, SU Stich arXiv preprint arXiv:2407.07084, 2024 | 1 | 2024 |
Global Complexity Analysis of BFGS A Rodomanov arXiv preprint arXiv:2404.15051, 2024 | 1 | 2024 |
Non-Convex Stochastic Composite Optimization with Polyak Momentum Y Gao, A Rodomanov, SU Stich arXiv preprint arXiv:2403.02967, 2024 | 1 | 2024 |
Gradient Methods for Stochastic Optimization in Relative Scale Y Nesterov, A Rodomanov arXiv preprint arXiv:2301.08352, 2023 | | 2023 |