Anton Rodomanov
Anton Rodomanov
Verified email at uclouvain.be
Title
Cited by
Cited by
Year
Putting MRFs on a tensor train
A Novikov, A Rodomanov, A Osokin, D Vetrov
International Conference on Machine Learning, 811-819, 2014
222014
A superlinearly-convergent proximal Newton-type method for the optimization of finite sums
A Rodomanov, D Kropotov
International Conference on Machine Learning, 2597-2605, 2016
212016
Primal-dual method for searching equilibrium in hierarchical congestion population games
P Dvurechensky, A Gasnikov, E Gasnikova, S Matsievsky, A Rodomanov, ...
arXiv preprint arXiv:1606.08988, 2016
182016
Greedy quasi-Newton methods with explicit superlinear convergence
A Rodomanov, Y Nesterov
arXiv preprint arXiv:2002.00657, 2020
32020
A randomized coordinate descent method with volume sampling
A Rodomanov, D Kropotov
SIAM Journal on Optimization 30 (3), 1878-1904, 2020
32020
Rates of superlinear convergence for classical quasi-Newton methods
A Rodomanov, Y Nesterov
arXiv preprint arXiv:2003.09174, 2020
22020
Smoothness Parameter of Power of Euclidean Norm.
A Rodomanov, Y Nesterov
J. Optim. Theory Appl. 185 (2), 303-326, 2020
12020
New results on superlinear convergence of classical quasi-Newton methods
A Rodomanov, Y Nesterov
arXiv preprint arXiv:2004.14866, 2020
2020
Linear Coupling of Gradient and Mirror Descent: Version for Composite Functions with Adaptive Estimation of the Lipschitz Constant
A Rodomanov
2016
A Newton-type Incremental Method with a Superlinear Convergence Rate
A Rodomanov, D Kropotov
The system can't perform the operation now. Try again later.
Articles 1–10