Follow
Nikita Doikov
Title
Cited by
Cited by
Year
Stochastic Subspace Cubic Newton Method
F Hanzely, N Doikov, P Richtárik, Y Nesterov
ICML 2020 (International Conference on Machine Learning), 2020
392020
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method
N Doikov, Y Nesterov
Journal of Optimization Theory and Applications, 2021
352021
Randomized Block Cubic Newton Method
N Doikov, P Richtárik
ICML 2018 (International Conference on Machine Learning), 2018
342018
Contracting Proximal Methods for Smooth Convex Optimization
N Doikov, Y Nesterov
SIAM Journal on Optimization 30 (4), 2020
272020
Local convergence of tensor methods
N Doikov, Y Nesterov
Mathematical Programming, 2021
262021
Gradient regularization of Newton method with Bregman distances
N Doikov, Y Nesterov
Mathematical Programming, 1-25, 2023
202023
Inexact Tensor Methods with Dynamic Accuracies
N Doikov, Y Nesterov
ICML 2020 (International Conference on Machine Learning), 2020
202020
Affine-invariant contracting-point methods for Convex Optimization
N Doikov, Y Nesterov
Mathematical Programming, 1-23, 2022
122022
Super-Universal Regularized Newton Method
N Doikov, K Mishchenko, Y Nesterov
arXiv preprint arXiv:2208.05888, 2022
112022
Convex optimization based on global lower second-order models
N Doikov, Y Nesterov
NeurIPS 2020 (Advances in Neural Information Processing Systems 33), 2020
82020
New second-order and tensor methods in Convex Optimization
N Doikov
Université catholique de Louvain, 2021
72021
Second-order optimization with lazy Hessians
N Doikov, EM Chayti, M Jaggi
ICML 2023 (International Conference on Machine Learning), 2022
42022
Optimization Methods for Fully Composite Problems
N Doikov, Y Nesterov
arXiv preprint arXiv:2103.12632, 2021
32021
High-Order Optimization Methods for Fully Composite Problems
N Doikov, Y Nesterov
SIAM Journal on Optimization 32 (3), 2402-2427, 2022
22022
Многокритериальные и многомодальные вероятностные тематические модели коллекций текстовых документов
КВ Воронцов, АА Потапенко, АИ Фрей, МА Апишев, НВ Дойков, ...
10-я Междунар. конф. ИОИ, 198, 2014
22014
Shuffle SGD is Always Better than SGD: Improved Analysis of SGD with Arbitrary Data Orders
A Koloskova, N Doikov, SU Stich, M Jaggi
arXiv preprint arXiv:2305.19259, 2023
12023
Lower Complexity Bounds for Minimizing Regularized Functions
N Doikov
arXiv preprint arXiv:2202.04545, 2022
12022
First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians
N Doikov, GN Grapiglia
arXiv preprint arXiv:2309.02412, 2023
2023
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method
N Doikov
arXiv preprint arXiv:2308.14742, 2023
2023
Linearization Algorithms for Fully Composite Optimization
ML Vladarean, N Doikov, M Jaggi, N Flammarion
COLT 2023 (Conference on Learning Theory), 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–20