Aleksandr Beznosikov
Aleksandr Beznosikov
Verified email at phystech.edu - Homepage
Title
Cited by
Cited by
Year
On biased compression for distributed learning
A Beznosikov, S Horváth, P Richtárik, M Safaryan
NeurIPS 2020, Workshop on Scalability, Privacy, and Security in Federated …, 2020
482020
Derivative-free method for composite optimization with applications to decentralized distributed optimization
A Beznosikov, E Gorbunov, A Gasnikov
IFAC-PapersOnLine 53 (2), 4038-4043, 2020
20*2020
Gradient-free methods with inexact oracle for convex-concave stochastic saddle-point problem
A Beznosikov, A Sadiev, A Gasnikov
International Conference on Mathematical Optimization Theory and Operations …, 2020
14*2020
Distributed Saddle-Point Problems: Lower Bounds, Optimal and Robust Algorithms
A Beznosikov, V Samokhin, A Gasnikov
arXiv preprint arXiv:2010.13112, 2021
10*2021
Recent theoretical advances in decentralized distributed convex optimization
E Gorbunov, A Rogozin, A Beznosikov, D Dvinskikh, A Gasnikov
arXiv preprint arXiv:2011.13259, 2020
102020
Decentralized distributed optimization for saddle point problems
A Rogozin, A Beznosikov, D Dvinskikh, D Kovalev, P Dvurechensky, ...
arXiv preprint arXiv:2102.07758, 2021
62021
Zeroth-order algorithms for smooth saddle-point problems
A Sadiev, A Beznosikov, P Dvurechensky, A Gasnikov
International Conference on Mathematical Optimization Theory and Operations …, 2021
52021
Decentralized Local Stochastic Extra-Gradient for Variational Inequalities
A Beznosikov, P Dvurechensky, A Koloskova, V Samokhin, SU Stich, ...
arXiv preprint arXiv:2106.08315, 2021
52021
Solving smooth min-min and min-max problems by mixed oracle algorithms
E Gladin, A Sadiev, A Gasnikov, P Dvurechensky, A Beznosikov, ...
arXiv preprint arXiv:2103.00434, 2021
42021
Distributed Saddle-Point Problems Under Data Similarity
A Beznosikov, G Scutari, A Rogozin, A Gasnikov
Advances in Neural Information Processing Systems 34, 2021
2*2021
Decentralized Personalized Federated Min-Max Problems
A Beznosikov, V Sushko, A Sadiev, A Gasnikov
arXiv preprint arXiv:2106.07289, 2021
22021
Linearly Convergent Gradient-Free Methods for Minimization of Parabolic Approximation
A Bazarova, A Beznosikov, A Gasnikov
arXiv preprint arXiv:2009.04906, 2020
1*2020
Random-reshuffled SARAH does not need a full gradient computations
A Beznosikov, M Takáč
arXiv preprint arXiv:2111.13322, 2021
2021
Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
A Beznosikov, P Richtárik, M Diskin, M Ryabinin, A Gasnikov
arXiv preprint arXiv:2110.03313, 2021
2021
Near-Optimal Decentralized Algorithms for Saddle Point Problems over Time-Varying Networks
A Beznosikov, A Rogozin, D Kovalev, A Gasnikov
International Conference on Optimization and Applications, 246-257, 2021
2021
One-Point Gradient-Free Methods for Composite Optimization with Applications to Distributed Optimization
I Stepanov, A Voronov, A Beznosikov, A Gasnikov
arXiv preprint arXiv:2107.05951, 2021
2021
One-point gradient-free methods for smooth and non-smooth saddle-point problems
A Beznosikov, V Novitskii, A Gasnikov
International Conference on Mathematical Optimization Theory and Operations …, 2021
2021
Decentralized Personalized Federated Learning: Lower Bounds and Optimal Algorithm for All Personalization Modes
A Sadiev, E Borodich, A Beznosikov, D Dvinskikh, A Gasnikov
The system can't perform the operation now. Try again later.
Articles 1–18