Dmitry Vilensky (Pasechnyuk)
Dmitry Vilensky (Pasechnyuk)
Verified email at - Homepage
Cited by
Cited by
Inexact model: A framework for optimization and variational inequalities
F Stonyakin, A Tyurin, A Gasnikov, P Dvurechensky, A Agafonov, ...
Optimization Methods and Software 36 (6), 1155-1201, 2021
Gradient methods for problems with inexact model of the objective
FS Stonyakin, D Dvinskikh, P Dvurechensky, A Kroshnin, O Kuznetsova, ...
Mathematical Optimization Theory and Operations Research: 18th International …, 2019
Adaptive catalyst for smooth convex optimization
A Ivanova, D Pasechnyuk, D Grishchenko, E Shulgin, A Gasnikov, ...
International Conference on Optimization and Applications, 20-37, 2021
Accelerated meta-algorithm for convex optimization problems
AV Gasnikov, DM Dvinskikh, PE Dvurechensky, DI Kamzolov, ...
Computational Mathematics and Mathematical Physics 61, 17-28, 2021
Oracle complexity separation in convex optimization
A Ivanova, P Dvurechensky, E Vorontsova, D Pasechnyuk, A Gasnikov, ...
Journal of Optimization Theory and Applications 193 (1-3), 462-490, 2022
Inexact Relative Smoothness and Strong Convexity for Optimization and Variational Inequalities by Inexact Model
F Stonyakin, A Tyurin, A Gasnikov, P Dvurechensky, A Agafonov, ...
arXiv preprint arXiv:2001.09013, 2020
A Damped Newton Method Achieves Global and Local Quadratic Convergence Rate
S Hanzely, D Kamzolov, D Pasechnyuk, A Gasnikov, P Richtárik, M Takác
Advances in Neural Information Processing Systems 35, 25320-25334, 2022
Solving strongly convex-concave composite saddle point problems with a small dimension of one of the variables
MS Alkousa, AV Gasnikov, EL Gladin, IA Kuruzov, DA Pasechnyuk, ...
Matematicheskii Sbornik 214 (3), 3-53, 2023
One Method for Minimization a Convex Lipschitz-Continuous Function of 2 Variables on a Fixed Square
DA Pasechnyuk, FS Stonyakin
arXiv preprint arXiv:1812.10300, 2018
Gradient-type adaptive methods for relatively Lipschitz convex optimization problems
F Stonyakin, A Titov, M Alkousa, O Savchuk, D Pasechnyuk
arXiv e-prints, arXiv: 2107.05765, 2021
Non-convex optimization in digital pre-distortion of the signal
D Pasechnyuk, A Maslovskiy, A Gasnikov, A Anikin, A Rogozin, A Gornov, ...
arXiv preprint arXiv:2103.10552, 2021
A unified analysis of variational inequality methods: variance reduction, sampling, quantization, and coordinate descent
AN Beznosikov, AV Gasnikov, KE Zainullina, AY Maslovskii, ...
Computational Mathematics and Mathematical Physics 63 (2), 147-174, 2023
Adaptive mirror descent for the network utility maximization problem
A Ivanova, F Stonyakin, D Pasechnyuk, E Vorontsova, A Gasnikov
IFAC-PapersOnLine 53 (2), 7851-7856, 2020
Numerical methods for the resource allocation problem in networks
A Ivanova, D Pasechnyuk, P Dvurechensky, A Gasnikov, E Vorontsova
arXiv preprint arXiv:1909.13321, 2019
Upper bounds on maximum admissible noise in zeroth-order optimisation
DA Pasechnyuk, A Lobanov, A Gasnikov
arXiv preprint arXiv:2306.16371, 2023
Accelerated proximal envelopes: application to componentwise methods
AS Anikin, VV Matyukhin, DA Pasechnyuk
Computational Mathematics and Mathematical Physics 62 (2), 336-345, 2022
Primal-dual gradient methods for searching network equilibria in combined models with nested choice structure and capacity constraints
M Kubentayeva, D Yarmoshik, M Persiianov, A Kroshnin, E Kotliarova, ...
Computational Management Science 21 (1), 15, 2024
Convergence analysis of stochastic gradient descent with adaptive preconditioning for non-convex and convex functions
DA Pasechnyuk, A Gasnikov, M Takáč
arXiv preprint arXiv:2308.14192, 2023
Effects of momentum scaling for SGD
DA Pasechnyuk, A Gasnikov, M Takáč
arXiv preprint arXiv:2210.11869, 2022
Stochastic optimization in digital pre-distortion of the signal
AV Alpatov, EA Peters, DA Pasechnyuk, AM Raigorodskii
arXiv preprint arXiv:2201.12159, 2022
The system can't perform the operation now. Try again later.
Articles 1–20