Подписаться
Filip Hanzely
Filip Hanzely
Неизвестная организация
Нет подтвержденного адреса электронной почты - Главная страница
Название
Процитировано
Процитировано
Год
Federated learning of a mixture of global and local models
F Hanzely, P Richtárik
arXiv preprint arXiv:2002.05516, 2020
3552020
A field guide to federated optimization
J Wang, Z Charles, Z Xu, G Joshi, HB McMahan, M Al-Shedivat, G Andrew, ...
arXiv preprint arXiv:2107.06917, 2021
3002021
Lower bounds and optimal algorithms for personalized federated learning
F Hanzely, S Hanzely, S Horváth, P Richtárik
NeurIPS 2020, 2020
1572020
A unified theory of SGD: Variance reduction, sampling, quantization and coordinate descent
E Gorbunov, F Hanzely, P Richtárik
International Conference on Artificial Intelligence and Statistics, 680-690, 2020
1452020
Local sgd: Unified theory and new efficient methods
E Gorbunov, F Hanzely, P Richtárik
International Conference on Artificial Intelligence and Statistics, 3556-3564, 2021
982021
Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
F Hanzely, P Richtarik, L Xiao
Computational Optimization and Applications, 2018
832018
SEGA: Variance reduction via gradient sketching
F Hanzely, K Mishchenko, P Richtárik
NeruIPS 2018, 2018
742018
Fastest rates for stochastic mirror descent methods
F Hanzely, P Richtárik
Computational Optimization and Applications 79, 717-766, 2021
522021
Accelerated stochastic matrix inversion: general theory and speeding up BFGS rules for faster second-order optimization
RM Gower, F Hanzely, P Richtárik, S Stich
NeurIPS 2018, 2018
472018
Stochastic Subspace Cubic Newton Method
F Hanzely, N Doikov, P Richtárik, Y Nesterov
ICML 2020, 2020
442020
Accelerated coordinate descent with arbitrary sampling and best rates for minibatches
F Hanzely, P Richtárik
AISTATS 2019, 2019
432019
Personalized federated learning: A unified framework and universal optimization techniques
F Hanzely, B Zhao, M Kolar
arXiv preprint arXiv:2102.09743, 2021
342021
99% of worker-master communication in distributed optimization is not needed
K Mishchenko, F Hanzely, P Richtárik
Conference on Uncertainty in Artificial Intelligence, 979-988, 2020
33*2020
Testing for causality in reconstructed state spaces by an optimized mixed prediction method
A Krakovská, F Hanzely
Physical Review E 94 (5), 052203, 2016
322016
One method to rule them all: Variance reduction for data, parameters and many new methods
F Hanzely, P Richtárik
arXiv preprint arXiv:1905.11266, 2019
272019
Privacy preserving randomized gossip algorithms
F Hanzely, J Konečný, N Loizou, P Richtárik, D Grishchenko
arXiv preprint arXiv:1706.07636, 2017
212017
Smoothness matrices beat smoothness constants: Better communication compression techniques for distributed optimization
M Safaryan, F Hanzely, P Richtárik
Advances in Neural Information Processing Systems 34, 25688-25702, 2021
202021
Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
F Hanzely, D Kovalev, P Richtarik
ICML 2020, 2020
202020
A nonconvex projection method for robust PCA
A Dutta, F Hanzely, P Richtárik
Proceedings of the AAAI conference on artificial intelligence 33 (01), 1468-1476, 2019
202019
A privacy preserving randomized gossip algorithm via controlled noise insertion
F Hanzely, J Konečný, N Loizou, P Richtárik, D Grishchenko
NeurIPS PPML workshop 2018, 2018
82018
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–20