Follow
Max Ryabinin
Max Ryabinin
Yandex, HSE University
Verified email at yandex-team.ru
Title
Cited by
Cited by
Year
Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts
M Ryabinin, A Gusev
Advances in Neural Information Processing Systems 33 (NeurIPS 2020), 3659–3672, 2020
132020
Embedding Words in Non-Vector Space with Unsupervised Graph Learning
M Ryabinin, S Popov, L Prokhorenkova, E Voita
Empirical Methods in Natural Language Processing (EMNLP 2020), 7317–7331, 2020
82020
Distributed Deep Learning in Open Collaborations
M Diskin*, A Bukhtiyarov*, M Ryabinin*, L Saulnier, Q Lhoest, A Sinitsin, ...
Advances in Neural Information Processing Systems 34 (NeurIPS 2021), 2021
52021
Scaling Ensemble Distribution Distillation to Many Classes With Proxy Targets
M Ryabinin, A Malinin, M Gales
Advances in Neural Information Processing Systems 34 (NeurIPS 2021), 2021
32021
Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices
M Ryabinin*, E Gorbunov*, V Plokhotnyuk, G Pekhimenko
Advances in Neural Information Processing Systems 34 (NeurIPS 2021), 2021
32021
It's All in the Heads: Using Attention Heads as a Baseline for Cross-Lingual Transfer in Commonsense Reasoning
A Tikhonov*, M Ryabinin*
Findings of the ACL 2021, 3534–3546, 2021
22021
Secure Distributed Training at Scale
E Gorbunov, A Borzunov, M Diskin, M Ryabinin
arXiv preprint arXiv:2106.11257, 2021
12021
Adaptive Prediction Time for Sequence Classification
M Ryabinin, E Lobacheva
12018
Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
A Beznosikov, P Richtárik, M Diskin, M Ryabinin, A Gasnikov
arXiv preprint arXiv:2110.03313, 2021
2021
SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient
M Ryabinin, T Dettmers, M Diskin, A Borzunov
2021
Unsupervised Discovery of Interpretable Latent Manipulations in Language VAEs
M Ryabinin, A Babenko, E Voita
2020
The system can't perform the operation now. Try again later.
Articles 1–11