Подписаться
Julia  Gusak
Julia Gusak
Подтвержден адрес электронной почты в домене inria.fr - Главная страница
Название
Процитировано
Процитировано
Год
Stable Low-rank Tensor Decomposition for Compression of Convolutional Neural Network
AH Phan, K Sobolev, K Sozykin, D Ermilov, J Gusak, P Tichavsky, ...
European Conference on Computer Vision 2020, 2020
1722020
Automated Multi-Stage Compression of Neural Networks
J Gusak, M Kholiavchenko, E Ponomarev, L Markeeva, ...
Proceedings of the IEEE International Conference on Computer Vision …, 2019
85*2019
Active subspace of neural networks: Structural analysis and universal attacks
C Cui, K Zhang, T Daulbaev, J Gusak, I Oseledets, Z Zhang
SIAM Journal on Mathematics of Data Science 2 (4), 1096-1122, 2020
392020
Interpolation technique to speed up gradients propagation in neural odes
T Daulbaev, A Katrutsa, L Markeeva, J Gusak, A Cichocki, I Oseledets
Advances in Neural Information Processing Systems 33, 16689-16700, 2020
302020
Optimal control and sensitivity analysis for two risk models
E Bulinskaya, J Gusak
Communications in Statistics-Simulation and Computation 45 (5), 1451-1466, 2016
222016
Discrete-time insurance model with capital injections and reinsurance
E Bulinskaya, J Gusak, A Muromskaya
Methodology and Computing in Applied Probability 17 (4), 899-914, 2015
212015
Towards Understanding Normalization in Neural ODEs
J Gusak, L Markeeva, T Daulbaev, A Katrutsa, A Cichocki, I Oseledets
International Conference on Learning Representations 2020 Workshop on …, 2020
192020
Survey on Efficient Training of Large Neural Networks
J Gusak, D Cherniuk, A Shilova, A Katrutsa, D Bershatsky, X Zhao, ...
IJCAI-ECAI, 2022
17*2022
Interpolated adjoint method for neural odes
T Daulbaev, A Katrutsa, L Markeeva, J Gusak, A Cichocki, I Oseledets
arXiv preprint arXiv:2003.05271, 2020
152020
Reduced-order modeling of deep neural networks
J Gusak, T Daulbaev, E Ponomarev, A Cichocki, I Oseledets
Computational Mathematics and Mathematical Physics 61 (5), 774-785, 2021
112021
Few-bit backward: Quantized gradients of activation functions for memory footprint reduction
GS Novikov, D Bershatsky, J Gusak, A Shonenkov, DV Dimitrov, ...
International Conference on Machine Learning, 26363-26381, 2023
102023
Automated multi-stage compression of neural networks. 2019 IEEE
J Gusak, M Kholyavchenko, E Ponomarev, L Markeeva, ...
CVF International Conference on Computer Vision Workshop (ICCVW) pp, 2501-2508, 2019
62019
Insurance Models Under Incomplete Information
E Bulinskaya, J Gusak
Springer Proceedings in Mathe-matics and Statistics 231, 2018
52018
Memory-efficient backpropagation through large linear layers
D Bershatsky, A Mikhalev, A Katrutsa, J Gusak, D Merkulov, I Oseledets
arXiv preprint arXiv:2201.13195, 2022
42022
Efficient GPT Model Pre-training using Tensor Train Matrix Representation
V Chekalina, G Novikov, J Gusak, I Oseledets, A Panchenko
Pacific Asia Conference on Language, Information and Computation, 2023
32023
Rockmate: an Efficient, Fast, Automatic and Generic Tool for Re-materialization in PyTorch
X Zhao, T Le Hellard, L Eyraud-Dubois, J Gusak, O Beaumont
International Conference on Machine Learning, 2023
22023
Meta-solver for neural ordinary differential equations
J Gusak, A Katrutsa, T Daulbaev, A Cichocki, I Oseledets
arXiv preprint arXiv:2103.08561, 2021
22021
OFFMATE: full fine-tuning of LLMs on a single GPU by re-materialization and offloading
X Zhao, L Eyraud-Dubois, T Le Hellard, J Gusak, O Beaumont
2024
HiRemate: Hierarchical Approach for Efficient Re-materialization of Large Neural Networks
J Gusak, X Zhao, T Le Hellard, Z Li, L Eyraud-Dubois, O Beaumont
https://hal.science/hal-04403844, 2024
2024
Quantization Aware Factorization for Deep Neural Network Compression
D Cherniuk, S Abukhovich, AH Phan, I Oseledets, A Cichocki, J Gusak
arXiv preprint arXiv:2308.04595, 2023
2023
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–20