Follow
Mihaela Rosca
Mihaela Rosca
Deepmind
Verified email at google.com - Homepage
Title
Cited by
Cited by
Year
Monte carlo gradient estimation in machine learning
S Mohamed, M Rosca, M Figurnov, A Mnih
Journal of Machine Learning Research 21 (132), 1-62, 2020
4072020
Variational approaches for auto-encoding generative adversarial networks
M Rosca, B Lakshminarayanan, D Warde-Farley, S Mohamed
arXiv preprint arXiv:1706.04987, 2017
3112017
Many paths to equilibrium: GANs do not need to decrease a divergence at every step
W Fedus, M Rosca, B Lakshminarayanan, AM Dai, S Mohamed, ...
arXiv preprint arXiv:1710.08446, 2017
2452017
Deep compressed sensing
Y Wu, M Rosca, T Lillicrap
International Conference on Machine Learning, 6850-6860, 2019
1752019
Distribution matching in variational inference
M Rosca, B Lakshminarayanan, S Mohamed
arXiv preprint arXiv:1802.06847, 2018
1032018
Training language gans from scratch
C de Masson d'Autume, S Mohamed, M Rosca, J Rae
Advances in Neural Information Processing Systems 32, 2019
822019
Sequence-to-sequence neural network models for transliteration
M Rosca, T Breuel
arXiv preprint arXiv:1610.09565, 2016
642016
Optax: composable gradient transformation and optimisation
M Hessel, D Budden, F Viola, M Rosca, E Sezener, T Hennigan
JAX, http://github. com/deepmind/optax (last access: 4 July 2023), version 0.0 1, 2020
472020
Spectral normalisation for deep reinforcement learning: an optimisation perspective
F Gogianu, T Berariu, MC Rosca, C Clopath, L Busoniu, R Pascanu
International Conference on Machine Learning, 3734-3744, 2021
432021
A case for new neural network smoothness constraints
M Rosca, T Weber, A Gretton, S Mohamed
PMLR, 2020
372020
Learning implicit generative models with the method of learned moments
S Ravuri, S Mohamed, M Rosca, O Vinyals
International conference on machine learning, 4314-4323, 2018
302018
Optax: composable gradient transformation and optimisation, in jax!, 2020
M Hessel, D Budden, F Viola, M Rosca, E Sezener, T Hennigan
URL http://github. com/deepmind/optax 16, 2010
282010
Why neural networks find simple solutions: The many regularizers of geometric complexity
B Dherin, M Munn, M Rosca, D Barrett
Advances in Neural Information Processing Systems 35, 2333-2349, 2022
192022
Optax: composable gradient transformation and optimisation, in jax
M Hessel, D Budden, F Viola, M Rosca, E Sezener, T Hennigan
Github. http://github. com/google/jax, 2020
162020
Discretization drift in two-player games
MC Rosca, Y Wu, B Dherin, D Barrett
International Conference on Machine Learning, 9064-9074, 2021
132021
On a continuous time model of gradient descent dynamics and instability in deep learning
M Rosca, Y Wu, C Qin, B Dherin
arXiv preprint arXiv:2302.01952, 2023
42023
Compressed sensing using neural networks
Y Wu, TP Lillicrap, M Rosca
US Patent App. 16/818,895, 2020
42020
Measure-valued derivatives for approximate bayesian inference
M Rosca, M Figurnov, S Mohamed, A Mnih
NeurIPS Workshop on Approximate Bayesian Inference, 2019
42019
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
M Reid, N Savinov, D Teplyashin, D Lepikhin, T Lillicrap, J Alayrac, ...
arXiv preprint arXiv:2403.05530, 2024
32024
Implicit regularisation in stochastic gradient descent: from single-objective to two-player games
M Rosca, MP Deisenroth
arXiv preprint arXiv:2307.05789, 2023
32023
The system can't perform the operation now. Try again later.
Articles 1–20