Подписаться
Shyam Upadhyay
Shyam Upadhyay
Staff Research Scientist, Google
Подтвержден адрес электронной почты в домене google.com - Главная страница
Название
Процитировано
Процитировано
Год
Gemini: a family of highly capable multimodal models
G Team, R Anil, S Borgeaud, Y Wu, JB Alayrac, J Yu, R Soricut, ...
ArXiv Preprint, 2023
2665*2023
Beyond the imitation game: Quantifying and extrapolating the capabilities of language models
A Srivastava, A Rastogi, A Rao, AAM Shoeb, A Abid, A Fisch, AR Brown, ...
ArXiv Preprint, 2022
1366*2022
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context
M Reid, N Savinov, D Teplyashin, D Lepikhin, T Lillicrap, J Alayrac, ...
ArXiv Preprint, 2024
744*2024
Looking beyond the surface: A challenge set for reading comprehension over multiple sentences
D Khashabi, S Chaturvedi, M Roth, S Upadhyay, D Roth
NAACL 2018 (Long Paper), 2018
5412018
Attention Interpretability across NLP Tasks
S Vashishth, S Upadhyay, GS Tomar, M Faruqui
ArXiv Preprint, 2019
235*2019
Cross-lingual models of word embeddings: An empirical comparison
S Upadhyay, M Faruqui, C Dyer, D Roth
ACL 2016 (Long Paper), 2016
2222016
Tableformer: Robust transformer modeling for table-text encoding
J Yang, A Gupta, S Upadhyay, L He, R Goel, S Paul
ACL 2022 (Long Paper), 2022
1082022
Annotating derivations: A new evaluation strategy and dataset for algebra word problems
S Upadhyay, MW Chang
EACL 2017 (Long Paper), 2017
702017
TIMEDIAL: Temporal commonsense reasoning in dialog
L Qin, A Gupta, S Upadhyay, L He, Y Choi, M Faruqui
ACL 2021 (Long Paper), 2021
632021
Learning from explicit and implicit supervision jointly for algebra word problems
S Upadhyay, MW Chang, KW Chang, W Yih
EMNLP 2016 (Long Paper), 2016
602016
Joint multilingual supervision for cross-lingual entity linking
S Upadhyay, N Gupta, D Roth
EMNLP 2018 (Long Paper), 2018
512018
How FaR Are Large Language Models From Agents with Theory-of-Mind?
P Zhou, A Madaan, SP Potharaju, A Gupta, KR McKee, A Holtzman, ...
ArXiv Preprint, 2023
502023
Gemini: a family of highly capable multimodal models. arXiv. org
G Team, R Anil, S Borgeaud, J Alayrac, J Yu, R Soricut, J Schalkwyk, ...
44*2023
Equation parsing: Mapping sentences to grounded equations
S Roy, S Upadhyay, D Roth
EMNLP 2016 (Long Paper), 2016
392016
Cogcompnlp: Your swiss army knife for nlp
D Khashabi, M Sammons, B Zhou, T Redman, C Christodoulopoulos, ...
LREC 2018, 2018
38*2018
Beyond bilingual: Multi-sense word embeddings using multilingual context
S Upadhyay, KW Chang, M Taddy, A Kalai, J Zou
Repl4NLP Workshop 2017 (Best Paper Award), 2017
332017
Disfl-QA: A benchmark dataset for understanding disfluencies in question answering
A Gupta, J Xu, S Upadhyay, D Yang, M Faruqui
Findings of ACL-IJCNLP 2021, 2021
312021
(Almost) zero-shot cross-lingual spoken language understanding
S Upadhyay, M Faruqui, G Tür, HT Dilek, L Heck
ICASSP 2018, 2018
292018
Robust cross-lingual hypernymy detection using dependency context
S Upadhyay, Y Vyas, M Carpuat, D Roth
NAACL 2018 (Long Paper), 2018
262018
Cross-Lingual Dataless Classification for Many Languages.
Y Song, S Upadhyay, H Peng, D Roth
IJCAI 2016, 2016
242016
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–20