Подписаться
Sewon Min
Sewon Min
Подтвержден адрес электронной почты в домене cs.washington.edu - Главная страница
Название
Процитировано
Процитировано
Год
Dense Passage Retrieval for Open-Domain Question Answering
V Karpukhin, B Oğuz, S Min, L Wu, S Edunov, D Chen, W Yih
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
26902020
Rethinking the role of demonstrations: What makes in-context learning work?
S Min, X Lyu, A Holtzman, M Artetxe, M Lewis, H Hajishirzi, L Zettlemoyer
arXiv preprint arXiv:2202.12837, 2022
8482022
Unifiedqa: Crossing format boundaries with a single qa system
D Khashabi, S Min, T Khot, A Sabharwal, O Tafjord, P Clark, H Hajishirzi
Findings of Empirical Methods in Natural Language Processing (EMNLP), 2020
627*2020
Metaicl: Learning to learn in context
S Min, M Lewis, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:2110.15943, 2021
3362021
Measuring and narrowing the compositionality gap in language models
O Press, M Zhang, S Min, L Schmidt, NA Smith, M Lewis
arXiv preprint arXiv:2210.03350, 2022
290*2022
Replug: Retrieval-augmented black-box language models
W Shi, S Min, M Yasunaga, M Seo, R James, M Lewis, L Zettlemoyer, ...
arXiv preprint arXiv:2301.12652, 2023
268*2023
Multi-hop Reading Comprehension through Question Decomposition and Rescoring
S Min, V Zhong, L Zettlemoyer, H Hajishirzi
Annual Meeting of the Association for Computational Linguistics (ACL), 2019
2312019
AmbigQA: Answering Ambiguous Open-domain Questions
S Min, J Michael, H Hajishirzi, L Zettlemoyer
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
2002020
Factscore: Fine-grained atomic evaluation of factual precision in long form text generation
S Min, K Krishna, X Lyu, M Lewis, W Yih, PW Koh, M Iyyer, L Zettlemoyer, ...
arXiv preprint arXiv:2305.14251, 2023
1842023
Efficient and Robust Question Answering from Minimal Context over Documents
S Min, V Zhong, R Socher, C Xiong
Annual Meeting of the Association for Computational Linguistics (ACL), 2018
1752018
Noisy channel language model prompting for few-shot text classification
S Min, M Lewis, H Hajishirzi, L Zettlemoyer
arXiv preprint arXiv:2108.04106, 2021
1732021
A Discrete Hard EM Approach for Weakly Supervised Question Answering
S Min, D Chen, H Hajishirzi, L Zettlemoyer
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019
1622019
Compositional Questions Do Not Necessitate Multi-hop Reasoning
S Min, E Wallace, S Singh, M Gardner, H Hajishirzi, L Zettlemoyer
Annual Meeting of the Association for Computational Linguistics (ACL), 2019
1452019
Question Answering through Transfer Learning from Large Fine-grained Supervision Data
S Min, M Seo, H Hajishirzi
Annual Meeting of the Association for Computational Linguistics (ACL), 2017
1372017
Query-reduction networks for question answering
M Seo, S Min, A Farhadi, H Hajishirzi
International Conference on Learning Representations (ICLR), 2017
132*2017
Efficient One-Pass End-to-End Entity Linking for Questions
BZ Li, S Min, S Iyer, Y Mehdad, W Yih
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
1242020
Towards understanding chain-of-thought prompting: An empirical study of what matters
B Wang, S Min, X Deng, J Shen, Y Wu, L Zettlemoyer, H Sun
arXiv preprint arXiv:2212.10001, 2022
113*2022
Knowledge guided text retrieval and reading for open domain question answering
S Min, D Chen, L Zettlemoyer, H Hajishirzi
arXiv preprint arXiv:1911.03868, 2019
1062019
Neural Speed Reading via Skim-RNN
M Seo, S Min, A Farhadi, H Hajishirzi
International Conference on Learning Representations (ICLR), 2018
104*2018
NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned
S Min, J Boyd-Graber, C Alberti, D Chen, E Choi, M Collins, K Guu, ...
Proceedings of Machine Learning Research, 2021
692021
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–20