Language is not all you need: Aligning perception with language models S Huang, L Dong, W Wang, Y Hao, S Singhal, S Ma, T Lv, L Cui, ... Advances in Neural Information Processing Systems 36, 72096-72109, 2023 | 470 | 2023 |
InfoXLM: An information-theoretic framework for cross-lingual language model pre-training Z Chi, L Dong, F Wei, N Yang, S Singhal, W Wang, X Song, XL Mao, ... arXiv preprint arXiv:2007.07834, 2020 | 351 | 2020 |
Cross-lingual natural language generation via pre-training Z Chi, L Dong, F Wei, W Wang, XL Mao, H Huang Proceedings of the AAAI conference on artificial intelligence 34 (05), 7570-7577, 2020 | 148 | 2020 |
Optimizing prompts for text-to-image generation Y Hao, Z Chi, L Dong, F Wei Advances in Neural Information Processing Systems 36, 2024 | 145 | 2024 |
XLM-E: Cross-lingual Language Model Pre-training via ELECTRA Z Chi arXiv preprint arXiv:2106.16138, 2021 | 133 | 2021 |
Complicated table structure recognition Z Chi, H Huang, HD Xu, H Yu, W Yin, XL Mao arXiv preprint arXiv:1908.04729, 2019 | 126 | 2019 |
Language models are general-purpose interfaces Y Hao, H Song, L Dong, S Huang, Z Chi, W Wang, S Ma, F Wei arXiv preprint arXiv:2206.06336, 2022 | 109 | 2022 |
mT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs Z Chi arXiv preprint arXiv:2104.08692, 2021 | 84 | 2021 |
On the representation collapse of sparse mixture of experts Z Chi, L Dong, S Huang, D Dai, S Ma, B Patra, S Singhal, P Bajaj, X Song, ... Advances in Neural Information Processing Systems 35, 34600-34613, 2022 | 77 | 2022 |
Improving pretrained cross-lingual language models via self-labeled word alignment Z Chi, L Dong, B Zheng, S Huang, XL Mao, H Huang, F Wei arXiv preprint arXiv:2106.06381, 2021 | 71 | 2021 |
Food recommendation with graph convolutional network X Gao, F Feng, H Huang, XL Mao, T Lan, Z Chi Information Sciences 584, 170-183, 2022 | 62 | 2022 |
Subhojit Som, Xia Song, and Furu Wei S Huang, L Dong, W Wang, Y Hao, S Singhal, S Ma, T Lv, L Cui, ... Language is not all you need: Aligning perception with language models …, 2023 | 53 | 2023 |
Subhojit Som, Xia Song, and Furu Wei. Language is not all you need: Aligning perception with language models S Huang, L Dong, W Wang, Y Hao, S Singhal, S Ma, T Lv, L Cui, ... arXiv preprint arXiv:2302.14045 1 (2), 3, 2023 | 49 | 2023 |
Consistency regularization for cross-lingual fine-tuning B Zheng, L Dong, S Huang, W Wang, Z Chi, S Singhal, W Che, T Liu, ... arXiv preprint arXiv:2106.08226, 2021 | 47 | 2021 |
Xlm-t: Scaling up multilingual machine translation with pretrained cross-lingual transformer encoders S Ma, J Yang, H Huang, Z Chi, L Dong, D Zhang, HH Awadalla, A Muzio, ... arXiv preprint arXiv:2012.15547, 2020 | 29 | 2020 |
Beyond english-centric bitexts for better multilingual language representation learning B Patra, S Singhal, S Huang, Z Chi, L Dong, F Wei, V Chaudhary, X Song arXiv preprint arXiv:2210.14867, 2022 | 19 | 2022 |
TorchScale: Transformers at scale S Ma, H Wang, S Huang, W Wang, Z Chi, L Dong, A Benhaim, B Patra, ... arXiv preprint arXiv:2211.13184, 2022 | 16 | 2022 |
Can monolingual pretrained models help cross-Lingual classification? Z Chi, L Dong, F Wei, XL Mao, H Huang arXiv preprint arXiv:1911.03913, 2019 | 14 | 2019 |
A robust and domain-adaptive approach for low-resource named entity recognition H Yu, XL Mao, Z Chi, W Wei, H Huang 2020 IEEE International Conference on Knowledge Graph (ICKG), 297-304, 2020 | 13 | 2020 |
Protllm: An interleaved protein-language llm with protein-as-word pre-training L Zhuo, Z Chi, M Xu, H Huang, H Zheng, C He, XL Mao, W Zhang arXiv preprint arXiv:2403.07920, 2024 | 12 | 2024 |