Подписаться
Hua Wu
Hua Wu
Baidu NLP
Подтвержден адрес электронной почты в домене baidu.com - Главная страница
Название
Процитировано
Процитировано
Год
Ernie: Enhanced representation through knowledge integration
Y Sun, S Wang, Y Li, S Feng, X Chen, H Zhang, X Tian, D Zhu, H Tian, ...
arXiv preprint arXiv:1904.09223, 2019
11692019
Ernie 2.0: A continual pre-training framework for language understanding
Y Sun, S Wang, Y Li, S Feng, H Tian, H Wu, H Wang
Proceedings of the AAAI conference on artificial intelligence 34 (05), 8968-8975, 2020
9212020
Multi-task learning for multiple language translation
D Dong, H Wu, W He, D Yu, H Wang
Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015
7112015
RocketQA: An optimized training approach to dense passage retrieval for open-domain question answering
Y Qu, Y Ding, J Liu, K Liu, R Ren, WX Zhao, D Dong, H Wu, H Wang
arXiv preprint arXiv:2010.08191, 2020
5682020
Minimum risk training for neural machine translation
S Shen, Y Cheng, Z He, W He, H Wu, M Sun, Y Liu
arXiv preprint arXiv:1512.02433, 2015
5152015
Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation
Y Sun, S Wang, S Feng, S Ding, C Pang, J Shang, J Liu, X Chen, Y Zhao, ...
arXiv preprint arXiv:2107.02137, 2021
4672021
An end-to-end model for question answering over knowledge base with cross-attention combining global knowledge
Y Hao, Y Zhang, K Liu, S He, Z Liu, H Wu, J Zhao
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
4432017
Multi-turn response selection for chatbots with deep attention matching network
X Zhou, L Li, D Dong, Y Liu, Y Chen, WX Zhao, D Yu, H Wu
Proceedings of the 56th Annual Meeting of the Association for Computational …, 2018
4032018
Learning to respond with deep neural networks for retrieval-based human-computer conversation system
R Yan, Y Song, H Wu
Proceedings of the 39th International ACM SIGIR conference on Research and …, 2016
4012016
Ernie-vil: Knowledge enhanced vision-language representations through scene graphs
F Yu, J Tang, W Yin, Y Sun, H Tian, H Wu, H Wang
Proceedings of the AAAI conference on artificial intelligence 35 (4), 3208-3216, 2021
3992021
Geometry-enhanced molecular representation learning for property prediction
X Fang, L Liu, J Lei, D He, S Zhang, J Zhou, F Wang, H Wu, H Wang
Nature Machine Intelligence 4 (2), 127-134, 2022
3982022
Unimo: Towards unified-modal understanding and generation via cross-modal contrastive learning
W Li, C Gao, G Niu, X Xiao, H Liu, J Liu, H Wu, H Wang
arXiv preprint arXiv:2012.15409, 2020
3982020
Unified structure generation for universal information extraction
Y Lu, Q Liu, D Dai, X Xiao, H Lin, X Han, L Sun, H Wu
arXiv preprint arXiv:2203.12277, 2022
3972022
Semi-supervised learning for neural machine translation
Y Cheng, Y Cheng
Joint training for neural machine translation, 25-40, 2019
3212019
Dureader: a chinese machine reading comprehension dataset from real-world applications
W He, K Liu, J Liu, Y Lyu, S Zhao, X Xiao, Y Liu, Y Wang, H Wu, Q She, ...
arXiv preprint arXiv:1711.05073, 2017
3172017
Multi-view response selection for human-computer conversation
X Zhou, D Dong, H Wu, S Zhao, D Yu, H Tian, X Liu, R Yan
Proceedings of the 2016 conference on empirical methods in natural language …, 2016
2752016
SKEP: Sentiment knowledge enhanced pre-training for sentiment analysis
H Tian, C Gao, X Xiao, H Liu, B He, H Wu, H Wang, F Wu
arXiv preprint arXiv:2005.05635, 2020
2742020
PLATO: Pre-trained dialogue generation model with discrete latent variable
S Bao, H He, F Wang, H Wu, H Wang
arXiv preprint arXiv:1910.07931, 2019
2712019
Pivot language approach for phrase-based statistical machine translation
H Wu, H Wang
Machine Translation 21, 165-181, 2007
2692007
STACL: Simultaneous translation with implicit anticipation and controllable latency using prefix-to-prefix framework
M Ma, L Huang, H Xiong, R Zheng, K Liu, B Zheng, C Zhang, Z He, H Liu, ...
arXiv preprint arXiv:1810.08398, 2018
2592018
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–20