首页-_学术活动_教师

学术报告508:非监督机器翻译

发布日期:  2021/06/28  周时强   浏览次数: 部门:    返回

报 告 人:王瑞 副教授

单位:上海交通大学-计算机系

报告时间:2021年7月1日(周四)15:00~16:00

报告地点:宝山校区计算机大楼402室

邀请人:王瑞

报告摘要:

Machine translation (MT) is a classic topic in NLP and AI. The history of MT can date back to 1940s.  Starting from 1990s when abundant parallel corpora become available, statistical ma-chine translation (SMT), combining a series of linguistic-motivated models, was groundbreaking. From 2010s when abundant computational resources became available, neural machine translation (NMT) reached state-of-the-art performance. Recently, unsupervised NMT (UNMT), which only relies on monolingual corpora, has achieved impressive results.

Meanwhile, there are still several challenges for UNMT. This tutorial first introduces the back-ground and the latest progress of UNMT. We then examine a number of challenges to UNMT and give empirical results on how well the technology currently holds up.

报告人简介:

Dr. Rui Wang is a computational linguist working as an associate professor at Shanghai Jiao Tong University since 2021. Before that, he was a researcher (tenured in 2020) at Japan National Institute of Information and Communications Technology (NICT) from 2016 to 2020. His research interest is NLP, especially machine translation (MT). He has published more than 40 papers in top-tier NLP/ML/AI conferences and journals, such as ACL, EMNLP, ICLR, AAAI, IJCAI, TPAMI, TASLP, etc. He has also won several first places in top-tier MT/NLP shared tasks, such as WMT-2018, WMT-2019, WMT-2020, CoNLL-2019, etc. He served as the area chairs of ICLR-2021/2022, NAACL-2021 and CoNLL-2021. He gave cutting-edge tutorials at EACL-2021 and EMNLP-2021.

主办单位:上海大学计算机工程与科学学院



上一条:学术报告509:卷积神经网络的设计准则

下一条:人类未来的关键:计算机科学与人工智能——人工智能研究院学术报告1