Titre :
Massively Multilingual Neural Machine Translation at Google (à noter que le séminaire sera en anglais) - George FOSTER
Résumé :
In this talk I will describe recent work by Google Translate aimed at building a universal neural machine translation (NMT) system capable of translating between any pair of languages. Our current milestone on the road to this goal is a single massively multilingual NMT model handling 103 languages, trained on over 25 billion sentence pairs. Our system demonstrates effective transfer learning ability, significantly improving the translation quality of low-resource languages, while keeping high-resource language translation quality on-par with competitive bilingual baselines. I will discuss various aspects of model design that are crucial to achieving quality and practicality in this setting. I will also present empirical analyses that identify the remaining weaknesses in our system, and suggest directions for future research. Biographie George Foster is a Research Scientist at Google Montreal, working on Machine Translation. Before joining Google he was a Researcher at the Canadian National Research Council from 2004-2014, where he co-founded and co-led the Portage Machine Translation project. George has a PhD in Computer Science from the University of Montreal. He is currently Action Editor for the TACL journal and serves on the board of the Machine Translation journal. He is a past member of the editorial board of the journal Computational Linguistics (2010-2012), and past President of the Association for Machine Translation in the Americas (2014-2016).