Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
Paper summary TLDR; The authors train a multilingual Neural Machine Translation (NMT) system based on the Google NMT architecture by prepend a special `2[lang]` (e.g. `2fr`) token to the input sequence to specify the target language. They empirically evaluate model performance on many-to-one, one-to-many and many-to-many translation tasks and demonstrate evidence for shared representations (interlingua).
arxiv.org
arxiv-sanity.com
scholar.google.com
Google's Multilingual Neural Machine Translation System: Enabling Zero-Shot Translation
Melvin Johnson and Mike Schuster and Quoc V. Le and Maxim Krikun and Yonghui Wu and Zhifeng Chen and Nikhil Thorat and Fernanda Viégas and Martin Wattenberg and Greg Corrado and Macduff Hughes and Jeffrey Dean
arXiv e-Print archive - 2016 via Local arXiv
Keywords: cs.CL, cs.AI

more

Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About