Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
Paper summary Investigation of how well LSTMs capture long-distance dependencies. The task is to predict verb agreement (singular or plural) when the subject noun is separated by different numbers of distractors. They find that an LSTM trained explicitly for this task manages to handle even most of the difficult cases, but a regular language model is more prone to being misled by the distractors. https://i.imgur.com/0kYhawn.png
transacl.org
sci-hub
scholar.google.com
Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
Linzen, Tal and Dupoux, Emmanuel and Goldberg, Yoav
TACL - 2016 via Local Bibsonomy
Keywords: dblp


Summary by Marek Rei 2 months ago
Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About

Sponsored by: and