Incorporating Copying Mechanism in Sequence-to-Sequence Learning Incorporating Copying Mechanism in Sequence-to-Sequence Learning
Paper summary TLDR; The authors introduce CopyNet, a variation on the seq2seq that incorporates a "copying mechanism". With this mechanism, the effective vocabulary is the union of the standard vocab and the words in the current source sentence. CopyNet predicts words based on a mixed probability of the standard attention mechanism and a new copy mechanism. The authors show empirically that on toy and summarization tasks CopNet behaves as expected: The decoder is dominated by copy mode when it tries to replicate something from the source.
arxiv.org
arxiv-sanity.com
scholar.google.com
Incorporating Copying Mechanism in Sequence-to-Sequence Learning
Jiatao Gu and Zhengdong Lu and Hang Li and Victor O. K. Li
arXiv e-Print archive - 2016 via arXiv
Keywords: cs.CL, cs.AI, cs.LG, cs.NE

more

Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About