A Neural Attention Model for Abstractive Sentence Summarization A Neural Attention Model for Abstractive Sentence Summarization
Paper summary TLDR; The authors apply a neural seq2seq model to sentence summarization. The model uses an attention mechanism (soft alignment). #### Key Points - Summaries generated on the sentence level, not paragraph level - Summaries have fixed length output - Beam search decoder - Extractive tuning for scoring function to encourage the model to take words from the input sequence - Training data: Headline + first sentence pair.
arxiv.org
scholar.google.com
A Neural Attention Model for Abstractive Sentence Summarization
Rush, Alexander M. and Chopra, Sumit and Weston, Jason
arXiv e-Print archive - 2015 via Bibsonomy
Keywords: dblp


Loading...
Your comment:


Short Science allows researchers to publish paper summaries that are voted on and ranked!
About