Sentence Level Recurrent Topic Model: Letting Topics Speak for Themselves Sentence Level Recurrent Topic Model: Letting Topics Speak for Themselves
Paper summary TLDR; The authors train an RNN-based topic model that takes into consideration word order and assumes words in the same sentence sahre the same topic. The authors sample topic mixtures from a Dirichlet distribution and then train a "topic embedding" together with the rest of the generative LSTM. The model is evaluated quantitatively using perplexity on generated sentences and on classification tasks and clearly beats competing models. Qualitative evaluation shows that the model can generate sensible sentences conditioned on the topic.
arxiv.org
scholar.google.com
Sentence Level Recurrent Topic Model: Letting Topics Speak for Themselves
Tian, Fei and Gao, Bin and He, Di and Liu, Tie-Yan
arXiv e-Print archive - 2016 via Bibsonomy
Keywords: dblp


Loading...
Your comment:


Short Science allows researchers to publish paper summaries that are voted on and ranked!
About