Semi-supervised sequence tagging with bidirectional language models Semi-supervised sequence tagging with bidirectional language models
Paper summary The paper proposes integrating a pre-trained language model into a sequence labeling model. The baseline model for sequence labeling is a two-layer LSTM/GRU. They concatenate the hidden states from pre-trained language models onto the output of the first LSTM layer. This provides an improvement on NER and chunking tasks. https://i.imgur.com/Hso3mL9.png
doi.org
sci-hub
scholar.google.com
Semi-supervised sequence tagging with bidirectional language models
Peters, Matthew E. and Ammar, Waleed and Bhagavatula, Chandra and Power, Russell
Association for Computational Linguistics - 2017 via Local Bibsonomy
Keywords: dblp


Summary by Marek Rei 3 weeks ago
Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About

Sponsored by: and