Semi-supervised Multitask Learning for Sequence Labeling Semi-supervised Multitask Learning for Sequence Labeling
Paper summary Incorporating an unsupervised language modeling objective to help train a bidirectional LSTM for sequence labeling. At the same time as training the tagger, the forward-facing LSTM is optimised to predict the next word and the backward-facing LSTM is optimised to predict the previous word. The model learns a better composition function and improves performance on NER, error detection, chunking and POS-tagging, without using additional data. https://i.imgur.com/pXLSsAR.png
doi.org
sci-hub
scholar.google.com
Semi-supervised Multitask Learning for Sequence Labeling
Rei, Marek
Association for Computational Linguistics - 2017 via Local Bibsonomy
Keywords: dblp


Summary by Marek Rei 6 months ago
Loading...
Your comment:


ShortScience.org allows researchers to publish paper summaries that are voted on and ranked!
About

Sponsored by: and