A Neural Layered Model for Nested Named Entity RecognitionCitation formats

Standard

A Neural Layered Model for Nested Named Entity Recognition. / Ju, Meizhi; Miwa, Makoto; Ananiadou, Sophia.

Proceedings of NAACL 2018. 2018. p. 1446-1459.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Ju, M, Miwa, M & Ananiadou, S 2018, A Neural Layered Model for Nested Named Entity Recognition. in Proceedings of NAACL 2018. pp. 1446-1459.

APA

Ju, M., Miwa, M., & Ananiadou, S. (2018). A Neural Layered Model for Nested Named Entity Recognition. In Proceedings of NAACL 2018 (pp. 1446-1459)

Vancouver

Ju M, Miwa M, Ananiadou S. A Neural Layered Model for Nested Named Entity Recognition. In Proceedings of NAACL 2018. 2018. p. 1446-1459

Author

Ju, Meizhi ; Miwa, Makoto ; Ananiadou, Sophia. / A Neural Layered Model for Nested Named Entity Recognition. Proceedings of NAACL 2018. 2018. pp. 1446-1459

Bibtex

@inproceedings{117a0b55af3440eda7d5bc7c7ab12297,
title = "A Neural Layered Model for Nested Named Entity Recognition",
abstract = "Entity mentions embedded in longer entity mentions are referred to as nested entities.Most named entity recognition (NER) systems deal only with the flat entities and ignorethe inner nested ones, which fails to capture finer-grained semantic information in underlying texts. To address this issue, we propose a novel neural model to identify nested entities by dynamically stacking flat NER layers. Each flat NER layer is based on the state-of-the-art flat NER model that captures sequential context representation with bidirectional long short-term memory (LSTM) layer andf eeds it to the cascaded CRF layer. Our model merges the output of the LSTM layer in the current flat NER layer to build new representation for detected entities and subsequently feeds them into the next flat NER layer. This allows our model to extract outer entities by taking full advantage of information encoded in their corresponding inner entities, in aninside-to-outside way. Our model dynamically stacks the flat NER layers until no outer entities are extracted. Extensive evaluation shows that our dynamic model outperforms state-of-the-art feature-based systems on nested NER, achieving 74.7{\%} and 72.2{\%} on GENIA and ACE2005 datasets, respectively, in terms of F-score.",
author = "Meizhi Ju and Makoto Miwa and Sophia Ananiadou",
year = "2018",
month = "6",
day = "1",
language = "English",
pages = "1446--1459",
booktitle = "Proceedings of NAACL 2018",

}

RIS

TY - GEN

T1 - A Neural Layered Model for Nested Named Entity Recognition

AU - Ju, Meizhi

AU - Miwa, Makoto

AU - Ananiadou, Sophia

PY - 2018/6/1

Y1 - 2018/6/1

N2 - Entity mentions embedded in longer entity mentions are referred to as nested entities.Most named entity recognition (NER) systems deal only with the flat entities and ignorethe inner nested ones, which fails to capture finer-grained semantic information in underlying texts. To address this issue, we propose a novel neural model to identify nested entities by dynamically stacking flat NER layers. Each flat NER layer is based on the state-of-the-art flat NER model that captures sequential context representation with bidirectional long short-term memory (LSTM) layer andf eeds it to the cascaded CRF layer. Our model merges the output of the LSTM layer in the current flat NER layer to build new representation for detected entities and subsequently feeds them into the next flat NER layer. This allows our model to extract outer entities by taking full advantage of information encoded in their corresponding inner entities, in aninside-to-outside way. Our model dynamically stacks the flat NER layers until no outer entities are extracted. Extensive evaluation shows that our dynamic model outperforms state-of-the-art feature-based systems on nested NER, achieving 74.7% and 72.2% on GENIA and ACE2005 datasets, respectively, in terms of F-score.

AB - Entity mentions embedded in longer entity mentions are referred to as nested entities.Most named entity recognition (NER) systems deal only with the flat entities and ignorethe inner nested ones, which fails to capture finer-grained semantic information in underlying texts. To address this issue, we propose a novel neural model to identify nested entities by dynamically stacking flat NER layers. Each flat NER layer is based on the state-of-the-art flat NER model that captures sequential context representation with bidirectional long short-term memory (LSTM) layer andf eeds it to the cascaded CRF layer. Our model merges the output of the LSTM layer in the current flat NER layer to build new representation for detected entities and subsequently feeds them into the next flat NER layer. This allows our model to extract outer entities by taking full advantage of information encoded in their corresponding inner entities, in aninside-to-outside way. Our model dynamically stacks the flat NER layers until no outer entities are extracted. Extensive evaluation shows that our dynamic model outperforms state-of-the-art feature-based systems on nested NER, achieving 74.7% and 72.2% on GENIA and ACE2005 datasets, respectively, in terms of F-score.

M3 - Conference contribution

SP - 1446

EP - 1459

BT - Proceedings of NAACL 2018

ER -