Paper published in a book (Scientific congresses, symposiums and conference proceedings)
Evaluating Pretrained Transformer-based Models on the Task of Fine-Grained Named Entity Recognition
Lothritz, Cedric; Allix, Kevin; Veiber, Lisa et al.
2020In Proceedings of the 28th International Conference on Computational Linguistics
Peer reviewed
 

Files


Full Text
2020.coling-main.334.pdf
Publisher postprint (2.22 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Natural Language Processing; fine-grained Named Entity Recognition; Transformers
Abstract :
[en] Named Entity Recognition (NER) is a fundamental Natural Language Processing (NLP) task and has remained an active research field. In recent years, transformer models and more specifically the BERT model developed at Google revolutionised the field of NLP. While the performance of transformer-based approaches such as BERT has been studied for NER, there has not yet been a study for the fine-grained Named Entity Recognition (FG-NER) task. In this paper, we compare three transformer-based models (BERT, RoBERTa, and XLNet) to two non-transformer-based models (CRF and BiLSTM-CNN-CRF). Furthermore, we apply each model to a multitude of distinct domains. We find that transformer-based models incrementally outperform the studied non-transformer-based models in most domains with respect to the F1 score. Furthermore, we find that the choice of domains significantly influenced the performance regardless of the respective data size or the model chosen.
Disciplines :
Computer science
Author, co-author :
Lothritz, Cedric  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
Allix, Kevin ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
Veiber, Lisa ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
Klein, Jacques ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
Bissyande, Tegawendé François D Assise  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > TruX
External co-authors :
no
Language :
English
Title :
Evaluating Pretrained Transformer-based Models on the Task of Fine-Grained Named Entity Recognition
Alternative titles :
[en] Evaluating Pretrained Transformer-based Models on the Task of Fine-Grained Named Entity Recognition
Publication date :
December 2020
Event name :
28th International Conference on Computational Linguistics
Event date :
08.12-13.12
Main work title :
Proceedings of the 28th International Conference on Computational Linguistics
Pages :
3750–3760
Peer reviewed :
Peer reviewed
Focus Area :
Security, Reliability and Trust
Available on ORBilu :
since 22 December 2020

Statistics


Number of views
843 (45 by Unilu)
Number of downloads
601 (30 by Unilu)

Scopus citations®
 
14
Scopus citations®
without self-citations
14

Bibliography


Similar publications



Contact ORBilu