@amanshakya

Named Entity Recognition for Nepali Using BERT Based Models

, , , und . Advances and Trends in Artificial Intelligence. Theory and Applications, Volume 13926 von Lecture Notes in Computer Science (LNAI 13926), Seite 93--104. Cham, Springer Nature Switzerland, (Juli 2023)
DOI: https://doi.org/10.1007

Zusammenfassung

Named Entity Recognition (NER) is one of the vital task for many Natural Language Processing (NLP) tasks. In recent times, transformer architecture-based models have become very popular for NLP tasks including NER achieving state-of-the-art results. The Bidirectional Encoder Representations from Transformers (BERT) model especially has been found to be very good for NER tasks. However, in Nepali limited work has been done using these models with existing works mostly using more traditional techniques. In this work, we show that by using a combination of preprocessing techniques and better-initialized BERT models, we can improve the performance of the NER system in Nepali. We show a significant improvement in results using the multilingual RoBERTa model. Using this, we were able to achieve a 6\% overall improvement in the f1 score in EverestNER Dataset. In terms of the fields, we have achieved an increase of up to 22\% in the f1 score for the Event entity which has the lowest support.

Links und Ressourcen

Tags

Community

  • @amanshakya
  • @dblp
@amanshakyas Tags hervorgehoben