Bidirectional Recurrent Neural Network Approach for Arabic Named Entity Recognition

Recurrent neural network (RNN) has achieved remarkable success in sequence labeling tasks with memory requirement. RNN can remember previous information of a sequence and can thus be used to solve natural language processing (NLP) tasks. Named entity recognition (NER) is a common task of NLP and can...

Full description

Saved in:
Bibliographic Details
Published inFuture internet Vol. 10; no. 12; p. 123
Main Authors Ali, Mohammed, Tan, Guanzheng, Hussain, Aamir
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.12.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recurrent neural network (RNN) has achieved remarkable success in sequence labeling tasks with memory requirement. RNN can remember previous information of a sequence and can thus be used to solve natural language processing (NLP) tasks. Named entity recognition (NER) is a common task of NLP and can be considered a classification problem. We propose a bidirectional long short-term memory (LSTM) model for this entity recognition task of the Arabic text. The LSTM network can process sequences and relate to each part of it, which makes it useful for the NER task. Moreover, we use pre-trained word embedding to train the inputs that are fed into the LSTM network. The proposed model is evaluated on a popular dataset called “ANERcorp.” Experimental results show that the model with word embedding achieves a high F-score measure of approximately 88.01%.
ISSN:1999-5903
1999-5903
DOI:10.3390/fi10120123