Investigating Word Vectors for the Negation of Verbs

Recent advances in natural language processing (NLP) technology have been remarkable. However, NLP technology still needs to address some limitations; in particular, the method of processing text containing negation using a computer remains underdeveloped. Recently, Tang et al. (Trans Assoc Comput L...

Full description

Saved in:
Bibliographic Details
Published inSN computer science Vol. 5; no. 2; p. 222
Main Authors Sasaki, Tomoya, Kikuchi, Yuto, Hara, Kazuo, Suzuki, Ikumi
Format Journal Article
LanguageEnglish
Published Singapore Springer Nature Singapore 01.02.2024
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Recent advances in natural language processing (NLP) technology have been remarkable. However, NLP technology still needs to address some limitations; in particular, the method of processing text containing negation using a computer remains underdeveloped. Recently, Tang et al. (Trans Assoc Comput Linguist 9:740–55, 2021) investigated the cause of failure in machine translation that occurs with respect to negation by examining contextualized word vectors. However, the answer to the question of how to construct word vectors that accurately retain the meaning of negation remains unknown. Therefore, in this study, to investigate word vectors with easy-to-understand problem settings, we focus on static word vectors constructed using word2vec and the negation of verbs in the word analogy task. We report the results of examining the corpus statistics of verbs and their negations when the word analogy task becomes sufficiently accurate. Furthermore, we demonstrate that removing the mean vector from word vectors improves the accuracy of the word analogy task.
ISSN:2661-8907
2662-995X
2661-8907
DOI:10.1007/s42979-023-02554-x