Combining and learning word embedding with WordNet for semantic relatedness and similarity measurement

In this research, we propose 3 different approaches to measure the semantic relatedness between 2 words: (i) boost the performance of GloVe word embedding model via removing or transforming abnormal dimensions; (ii) linearly combine the information extracted from WordNet and word embeddings; and (ii...

Full description

Saved in:
Bibliographic Details
Published inJournal of the American Society for Information Science and Technology Vol. 71; no. 6; pp. 657 - 670
Main Authors Lee, Yang‐Yin, Ke, Hao, Yen, Ting‐Yu, Huang, Hen‐Hsen, Chen, Hsin‐Hsi
Format Journal Article
LanguageEnglish
Published Hoboken, USA John Wiley & Sons, Inc 01.06.2020
Wiley Periodicals Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this research, we propose 3 different approaches to measure the semantic relatedness between 2 words: (i) boost the performance of GloVe word embedding model via removing or transforming abnormal dimensions; (ii) linearly combine the information extracted from WordNet and word embeddings; and (iii) utilize word embedding and 12 linguistic information extracted from WordNet as features for Support Vector Regression. We conducted our experiments on 8 benchmark data sets, and computed Spearman correlations between the outputs of our methods and the ground truth. We report our results together with 3 state‐of‐the‐art approaches. The experimental results show that our method can outperform state‐of‐the‐art approaches in all the selected English benchmark data sets.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2330-1635
2330-1643
DOI:10.1002/asi.24289