The MeSH-Gram Neural Network Model: Extending Word Embedding Vectors with MeSH Concepts for Semantic Similarity

Eliciting semantic similarity between concepts remains a challenging task. Recent approaches founded on embedding vectors have gained in popularity as they have risen to efficiently capture semantic relationships. The underlying idea is that two words that have close meaning gather similar contexts....

Full description

Saved in:
Bibliographic Details
Published inStudies in health technology and informatics Vol. 264; p. 5
Main Authors Abdeddaïm, Saïd, Vimard, Sylvestre, Soualmia, Lina F
Format Journal Article
LanguageEnglish
Published Netherlands 21.08.2019
Subjects
Online AccessGet more information

Cover

Loading…
More Information
Summary:Eliciting semantic similarity between concepts remains a challenging task. Recent approaches founded on embedding vectors have gained in popularity as they have risen to efficiently capture semantic relationships. The underlying idea is that two words that have close meaning gather similar contexts. In this study, we propose a new neural network model, named MeSH-gram, which relies on a straightforward approach that extends the skip-gram neural network model by considering MeSH (Medical Subject Headings) descriptors instead of words. Trained on publicly available PubMed/MEDLINE corpus, MeSH-gram is evaluated on reference standards manually annotated for semantic similarity. MeSH-gram is first compared to skip-gram with vectors of size 300 and at several windows' contexts. A deeper comparison is performed with twenty existing models. All the obtained results with Spearman's rank correlations between human scores and computed similarities show that MeSH-gram (i) outperforms the skip-gram model and (ii) is comparable to the best methods that need more computation and external resources.
ISSN:1879-8365
DOI:10.3233/SHTI190172