Text mining by Tsallis entropy
Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks...
Saved in:
Published in | Physica A Vol. 490; pp. 1368 - 1376 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
15.01.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms’ relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.
•Non-extensive statistical mechanics appropriately describes complex systems.•We apply Tsallis entropy to ranks the terms’ relevance to document subject.•Experimental evaluations confirm the capability of this metric in keyword detection.•This metric has reliable word ranking results in comparison with previous methods. |
---|---|
ISSN: | 0378-4371 1873-2119 |
DOI: | 10.1016/j.physa.2017.09.020 |