Dispersion Entropy: A Measure for Time-Series Analysis
One of the most powerful tools to assess the dynamical characteristics of time series is entropy. Sample entropy (SE), though powerful, is not fast enough, especially for long signals. Permutation entropy (PE), as a broadly used irregularity indicator, considers only the order of the amplitude value...
Saved in:
Published in | IEEE signal processing letters Vol. 23; no. 5; pp. 610 - 614 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.05.2016
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | One of the most powerful tools to assess the dynamical characteristics of time series is entropy. Sample entropy (SE), though powerful, is not fast enough, especially for long signals. Permutation entropy (PE), as a broadly used irregularity indicator, considers only the order of the amplitude values and hence some information regarding the amplitudes may be discarded. To tackle these problems, we introduce a new method, termed dispersion entropy (DE), to quantify the regularity of time series. We gain insight into the dependency of DE on several straightforward signal-processing concepts via a set of synthetic time series. The results show that DE, unlike PE, can detect the noise bandwidth and simultaneous frequency and amplitude change. We also employ DE to three publicly available real datasets. The simulations on real-valued signals show that the DE method considerably outperforms PE to discriminate different groups of each dataset. In addition, the computation time of DE is significantly less than that of SE and PE. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2016.2542881 |