Kullback-Leibler and Rényi divergence rate for Gaussian stationary ARMA processes comparison
In signal processing, ARMA processes are widely used to model short-memory processes. In various applications, comparing or classifying ARMA processes is required. In this paper, our purpose is to provide analytical expressions of the divergence rates of the Kullback-Leibler divergence, the Rényi di...
Saved in:
Published in | Digital signal processing Vol. 116; p. 103089 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Inc
01.09.2021
Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In signal processing, ARMA processes are widely used to model short-memory processes. In various applications, comparing or classifying ARMA processes is required. In this paper, our purpose is to provide analytical expressions of the divergence rates of the Kullback-Leibler divergence, the Rényi divergence (RD) of order α and their symmetric versions for two Gaussian ARMA processes, by taking advantage of results such as the Yule-Walker equations and notions such as inverse filtering. The divergence rates can be interpreted as the sum of different quantities: power of one ARMA process filtered by the inverse filter associated with the second ARMA process, cepstrum, etc. Finally, illustrations show that the ranges of values taken by the divergence rates of the RD are sensitive to α, especially when the latter is close to 1. |
---|---|
ISSN: | 1051-2004 1095-4333 |
DOI: | 10.1016/j.dsp.2021.103089 |