Kullback-Leibler and Rényi divergence rate for Gaussian stationary ARMA processes comparison

In signal processing, ARMA processes are widely used to model short-memory processes. In various applications, comparing or classifying ARMA processes is required. In this paper, our purpose is to provide analytical expressions of the divergence rates of the Kullback-Leibler divergence, the Rényi di...

Full description

Saved in:
Bibliographic Details
Published inDigital signal processing Vol. 116; p. 103089
Main Authors Grivel, Eric, Diversi, Roberto, Merchan, Fernando
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.09.2021
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In signal processing, ARMA processes are widely used to model short-memory processes. In various applications, comparing or classifying ARMA processes is required. In this paper, our purpose is to provide analytical expressions of the divergence rates of the Kullback-Leibler divergence, the Rényi divergence (RD) of order α and their symmetric versions for two Gaussian ARMA processes, by taking advantage of results such as the Yule-Walker equations and notions such as inverse filtering. The divergence rates can be interpreted as the sum of different quantities: power of one ARMA process filtered by the inverse filter associated with the second ARMA process, cepstrum, etc. Finally, illustrations show that the ranges of values taken by the divergence rates of the RD are sensitive to α, especially when the latter is close to 1.
ISSN:1051-2004
1095-4333
DOI:10.1016/j.dsp.2021.103089