On the saddlepoint approximation of the dependence testing bound in memoryless channels

This paper introduces an upper-bound on the absolute difference between: (a) the cumulative distribution function (c.d. f.) of the sum of a finite number of independent and identically distributed (i.i.d) random variables; and (b) a saddlepoint approximation of such c.d.f. This upperbound is general...

Full description

Saved in:
Bibliographic Details
Published inICC 2020 - 2020 IEEE International Conference on Communications (ICC) pp. 1 - 5
Main Authors Anade, Dadja, Gorce, Jean-Marie, Mary, Philippe, Perlaza, Samir M.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper introduces an upper-bound on the absolute difference between: (a) the cumulative distribution function (c.d. f.) of the sum of a finite number of independent and identically distributed (i.i.d) random variables; and (b) a saddlepoint approximation of such c.d.f. This upperbound is general and particularly precise in the regime of large deviations. This result is used to study the dependence testing (DT) bound on the minimum decoding error probability (DEP) in memoryless channels. Within this context, the main results include new lower and upper bounds on the DT bound. As a byproduct, an upper bound on the absolute difference between the exact value of the DT bound and its saddlepoint approximation is obtained. Numerical analysis of these bounds are presented for the case of the binary symmetric channel and the additive white Gaussian noise channel, in which the new bounds are observed to be tight.
ISSN:1938-1883
DOI:10.1109/ICC40277.2020.9148654