Effective sample size approximations as entropy measures

In this work, we analyze alternative effective sample size (ESS) metrics for importance sampling algorithms, and discuss a possible extended range of applications. We show the relationship between the ESS expressions used in the literature and two entropy families, the Rényi and Tsallis entropy. The...

Full description

Saved in:
Bibliographic Details
Published inComputational statistics
Main Authors Martino, L., Elvira, V.
Format Journal Article
LanguageEnglish
Published 29.07.2025
Online AccessGet full text
ISSN0943-4062
1613-9658
DOI10.1007/s00180-025-01665-8

Cover

Loading…
More Information
Summary:In this work, we analyze alternative effective sample size (ESS) metrics for importance sampling algorithms, and discuss a possible extended range of applications. We show the relationship between the ESS expressions used in the literature and two entropy families, the Rényi and Tsallis entropy. The Rényi entropy is connected to the Huggins-Roy’s ESS family introduced in Huggins and Roy (2015). We prove that that all the ESS functions included in the Huggins-Roy’s family fulfill all the desirable theoretical conditions. We analyzed and remark the connections with several other fields, such as the Hill numbers introduced in ecology, the Gini inequality coefficient employed in economics, and the Gini impurity index used mainly in machine learning, to name a few. Finally, by numerical simulations, we study the performance of different ESS expressions contained in the previous ESS families in terms of approximation of the theoretical ESS definition, and show the application of ESS formulas in a variable selection problem.
ISSN:0943-4062
1613-9658
DOI:10.1007/s00180-025-01665-8