Generalised information and entropy measures in physics

The formalism of statistical mechanics can be generalised by starting from more general measures of information than the Shannon entropy and maximising those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of...

Full description

Saved in:
Bibliographic Details
Published inContemporary physics Vol. 50; no. 4; pp. 495 - 510
Main Author Beck, Christian
Format Journal Article
LanguageEnglish
Published Colchester Taylor & Francis 01.07.2009
Taylor & Francis Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The formalism of statistical mechanics can be generalised by starting from more general measures of information than the Shannon entropy and maximising those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Rényi entropy, Tsallis entropy, Abe entropy, Kaniadakis entropy, Sharma-Mittal entropies, and a few more. Important concepts such as the axiomatic foundations, composability and Lesche stability of information measures are briefly discussed. Potential applications in physics include complex systems with long-range interactions and metastable states, scattering processes in particle physics, hydrodynamic turbulence, defect turbulence, optical lattices, and quite generally driven nonequilibrium systems with fluctuations of temperature.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0010-7514
1366-5812
DOI:10.1080/00107510902823517