Editors: Andres M. Kowalski, Ra´ul D. Rossignoli, Evaldo M. F. Curado

Concepts and Recent Advances in Generalized Information Measures and Statistics

eBook: US $59 Special Offer (PDF + Printed Copy): US $170
Printed Copy: US $141
Library License: US $236
ISBN: 978-1-60805-761-0 (Print)
ISBN: 978-1-60805-760-3 (Online)
Year of Publication: 2013
DOI: 10.2174/97816080576031130101

Introduction

Introduction: Summary of Contents

The goal of this book is to offer an updated overview on generalized information measures and statistics, including the basic concepts as well as some recent relevant applications.

The book begins with an historical introduction describing the fascinating development of the concepts of heat and entropy. Starting from the ideas of the ancient Greece, an account of the main historical breakthroughs is provided, which allows to appreciate the fundamental contributions of Nicolas Sadi Carnot, Rudolf Clausius, Ludwig Boltzmann, Josiah Willard Gibbs and others. It ends with the seminal works of Claude Shannon, which led to the foundation of Information Theory, and Edwin Jaynes, which provided the connection of the latter with Statistical Mechanics.

The second chapter is a basic tutorial on the essentials of information entropy, describing in an accessible level the concepts and quantities used in the rest of this book. The Shannon entropy and its associated measures such as the conditional entropy, the mutual information (a measure of correlations) and the relative entropy (also known as Kullback-Leibler divergence, a measure of the discrepancy between two probability distributions) are all presented, together with their main properties and the most important proofs. We also provide the main features of the Fisher Information, which can be presented in terms of the relative entropy between two slightly displaced distributions, and the associated Cramer-Rao bound. Other topics include the definition of entropy in quantum systems, the fundamental property of concavity and a brief introduction to the maximum entropy approach and its connection with statistical mechanics. It contains finally the Shannon-Khinchin axioms leading to the uniqueness theorem for the Shannon entropy together with an introduction to the concept of generalized entropies.

Chapters 3, 4 and 5, are devoted precisely to the generalized entropy concept. Chapter 3 presents a review by Constantino Tsallis of the famous non-additive entropy Sq which is known by his name, together with the associated generalized statistical mechanics and q-distributions. As described there, such generalized framework allows for the possibility of an extensive thermodynamic entropy in strongly correlated systems where the standard additive Boltzmann-Gibbs entropy is non-extensive. The chapter includes a comprehensive list of relevant recent applications of the formalism in the most diverse fields, together with the concomitant references. It also comments on recent relevant results related with the connection of Sq with the theory of numbers through the Riemann zeta function. Chapter 4 presents an axiomatic approach for deriving the form of a generalized entropy. After describing in full detail the four Shannon-Khinchin axioms, it considers the situation where just the first three are conserved, together with the requirement of two newly discovered scaling laws which the generalized entropy should fulfill. It is shown that this leads to a general form of entropy depending essentially on two parameters, which define entropic equivalence classes. The connection with the Shannon, Tsallis, R´enyi and other entropies is described in detail, together with the associated distribution functions and some related aspects. It includes an appendix containing the technical details and the demonstration of four associated theorems.

Chapter 5 discusses the relation between generalized entropies and the concept of majorization. The latter is a powerful and elegant mathematical theory for comparing probability distributions, which leads to a rigorous concept of mixedness and disorder. This chapter describes first the concept of majorization in an accessible level. It then considers its connection with entropy, and shows that by means of generalized entropies it is possible to express the majorization relation in terms of entropic inequalities. It also describes the majorization properties of the probability distributions determined by the maximization of general entropic forms, and the concept of mixing parameters, i.e., parameters whose increase ensure majorization. Finally, the concept of majorization in the quantum case, i.e., for density operators, is also examined. As application, the problem of quantum entanglement detection is considered, where it is shown that majorization leads to a generalized entropic criterion for separability, which is much stronger than the standard entropic criterion. In chapter 6, the notion of distance measures for probability distributions is reviewed. An overview of the most frequently used metrics and distances like Euclidean metrics, Wootters’s distance, Fisher metric and Kullback-Leibler divergence is made, centering the analysis in the distance known as the Jensen-Shannon divergence both in their classical and quantum versions. Application of the latter as a measure of quantum entanglement is also discussed. This chapter is related to the next two chapters, which are devoted to Statistical Measures of Complexity, because of the dependence of these measures with distances in probability space.

There is no universally accepted definition of complexity, nor of quantifiers of complex- xi ity. An extensive list of relevant contributions can be found in the introduction of the chapter 8, as well as in the references of chapter 7. A comparative classification of various complexity measures, by Wackerbauer, Witt, Atmanspacher, Kurths and Scheingraber, can be found in the reference [10] of chapter 8. We will here consider just a particular class of complexity measures based on information theory, which are essentially a combination of an entropy with a distance measure in probability space. Such measures vanish when the probability distribution implies either full certainty or complete uncertainty, being maximum at some “intermediate” distribution. This approach is precisely adopted in chapter 7, where Ricardo L´opez-Ruiz, Hector Mancini and Xabier Calbet introduce the well-known measure of complexity known by their surnames (LMC Statistical Measure of Complexity). Its properties are discussed in full detail and some interesting applications (gaussian and exponential distributions, and complexity in a two-level laser model) are also provided. In chapter 8 the properties of a Generalized Statistical Complexity Measure are discussed. The authors adopt the functional product form of the LMC Statistical Measure of Complexity, but consider different entropic forms and different definitions of distance between distributions of probability, beyond the Shannon Entropy and Euclidean distance used in chapter 7. In particular, the use of the Jensen divergence introduced en chapter 6, together with the Shannon Entropy (Shannon Jensen Statistical Complexity) is analyzed in depth. Another important aspect considered in this chapter is the methodology for the proper determination of the underlying probability distribution function (PDF), associated with a given dynamical system or time series. We should also mention here the Statistical Complexity of Shiner, Davison and Landsberg (ref. [13] of chapter 8).

In chapters 9-15, different applications of generalized information measures are considered. Chapter 9 deals with the Fisher Information, whose basic properties were introduced in Chapter 2. The chapter describes its use in radial probability distributions associated with quantum states, determining the related Cramer-Rao inequalities and the explicit expressions of the Fisher information for both ground and excited states of D-dimensional hydrogenic systems. It then considers its application to some physico-chemical processes, showing that the Fisher information can be a valuable tool for detecting the transition rate and the stationary points of a chemical reaction.

Chapters 10 and 11 deal with problems in physics while chapters 12-15 with applications in others fields. In particular, chapters 14 and 15 are devoted to biological applications. In chapter 10, the links between the entanglement concept (see also chapter 5) and the information entropy are analyzed, as represented by different measures like the Shannon, Renyi (see chapter 2 and 4) and Tsallis (see chapter 3) ones. In chapter 11, the authors review the difference between quantum statistical treatments and semiclassical ones, using a semiclassical Fisher Information measure built up with Husimi distributions. Chapter 12 deals with the use of Information theory tools for characterizing pseudo random number generators obtained from chaotic dynamical systems. The authors make use of the conjunction between Entropy and the Shannon Jensen Statistical Complexity introduced in chapter 8, to evaluate the quality of pseudo random number generator. It is done by quantifying the equiprobability of all its values and statistical independence between consecutive outputs by means the comparison of a Shannon Entropy calculated with a Histogram PDF and a Shannon Jensen Statistical Complexity calculated with a Symbolic Bandt–Pompe PDF (see chapter 8) in an Entropy–Statistical Complexity plane. In chapter 13, the authors employ different information measures such as Shannon Entropy, Fisher Information measure (see chapter 2) and the Shannon Jensen Statistical Complexity (introduced in chapter 8), to analyze sedimentary data corresponding to the Holocene and so characterize changes in the dynamical behavior of ENSO (El Ni˜no/Southern Oscillation) during this period.

In chapter 14, the authors present an application of wavelet-based information measures to characterize red blood cells membrane viscoelasticity. Relative Energy, Shannon Entropy, Shannon Jensen Statistical Complexity calculated with a Wavelet PDF, technic introduced in chapter 8, together with an Entropy–Complexity plane, are used to analyzing a human haematological disease.

Finally, in chapter 15 the authors apply an information theoretic approach to analyze the role of spike correlations in the neuronal code. By considering certain brain structures as communication channels, application of Information Theory becomes feasible, allowing in particular to investigate correlations through the pertinent mutual information. It is a nice example of the important role played by Information theoretical methods in current problems of Theoretical Neuroscience. The chapter also includes a comprehensive list of references on the subject.

Foreword by Fernando Dantas Nobre

- Pp. i-ii (2)
Fernando Dantas Nobre
Download Free

Foreword by Héctor Vucetich

- Pp. iii
Hector Vucetich
Download Free

Preface

- Pp. iv
Andres M. Kowalski, Raul D. Rossignoli, Evaldo M. F. Curado
Download Free

List of Contributors

- Pp. v-ix (5)
Andres M. Kowalski, Raul D. Rossignoli, Evaldo M. F. Curado
Download Free

Introduction - Summary of Contents

- Pp. x-xiii (4)
Andres M. Kowalski, Raul D. Rossignoli, Evaldo M. F. Curado
Download Free

Heat and Entropy: A Brief History

- Pp. 3-29 (27)
Evaldo M. F. Curado, Andres M. Kowalski, Raul D. Rossignoli
View Abstract

Essentials of Information Entropy and Related Measures

- Pp. 30-56 (27)
Raul D. Rossignoli, Andres M. Kowalski, Evaldo M. F. Curado
View Abstract

The Nonadditive Entropy Sq: A Door Open to the Nonuniversality of the Mathematical Expression of the Clausius Thermodynamic Entropy in Terms of the Probabilities of the Microscopic Configurations

- Pp. 57-80 (24)
Constantino Tsallis
View Abstract

What do Generalized Entropies Look Like? An Axiomatic Approach for Complex, Non-Ergodic Systems

- Pp. 81-99 (19)
Stefan Thurner, Rudolf Hanel
View Abstract

Majorization and Generalized Entropies

- Pp. 100-129 (30)
Norma Canosa, Raul D. Rossignoli
View Abstract

Distances Measures for Probability Distributions

- Pp. 130-146 (17)
Pedro W. Lamberti, Ana P. Majtey
View Abstract

A Statistical Measure of Complexity

- Pp. 147-168 (22)
Ricardo Lopez-Ruiz, Hector Mancini, Xavier Calbet
View Abstract

Generalized Statistical Complexity: A New Tool for Dynamical Systems

- Pp. 169-215 (47)
Osvaldo A. Rosso, Maria Teresa Martin, Hilda A. Larrondo, Andres M. Kowalski, Angelo Plastino
View Abstract

The Fisher Information: Properties and Physico-Chemical Applications

- Pp. 216-233 (18)
Jesus S. Dehesa, Rodolfo O. Esquivel, Angel Ricardo Plastino, Pablo Sanchez-Moreno
View Abstract

Entanglement and Entropy

- Pp. 234-255 (22)
J. Batle, A. Plastino, A. R. Plastino, M. Casas
View Abstract

Semiclassical Treatments and Information Theory

- Pp. 256-282 (27)
Flavia Pennini, Angelo Plastino, Gustavo Luis Ferri
View Abstract

Statistical Complexity of Chaotic Pseudorandom Number Generators

- Pp. 283-308 (26)
Hilda A. Larrondo, Luciana De Micco, Claudio M. Gonzalez, Angelo Plastino, Osvaldo A. Rosso
View Abstract

Analysis of an EL Nino-Southern Oscillation proxy record using Information Theory quantifiers

- Pp. 309-340 (32)
Laura C. Carpi, Patricia M. Saco, Alejandra Figliola, Eduardo Serrano, Osvaldo A. Rosso
View Abstract

Erythrocytes Viscoelasticity Under the lens of Wavelet–Information Theory Quantifiers

- Pp. 341-374 (34)
Ana Maria Korol, Maria Teresa Martin, Bibiana Riquelme, Mabel D’Arrigo, Osvaldo A. Rosso
View Abstract

Information-Theoretic Analysis of the Role of Correlations in Neural Spike Trains

- Pp. 375-407 (33)
Fernando Montani, Simon R. Schultz
View Abstract

Index

- Pp. 408-414 (7)
Andres M. Kowalski, Raul D. Rossignoli, Evaldo M. F. Curado
Download Free

RELATED BOOKS

.Probability and Statistics: Theory and Exercises.
.Introductory Statistics.
.Introductory Statistical Procedures with SPSS.
.Reliability Calculations with the Stochastic Finite Element.