Concepts and Recent Advances in Generalized Information Measures and Statistics


by

Andres M. Kowalski, Ra´ul D. Rossignoli, Evaldo M. F. Curado

DOI: 10.2174/97816080576031130101
eISBN: 978-1-60805-760-3, 2013
ISBN: 978-1-60805-761-0



Recommend this eBook to your Library



Introduction: Summary of Contents The goal of this book is to offer an updated overview on generali...[view complete introduction]

Table of Contents

Foreword by Fernando Dantas Nobre

- Pp. i-ii (2)

Fernando Dantas Nobre

Download Free

Foreword by Héctor Vucetich

- Pp. iii

Hector Vucetich

Download Free

Preface

- Pp. iv

Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

Download Free

List of Contributors

- Pp. v-ix (5)

Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

Download Free

Introduction - Summary of Contents

- Pp. x-xiii (4)

Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

Download Free

Heat and Entropy: A Brief History

- Pp. 3-29 (27)

Evaldo M. F. Curado, Andres M. Kowalski and Raul D. Rossignoli

View Abstract Purchase Chapter

Essentials of Information Entropy and Related Measures

- Pp. 30-56 (27)

Raul D. Rossignoli, Andres M. Kowalski and Evaldo M. F. Curado

View Abstract Purchase Chapter

The Nonadditive Entropy Sq: A Door Open to the Nonuniversality of the Mathematical Expression of the Clausius Thermodynamic Entropy in Terms of the Probabilities of the Microscopic Configurations

- Pp. 57-80 (24)

Constantino Tsallis

View Abstract Purchase Chapter

What do Generalized Entropies Look Like? An Axiomatic Approach for Complex, Non-Ergodic Systems

- Pp. 81-99 (19)

Stefan Thurner and Rudolf Hanel

View Abstract Purchase Chapter

Majorization and Generalized Entropies

- Pp. 100-129 (30)

Norma Canosa and Raul D. Rossignoli

View Abstract Purchase Chapter

Distances Measures for Probability Distributions

- Pp. 130-146 (17)

Pedro W. Lamberti and Ana P. Majtey

View Abstract Purchase Chapter

A Statistical Measure of Complexity

- Pp. 147-168 (22)

Ricardo Lopez-Ruiz, Hector Mancini and Xavier Calbet

View Abstract Purchase Chapter

Generalized Statistical Complexity: A New Tool for Dynamical Systems

- Pp. 169-215 (47)

Osvaldo A. Rosso, Maria Teresa Martin, Hilda A. Larrondo, Andres M. Kowalski and Angelo Plastino

View Abstract Purchase Chapter

The Fisher Information: Properties and Physico-Chemical Applications

- Pp. 216-233 (18)

Jesus S. Dehesa, Rodolfo O. Esquivel, Angel Ricardo Plastino and Pablo Sanchez-Moreno

View Abstract Purchase Chapter

Entanglement and Entropy

- Pp. 234-255 (22)

J. Batle, A. Plastino, A. R. Plastino and M. Casas

View Abstract Purchase Chapter

Semiclassical Treatments and Information Theory

- Pp. 256-282 (27)

Flavia Pennini, Angelo Plastino and Gustavo Luis Ferri

View Abstract Purchase Chapter

Statistical Complexity of Chaotic Pseudorandom Number Generators

- Pp. 283-308 (26)

Hilda A. Larrondo, Luciana De Micco, Claudio M. Gonzalez, Angelo Plastino and Osvaldo A. Rosso

View Abstract Purchase Chapter

Analysis of an EL Nino-Southern Oscillation proxy record using Information Theory quantifiers

- Pp. 309-340 (32)

Laura C. Carpi, Patricia M. Saco, Alejandra Figliola, Eduardo Serrano and Osvaldo A. Rosso

View Abstract Purchase Chapter

Erythrocytes Viscoelasticity Under the lens of Wavelet–Information Theory Quantifiers

- Pp. 341-374 (34)

Ana Maria Korol, Maria Teresa Martin, Bibiana Riquelme, Mabel D’Arrigo and Osvaldo A. Rosso

View Abstract Purchase Chapter

Information-Theoretic Analysis of the Role of Correlations in Neural Spike Trains

- Pp. 375-407 (33)

Fernando Montani and Simon R. Schultz

View Abstract Purchase Chapter

Index

- Pp. 408-414 (7)

Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

Download Free

Foreword

Foreword by Nobre

The concept of entropy appears in many areas of knowledge, like thermodynamics, statistical mechanics, information theory, biology, economy, and human sciences. From the historical point of view, it was introduced in 1865 by Rudolf Clausius through an elegant formulation of the second law of thermodynamics. A nice historical review of how the concept of entropy emerged in physics is presented in Chapter 1, written by the editors of this book. According to the second law, the total entropy of an isolated system can never decrease in thermodynamical transformations, and particularly in the case of irreversible transformations, it always increases. Since most natural processes are irreversible, entropy has been associated with the “arrow of time”. Some years later (1872), Ludwig Boltzmann wrote down an equation (known as Boltzmann equation) to describe the evolution of the single-particle distribution of a rarefied gas. Considering this distribution, he defined a quantity H and proved the famous H-theorem, by showing that H always decreases in time. Boltzmann realized that for a perfect gas in equilibrium, the quantity H was related to Clausius’ entropy S (apart from a minus sign and some multiplicative constants). This identification led to the definition of statistical entropy, i.e., the entropy defined in terms of a probability distribution, P(x̄, t), associated with the occurrence of a given physical quantity x̄ (in the case of gases x̄ may represent the position, the velocity, or both position and velocity, of a molecule) at a time t. Moreover, the H-theorem yielded a microscopic interpretation of the second law of thermodynamics.

The dynamical approach of Boltzmann, together with the elegant theory of statistical ensembles at equilibrium proposed by Josiah Willard Gibbs, led to the Boltzmann-Gibbs theory of statistical mechanics, which represents one of the most successful theoretical frameworks of physics. The knowledge of the equilibrium distribution associated with a given statistical ensemble allows one to calculate average values to be related with thermodynamic quantities. This theory is based on a fundamental assumption, namely, the ergodic hypothesis, which requires that the system will pass through all its microstates after a suf- ficiently long time. Only if ergodicity holds is that one can replace a given time average (defined within the Boltzmann framework) by the corresponding average over a statistical ensemble (defined at equilibrium).

In Chapter 2 (also written by the editors of this book) the essentials of information theory are introduced and discussed. This theory was created in 1948 by Claude Shannon through the definition of a measure of information following a form similar to the quantity H introduced by Boltzmann. Consequently, from this measure of information one constructs a statistical entropy, sometimes called Boltzmann-Gibbs-Shannon entropy. In 1957, E. T. Jaynes introduced the Maximum Entropy Principle, which allows one to obtain equilibrium distributions by maximizing the statistical entropy under given constraints. In this way, one can derive the equilibrium distributions of statistical mechanics from the entropy maximization procedure, by considering the Boltzmann-Gibbs-Shannon entropy under the constraints suitable for each statistical ensemble.

The interest in a wide variety of complex systems, which are usually characterized by a large number of constituents that interact through long-range forces, and may present peculiar features, like strong correlations, long-time memory, and breakdown of ergodicity, has increased in the latest years. Therefore, due to these characteristics, the complex systems are expected to exhibit collective behaviors very different from those of the rarefied gas considered by Boltzmann in 1872. Many experiments, as well as computational studies in complex systems, have shown properties in disagreement with the predictions of Boltzmann-Gibbs statistical mechanics, suggesting the need of a more general theory for their description. A breakthrough occurred in 1988 with the introduction of a generalized entropy Sq by Constantino Tsallis; this proposal is discussed in detail in Chapter 3. It is characterized by a real index q, such that in the limit q ? 1 one recovers the Boltzmann-Gibbs-Shannon entropy. The entropy Sq, and more particularly the distribution that comes from its maximization, has been very successful in describing many situations where Boltzmann-Gibbs statistical mechanics fails.

Although the entropy Sq has been the most successful for describing complex systems so far, other generalized entropic forms have also been proposed in the literature, as discussed in Chapter 4 by S. Thurner and R. Hanel. This is what the present book is about, containing interesting chapters in history, theory, computational methods, experimental verifications, and applications. The book is addressed not only to physicists, but to researchers in a large diversity of fields, like biology, medicine, economics, human sciences, and to all those interested in understanding the mysteries within the realm of complex systems.

Fernando Dantas Nobre
Centro Brasileiro de Pesquisas F´isicas
Rio de Janeiro - RJ - Brazil

Foreword by Vucetich

In the fourth century BC, Aristotle stated that he disposed of an infallible method to find the truth, namely, the inductive one. And in the XVII century, Leibniz proposed the construction of a “Calculus of Ideas” that would end vain debates. Neither project succeeded, but each of them started several projects that enriched science: logic and statistics. In the XIX century a new, powerful idea was developed, namely, that of entropy: an ever-growing physical magnitude that measured the degree of decay of order in a physical system. This powerful idea not only explained quantitatively the behavior of gases, dilute solutions and chemical reactions but also explained the philosophical problem of decay that Aristotle attributed to earthly matter. With the introduction of entropy, thermodynamics became a model of theoretical science.

In 1948 Shannon developed a “Statistical theory of communication” taking ideas from both fields that in turn opened new paths for research. The powerful notion of information entropy played a major part in the development of new statistical techniques, overhauled the Bayesian approach to probability and statistics and provided powerful new techniques and approaches on several fields of science.

Later on, several generalizations of the concept of information entropy were introduced, that extended and shed new light on the field. These generalizations are already applied in statistical problems and may have interesting applications in fields of science such as critical behavior or neuroscience.

These and related topics are treated in this book that is a review of an old subject from a young point of view. Starting from its historical roots, the book proceeds with the mathematical foundations, generalizations, properties and applications to different branches of mathematics and natural science of the powerful notion of information entropy. And as such, it gives a state-of-art perspective of the subject in the second decade of the XXI century.

Reader: enjoy!

Héctor Vucetich
Observatorio Astron´omico
Universidad Nacional de La Plata
La Plata – Argentina


Preface

Since the introduction of the concept of information entropy by Claude Shannon in his famous 1948 article [1], quantifiers based on information theory have played an increasingly fundamental role in several fields. Different generalizations of the Shannon entropy have been developed, among them the R´enyi and Tsallis entropies, which have found important applications not only in physics but also in quite distinct areas such as biology, economy, cognitive sciences, etc. In addition, other information measures such as the Fisher information, which predates the Shannon entropy, and the more recent statistical complexities, have also proved to be useful and powerful tools in different scenarios, allowing in particular to analyze time series and data series independently of their sources.

It is our goal to expose in this E-book, in a broadly accessible level, the basic concepts and some of the latest developments in the field of generalized information measures, understanding as such all those quantities which allow to obtain and quantify information from a probability distribution. Addressed not only to physicists, but also to researchers in other fields like biology, medicine, economics, etc., it offers through its chapters an overview of the main measures and techniques, together with some recent relevant applications which illustrate their potential. Its scope ranges from generalized entropies and the majorization based concept of disorder to complexity measures and metrics in probability space. It includes methods for extracting probability distributions from general data series and applications ranging from quantum entanglement to biology and brain modeling. A comprehensive list of references is also contained.

Reference

[1] Claude E. Shannon, “A Mathematical Theory of Communication”, Bell System Technical Journal 27, 379–423 and 623–656 (1948).

Andres M. Kowalskia, Ra´ul D. Rossignolia, Evaldo M. F. Curadob

a Departamento de F´isica–IFLP, Universidad Nacional de La Plata

and Comisi´on de Investigaciones Cient´ificas, La Plata, Argentina

b Centro Brasileiro de Pesquisas F´isicas and National Institute of

Science and Technology for Complex Systems, Rio de Janeiro, Brasil

List of Contributors

Editor(s):
Andres M. Kowalski
CIC - Departamento de F´isica-IFLP
Universidad Nacional de La Plata
C.C.67
La Plata, 1900
Argentina


Ra´ul D. Rossignoli
Departamento de Física –IFLP
Universidad Nacional de La Plata and Comisíon de Investigaciones Científicas
La Plata
Argentina


Evaldo M. F. Curado
Centro Brasileiro de Pesquisas Física and National Institute of Science and Technology for Complex Systems
Rio de Janeiro
Brazil




Advertisement


Related Journals



Related Books



Webmaster Contact: urooj@benthamscience.org Copyright © 2016 Bentham Science