Table of Contents
Foreword by Fernando Dantas Nobre
 Pp. iii (2)Fernando Dantas Nobre
Download Free
Foreword by Héctor Vucetich
 Pp. iiiHector Vucetich
Download Free
Preface
 Pp. ivAndres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado
Download Free
List of Contributors
 Pp. vix (5)Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado
Download Free
Introduction  Summary of Contents
 Pp. xxiii (4)Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado
Download Free
Heat and Entropy: A Brief History
 Pp. 329 (27)Evaldo M. F. Curado, Andres M. Kowalski and Raul D. RossignoliPDF Price: $15
View Abstract
Purchase Chapter
Essentials of Information Entropy and Related Measures
 Pp. 3056 (27)Raul D. Rossignoli, Andres M. Kowalski and Evaldo M. F. CuradoPDF Price: $15
View Abstract
Purchase Chapter
The Nonadditive Entropy Sq: A Door Open to the Nonuniversality of the Mathematical Expression of the Clausius Thermodynamic Entropy in Terms of the Probabilities of the Microscopic Configurations
 Pp. 5780 (24)Constantino TsallisPDF Price: $15
View Abstract
Purchase Chapter
What do Generalized Entropies Look Like? An Axiomatic Approach for Complex, NonErgodic Systems
 Pp. 8199 (19)Stefan Thurner and Rudolf HanelPDF Price: $15
View Abstract
Purchase Chapter
Distances Measures for Probability Distributions
 Pp. 130146 (17)Pedro W. Lamberti and Ana P. MajteyPDF Price: $15
View Abstract
Purchase Chapter
A Statistical Measure of Complexity
 Pp. 147168 (22)Ricardo LopezRuiz, Hector Mancini and Xavier CalbetPDF Price: $15
View Abstract
Purchase Chapter
Generalized Statistical Complexity: A New Tool for Dynamical Systems
 Pp. 169215 (47)Osvaldo A. Rosso, Maria Teresa Martin, Hilda A. Larrondo, Andres M. Kowalski and Angelo PlastinoPDF Price: $15
View Abstract
Purchase Chapter
The Fisher Information: Properties and PhysicoChemical Applications
 Pp. 216233 (18)Jesus S. Dehesa, Rodolfo O. Esquivel, Angel Ricardo Plastino and Pablo SanchezMorenoPDF Price: $15
View Abstract
Purchase Chapter
Semiclassical Treatments and Information Theory
 Pp. 256282 (27)Flavia Pennini, Angelo Plastino and Gustavo Luis FerriPDF Price: $15
View Abstract
Purchase Chapter
Statistical Complexity of Chaotic Pseudorandom Number Generators
 Pp. 283308 (26)Hilda A. Larrondo, Luciana De Micco, Claudio M. Gonzalez, Angelo Plastino and Osvaldo A. RossoPDF Price: $15
View Abstract
Purchase Chapter
Analysis of an EL NinoSouthern Oscillation proxy record using Information Theory quantifiers
 Pp. 309340 (32)Laura C. Carpi, Patricia M. Saco, Alejandra Figliola, Eduardo Serrano and Osvaldo A. RossoPDF Price: $15
View Abstract
Purchase Chapter
Erythrocytes Viscoelasticity Under the lens of Wavelet–Information Theory Quantifiers
 Pp. 341374 (34)Ana Maria Korol, Maria Teresa Martin, Bibiana Riquelme, Mabel D’Arrigo and Osvaldo A. RossoPDF Price: $15
View Abstract
Purchase Chapter
InformationTheoretic Analysis of the Role of Correlations in Neural Spike Trains
 Pp. 375407 (33)Fernando Montani and Simon R. SchultzPDF Price: $15
View Abstract
Purchase Chapter
Index
 Pp. 408414 (7)Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado
Download Free
Foreword
Foreword by Nobre
The concept of entropy appears in many areas of knowledge, like thermodynamics, statistical mechanics, information theory, biology, economy, and human sciences. From the historical point of view, it was introduced in 1865 by Rudolf Clausius through an elegant formulation of the second law of thermodynamics. A nice historical review of how the concept of entropy emerged in physics is presented in Chapter 1, written by the editors of this book. According to the second law, the total entropy of an isolated system can never decrease in thermodynamical transformations, and particularly in the case of irreversible transformations, it always increases. Since most natural processes are irreversible, entropy has been associated with the “arrow of time”. Some years later (1872), Ludwig Boltzmann wrote down an equation (known as Boltzmann equation) to describe the evolution of the singleparticle distribution of a rarefied gas. Considering this distribution, he defined a quantity H and proved the famous Htheorem, by showing that H always decreases in time. Boltzmann realized that for a perfect gas in equilibrium, the quantity H was related to Clausius’ entropy S (apart from a minus sign and some multiplicative constants). This identification led to the definition of statistical entropy, i.e., the entropy defined in terms of a probability distribution, P(x̄, t), associated with the occurrence of a given physical quantity x̄ (in the case of gases x̄ may represent the position, the velocity, or both position and velocity, of a molecule) at a time t. Moreover, the Htheorem yielded a microscopic interpretation of the second law of thermodynamics.
The dynamical approach of Boltzmann, together with the elegant theory of statistical ensembles at equilibrium proposed by Josiah Willard Gibbs, led to the BoltzmannGibbs theory of statistical mechanics, which represents one of the most successful theoretical frameworks of physics. The knowledge of the equilibrium distribution associated with a given statistical ensemble allows one to calculate average values to be related with thermodynamic quantities. This theory is based on a fundamental assumption, namely, the ergodic hypothesis, which requires that the system will pass through all its microstates after a suf ficiently long time. Only if ergodicity holds is that one can replace a given time average (defined within the Boltzmann framework) by the corresponding average over a statistical ensemble (defined at equilibrium).
In Chapter 2 (also written by the editors of this book) the essentials of information theory are introduced and discussed. This theory was created in 1948 by Claude Shannon through the definition of a measure of information following a form similar to the quantity H introduced by Boltzmann. Consequently, from this measure of information one constructs a statistical entropy, sometimes called BoltzmannGibbsShannon entropy. In 1957, E. T. Jaynes introduced the Maximum Entropy Principle, which allows one to obtain equilibrium distributions by maximizing the statistical entropy under given constraints. In this way, one can derive the equilibrium distributions of statistical mechanics from the entropy maximization procedure, by considering the BoltzmannGibbsShannon entropy under the constraints suitable for each statistical ensemble.
The interest in a wide variety of complex systems, which are usually characterized by a large number of constituents that interact through longrange forces, and may present peculiar features, like strong correlations, longtime memory, and breakdown of ergodicity, has increased in the latest years. Therefore, due to these characteristics, the complex systems are expected to exhibit collective behaviors very different from those of the rarefied gas considered by Boltzmann in 1872. Many experiments, as well as computational studies in complex systems, have shown properties in disagreement with the predictions of BoltzmannGibbs statistical mechanics, suggesting the need of a more general theory for their description. A breakthrough occurred in 1988 with the introduction of a generalized entropy S_{q} by Constantino Tsallis; this proposal is discussed in detail in Chapter 3. It is characterized by a real index q, such that in the limit q ? 1 one recovers the BoltzmannGibbsShannon entropy. The entropy Sq, and more particularly the distribution that comes from its maximization, has been very successful in describing many situations where BoltzmannGibbs statistical mechanics fails.
Although the entropy Sq has been the most successful for describing complex systems so far, other generalized entropic forms have also been proposed in the literature, as discussed in Chapter 4 by S. Thurner and R. Hanel. This is what the present book is about, containing interesting chapters in history, theory, computational methods, experimental verifications, and applications. The book is addressed not only to physicists, but to researchers in a large diversity of fields, like biology, medicine, economics, human sciences, and to all those interested in understanding the mysteries within the realm of complex systems.
Fernando Dantas Nobre
Centro Brasileiro de Pesquisas F´isicas
Rio de Janeiro  RJ  Brazil
Foreword by Vucetich
In the fourth century BC, Aristotle stated that he disposed of an infallible method to find the truth, namely, the inductive one. And in the XVII century, Leibniz proposed the construction of a “Calculus of Ideas” that would end vain debates. Neither project succeeded, but each of them started several projects that enriched science: logic and statistics. In the XIX century a new, powerful idea was developed, namely, that of entropy: an evergrowing physical magnitude that measured the degree of decay of order in a physical system. This powerful idea not only explained quantitatively the behavior of gases, dilute solutions and chemical reactions but also explained the philosophical problem of decay that Aristotle attributed to earthly matter. With the introduction of entropy, thermodynamics became a model of theoretical science.
In 1948 Shannon developed a “Statistical theory of communication” taking ideas from both fields that in turn opened new paths for research. The powerful notion of information entropy played a major part in the development of new statistical techniques, overhauled the Bayesian approach to probability and statistics and provided powerful new techniques and approaches on several fields of science.
Later on, several generalizations of the concept of information entropy were introduced, that extended and shed new light on the field. These generalizations are already applied in statistical problems and may have interesting applications in fields of science such as critical behavior or neuroscience.
These and related topics are treated in this book that is a review of an old subject from a young point of view. Starting from its historical roots, the book proceeds with the mathematical foundations, generalizations, properties and applications to different branches of mathematics and natural science of the powerful notion of information entropy. And as such, it gives a stateofart perspective of the subject in the second decade of the XXI century.
Reader: enjoy!
Héctor Vucetich
Observatorio Astron´omico
Universidad Nacional de La Plata
La Plata – Argentina
Preface
Since the introduction of the concept of information entropy by Claude Shannon in his famous 1948 article [1], quantifiers based on information theory have played an increasingly fundamental role in several fields. Different generalizations of the Shannon entropy have been developed, among them the R´enyi and Tsallis entropies, which have found important applications not only in physics but also in quite distinct areas such as biology, economy, cognitive sciences, etc. In addition, other information measures such as the Fisher information, which predates the Shannon entropy, and the more recent statistical complexities, have also proved to be useful and powerful tools in different scenarios, allowing in particular to analyze time series and data series independently of their sources.
It is our goal to expose in this Ebook, in a broadly accessible level, the basic concepts and some of the latest developments in the field of generalized information measures, understanding as such all those quantities which allow to obtain and quantify information from a probability distribution. Addressed not only to physicists, but also to researchers in other fields like biology, medicine, economics, etc., it offers through its chapters an overview of the main measures and techniques, together with some recent relevant applications which illustrate their potential. Its scope ranges from generalized entropies and the majorization based concept of disorder to complexity measures and metrics in probability space. It includes methods for extracting probability distributions from general data series and applications ranging from quantum entanglement to biology and brain modeling. A comprehensive list of references is also contained.
Reference
[1] Claude E. Shannon, “A Mathematical Theory of Communication”, Bell System Technical Journal 27, 379–423 and 623–656 (1948).
Andres M. Kowalski^{a}, Ra´ul D. Rossignoli^{a}, Evaldo M. F. Curado^{b}
^{
a
}Departamento de F´isica–IFLP, Universidad Nacional de La Plata
and Comisi´on de Investigaciones Cient´ificas, La Plata, Argentina
^{
b
} Centro Brasileiro de Pesquisas F´isicas and National Institute of
Science and Technology for Complex Systems, Rio de Janeiro, Brasil
List of Contributors
Editor(s):
Andres M. Kowalski CIC  Departamento de F´isicaIFLP Universidad Nacional de La Plata C.C.67 La Plata, 1900 Argentina
Ra´ul D. Rossignoli Departamento de Física –IFLP Universidad Nacional de La Plata and Comisíon de Investigaciones Científicas La Plata Argentina
Evaldo M. F. Curado Centro Brasileiro de Pesquisas Física and National Institute of Science and Technology for Complex Systems Rio de Janeiro Brazil


