Chapter 2

Essentials of Information Entropy and Related Measures

Raul D. Rossignoli, Andres M. Kowalski and Evaldo M. F. Curado

Abstract

This introductory chapter provides a basic review of the Shannon entropy and of some important related quantities like the joint entropy, the conditional entropy, the mutual information and the relative entropy. We also discuss the Fisher information, the fundamental property of concavity, the basic elements of the maximum entropy approach and the definition of entropy in the Quantum case. We close this chapter with the axioms which determine the Shannon entropy and a brief description of other information measures.

Total Pages: 30-56 (27)

Purchase Chapter  Book Details

RELATED BOOKS

.Probability and Statistics: Theory and Exercises.
.Introductory Statistics.
.Introductory Statistical Procedures with SPSS.
.Reliability Calculations with the Stochastic Finite Element.