Essentials of Information Entropy and Related Measures
- Authors: Raul D. Rossignoli1, Andres M. Kowalski2, Evaldo M. F. Curado3
-
View Affiliations Hide Affiliations1 Departamento de F´ısica-IFLP, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, C.C. 727, 1900 La Plata, Argentina 2 Departamento de F´ısica-IFLP, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, C.C. 727, 1900 La Plata, Argentina 3 Centro Brasileiro de Pesquisas F´ısicas and National Institute of Science and Technology for Complex Systems, Rio de Janeiro, Brasil
- Source: Concepts and Recent Advances in Generalized Information Measures and Statistics , pp 30-56
- Publication Date: December 2013
- Language: English
Preview this chapter:
Essentials of Information Entropy and Related Measures, Page 1 of 1
< Previous page | Next page > /docserver/preview/fulltext/9781608057603/chapter-2-1.gif
This introductory chapter provides a basic review of the Shannon entropy and of some important related quantities like the joint entropy, the conditional entropy, the mutual information and the relative entropy. We also discuss the Fisher information, the fundamental property of concavity, the basic elements of the maximum entropy approach and the definition of entropy in the Quantum case. We close this chapter with the axioms which determine the Shannon entropy and a brief description of other information measures.
Hardbound ISBN:
9781608057610
Ebook ISBN:
9781608057603
-
From This Site
/content/books/9781608057603.chapter-2dcterms_subject,pub_keyword-contentType:Journal -contentType:Figure -contentType:Table -contentType:SupplementaryData105
/content/books/9781608057603.chapter-2
dcterms_subject,pub_keyword
-contentType:Journal -contentType:Figure -contentType:Table -contentType:SupplementaryData
10
5
Chapter
content/books/9781608057603
Book
false
en