Axiomatic characterizations of information measures

Research output: Contribution to journalArticle

130 Citations (Scopus)


Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.

Original languageEnglish
Pages (from-to)261-273
Number of pages13
Issue number3
Publication statusPublished - Sep 2008


  • Bregman distance
  • Functional equation
  • Kullback I-divergence
  • Maximum entropy
  • Proper score
  • Rényi information measures
  • Shannon entropy
  • Transitive inference rule
  • f-divergence
  • f-entropy

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Fingerprint Dive into the research topics of 'Axiomatic characterizations of information measures'. Together they form a unique fingerprint.

  • Cite this