Axiomatic characterizations of information measures

Research output: Contribution to journalArticle

118 Citations (Scopus)

Abstract

Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.

Original languageEnglish
Pages (from-to)261-273
Number of pages13
JournalEntropy
Volume10
Issue number3
DOIs
Publication statusPublished - Sep 2008

Fingerprint

entropy
information theory
inference
set theory
divergence

Keywords

  • Bregman distance
  • f-divergence
  • f-entropy
  • Functional equation
  • Kullback I-divergence
  • Maximum entropy
  • Proper score
  • Rényi information measures
  • Shannon entropy
  • Transitive inference rule

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Cite this

Axiomatic characterizations of information measures. / Csiszár, Imre.

In: Entropy, Vol. 10, No. 3, 09.2008, p. 261-273.

Research output: Contribution to journalArticle

@article{2540492a539a4fe4b980832d2965cd5a,
title = "Axiomatic characterizations of information measures",
abstract = "Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.",
keywords = "Bregman distance, f-divergence, f-entropy, Functional equation, Kullback I-divergence, Maximum entropy, Proper score, R{\'e}nyi information measures, Shannon entropy, Transitive inference rule",
author = "Imre Csisz{\'a}r",
year = "2008",
month = "9",
doi = "10.3390/e10030261",
language = "English",
volume = "10",
pages = "261--273",
journal = "Entropy",
issn = "1099-4300",
publisher = "Multidisciplinary Digital Publishing Institute (MDPI)",
number = "3",

}

TY - JOUR

T1 - Axiomatic characterizations of information measures

AU - Csiszár, Imre

PY - 2008/9

Y1 - 2008/9

N2 - Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.

AB - Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.

KW - Bregman distance

KW - f-divergence

KW - f-entropy

KW - Functional equation

KW - Kullback I-divergence

KW - Maximum entropy

KW - Proper score

KW - Rényi information measures

KW - Shannon entropy

KW - Transitive inference rule

UR - http://www.scopus.com/inward/record.url?scp=54749100076&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=54749100076&partnerID=8YFLogxK

U2 - 10.3390/e10030261

DO - 10.3390/e10030261

M3 - Article

VL - 10

SP - 261

EP - 273

JO - Entropy

JF - Entropy

SN - 1099-4300

IS - 3

ER -