A class of measures of informativity of observation channels

Research output: Contribution to journalArticle

88 Citations (Scopus)

Abstract

A class of numerical measures of informativity of observation channels or statistical experiments is defined by the aid of f-divergences introduced by the author as measures of difference of two probability distributions. For observation channels with given prior probabilities, the f-informativity measures are generalizations of Shannon's mutual information and include Gallager's function E0(ρ{variant}Q) appearing in the derivation of error exponent for noisy channels, as well. For observation channels without prior probabilities, the suggested informativity measures have the geometric interpretation of a radius. The f-informativity defined for the Bayesian case shares several useful properties of the mutual information, such as e. g. the "data processing theorem". Its maximum with respect to all possible prior distributions is shown by a minimax argument to be just the f-radius, thus the latter is a generalization of channel capacity. The f-informativity measures can also be used to characterize the statistical sufficiency of indirect observations.

Original languageEnglish
Pages (from-to)191-213
Number of pages23
JournalPeriodica Mathematica Hungarica
Volume2
Issue number1-4
DOIs
Publication statusPublished - Mar 1 1972

ASJC Scopus subject areas

  • Mathematics(all)

Fingerprint Dive into the research topics of 'A class of measures of informativity of observation channels'. Together they form a unique fingerprint.

  • Cite this