### Abstract

A class of numerical measures of informativity of observation channels or statistical experiments is defined by the aid of f-divergences introduced by the author as measures of difference of two probability distributions. For observation channels with given prior probabilities, the f-informativity measures are generalizations of Shannon's mutual information and include Gallager's function E_{0}(ρ{variant}Q) appearing in the derivation of error exponent for noisy channels, as well. For observation channels without prior probabilities, the suggested informativity measures have the geometric interpretation of a radius. The f-informativity defined for the Bayesian case shares several useful properties of the mutual information, such as e. g. the "data processing theorem". Its maximum with respect to all possible prior distributions is shown by a minimax argument to be just the f-radius, thus the latter is a generalization of channel capacity. The f-informativity measures can also be used to characterize the statistical sufficiency of indirect observations.

Original language | English |
---|---|

Pages (from-to) | 191-213 |

Number of pages | 23 |

Journal | Periodica Mathematica Hungarica |

Volume | 2 |

Issue number | 1-4 |

DOIs | |

Publication status | Published - Mar 1 1972 |

### ASJC Scopus subject areas

- Mathematics(all)