Distribution Estimation Consistent in Total Variation and in Two Types of Information Divergence

Andrew R. Barron, László Györfi, Edward C. van der Meulen

Research output: Contribution to journalArticle

71 Citations (Scopus)

Abstract

The problem of the nonparametric estimation of a probability distribution is considered from three viewpoints: the consistency in total variation, the consistency in information divergence, and consistency in reversed order information divergence. These types of consistencies are relatively strong criteria of convergence, and a probability distribution cannot be consistently estimated in either type of convergence without any restrictions on the class of probability distributions allowed. Histogram-based estimators of distribution are presented which, under certain conditions, converge in total variation, in information divergence, and in reversed order information divergence to the unknown probability distribution. Some a priori information about the true probability distribution is assumed in each case. As the concept of consistency in information divergence is stronger than that of convergence in total variation, additional assumptions are imposed in the cases of informational divergences.

Original languageEnglish
Pages (from-to)1437-1454
Number of pages18
JournalIEEE Transactions on Information Theory
Volume38
Issue number5
DOIs
Publication statusPublished - Sep 1992

    Fingerprint

Keywords

  • Consistent distribution estimation
  • histogram-based estimate
  • information divergence
  • reversed order information divergence
  • total variation

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Cite this