Density-free convergence properties of various estimators of entropy

László Györfi, Edward C. van der Meulen

Research output: Contribution to journalArticle

48 Citations (Scopus)

Abstract

Let f{hook}(x) be a probability density function, x∈Rd. The Shannon (or differential) entropy is defined as H(f{hook})=-∫f{hook}(x) log f{hook}(x) dx. In this paper we propose, based on a random sample X1,..., Xn generated from f{hook}, two new nonparametric estimators for H(f{hook}). Both entropy estimators are histogram-based in the sense that they involve a histogram-based desntiy estimator f{hook} ̂n. We prove their a.s. consistency with the only condition on f{hook} that H(f{hook}) is finite.

Original languageEnglish
Pages (from-to)425-436
Number of pages12
JournalComputational Statistics and Data Analysis
Volume5
Issue number4
DOIs
Publication statusPublished - Sep 1987

    Fingerprint

Keywords

  • Differential entropy
  • Histogram-based density estimates
  • L and almost sure convergence
  • Nonparametric entropy estimation

ASJC Scopus subject areas

  • Statistics and Probability
  • Computational Mathematics
  • Computational Theory and Mathematics
  • Applied Mathematics

Cite this