Almost sure consistency of a general class of entropy estimators

Laszlo Gyorfi, Edward C. van der Meulen

Research output: Contribution to conferencePaper

Abstract

Summary form only given, as follows. Let f(x) be a probability density function, x ε R$$-$/.$/ The Shannon (or differential) entropy is defined as H(f) = -∫f(x) log f(x) dx, integrated from x = -∞ to + ∞. The authors propose, based on a random sample X1, ..., Xn generated from f, two new nonparametric estimators for H(f). Both entropy estimators are histogram-based in the sense that they involve a histogram-based density estimator fn(x). They prove the almost sure consistency of these new entropy estimators with the only condition on f being that H(f) is finite. Subsequently they determine which additional properties one should impose on an L1-consistent density estimator fn (x) (not necessarily histogram-based) such that the corresponding empiric entropies are almost sure consistent.

Original languageEnglish
Number of pages1
Publication statusPublished - Dec 1 1988

    Fingerprint

ASJC Scopus subject areas

  • Engineering(all)

Cite this