Summary form only given, as follows. Let f(x) be a probability density function, x ε R$$-$/.$/ The Shannon (or differential) entropy is defined as H(f) = -∫f(x) log f(x) dx, integrated from x = -∞ to + ∞. The authors propose, based on a random sample X1, ..., Xn generated from f, two new nonparametric estimators for H(f). Both entropy estimators are histogram-based in the sense that they involve a histogram-based density estimator fn(x). They prove the almost sure consistency of these new entropy estimators with the only condition on f being that H(f) is finite. Subsequently they determine which additional properties one should impose on an L1-consistent density estimator fn (x) (not necessarily histogram-based) such that the corresponding empiric entropies are almost sure consistent.
|Number of pages||1|
|Publication status||Published - Dec 1 1988|
ASJC Scopus subject areas