### Abstract

Summary form only given, as follows. Let f(x) be a probability density function, x ε R$$-$/.$/ The Shannon (or differential) entropy is defined as H(f) = -∫f(x) log f(x) dx, integrated from x = -∞ to + ∞. The authors propose, based on a random sample X_{1}, ..., X_{n} generated from f, two new nonparametric estimators for H(f). Both entropy estimators are histogram-based in the sense that they involve a histogram-based density estimator f_{n}(x). They prove the almost sure consistency of these new entropy estimators with the only condition on f being that H(f) is finite. Subsequently they determine which additional properties one should impose on an L_{1}-consistent density estimator f_{n} (x) (not necessarily histogram-based) such that the corresponding empiric entropies are almost sure consistent.

Original language | English |
---|---|

Number of pages | 1 |

Publication status | Published - Dec 1 1988 |

### Fingerprint

### ASJC Scopus subject areas

- Engineering(all)

### Cite this

*Almost sure consistency of a general class of entropy estimators*.