The Shannon entropy of a random variable X with density function f(x) is defined as H(f) = - ∫ f(x)log f(x) dx. Based on randomly censored observations a nonparametric estimator for H(f) is proposed if H(f) is finite and is nonnegative. This entropy estimator is histogram-based in the sense that it involves a histogram-based density estimator fn constructed from the censored data. We prove the a. s. consistency of this estimator.
|Number of pages||11|
|Journal||Problems of control and information theory|
|Publication status||Published - Dec 1 1991|
ASJC Scopus subject areas