Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence

Andrew R. Barron, L. Györfi, Edward C. van der Meulen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

An attempt is made to show how a certain given distribution estimator, which is consistent in expected information divergence, leads to a universal code for the class of all probability measures on a given nondiscrete space which are dominated in I-divergence by a known given probability measure.

Original languageEnglish
Title of host publicationProceedings of the 1993 IEEE International Symposium on Information Theory
PublisherPubl by IEEE
Pages51
Number of pages1
ISBN (Print)0780308786
Publication statusPublished - 1993
EventProceedings of the 1993 IEEE International Symposium on Information Theory - San Antonio, TX, USA
Duration: Jan 17 1993Jan 22 1993

Other

OtherProceedings of the 1993 IEEE International Symposium on Information Theory
CitySan Antonio, TX, USA
Period1/17/931/22/93

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Barron, A. R., Györfi, L., & van der Meulen, E. C. (1993). Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. In Proceedings of the 1993 IEEE International Symposium on Information Theory (pp. 51). Publ by IEEE.

Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. / Barron, Andrew R.; Györfi, L.; van der Meulen, Edward C.

Proceedings of the 1993 IEEE International Symposium on Information Theory. Publ by IEEE, 1993. p. 51.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Barron, AR, Györfi, L & van der Meulen, EC 1993, Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. in Proceedings of the 1993 IEEE International Symposium on Information Theory. Publ by IEEE, pp. 51, Proceedings of the 1993 IEEE International Symposium on Information Theory, San Antonio, TX, USA, 1/17/93.
Barron AR, Györfi L, van der Meulen EC. Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. In Proceedings of the 1993 IEEE International Symposium on Information Theory. Publ by IEEE. 1993. p. 51
Barron, Andrew R. ; Györfi, L. ; van der Meulen, Edward C. / Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence. Proceedings of the 1993 IEEE International Symposium on Information Theory. Publ by IEEE, 1993. pp. 51
@inproceedings{66412c567993483e849880aad4379ace,
title = "Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence",
abstract = "An attempt is made to show how a certain given distribution estimator, which is consistent in expected information divergence, leads to a universal code for the class of all probability measures on a given nondiscrete space which are dominated in I-divergence by a known given probability measure.",
author = "Barron, {Andrew R.} and L. Gy{\"o}rfi and {van der Meulen}, {Edward C.}",
year = "1993",
language = "English",
isbn = "0780308786",
pages = "51",
booktitle = "Proceedings of the 1993 IEEE International Symposium on Information Theory",
publisher = "Publ by IEEE",

}

TY - GEN

T1 - Universal coding of non-discrete sources based on distribution estimation consistent in expected information divergence

AU - Barron, Andrew R.

AU - Györfi, L.

AU - van der Meulen, Edward C.

PY - 1993

Y1 - 1993

N2 - An attempt is made to show how a certain given distribution estimator, which is consistent in expected information divergence, leads to a universal code for the class of all probability measures on a given nondiscrete space which are dominated in I-divergence by a known given probability measure.

AB - An attempt is made to show how a certain given distribution estimator, which is consistent in expected information divergence, leads to a universal code for the class of all probability measures on a given nondiscrete space which are dominated in I-divergence by a known given probability measure.

UR - http://www.scopus.com/inward/record.url?scp=0027210219&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027210219&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0027210219

SN - 0780308786

SP - 51

BT - Proceedings of the 1993 IEEE International Symposium on Information Theory

PB - Publ by IEEE

ER -