Information theoretic methods in probability and statistics

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.

Original languageEnglish
Title of host publicationIEEE International Symposium on Information Theory - Proceedings
PublisherIEEE
Pages2
Number of pages1
Publication statusPublished - 1997
EventProceedings of the 1997 IEEE International Symposium on Information Theory - Ulm, Ger
Duration: Jun 29 1997Jul 4 1997

Other

OtherProceedings of the 1997 IEEE International Symposium on Information Theory
CityUlm, Ger
Period6/29/977/4/97

Fingerprint

Information theory
Divergence
Statistics
Information Theory
Large Deviation Theory
Statistical methods
Gaussian Measure
Contingency Table
Absolutely Continuous
Hypothesis Testing
Testing
Concepts

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Applied Mathematics
  • Modelling and Simulation
  • Theoretical Computer Science
  • Information Systems

Cite this

Csiszár, I. (1997). Information theoretic methods in probability and statistics. In IEEE International Symposium on Information Theory - Proceedings (pp. 2). IEEE.

Information theoretic methods in probability and statistics. / Csiszár, I.

IEEE International Symposium on Information Theory - Proceedings. IEEE, 1997. p. 2.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Csiszár, I 1997, Information theoretic methods in probability and statistics. in IEEE International Symposium on Information Theory - Proceedings. IEEE, pp. 2, Proceedings of the 1997 IEEE International Symposium on Information Theory, Ulm, Ger, 6/29/97.
Csiszár I. Information theoretic methods in probability and statistics. In IEEE International Symposium on Information Theory - Proceedings. IEEE. 1997. p. 2
Csiszár, I. / Information theoretic methods in probability and statistics. IEEE International Symposium on Information Theory - Proceedings. IEEE, 1997. pp. 2
@inproceedings{704e6fbed6bd4f2bada3e8ee8c02fd37,
title = "Information theoretic methods in probability and statistics",
abstract = "Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.",
author = "I. Csisz{\'a}r",
year = "1997",
language = "English",
pages = "2",
booktitle = "IEEE International Symposium on Information Theory - Proceedings",
publisher = "IEEE",

}

TY - GEN

T1 - Information theoretic methods in probability and statistics

AU - Csiszár, I.

PY - 1997

Y1 - 1997

N2 - Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.

AB - Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.

UR - http://www.scopus.com/inward/record.url?scp=0030677832&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030677832&partnerID=8YFLogxK

M3 - Conference contribution

SP - 2

BT - IEEE International Symposium on Information Theory - Proceedings

PB - IEEE

ER -