### Abstract

Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.

Original language | English |
---|---|

Title of host publication | IEEE International Symposium on Information Theory - Proceedings |

Publisher | IEEE |

Pages | 2 |

Number of pages | 1 |

Publication status | Published - 1997 |

Event | Proceedings of the 1997 IEEE International Symposium on Information Theory - Ulm, Ger Duration: Jun 29 1997 → Jul 4 1997 |

### Other

Other | Proceedings of the 1997 IEEE International Symposium on Information Theory |
---|---|

City | Ulm, Ger |

Period | 6/29/97 → 7/4/97 |

### Fingerprint

### ASJC Scopus subject areas

- Electrical and Electronic Engineering
- Applied Mathematics
- Modelling and Simulation
- Theoretical Computer Science
- Information Systems

### Cite this

*IEEE International Symposium on Information Theory - Proceedings*(pp. 2). IEEE.

**Information theoretic methods in probability and statistics.** / Csiszár, I.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*IEEE International Symposium on Information Theory - Proceedings.*IEEE, pp. 2, Proceedings of the 1997 IEEE International Symposium on Information Theory, Ulm, Ger, 6/29/97.

}

TY - GEN

T1 - Information theoretic methods in probability and statistics

AU - Csiszár, I.

PY - 1997

Y1 - 1997

N2 - Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.

AB - Some applications of information theoretic ideas within mathematics is discussed, taken from the field of probability and mathematical statistics. One of the first significant applications of information theory (IT) to probability was Hajek's proof that two Gaussian measures are either mutually absolutely continuous or mutually singular, according as their I-divergence is finite or infinite. Starting with the work of Sanov, I-divergence became a key concept known as large deviations theory. An information theoretic approach to statistics was first put forward by Kullback based on the concept of I-divergence. Some IT-based techniques related to universal hypothesis testing and the analysis of contingency tables is reviewed.

UR - http://www.scopus.com/inward/record.url?scp=0030677832&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030677832&partnerID=8YFLogxK

M3 - Conference contribution

SP - 2

BT - IEEE International Symposium on Information Theory - Proceedings

PB - IEEE

ER -