Conditional Limit Theorems under Markov Conditioning

Imre CsiszáR, Thomas M. Cover, Byoung Seqn Choi

Research output: Contribution to journalLetter

69 Citations (Scopus)

Abstract

Let X1,X2,–- be independent identically distributed random variables taking values in a finite set X and consider the conditional joint distribution of the first m elements of the sample Xl,…, Xn on the condition that X1 = xl and the sliding block sample average of a function [FORMULA OMITTED] defined on X2 exceeds a threshold a> Eh(Xl, X2). For m fixed and n →∞ this conditional joint distribution is shown to converge to the m step joint distribution of a Markov chain started in x1 which is closest to Xl,X2,–- in Kullback-Leibler information divergence among all Markov chains whose two-dimensional stationary distribution P(-,-) satisfies T.P(x, y)h(x, y) ≈/α, provided some distribution P on X2 having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained when Xl,X2,–- is an arbitrary finite-order Markov chain and more general conditioning is allowed.

Original languageEnglish
Pages (from-to)788-801
Number of pages14
JournalIEEE Transactions on Information Theory
Volume33
Issue number6
DOIs
Publication statusPublished - Nov 1987

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint Dive into the research topics of 'Conditional Limit Theorems under Markov Conditioning'. Together they form a unique fingerprint.

  • Cite this