### Abstract

Let X1,X2,–- be independent identically distributed random variables taking values in a finite set X and consider the conditional joint distribution of the first m elements of the sample Xl,…, Xn on the condition that X1 = xl and the sliding block sample average of a function [FORMULA OMITTED] defined on X2 exceeds a threshold a> Eh(Xl, X2). For m fixed and n →∞ this conditional joint distribution is shown to converge to the m step joint distribution of a Markov chain started in x1 which is closest to Xl,X2,–- in Kullback-Leibler information divergence among all Markov chains whose two-dimensional stationary distribution P(-,-) satisfies T.P(x, y)h(x, y) ≈/α, provided some distribution P on X2 having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained when Xl,X2,–- is an arbitrary finite-order Markov chain and more general conditioning is allowed.

Original language | English |
---|---|

Pages (from-to) | 788-801 |

Number of pages | 14 |

Journal | IEEE Transactions on Information Theory |

Volume | 33 |

Issue number | 6 |

DOIs | |

Publication status | Published - Nov 1987 |

### ASJC Scopus subject areas

- Information Systems
- Computer Science Applications
- Library and Information Sciences

## Fingerprint Dive into the research topics of 'Conditional Limit Theorems under Markov Conditioning'. Together they form a unique fingerprint.

## Cite this

*IEEE Transactions on Information Theory*,

*33*(6), 788-801. https://doi.org/10.1109/TIT.1987.1057385