Let X1,X2,–- be independent identically distributed random variables taking values in a finite set X and consider the conditional joint distribution of the first m elements of the sample Xl,…, Xn on the condition that X1 = xl and the sliding block sample average of a function [FORMULA OMITTED] defined on X2 exceeds a threshold a> Eh(Xl, X2). For m fixed and n →∞ this conditional joint distribution is shown to converge to the m step joint distribution of a Markov chain started in x1 which is closest to Xl,X2,–- in Kullback-Leibler information divergence among all Markov chains whose two-dimensional stationary distribution P(-,-) satisfies T.P(x, y)h(x, y) ≈/α, provided some distribution P on X2 having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained when Xl,X2,–- is an arbitrary finite-order Markov chain and more general conditioning is allowed.
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences