On estimating the memory for finitarily Markovian processes

G. Morvai, Benjamin Weiss

Research output: Contribution to journalArticle

12 Citations (Scopus)

Abstract

Finitarily Markovian processes are those processes {Xn}n = - ∞ for which there is a finite K (K = K ({Xn}n = - ∞0)) such that the conditional distribution of X1 given the entire past is equal to the conditional distribution of X1 given only {Xn}n = 1 - K0. The least such value of K is called the memory length. We give a rather complete analysis of the problems of universally estimating the least such value of K, both in the backward sense that we have just described and in the forward sense, where one observes successive values of {Xn} for n ≥ 0 and asks for the least value K such that the conditional distribution of Xn + 1 given {Xi}i = n - K + 1n is the same as the conditional distribution of Xn + 1 given {Xi}i = - ∞n. We allow for finite or countably infinite alphabet size.

Original languageEnglish
Pages (from-to)15-30
Number of pages16
JournalAnnales de l'institut Henri Poincare (B) Probability and Statistics
Volume43
Issue number1
DOIs
Publication statusPublished - Jan 2007

Fingerprint

Markovian Process
Conditional Distribution
Entire
Conditional distribution

Keywords

  • Nonparametric estimation
  • Stationary processes

ASJC Scopus subject areas

  • Statistics and Probability

Cite this

On estimating the memory for finitarily Markovian processes. / Morvai, G.; Weiss, Benjamin.

In: Annales de l'institut Henri Poincare (B) Probability and Statistics, Vol. 43, No. 1, 01.2007, p. 15-30.

Research output: Contribution to journalArticle

@article{5e9f4bc85f8948e984a2dc9264d9979c,
title = "On estimating the memory for finitarily Markovian processes",
abstract = "Finitarily Markovian processes are those processes {Xn}n = - ∞∞ for which there is a finite K (K = K ({Xn}n = - ∞0)) such that the conditional distribution of X1 given the entire past is equal to the conditional distribution of X1 given only {Xn}n = 1 - K0. The least such value of K is called the memory length. We give a rather complete analysis of the problems of universally estimating the least such value of K, both in the backward sense that we have just described and in the forward sense, where one observes successive values of {Xn} for n ≥ 0 and asks for the least value K such that the conditional distribution of Xn + 1 given {Xi}i = n - K + 1n is the same as the conditional distribution of Xn + 1 given {Xi}i = - ∞n. We allow for finite or countably infinite alphabet size.",
keywords = "Nonparametric estimation, Stationary processes",
author = "G. Morvai and Benjamin Weiss",
year = "2007",
month = "1",
doi = "10.1016/j.anihpb.2005.11.001",
language = "English",
volume = "43",
pages = "15--30",
journal = "Annales de l'institut Henri Poincare (B) Probability and Statistics",
issn = "0246-0203",
publisher = "Institute of Mathematical Statistics",
number = "1",

}

TY - JOUR

T1 - On estimating the memory for finitarily Markovian processes

AU - Morvai, G.

AU - Weiss, Benjamin

PY - 2007/1

Y1 - 2007/1

N2 - Finitarily Markovian processes are those processes {Xn}n = - ∞∞ for which there is a finite K (K = K ({Xn}n = - ∞0)) such that the conditional distribution of X1 given the entire past is equal to the conditional distribution of X1 given only {Xn}n = 1 - K0. The least such value of K is called the memory length. We give a rather complete analysis of the problems of universally estimating the least such value of K, both in the backward sense that we have just described and in the forward sense, where one observes successive values of {Xn} for n ≥ 0 and asks for the least value K such that the conditional distribution of Xn + 1 given {Xi}i = n - K + 1n is the same as the conditional distribution of Xn + 1 given {Xi}i = - ∞n. We allow for finite or countably infinite alphabet size.

AB - Finitarily Markovian processes are those processes {Xn}n = - ∞∞ for which there is a finite K (K = K ({Xn}n = - ∞0)) such that the conditional distribution of X1 given the entire past is equal to the conditional distribution of X1 given only {Xn}n = 1 - K0. The least such value of K is called the memory length. We give a rather complete analysis of the problems of universally estimating the least such value of K, both in the backward sense that we have just described and in the forward sense, where one observes successive values of {Xn} for n ≥ 0 and asks for the least value K such that the conditional distribution of Xn + 1 given {Xi}i = n - K + 1n is the same as the conditional distribution of Xn + 1 given {Xi}i = - ∞n. We allow for finite or countably infinite alphabet size.

KW - Nonparametric estimation

KW - Stationary processes

UR - http://www.scopus.com/inward/record.url?scp=33751241746&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33751241746&partnerID=8YFLogxK

U2 - 10.1016/j.anihpb.2005.11.001

DO - 10.1016/j.anihpb.2005.11.001

M3 - Article

AN - SCOPUS:33751241746

VL - 43

SP - 15

EP - 30

JO - Annales de l'institut Henri Poincare (B) Probability and Statistics

JF - Annales de l'institut Henri Poincare (B) Probability and Statistics

SN - 0246-0203

IS - 1

ER -