M. Malyutov
SCOT Approximation and Asymptotic Inference I
Approximation
of stationary strongly mixing processes by SCOT models and the Le Cam−Hajek−Ibragimov−Khasminsky locally minimax theory
of statistical inference for them is outlined. SCOT is
an m-Markov model with sparse memory
structure. In our previous IP papers we proved SCOT equivalence to 1-Markov
Chain with state space − alphabet consisting of the SCOT contexts. For
the fixed alphabet size and growing sample size, the Local Asymptotic Normality
is proved and applied for establishing asymptotically optimal inference. We
outline what obstacles arise for a large SCOT alphabet size and not necessarily
vast sample size.
КЛЮЧЕВЫЕ СЛОВА: strong mixing, strongly stationary sequences,
Local Asymptotic Normality, Local Asymptotic Minimaxity, SCOT models, Edgeworth
expansion