### Abstract

Original language | English |
---|---|

Journal | Bayesian analysis |

Volume | 11 |

Issue number | 1 |

Pages (from-to) | 247-263 |

Number of pages | 17 |

ISSN | 1931-6690 |

DOIs | |

Publication status | Published - 2016 |

MoE publication type | A1 Journal article-refereed |

### Fields of Science

- 112 Statistics and probability
- Clustering
- Recursive learning
- Sequence analysis,
- Sparse Markov chains

### Cite this

*Bayesian analysis*,

*11*(1), 247-263. https://doi.org/10.1214/15-BA949

}

*Bayesian analysis*, vol. 11, no. 1, pp. 247-263. https://doi.org/10.1214/15-BA949

**Recursive learning for sparse Markov models.** / Xiong, Jie; Jääskinen, Väinö; Corander, Jukka.

Research output: Contribution to journal › Article › Scientific › peer-review

TY - JOUR

T1 - Recursive learning for sparse Markov models

AU - Xiong, Jie

AU - Jääskinen, Väinö

AU - Corander, Jukka

PY - 2016

Y1 - 2016

N2 - Markov chains of higher order are popular models for a wide variety of applications in natural language and DNA sequence processing. However, since the number of parameters grows exponentially with the order of a Markov chain, several alternative model classes have been proposed that allow for stability and higher rate of data compression. The common notion to these models is that they cluster the possible sample paths used to predict the next state into invariance classes with identical conditional distributions assigned to the same class. The models vary in particular with respect to constraints imposed on legitime partitions of the sample paths. Here we consider the class of sparse Markov chains for which the partition is left unconstrained a priori. A recursive computation scheme based on Delaunay triangulation of the parameter space is introduced to enable fast approximation of the posterior mode partition. Comparisons with stochastic optimization, k-means and nearest neighbor algorithms show that our approach is both considerably faster and leads on average to a more accurate estimate of the underlying partition. We show additionally that the criterion used in the recursive steps for comparison of triangulation cell contents leads to consistent estimation of the local structure in the sparse Markov model.

AB - Markov chains of higher order are popular models for a wide variety of applications in natural language and DNA sequence processing. However, since the number of parameters grows exponentially with the order of a Markov chain, several alternative model classes have been proposed that allow for stability and higher rate of data compression. The common notion to these models is that they cluster the possible sample paths used to predict the next state into invariance classes with identical conditional distributions assigned to the same class. The models vary in particular with respect to constraints imposed on legitime partitions of the sample paths. Here we consider the class of sparse Markov chains for which the partition is left unconstrained a priori. A recursive computation scheme based on Delaunay triangulation of the parameter space is introduced to enable fast approximation of the posterior mode partition. Comparisons with stochastic optimization, k-means and nearest neighbor algorithms show that our approach is both considerably faster and leads on average to a more accurate estimate of the underlying partition. We show additionally that the criterion used in the recursive steps for comparison of triangulation cell contents leads to consistent estimation of the local structure in the sparse Markov model.

KW - 112 Statistics and probability

KW - Clustering

KW - Recursive learning

KW - Sequence analysis,

KW - Sparse Markov chains

U2 - 10.1214/15-BA949

DO - 10.1214/15-BA949

M3 - Article

VL - 11

SP - 247

EP - 263

JO - Bayesian analysis

JF - Bayesian analysis

SN - 1931-6690

IS - 1

ER -