Pruning Rules for Learning Parsimonious Context Trees

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review


We give a novel algorithm for finding a parsimonious context tree (PCT) that best fits a given data set.
PCTs extend traditional context trees by allowing context-specific grouping of the states of a context variable, also enabling skipping the variable.
However, they gain statistical efficiency at the cost of computational efficiency, as the search space of PCTs is of tremendous size.
We propose pruning rules based on efficiently computable score upper bounds with the aim of reducing this search space significantly.
While our concrete bounds exploit properties of the BIC score, the ideas apply also to other scoring functions.
Empirical results show that our algorithm is typically an order-of-magnitude faster than a recently proposed memory-intensive algorithm, or alternatively, about equally fast but using dramatically less memory.
Original languageEnglish
Title of host publicationUncertainty in Artificial Intelligence : Proceedings of the Thirty-Second Conference (2016) June 25-29, 2016, Jersey City, New Jersey, USA
EditorsAlexander Ihler, Dominik Janzing
Number of pages10
Place of PublicationCorvallis, Oregon
PublisherAUAI Press
Publication date2016
ISBN (Print)978-0-9966431-1-5
Publication statusPublished - 2016
MoE publication typeA4 Article in conference proceedings
EventConference on Uncertainty in Artificial Intelligence - Jersey City, New Jersey, United States
Duration: 25 Jun 201629 Jun 2016
Conference number: 32

Fields of Science

  • 113 Computer and information sciences

Cite this