Abstrakti
Recently two search algorithms, A* and breadth-first branch and bound (BFBnB), were developed based on a simple admissible heuristic for learning Bayesian network structures that optimize a scoring function. The heuristic represents a relaxation of the learning problem such that each variable chooses optimal parents independently. As a result, the heuristic may contain many directed cycles and result in a loose bound. This paper introduces an improved admissible heuristic that tries to avoid directed cycles within small groups of variables. A sparse representation is also introduced to store only the unique optimal parent choices. Empirical results show that the new techniques significantly improved the efficiency and scalability of A* and BFBnB on most of datasets tested in this paper.
Alkuperäiskieli | englanti |
---|---|
Otsikko | Proceedings of the 28th Conference of Uncertainty in Artificial Intelligence |
Sivumäärä | 10 |
Julkaisupäivä | 2012 |
Tila | Julkaistu - 2012 |
OKM-julkaisutyyppi | A4 Artikkeli konferenssijulkaisuussa |
Tapahtuma | Conference on Uncertainty in Artificial Intelligence - Catalina Island, Yhdysvallat (USA) Kesto: 15 elok. 2012 → 17 elok. 2012 Konferenssinumero: 28 |
Tieteenalat
- 113 Tietojenkäsittely- ja informaatiotieteet