Abstract
Recently two search algorithms, A* and breadth-first branch and bound (BFBnB), were developed based on a simple admissible heuristic for learning Bayesian network structures that optimize a scoring function. The heuristic represents a relaxation of the learning problem such that each variable chooses optimal parents independently. As a result, the heuristic may contain many directed cycles and result in a loose bound. This paper introduces an improved admissible heuristic that tries to avoid directed cycles within small groups of variables. A sparse representation is also introduced to store only the unique optimal parent choices. Empirical results show that the new techniques significantly improved the efficiency and scalability of A* and BFBnB on most of datasets tested in this paper.
Original language | English |
---|---|
Title of host publication | Proceedings of the 28th Conference of Uncertainty in Artificial Intelligence |
Number of pages | 10 |
Publication date | 2012 |
Publication status | Published - 2012 |
MoE publication type | A4 Article in conference proceedings |
Event | Conference on Uncertainty in Artificial Intelligence - Catalina Island, United States Duration: 15 Aug 2012 → 17 Aug 2012 Conference number: 28 |
Fields of Science
- 113 Computer and information sciences