Abstract
We present a method for learning Bayesian networks from data sets containing
thousands of variables without the need for structure constraints. Our approach
is made of two parts. The first is a novel algorithm that effectively explores the
space of possible parent sets of a node. It guides the exploration towards the
most promising parent sets on the basis of an approximated score function that
is computed in constant time. The second part is an improvement of an existing
ordering-based algorithm for structure optimization. The new algorithm provably
achieves a higher score compared to its original formulation. Our novel approach
consistently outperforms the state of the art on very large data sets.
| Original language | English |
|---|---|
| Title of host publication | Advances in Neural Information Processing Systems 28 (NIPS 2015) |
| Editors | C. Cortes, N.D. Lawrence, D.D. Lee, M. Sugiyama, R. Garnett |
| Publisher | NIPS Foundation, Inc |
| Pages | 1855-1863 |
| Number of pages | 9 |
| Publication status | Published - Dec 2015 |
| Event | NIPS 2015 - Montreal, Canada Duration: 07 Dec 2015 → 12 Dec 2015 |
Conference
| Conference | NIPS 2015 |
|---|---|
| Country/Territory | Canada |
| City | Montreal |
| Period | 07/12/2015 → 12/12/2015 |