Abstract
By accounting for context-specific independences, the size of a model can be drastically reduced, thereby making the underlying inference problem more manageable. Switched probabilistic relational models contain explicit context-specific independences. To efficiently answer multiple queries in switched probabilistic relational models, we combine the advantages of propositional gate models for context-specific independences and the lifted junction tree algorithm for answering multiple queries in probabilistic relational models. Specifically, this paper contributes (i) variable elimination in gate models, (ii) applying the lifting idea to gate models, defining switched probabilistic relational models, enabling lifted variable elimination in computations, and (iii) the switched lifted junction tree algorithm to answer multiple queries in such models efficiently. Empirical results show that using context-specific independence speeds up even lifted inference significantly.
Original language | English |
---|---|
Title of host publication | AI 2019: Advances in Artificial Intelligence |
Editors | Jixue Liu, James Bailey |
Number of pages | 13 |
Volume | 11919 LNAI |
Publisher | Springer, Cham |
Publication date | 25.11.2019 |
Pages | 104-116 |
ISBN (Print) | 978-3-030-35287-5 |
ISBN (Electronic) | 978-3-030-35288-2 |
DOIs | |
Publication status | Published - 25.11.2019 |
Event | 32nd Australasian Joint Conference on Artificial Intelligence - Adelaide, Australia Duration: 02.12.2019 → 05.12.2019 Conference number: 234489 |
Research Areas and Centers
- Centers: Center for Artificial Intelligence Luebeck (ZKIL)
- Research Area: Intelligent Systems