Efficient Multiple Query Answering in Switched Probabilistic Relational Models

Marcel Gehrke*, Tanya Braun, Ralf Möller

*Corresponding author for this work

Abstract

By accounting for context-specific independences, the size of a model can be drastically reduced, thereby making the underlying inference problem more manageable. Switched probabilistic relational models contain explicit context-specific independences. To efficiently answer multiple queries in switched probabilistic relational models, we combine the advantages of propositional gate models for context-specific independences and the lifted junction tree algorithm for answering multiple queries in probabilistic relational models. Specifically, this paper contributes (i) variable elimination in gate models, (ii) applying the lifting idea to gate models, defining switched probabilistic relational models, enabling lifted variable elimination in computations, and (iii) the switched lifted junction tree algorithm to answer multiple queries in such models efficiently. Empirical results show that using context-specific independence speeds up even lifted inference significantly.

Original languageEnglish
Title of host publicationAI 2019: Advances in Artificial Intelligence
EditorsJixue Liu, James Bailey
Number of pages13
Volume11919 LNAI
PublisherSpringer, Cham
Publication date25.11.2019
Pages104-116
ISBN (Print)978-3-030-35287-5
ISBN (Electronic)978-3-030-35288-2
DOIs
Publication statusPublished - 25.11.2019
Event32nd Australasian Joint Conference on Artificial Intelligence - Adelaide, Australia
Duration: 02.12.201905.12.2019
Conference number: 234489

Research Areas and Centers

  • Centers: Center for Artificial Intelligence Luebeck (ZKIL)
  • Research Area: Intelligent Systems

Cite this