Efficient Multiple Query Answering in Switched Probabilistic Relational Models

Marcel Gehrke*, Tanya Braun, Ralf Möller

*Korrespondierende/r Autor/-in für diese Arbeit

Abstract

By accounting for context-specific independences, the size of a model can be drastically reduced, thereby making the underlying inference problem more manageable. Switched probabilistic relational models contain explicit context-specific independences. To efficiently answer multiple queries in switched probabilistic relational models, we combine the advantages of propositional gate models for context-specific independences and the lifted junction tree algorithm for answering multiple queries in probabilistic relational models. Specifically, this paper contributes (i) variable elimination in gate models, (ii) applying the lifting idea to gate models, defining switched probabilistic relational models, enabling lifted variable elimination in computations, and (iii) the switched lifted junction tree algorithm to answer multiple queries in such models efficiently. Empirical results show that using context-specific independence speeds up even lifted inference significantly.

OriginalspracheEnglisch
TitelAI 2019: Advances in Artificial Intelligence
Redakteure/-innenJixue Liu, James Bailey
Seitenumfang13
Band11919 LNAI
Herausgeber (Verlag)Springer, Cham
Erscheinungsdatum25.11.2019
Seiten104-116
ISBN (Print)978-3-030-35287-5
ISBN (elektronisch)978-3-030-35288-2
DOIs
PublikationsstatusVeröffentlicht - 25.11.2019
Veranstaltung32nd Australasian Joint Conference on Artificial Intelligence - Adelaide, Australien
Dauer: 02.12.201905.12.2019
Konferenznummer: 234489

Strategische Forschungsbereiche und Zentren

  • Zentren: Zentrum für Künstliche Intelligenz Lübeck (ZKIL)
  • Querschnittsbereich: Intelligente Systeme

Fingerprint

Untersuchen Sie die Forschungsthemen von „Efficient Multiple Query Answering in Switched Probabilistic Relational Models“. Zusammen bilden sie einen einzigartigen Fingerprint.

Zitieren