How to Encode Dynamic Gaussian Bayesian Networks as Gaussian Processes?

Mattis Hartwig*, Ralf Möller

*Corresponding author for this work

Abstract

One dimensional versions of the Markov chain and the hidden Markov model have been generalized as Gaussian processes. Currently these approaches support only a single dimension which is limiting their usability. In this paper we encode the more general dynamic Gaussian Bayesian network as a Gaussian process and thus allow arbitrary number of dimensions and arbitrary connections between time steps. Our developed Gaussian process based formalism has the advantage of supporting a direct inference from any time point to the other without propagation of evidence throughout the whole network, flexibility to combine the covariance function with others if needed and keeping all properties of the dynamic Gaussian Bayesian network.

Original languageEnglish
Title of host publicationAI 2020: AI 2020: Advances in Artificial Intelligence
EditorsMarcus Gallagher, Nour Moustafa, Erandi Lakshika
Number of pages12
Volume12576 LNAI
PublisherSpringer, Cham
Publication date27.11.2020
Pages371-382
ISBN (Print)978-3-030-64983-8
ISBN (Electronic)978-3-030-64984-5
DOIs
Publication statusPublished - 27.11.2020
Event33rd Australasian Joint Conference on Artificial Intelligence - Canberra, Australia
Duration: 29.11.202030.11.2020
Conference number: 252419

Research Areas and Centers

  • Centers: Center for Artificial Intelligence Luebeck (ZKIL)
  • Research Area: Intelligent Systems

Fingerprint

Dive into the research topics of 'How to Encode Dynamic Gaussian Bayesian Networks as Gaussian Processes?'. Together they form a unique fingerprint.

Cite this