How to Encode Dynamic Gaussian Bayesian Networks as Gaussian Processes?

Mattis Hartwig*, Ralf Möller

*Korrespondierende/r Autor/-in für diese Arbeit

Abstract

One dimensional versions of the Markov chain and the hidden Markov model have been generalized as Gaussian processes. Currently these approaches support only a single dimension which is limiting their usability. In this paper we encode the more general dynamic Gaussian Bayesian network as a Gaussian process and thus allow arbitrary number of dimensions and arbitrary connections between time steps. Our developed Gaussian process based formalism has the advantage of supporting a direct inference from any time point to the other without propagation of evidence throughout the whole network, flexibility to combine the covariance function with others if needed and keeping all properties of the dynamic Gaussian Bayesian network.

OriginalspracheEnglisch
TitelAI 2020: AI 2020: Advances in Artificial Intelligence
Redakteure/-innenMarcus Gallagher, Nour Moustafa, Erandi Lakshika
Seitenumfang12
Band12576 LNAI
Herausgeber (Verlag)Springer, Cham
Erscheinungsdatum27.11.2020
Seiten371-382
ISBN (Print)978-3-030-64983-8
ISBN (elektronisch)978-3-030-64984-5
DOIs
PublikationsstatusVeröffentlicht - 27.11.2020
Veranstaltung33rd Australasian Joint Conference on Artificial Intelligence - Canberra, Australien
Dauer: 29.11.202030.11.2020
Konferenznummer: 252419

Strategische Forschungsbereiche und Zentren

  • Zentren: Zentrum für Künstliche Intelligenz Lübeck (ZKIL)
  • Querschnittsbereich: Intelligente Systeme

Fingerprint

Untersuchen Sie die Forschungsthemen von „How to Encode Dynamic Gaussian Bayesian Networks as Gaussian Processes?“. Zusammen bilden sie einen einzigartigen Fingerprint.

Zitieren