Registration with probabilistic correspondences — Accurate and robust registration for pathological and inhomogeneous medical data

Julia Krüger, Sandra Schultz, Heinz Handels, Jan Ehrhardt

Abstract

The registration of two medical images is usually based on the assumption that corresponding regions exist in both images. If this assumption is violated by e.g. pathologies most approaches encounter problems. The registration approach proposed is based on probabilistic correspondences of sparse image representations and enables a robust handling of potentially missing correspondences. A maximum a-posteriori framework is used to derive an optimization criterion with respect to deformation parameters that aim to reduce the shape and appearance differences between the registered images. A multi-resolution approach speeds up the optimization and increases the robustness of the registration. The computed probabilistic correspondences enable the approach to deal with missing correspondences in the images. Furthermore, they provide additional information about the quality of fit and potentially non-corresponding/pathological image regions. The approach is compared to two state-of-the-art registration methods using MR brain and cardiac images: a variational intensity-based registration algorithm and a feature-based registration approach using a discrete optimization scheme. The comprehensive quantitative evaluation using additional simulated stroke lesions shows a significantly higher accuracy and robustness of the proposed approach. Furthermore, the correspondence probability maps were used to characterize pathological regions in the MRI brain data.

Original languageEnglish
Article number102839
JournalComputer Vision and Image Understanding
Volume190
ISSN1077-3142
DOIs
Publication statusPublished - 01.2020

Fingerprint

Dive into the research topics of 'Registration with probabilistic correspondences — Accurate and robust registration for pathological and inhomogeneous medical data'. Together they form a unique fingerprint.

Cite this