Revisiting Iterative Highly Efficient Optimisation Schemes in Medical Image Registration

Lasse Hansen*, Mattias P. Heinrich

*Corresponding author for this work


3D registration remains one of the big challenges in medical imaging, especially when dealing with highly deformed anatomical structures such as those encountered in inter- or intra-patient registration of abdominal scans. In a recent MICCAI registration challenge (Learn2Reg) deep learning based network architectures with inference times of <2 s showed great success for supervised alignment tasks. However, in unsupervised settings deep learning methods have not yet outperformed their conventional algorithmic counterparts based on continuous iterative optimisation (and probably won’t as they share the same objective function (image metric)). This finding has brought us to revisit conventional optimisation schemes and investigate an iterative message passing approach that enables fast runtimes (using iterative optimisation with only few displacement candidates) and high registration accuracy. We conduct experiments on three challenging abdominal datasets ((pre-aligned) inter-patient CT, intra-patient MR-CT) and carry out an in-depth evaluation with a set of selected comparison methods. Our results clearly indicate that optimisation based methods are highly competitive both in accuracy and runtime when compared to Deep Learning methods. Moreover, we show that semantic label information (when available) can be efficiently exploited by our approach (cf. weakly supervised learning). Data and code will be made publicly available to ensure reproducibility and accelerate research in the field of 3D medical registration ( ).

Original languageEnglish
Title of host publicationInternational Conference on Medical Image Computing and Computer-Assisted Intervention
Publication date2021
Publication statusPublished - 2021


Dive into the research topics of 'Revisiting Iterative Highly Efficient Optimisation Schemes in Medical Image Registration'. Together they form a unique fingerprint.

Cite this