Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models

Dirk Fortmeier, Matthias Wilms, André Mastmeyer, Heinz Handels


This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.
Original languageEnglish
JournalIEEE Transactions on Haptics
Pages (from-to)371 - 383
Publication statusPublished - 01.10.2015


Dive into the research topics of 'Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models'. Together they form a unique fingerprint.

Cite this