Patient localization for robotized ultrasound-guided radiation therapy

Ivo Kuhlemann, Philipp Jauer, Achim Schweikard, Floris Ernst


Accurate localization and tracking of moving targets is one of the major challenges faced today during high-precision radiotherapy. Typically, the position of the treatment target is either determined using infrequent X-ray images or cone-beam CT scans. A totally different approach currently under active development makes use of ultrasound imaging to continuously track the target region. We have evaluated a robotized setup where Microsoft’s Kinect v2 sensor is used to localize the patient and specific ultrasonic view ports previously defined in the planning CT. The setup is validated using an anthropomorphic torso phantom and four predefined view ports (apical and parasternal echocardiography, liver sonography, suprapubic prostate sonography). The Kinect sensor and an optical tracking system (used to determine the position of the torso phantom) were calibrated to the robot using the QR24 hand-eye calibration algorithm. Then each view port was approached fifteen times from different directions, showing that the accuracy achievable is, on average, approximately 2.1 cm. This number can mostly be attributed to the difficulty of obtaining accurate calibration of the geometric relationship between the robot and the Kinect sensor. It was observed that the Kinect sensor system suffers from substantial distortion in the centimeter range, severely compromising the accuracy of the whole setup.
Original languageEnglish
Number of pages8
Publication statusPublished - 01.10.2015
Event18th International Conference on Medical Image Computing and Computer-Assisted Intervention - MICCAI 2015
- Munich, Germany
Duration: 05.10.201509.10.2015


Conference18th International Conference on Medical Image Computing and Computer-Assisted Intervention - MICCAI 2015
Internet address


Dive into the research topics of 'Patient localization for robotized ultrasound-guided radiation therapy'. Together they form a unique fingerprint.

Cite this