Nearest neighbor 3D segmentation with context features

Evelin Hristova, Heinrich Schulz, Tom Brosch, Mattias P. Heinrich, Hannes Nickisch

Abstract

Automated and fast multi-label segmentation of medical images is challenging and clinically important. This paper builds upon a supervised machine learning framework that uses training data sets with dense organ annotations and vantage point trees to classify voxels in unseen images based on similarity of binary feature vectors extracted from the data. Without explicit model knowledge, the algorithm is applicable to different modalities and organs, and achieves high accuracy. The method is successfully tested on 70 abdominal CT and 42 pelvic MR images. With respect to ground truth, an average Dice overlap score of 0.76 for the CT segmentation of liver, spleen and kidneys is achieved. The mean score for the MR delineation of bladder, bones, prostate and rectum is 0.65. Additionally, we benchmark several variations of the main components of the method and reduce the computation time by up to 47% without significant loss of accuracy. The segmentation results are-for a nearest neighbor method-surprisingly accurate, robust as well as data and time efficient.

Original languageEnglish
Title of host publicationMedical Imaging 2018: Image Processing
EditorsElsa D. Angelini , Bennett A. Landman
Number of pages8
Volume10574
PublisherSPIE
Publication date20.03.2018
Article number105740M
ISBN (Print)978-151061637-0
DOIs
Publication statusPublished - 20.03.2018
EventSPIE Medical Imaging 2018
- Marriott Marquis Houston, Houston, United States
Duration: 10.02.201815.02.2018
http://spie.org/conferences-and-exhibitions/past-conferences-and-exhibitions/medical-imaging-2017-x128747
https://spie.org/conferences-and-exhibitions/medical-imaging
http://spie.org/conferences-and-exhibitions/past-conferences-and-exhibitions/medical-imaging-2017-x128747

Fingerprint

Dive into the research topics of 'Nearest neighbor 3D segmentation with context features'. Together they form a unique fingerprint.

Cite this