Very fast digital 2D rigid motion estimation directly on continuous k-space data using an RNN

Marius Krusen*, Floris Ernst

*Corresponding author for this work

Abstract

Objective: Motion of the target during radiotherapy negatively impacts the effectiveness and can cause damage to nearby tissue. MRI offers great soft-tissue contrast to visualize the target but its long acquisition time requires high undersampling of k-space to monitor this motion in real time. This typically limits the achievable latency even though every saved millisecond can increase the effectiveness of the treatment. Methods: In this study, a recurrent neural network (RNN) is used to continuously estimate motion directly from incoming k-space data. A golden-angle radial k-space trajectory continuously scans the target region and feeds the acquired data spoke-by-spoke into the RNN. Skipping image reconstruction and focusing only on the motion in the data allows for very fast motion monitoring in the order of milliseconds. To improve network training and generalization, different amounts of peripheral values are removed from the k-space spokes during preprocessing. To train and evaluate the network, 2D MRI motion datasets with different motion characteristics were generated by synthetically transforming slices of 25 MRI head scans. Results: The RNN takes less than a millisecond to accurately estimate motion. By keeping only the inner 10% of each spoke, a mean rotational error of 0.37° and a mean translational error of 0.26 mm are achieved. No patient-specific preparation or retraining is necessary. Conclusion and significance: The network gives very fast motion estimations with sub-millimeter accuracy. The results demonstrate the feasibility of this approach and provide the groundwork to further reduce latency in real-time motion monitoring systems for radiotherapy without sacrificing accuracy.

Original languageEnglish
Article number105413
JournalBiomedical Signal Processing and Control
Volume87
ISSN1746-8094
DOIs
Publication statusPublished - 01.2024

Cite this