Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Computer-Assisted Navigation Techniques for MRI-Guided Percutaneous Interventions

Abstract

Image-guided minimally invasive percutaneous interventions, including needle-based targeted biopsy and focal therapy, play key roles in cancer diagnosis and treatment. With excellent soft tissue contrast and the absence of ionizing radiation, magnetic resonance imaging (MRI) has become a promising imaging modality for intra-procedural and real-time guidance of percutaneous interventions. However, MRI-guided percutaneous interventions still face two main challenges: limited access to patients inside the scanner and tissue displacement due to physiological motion. MRI-compatible remotely controlled systems for instrument manipulation inside the scanner are being developed to address the first challenge. To address the second challenge, computer-assisted navigation methods using intra-procedural and real-time MRI are being investigated to provide essential information regarding tissue and instrument positions to guide percutaneous interventions. Accurate tissue motion tracking and automatic needle localization techniques are the core components for computer-assisted navigation. Respiratory motion remains the main challenge for procedures in abdominal organs. The tissue target tracking accuracy is negatively impacted by the system latency of the MRI acquisition, reconstruction, and processing pipeline. In addition, passive needle tracking using MR images is challenged by variations of the needle-induced signal void feature in different situations. Discrepancies between the needle feature position on MRI and the underlying physical needle position could increase needle localization errors during procedures. Therefore, this dissertation aims to address these challenges by establishing new computer-assisted navigation techniques for MRI-guided interventions, including prediction of tissue motion due to respiration using fusion-based multi-rate Kalman filtering and deep learning-based needle localization methods.

Firstly, this work investigated image-based and surrogate-based motion tracking methods using real-time golden-angle radial MRI to achieve real-time MRI guidance for interventions in organs affected by respiration (e.g., liver). Images with different temporal footprints were reconstructed from the same golden-angle radial MRI data stream to simultaneously enable image-based and surrogate-based tracking at 10 Hz. Phantom experiments confirmed that the median online tracking error of image-based tracking was lower than surrogate-based methods, however, with higher median system latency. This work proposed a new fusion-based respiratory motion prediction framework to combine the lower tracking error of image-based tracking with the lower latency of surrogate-based tracking. The fusion-based method was evaluated in retrospective studies using in vivo real-time free-breathing liver MRI. The motion prediction accuracy of the proposed framework achieved low-latency feedback with improved accuracy compared with image-based and surrogate-based methods.

Next, to develop an automatic needle tracking algorithm for MRI-guided interventions, this work adapted the Mask Region Proposal-Based Convolutional Neural Network (R-CNN) to localize the passive needle features on MRI. Mask R-CNN was adapted and trained to segment the needle feature using intra-procedural images from MRI-guided prostate biopsy cases and real-time images from MRI-guided needle insertion in ex vivo tissue. The segmentation masks were passed into a needle feature localization algorithm to extract the needle feature tip location and axis orientation. The proposed algorithm was evaluated on MR images from in vivo intra-procedural prostate biopsy cases and ex vivo real-time MRI experiments with a range of different conditions. It achieved pixel-level tracking accuracy in real time and has the potential to assist MRI-guided percutaneous interventions.

Lastly, to overcome in-plane and through-plane discrepancies between the needle feature position on MRI and the underlying physical needle position, this work developed a deep learning-based framework to automatically localize the physical needle position using single-slice and 3- slice MRI. The proposed framework consists of two Mask R-CNN stages. Physics-based simulations were performed to generate single-slice and 3-slice images with needle features from a range of underlying needle positions and MRI parameters to form datasets for training the single-slice and 3-slice physical needle Mask R-CNN models. Using the single-slice model, the proposed physics-driven Mask R-CNN framework achieved sub-millimeter physical needle localization accuracy on single-slice images aligned with the needle. The 3-slice model further reduced the through-plane physical needle localization error in situations where the imaging plane may be misaligned with the needle. Both frameworks can achieve physical needle localization in real time for interventional MRI.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View