Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Towards Building Autonomy and Intelligence for Surgical Robotic Systems using Trajectory Optimization, Stochastic Estimation, Vision-based Control, and Machine Learning Algorithms

  • Author(s): Aghajani Pedram, Sahba
  • Advisor(s): Rosen, Jacob
  • et al.
Abstract

While the teleoperation framework has been successfully implemented for the surgical robots, especially for soft tissue interventions, the main challenge is that the surgeons are responsible for all actions taken, and all decisions made, during the entire surgery. As the robotics technology explores more complicated surgical interventions, teleoperation framework can become increasingly overwhelming

for the human surgeons, who inherit limited sensing and motor control bandwidth, and can result in degraded surgical performance. Introduction of automation and intelligence into the robot-assisted interventions, where some of the surgical responsibilities are delegated to the AI agent, can substantially improve this framework and enhance the overall surgical outcome. Amongst many challenges of bringing autonomy into the surgical interventions, the main two technological ones pertain to the complexity of soft tissue environments and the inaccuracies of the surgical robotics systems. This dissertation aims at addressing these two challenges and proposes various solutions for different surgical robotic systems with applications to laparoscopic, orthopedic, and opthalmologic surgeries. Regarding planning of surgical subtasks, suturing and tissue manipulation which occur frequently in soft tissue surgeries are considered. For suturing task, two novel optimization-based needle motion planning algorithms, Fixed Center Motion (FCM) and Moving Center Motion (MCM), are proposed where the tissue trauma is minimized and a wide variety of suturing criteria (i.e., adequate depth) are met. An extensive simulations for each method were provided to (I) confirm the mathematical formulations and (II) to obtain optimal strategies under various suturing conditions. The FCM needle planning was deployed on Raven IV system with an open loop controller (i.e, no vision feedback) and experiment results confirmed the simulation and optimization outcomes. Regarding the tissue manipulation task, a new synergic learning method where human knowledge contributes to selecting intuitive features of tissue manipulation while the algorithm learns to take optimal actions is proposed. The method was tested on four different configurations in the simulations and the robot was able to successfully accomplish the task of tissue manipulation autonomously for all. To improve estimation and control accuracy of three surgical robotics systems, multiple frameworks are proposed. For the first category which pertain to cable-driven serial manipulators used for soft tissue surgeries, a 6 DoF visual servo controller using robot-camera calibration and realtime vision feedback was developed. The framework enabled the Raven IV surgical system to perform autonomous suturing task for various suturing trajectories and tissue compliance. For the second category which pertain to continuum manipulators with applications to orthopedic surgery and bronchoscopy, a novel stochastic sensor fusion algorithm, called Simultaneous Sensor Calibration and Deformation Estimation (SCADE), was introduced. SCADE addresses the problem of estimating calibration bias of FBG sensors as well as the shape/tip of the continuum surgical manipulators simultaneously in real-time. The algorithm was tested to estimate the tip position of a continuum manipulator within free and obstructed environments and showed superior performance compared to estimations from FBG sensor. For the third category which pertain to a robot-assisted cataract surgery system, a new hardware and software solution was proposed to estimate the tip location of surgical tools inside the eye during cataract surgery. The framework was developed and

tested using a total of 31 pig eyes and results demonstrated efficacy of the proposed solution.

Main Content
Current View