Recent advances in cutting-edge computer-assisted surgical navigation systems resolve thecritical need for safer and more efficient surgical procedures. State-of-the-art navigation systems
incorporate invasive retro-reflective markers that can be detected with the help of infrared cameras,
providing real-time guidance to clinicians about patient anatomy and instrument trajectory. While
these systems have revolutionized computer assisted surgeries, the cost ranges from $250,000 to
500,000, making it prohibitively expensive for small healthcare centers.
Optical tracking, an alternative to infrared tracking, is explored in this paper. It relies on
visible light to be able to track specific objects or markers. Using two low-cost web cameras, a
stereoscopic camera was calibrated and programmed to detect the 3D position of moving fiducial
ArUco markers, non-invasive tags that store positional information. First, using a positioning
platform, the markers were moved in the X and Y directions with the objective of tracking minor
movements of patients during surgery. Next, a marker attached to a surgical needle was moved in
5DOF using the Arduino Tinkerkit Braccio Robotic Arm to mimic the handling of surgical
instruments. Movements were recorded and the videos were processed in 3 color spaces, red-
green-blue (RGB), hue-saturation-lightness (HSL) and hue-saturation-value (HSV) on OpenCV to
calculate the tracking accuracy of the cameras. On the positioning platform, differently colored
markers (white, pink, orange and yellow) were tested for the highest accuracy of detection. Using
various post-processing techniques, it was found that an accuracy of 5.5 mm could be achieved in
the RGB color space using a white colored marker. When the marker was moved using the robot,
it was possible to obtain 2.27 mm accuracy in the RGB color space with a white colored marker.