Visualizing the 3D hidden signatures of surgical mastery using objective motion data from the JIGSAWS dataset.
In robotic-assisted surgery (RAS), technical proficiency has traditionally been assessed through subjective observation. However, kinematic data offers an objective "fingerprint" of a surgeon’s motor strategies. By analyzing the 3D trajectory of the robotic tip, we can transform raw motion into a data-driven map that reveals the true efficiency of surgical intent.
The Expert fingerprint is defined by Vertical Intent and Interface Transparency. By utilizing a precise "Surgical Arc," the Expert minimizes lateral shear force—reducing collateral tissue trauma. This signature indicates that the robotic system has become an intuitive extension of the surgeon’s hand, allowing for a highly automated and efficient motor program.
The Outlier fingerprint reveals Spatial Uncertainty. Instead of lifting, the Outlier "wanders" horizontally across the tissue surface, significantly increasing dragging forces. This erratic motion suggests the surgeon is "fighting the interface," resulting in a high visual-motor workload and a failure to effectively bridge the gap between human intuition and robotic precision.
Spatial Variance
By moving from subjective grading to Objective Kinematic Fingerprinting, we can identify spatial inefficiencies that are invisible to the naked eye. This project demonstrates how data-driven benchmarks can redefine surgical training and patient safety.