Learning Surgical Motion Pattern from Small Data in Endoscopic Sinus and Skull Base Surgeries

Existing studies demonstrated that surgical motion patterns are strongly correlated with surgical outcomes. Real surgeries are complicated and it is expensive to harvest surgical data. Consequently, existing researches on surgical motion patterns focus on specific concise surgical tasks or simple surgical procedures. The paper presents a surgical motion pattern modeling technique that uses small data but can be applied to virtually any Endoscopic Sinus and Skull Base Surgeries (ESSBSs). The proposed method decreases the dimensionalities of the feature space through projecting surgical instrument motions into the endoscope coordinate, based on human expert domain knowledge. Furthermore, the method uses kinematic features and learns the motion pattern with Gaussian Process learning techniques. Comparing with existing surgical motion pattern modeling methods, the proposed method: 1, learns the motion model from small data; 2, can be generally applied to ESSBSs because it neither assumes nor depends on specific surgical tasks; 3, provides informative results in a real-time manner for optimizing surgical motions for improving surgical outcomes. The proposed method was verified by predicting surgical skill levels on cadaver surgeries. The results show the real-time prediction precision is higher than 81% and the offline accumulated precision reach 100%.