Modeling of Surgical Procedures Using Statecharts for Semi-Autonomous Robotic Surgery

In this paper we propose a new methodology to model surgical procedures that is specifically tailored to semi-autonomous robotic surgery. We propose to use a restricted version of statecharts to merge the bottom-up approach, based on data-driven techniques (e.g., machine learning), with the top-down approach based on knowledge representation techniques. We consider medical knowledge about the procedure and sensing of the environment in two concurrent regions of the statecharts to facilitate re-usability and adaptability of the modules. Our approach allows producing a well defined procedural model exploiting the hierarchy capability of the statecharts, while machine learning modules act as soft sensors to trigger state transitions. Integrating data driven and prior knowledge techniques provides a robust, modular, flexible and re-configurable methodology to define a surgical procedure which is comprehensible by both humans and machines. We validate our approach on the three surgical phases of a Robot-Assisted Radical Prostatectomy (RARP) that directly involve the assistant surgeon: bladder mobilization, bladder neck transection, and vesicourethral anastomosis, all performed on synthetic manikins.