Surgical Phase Recognition using Movement Data from Video Imagery and Location Sensor Data

The automatic recognition of surgical phases has strong potential to help medical staff understand individual and group patterns, optimize work flows, and identify potential work flow risks that lead to adverse medical events in an operating room. In this chapter, we investigate the performance of context recognition on the movement of operating room staff throughout their work environment, which was measured by imaging and tracking. We employed an optical flow algorithm and trajectory clustering techniques to extract movement characteristics of surgical staff from video imagery and time-stamped location data collected by an ultrasonic location aware system, respectively. Then we applied a Support Vector Machine to time-stamped location data, optical flow estimates, trajectory clusters, and combinations of these three data to examine the intraoperative context recognition rate. Our results show that the integration of both video imagery and location sensor data improves context awareness of neurosurgical operations.

[1]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[2]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[3]  Gary R. Bradski,et al.  Motion segmentation and pose recognition with motion history gradients , 2000, Proceedings Fifth IEEE Workshop on Applications of Computer Vision.

[4]  Spyretta Golemati,et al.  Carotid artery wall motion estimated from B-mode ultrasound using region tracking and block matching. , 2003, Ultrasound in medicine & biology.

[5]  Yan Xiao,et al.  An Algorithm for Processing Vital Sign Monitoring Data to Remotely Identify Operating Room Occupancy in Real-Time , 2005, Anesthesia and analgesia.

[6]  Hiroshi Motoda,et al.  Graph Clustering Based on Structural Similarity of Fragments , 2005, Federation over the Web.

[7]  Jeff Sutherland,et al.  Towards an Intelligent Hospital Environment: Adaptive Workflow in the OR of the Future , 2006, Proceedings of the 39th Annual Hawaii International Conference on System Sciences (HICSS'06).

[8]  Ken Masamune,et al.  Timed-automata-based model for laparoscopic surgery and intraoperative motion recognition of a surgeon as the interface connecting the surgical scenario and the real operating room , 2006 .

[9]  Guang-Zhong Yang,et al.  Eye-Gaze Driven Surgical Workflow Segmentation , 2007, MICCAI.

[10]  Nassir Navab,et al.  A Boosted Segmentation Method for Surgical Workflow Analysis , 2007, MICCAI.

[11]  Yasuo Sakurai,et al.  Surgical Workflow Monitoring Based on Trajectory Data Mining , 2010, JSAI-isAI Workshops.

[12]  Marco Heurich,et al.  An event-based conceptual model for context-aware movement analysis , 2011, Int. J. Geogr. Inf. Sci..

[13]  Jean Meunier,et al.  Robust Video Surveillance for Fall Detection Based on Human Shape Deformation , 2011, IEEE Transactions on Circuits and Systems for Video Technology.

[14]  Nassir Navab,et al.  Statistical modeling and recognition of surgical workflow , 2012, Medical Image Anal..

[15]  Hiroshi Iseki,et al.  Development and Initial Clinical Testing of “OPECT”: An Innovative Device for Fully Intangible Control of the Intraoperative Image-Displaying Monitor by the Surgeon , 2014, Neurosurgery.

[16]  Atsushi Nara,et al.  Space-Time Analytics of Tracks for the Understanding of Patterns of Life , 2015 .

[17]  D. Richardson,et al.  Space-Time Integration in Geography and GIScience: Research Frontiers in the US and China , 2015 .