Organic Boundary Location Based on Color-Texture of Visual Perception in Wireless Capsule Endoscopy Video

This paper addresses the problem of automatically locating the boundary between the stomach and the small intestine (the pylorus) in wireless capsule endoscopy (WCE) video. For efficient image segmentation, the color-saliency region detection (CSD) method is developed for obtaining the potentially valid region of the frame (VROF). To improve the accuracy of locating the pylorus, we design the Monitor-Judge model. On the one hand, the color-texture fusion feature of visual perception (CTVP) is constructed by grey level cooccurrence matrix (GLCM) feature from the maximum moments of the phase congruency covariance and hue-saturation histogram feature in HSI color space. On the other hand, support vector machine (SVM) classifier with the CTVP feature is utilized to locate the pylorus. The experimental results on 30 real WCE videos demonstrate that the proposed location method outperforms the related valuable techniques.

[1]  Robert J. Moorhead,et al.  Image processing using the HSI color space , 1991, IEEE Proceedings of the SOUTHEASTCON '91.

[2]  H. Barlow Vision: A computational investigation into the human representation and processing of visual information: David Marr. San Francisco: W. H. Freeman, 1982. pp. xvi + 397 , 1983 .

[3]  Max Q.-H. Meng,et al.  Wireless capsule endoscopy video automatic segmentation , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[4]  Robert M. Haralick,et al.  Textural Features for Image Classification , 1973, IEEE Trans. Syst. Man Cybern..

[5]  Duncan Bell,et al.  Stomach, intestine, and colon tissue discriminators for wireless capsule endoscopy images , 2005, SPIE Medical Imaging.

[6]  Max Q.-H. Meng,et al.  Polyp classification based on Bag of Features and saliency in wireless capsule endoscopy , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[7]  Lidia Ciobanu,et al.  Colon capsule endoscopy: a new method of investigating the large bowel. , 2008, Journal of gastrointestinal and liver diseases : JGLD.

[8]  Max Q.-H. Meng,et al.  Capsule endoscopy video Boundary Detection , 2011, 2011 IEEE International Conference on Information and Automation.

[9]  Sabine Süsstrunk,et al.  Frequency-tuned salient region detection , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[11]  Muhammad Abubakar Siddique,et al.  A novel method for automatically locating the pylorus in the wireless capsule endoscopy , 2017, Biomedizinische Technik. Biomedical engineering.

[12]  W. Marsden I and J , 2012 .

[13]  Miguel Tavares Coimbra,et al.  Automated Topographic Segmentation and Transit Time Estimation in Endoscopic Capsule Exams , 2008, IEEE Transactions on Medical Imaging.

[14]  Peter Kovesi,et al.  Phase Congruency Detects Corners and Edges , 2003, DICTA.

[15]  李抱朴,et al.  Motion Analysis for Capsule Endoscopy Video Segmentation , 2011 .

[16]  Max Q.-H. Meng,et al.  Motion analysis for capsule endoscopy video segmentation , 2011, 2011 IEEE International Conference on Automation and Logistics (ICAL).

[17]  Nikolaos G. Bourbakis,et al.  Detection of Small Bowel Polyps and Ulcers in Wireless Capsule Endoscopy Videos , 2011, IEEE Transactions on Biomedical Engineering.

[18]  Changxin Gao,et al.  Log-Gabor Weber descriptor for face recognition , 2015, J. Electronic Imaging.

[19]  P. Swain,et al.  A randomized trial comparing wireless capsule endoscopy with push enteroscopy for the detection of small-bowel lesions. , 2000, Gastroenterology.

[20]  Gaurav Sharma,et al.  Digital color imaging , 1997, IEEE Trans. Image Process..

[21]  J. Barkin,et al.  Stomach , 2015, The American Journal of Gastroenterology.