Burn image segmentation based on Mask Regions with Convolutional Neural Network deep learning framework: more accurate and more convenient

BackgroundBurns are life-threatening with high morbidity and mortality. Reliable diagnosis supported by accurate burn area and depth assessment is critical to the success of the treatment decision and, in some cases, can save the patient’s life. Current techniques such as straight-ruler method, aseptic film trimming method, and digital camera photography method are not repeatable and comparable, which lead to a great difference in the judgment of burn wounds and impede the establishment of the same evaluation criteria. Hence, in order to semi-automate the burn diagnosis process, reduce the impact of human error, and improve the accuracy of burn diagnosis, we include the deep learning technology into the diagnosis of burns.MethodThis article proposes a novel method employing a state-of-the-art deep learning technique to segment the burn wounds in the images. We designed this deep learning segmentation framework based on the Mask Regions with Convolutional Neural Network (Mask R-CNN). For training our framework, we labeled 1150 pictures with the format of the Common Objects in Context (COCO) data set and trained our model on 1000 pictures. In the evaluation, we compared the different backbone networks in our framework. These backbone networks are Residual Network-101 with Atrous Convolution in Feature Pyramid Network (R101FA), Residual Network-101 with Atrous Convolution (R101A), and InceptionV2-Residual Network with Atrous Convolution (IV2RA). Finally, we used the Dice coefficient (DC) value to assess the model accuracy.ResultThe R101FA backbone network gains the highest accuracy 84.51% in 150 pictures. Moreover, we chose different burn depth pictures to evaluate these three backbone networks. The R101FA backbone network gains the best segmentation effect in superficial, superficial thickness, and deep partial thickness. The R101A backbone network gains the best segmentation effect in full-thickness burn.ConclusionThis deep learning framework shows excellent segmentation in burn wound and extremely robust in different burn wound depths. Moreover, this framework just needs a suitable burn wound image when analyzing the burn wound. It is more convenient and more suitable when using in clinics compared with the traditional methods. And it also contributes more to the calculation of total body surface area (TBSA) burned.

[1]  Kaiming He,et al.  Feature Pyramid Networks for Object Detection , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[2]  Matthew Q. Hill,et al.  Body talk , 2016, ACM Trans. Graph..

[3]  Kaiming He,et al.  Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Wei Zeng,et al.  The unified discrete surface Ricci flow , 2014, Graph. Model..

[5]  Ayman El-Baz,et al.  Automated framework for accurate segmentation of pressure ulcer images , 2017, Comput. Biol. Medicine.

[6]  WangBei,et al.  Reconstructing 3D human models with a Kinect , 2016 .

[7]  Begoña Acha,et al.  Segmentation of burn images using the L*u*v* space and classification of their depths by color and texture imformation , 2002, SPIE Medical Imaging.

[8]  Yan Wan,et al.  BurnCalc assessment study of computer-aided individual three-dimensional burn area calculation , 2014, Journal of Translational Medicine.

[9]  Ross B. Girshick,et al.  Mask R-CNN , 2017, 1703.06870.

[10]  Luc Vincent,et al.  Watersheds in Digital Spaces: An Efficient Algorithm Based on Immersion Simulations , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Trevor Darrell,et al.  Fully Convolutional Networks for Semantic Segmentation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Kath Bogie,et al.  Reliability of electronic versus manual wound measurement techniques. , 2006, Archives of physical medicine and rehabilitation.

[14]  Neil A. Dodgson,et al.  Proceedings Ninth IEEE International Conference on Computer Vision , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[15]  Jing Hua,et al.  Authalic Parameterization of General Surfaces Using Lie Advection , 2011, IEEE Transactions on Visualization and Computer Graphics.

[16]  S. Chong,et al.  The validation study on a three-dimensional burn estimation smart-phone application: accurate, free and fast? , 2018, Burns & Trauma.

[17]  Iasonas Kokkinos,et al.  DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Ross B. Girshick,et al.  Fast R-CNN , 2015, 1504.08083.

[19]  Trevor Darrell,et al.  Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation , 2013, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[20]  Sergey Ioffe,et al.  Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning , 2016, AAAI.

[21]  Guang Chen,et al.  Reconstructing 3D human models with a Kinect , 2016, Comput. Animat. Virtual Worlds.

[22]  Koen E. A. van de Sande,et al.  Selective Search for Object Recognition , 2013, International Journal of Computer Vision.

[23]  David G Armstrong,et al.  Digital Planimetry Results in More Accurate Wound Measurements: A Comparison to Standard Ruler Measurements , 2010, Journal of diabetes science and technology.

[24]  G. Gethin,et al.  Wound measurement comparing the use of acetate tracings and Visitrak digital planimetry. , 2006, Journal of clinical nursing.

[25]  Ron Kikinis,et al.  Statistical validation of image segmentation quality based on a spatial overlap index. , 2004, Academic radiology.

[26]  Sajal K. Das,et al.  A call admission and control scheme for quality‐of‐service (QoS) provisioning in next generation wireless networks , 2000, Wirel. Networks.