SHA-MTL: soft and hard attention multi-task learning for automated breast cancer ultrasound image segmentation and classification

Purpose The automatic analysis of ultrasound images facilitates the diagnosis of breast cancer effectively and objectively. However, due to the characteristics of ultrasound images, it is still a challenging task to achieve analyzation automatically. We suppose that the algorithm will extract lesion regions and distinguish categories easily if it is guided to focus on the lesion regions.Method We propose a multi-task learning (SHA-MTL) model based on soft and hard attention mechanisms for breast ultrasound (BUS) image simultaneous segmentation and binary classification. The SHA-MTL model consists of a dense CNN encoder and an upsampling decoder, which are connected by attention-gated (AG) units with soft attention mechanism. Cross-validation experiments are performed on BUS datasets with category and mask labels, and multiple comprehensive analyses are performed on the two tasks.Results We assess the SHA-MTL model on a public BUS image dataset. For the segmentation task, the sensitivity and DICE of the SHA-MTL model to the lesion regions increased by 2.27% and 1.19% compared with the single task model, respectively. The classification accuracy and F1 score increased by 2.45% and 3.82%, respectively.Conclusion The results validate the effectiveness of our model and indicate that the SHA-MTL model requires less a priori knowledge to achieve better results by comparing with other recent models. Therefore, we can draw the conclusion that paying more attention to the lesion region of BUS is conducive to the discrimination of lesion types.

[1]  Huazhu Fu,et al.  Global Guidance Network for Breast Lesion Segmentation in Ultrasound Images , 2021, Medical Image Anal..

[2]  Wei Zhang,et al.  Accurate Screening of COVID-19 Using Attention-Based Deep 3D Multiple Instance Learning , 2020, IEEE Transactions on Medical Imaging.

[3]  Xiaomei Ma,et al.  Global Burden of Cancer , 2006, The Yale journal of biology and medicine.

[4]  Vijayalakshmi G. V. Mahesh,et al.  An RDAU-NET model for lesion segmentation in breast ultrasound images , 2019, PloS one.

[5]  Ying Wang,et al.  A Benchmark for Breast Ultrasound Image Segmentation (BUSIS) , 2018, ArXiv.

[6]  Hassan Rivaz,et al.  Two-stage ultrasound image segmentation using U-Net and test time augmentation , 2020, International Journal of Computer Assisted Radiology and Surgery.

[7]  Ruey-Feng Chang,et al.  Computer-aided diagnosis of breast ultrasound images using ensemble learning from convolutional neural networks , 2020, Comput. Methods Programs Biomed..

[8]  Fei Xu,et al.  Automatic Breast Ultrasound Image Segmentation: A Survey , 2017, Pattern Recognit..

[9]  Nobhojit Roy,et al.  The Global Burden of Cancer 2013. , 2015, JAMA oncology.

[10]  Xueding Wang,et al.  Medical breast ultrasound image segmentation by machine learning , 2019, Ultrasonics.

[11]  Xuelong Li,et al.  Segmentation of breast ultrasound image with semantic classification of superpixels , 2020, Medical Image Anal..

[12]  Yi Wang,et al.  Breast Cancer Classification in Automated Breast Ultrasound Using Multiview Convolutional Neural Network with Transfer Learning. , 2020, Ultrasound in medicine & biology.

[13]  Walid Al-Dhabyani,et al.  Dataset of breast ultrasound images , 2019, Data in brief.

[14]  Ben Glocker,et al.  Attention Gated Networks: Learning to Leverage Salient Regions in Medical Images , 2018, Medical Image Anal..

[15]  Ling Zhang,et al.  Automated breast cancer detection and classification using ultrasound images: A survey , 2015, Pattern Recognit..