BDCN: Semantic Embedding Self-explanatory Breast Diagnostic Capsules Network

Building an interpretable AI diagnosis system for breast cancer is an important embodiment of AI assisted medicine. Traditional breast cancer diagnosis methods based on machine learning are easy to explain, but the accuracy is very low. Deep neural network greatly improves the accuracy of diagnosis, but the black box model does not provide transparency and interpretation. In this work, we propose a semantic embedding self-explanatory Breast Diagnostic Capsules Network (BDCN). This model is the first to combine the capsule network with semantic embedding for the AI diagnosis of breast tumors, using capsules to simulate semantics. We pre-trained the extraction word vector by embedding the semantic tree into the BERT and used the capsule network to improve the semantic representation of multiple heads of attention to construct the extraction feature, the capsule network was extended from the computer vision classification task to the text classification task. Simultaneously, both the back propagation principle and dynamic routing algorithm are used to realize the local interpretability of the diagnostic model. The experimental results show that this breast diagnosis model improves the model performance and has good interpretability, which is more suitable for clinical situations.

[1]  Hong Lu,et al.  Evaluating Generalization Ability of Convolutional Neural Networks and Capsule Networks for Image Classification via Top-2 Classification , 2019, ArXiv.

[2]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.

[3]  Min Yang,et al.  Investigating Capsule Networks with Dynamic Routing for Text Classification , 2018, EMNLP.

[4]  Alvaro Soto,et al.  Translating Natural Language Instructions for Behavioral Robot Navigation with a Multi-Head Attention Mechanism , 2020, WINLP.

[5]  Alex Graves,et al.  Grid Long Short-Term Memory , 2015, ICLR.

[6]  Scott Lundberg,et al.  A Unified Approach to Interpreting Model Predictions , 2017, NIPS.

[7]  Wei Zhang,et al.  Progress in Interpretability Research of Convolutional Neural Networks , 2019, MobiCASE.

[8]  Yu-Dong Yao,et al.  Breast Cancer Detection Using Extreme Learning Machine Based on Feature Fusion With CNN Deep Features , 2019, IEEE Access.

[9]  Carlos Guestrin,et al.  "Why Should I Trust You?": Explaining the Predictions of Any Classifier , 2016, ArXiv.

[10]  Robert M. Nishikawa,et al.  Microcalcification Classification Assisted by Content-Based Image Retrieval for Breast Cancer Diagnosis , 2007, 2007 IEEE International Conference on Image Processing.

[11]  Weimin Li,et al.  Knowledge-Powered Deep Breast Tumor Classification with Multiple Medical Reports. , 2019, IEEE/ACM transactions on computational biology and bioinformatics.

[12]  Jianqing Zhu,et al.  A Benign and Malignant Breast Tumor Classification Method via Efficiently Combining Texture and Morphological Features on Ultrasound Images , 2020, Comput. Math. Methods Medicine.

[13]  Trevor Darrell,et al.  Fully Convolutional Networks for Semantic Segmentation , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[15]  Zhe Zhao,et al.  K-BERT: Enabling Language Representation with Knowledge Graph , 2019, AAAI.

[16]  Dan Jiang,et al.  Tree Framework With BERT Word Embedding for the Recognition of Chinese Implicit Discourse Relations , 2020, IEEE Access.

[17]  Rico Sennrich,et al.  Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures , 2018, EMNLP.

[18]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[19]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[20]  Haviluddin,et al.  Crude Palm Oil Prediction Based on Backpropagation Neural Network Approach , 2019, Knowl. Eng. Data Sci..

[21]  Sebastian Bock,et al.  A Proof of Local Convergence for the Adam Optimizer , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).