Tytuł artykułu
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
Automatic segmentation of breast lesions from ultrasound images plays an important role in computer-aided breast cancer diagnosis. Many deep learning methods based on convolutional neural networks (CNNs) have been proposed for breast ultrasound image segmentation. However, breast ultrasound image segmentation is still challenging due to ambiguous lesion boundaries. We propose a novel dual-stage framework based on Transformer and Multi-layer perceptron (MLP) for the segmentation of breast lesions. We combine the Swin Transformer block with an efficient pyramid squeezed attention block in a parallel design and introduce bi-directional interactions across branches, which can efficiently extract multi-scale long-range dependencies to improve the segmentation performance and robustness of the model. Furthermore, we introduce tokenized MLP block in the MLP stage to extract global contextual information while retaining fine-grained information to segment more complex breast lesions. We have conducted extensive experiments with state-of-the-art methods on three breast ultrasound datasets, including BUSI, BUL, and MT_BUS datasets. The dice coefficient reached 0.8127 ± 0.2178, and the intersection over union reached 0.7269 ± 0.2370 on benign lesions when the Hausdorff distance was maintained at 3.75 ± 1.83. The dice coefficient of malignant lesions is improved by 3.09% for BUSI dataset. The segmentation results on the BUL and MT_BUS datasets also show that our proposed model achieves better segmentation results than other methods. Moreover, the external experiments indicate that the proposed model provides better generalization capability for breast lesion segmentation. The dual-stage scheme and the proposed Transformer module achieve the fine-grained local information and long-range dependencies to relieve the burden of radiologists.
Wydawca
Czasopismo
Rocznik
Tom
Strony
656--671
Opis fizyczny
Bibliogr. 58 poz., rys., tab.
Twórcy
autor
- Computer School, University of South China, Hengyang, China
autor
- School of Mechanical Engineering, Shanghai Jiao Tong University, Shanghai, China
autor
- Computer School, University of South China, Hengyang, China
autor
- Computer School, University of South China, Hengyang 421001, China
autor
- Affiliated Nanhua Hospital, University of South China, Hengyang, China
Bibliografia
- [1] Shin SY, Lee S, Yun ID, Kim SM, Lee KM. Joint weakly and semi-supervised deep learning for localization and classification of masses in breast ultrasound images. IEEE Trans Med Imag 2018;38:762-74.
- [2] Bleicher RJ, Ruth K, Sigurdson ER, Beck JR, Ross E, Wong YN, et al. Time to surgery and breast cancer survival in the united states. JAMA Oncol 2016;2:330-9.
- [3] Xian M, Zhang Y, Cheng HD, Xu F, Zhang B, Ding J. Automatic breast ultrasound image segmentation: A survey. Pattern Recogn 2018;79:340-55.
- [4] Zhou Q, Wang Q, Bao Y, Kong L, Jin X, Ou W. Laednet: A lightweight attention encoder-decoder network for ultrasound medical image segmentation. Comput Electr Eng 2022;99:107777.
- [5] Long J, Shelhamer E, Darrell T. Fully convolutional networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2015. pp. 3431-3440.
- [6] Ronneberger O, Fischer P, Brox T. U-net: Convolutional networks for biomedical image segmentation. In: International Conference on Medical Image Computing and Computer-assisted Intervention. Springer; 2015. p. 234-41.
- [7] Zhou Z, Rahman Siddiquee MM, Tajbakhsh N, Liang J. Unet+ +: A nested u-net architecture for medical image segmentation. In: Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support. Springer; 2018. p. 3-11.
- [8] Ibtehaz N, Rahman MS. Multiresunet: Rethinking the u-net architecture for multimodal biomedical image segmentation. Neural Networks 2020;121:74-87.
- [9] Zhang H, Zu K, Lu J, Zou Y, Meng D. Epsanet: An efficient pyramid squeeze attention block on convolutional neural network. In: Proceedings of the Asian Conference on Computer Vision; 2022. pp. 1161-1177.
- [10] Hu J, Shen L, Sun G. Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2018. pp. 7132-7141.
- [11] Wang Y, Wang N, Xu M, Yu J, Qin C, Luo X, et al. Deeply-supervised networks with threshold loss for cancer detection in automated breast ultrasound. IEEE Trans Med Imag 2019;39:866-76.
- [12] Houssein EH, Emam MM, Ali AA, Suganthan PN. Deep and machine learning techniques for medical imaging-based breast cancer: A comprehensive review. Expert Syst Appl 2021;167:114161.
- [13] Irfan R, Almazroi AA, Rauf HT, Damaševičius R, Nasr EA, Abdelgawad AE. Dilated semantic segmentation for breast ultrasonic lesion detection using parallel feature fusion. Diagnostics 2021;11:1212.
- [14] Hu Y, Guo Y, Wang Y, Yu J, Li J, et al. Automatic tumor segmentation in breast ultrasound images using a dilated fully convolutional network combined with an active contour model. Med Phys 2019;46:215-28.
- [15] Luo W, Li Y, Urtasun R, Zemel R. Understanding the effective receptive field in deep convolutional neural networks. Adv Neural Informat Process Syst 2016;29.
- [16] Byra M, Jarosik P, Szubert A, Galperin M, Ojeda-Fournier H, Olson L, et al. Breast mass segmentation in ultrasound with selective kernel u-net convolutional neural network. Biomed Signal Process Control 2020;61:102027.
- [17] Oktay O, Schlemper J, Folgoc LL, Lee M, Heinrich M, Misawa K. et al. Attention u-net: Learning where to look for the pancreas. 2018. arXiv preprint arXiv:1804.03999.
- [18] Huang X, Chen J, Chen M, Wan Y, Chen L. Fre-net: Full-region enhanced network for nuclei segmentation in histopathology images. Biocybernet Biomed Eng 2023;43:386-401.
- [19] Qi K, Yang H, Li C, Liu Z, Wang M, Liu Q, et al. X-net: Brain stroke lesion segmentation based on depthwise separable convolution and long-range dependencies. In: International Conference on Medical Image Computing and Computerassisted Intervention. Springer; 2019. p. 247-55.
- [20] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. Adv Neural Informat Process Syst 2017;30.
- [21] Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, et al. An image is worth 16x16 words: Transformers for image recognition at scale; 2020. arXiv preprint arXiv:2010.11929.
- [22] Cao H, Wang Y, Chen J, Jiang D, Zhang X, Tian Q, et al. Swinunet: Unet-like pure transformer for medical image segmentation. In: Computer Vision-ECCV 2022 Workshops: Tel Aviv, Israel, October 23-27, 2022, Proceedings, Part III. Springer; 2023. p. 205-18.
- [23] Chen J, Lu Y, Yu Q, Luo X, Adeli E, Wang Y, et al. Transunet: Transformers make strong encoders for medical image segmentation; 2021. arXiv preprint arXiv:2102.04306.
- [24] Tolstikhin IO, Houlsby N, Kolesnikov A, Beyer L, Zhai X, Unterthiner T, et al. Mlp-mixer: An all-mlp architecture for vision. Adv Neural Informat Process Syst 2021;34:24261-72.
- [25] Valanarasu JMJ, Patel VM. Unext: Mlp-based rapid medical image segmentation network. In: Medical Image Computing and Computer Assisted Intervention-MICCAI 2022: 25th International Conference, Singapore, September 18-22, 2022. Springer; 2022. p. 23-33.
- [26] Iqbal A, Sharif M. Mda-net: Multiscale dual attention-based network for breast lesion segmentation using ultrasound images. J King Saud Univ-Comput Informat Sci 2022;34:7283-99.
- [27] Al-Dhabyani W, Gomaa M, Khaled H, Fahmy A. Dataset of breast ultrasound images. Data Brief 2020;28:104863.
- [28] Yap MH, Pons G, Marti J, Ganau S, Sentis M, Zwiggelaar R, et al. Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J Biomed Health Informat 2017;22:1218-26.
- [29] Badawy SM, Mohamed AENA, Hefnawy AA, Zidan HE, GadAllah MT, El-Banby GM. Automatic semantic segmentation of breast tumors in ultrasound images based on combining fuzzy logic and deep learning-a feasibility study. PloS One 2021;16:e0251899.
- [30] Xian M, Zhang Y, Cheng HD. Fully automatic segmentation of breast ultrasound images based on breast characteristics in space and frequency domains. Pattern Recogn 2015;48:485-97.
- [31] Xian M, Huang J, Zhang Y, Tang X. Multiple-domain knowledge based mrf model for tumor segmentation in breast ultrasound images. In: 2012 19th IEEE International Conference on Image Processing. IEEE; 2012. p. 2021-4.
- [32] Madabhushi A, Metaxas DN. Combining low-, high-level and empirical domain knowledge for automated segmentation of ultrasonic breast lesions. IEEE Trans Med Imag 2003;22:155-69.
- [33] Huang Q, Bai X, Li Y, Jin L, Li X. Optimized graph-based segmentation for ultrasound images. Neurocomputing 2014;129:216-24.
- [34] Daoud MI, Atallah AA, Awwad F, Al-Najjar M, Alazrai R. Automatic superpixel-based segmentation method for breast ultrasound images. Expert Syst Appl 2019;121:78-96.
- [35] Huang Q, Huang Y, Luo Y, Yuan F, Li X. Segmentation of breast ultrasound image with semantic classification of superpixels. Med Image Anal 2020;61:101657.
- [36] Huang K, Cheng HD, Zhang Y, Zhang B, Xing P, Ning C. Medical knowledge constrained semantic breast ultrasound image segmentation. In: 2018 24th International Conference on Pattern Recognition (ICPR). IEEE; 2018. p. 1193-8.
- [37] Almajalid R, Shan J, Du Y, Zhang M. Development of a deeplearning-based method for breast ultrasound image segmentation. In: 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE; 2018. p. 1103-8.
- [38] Hu Y, Guo Y, Wang Y, Yu J, Li J, Zhou S, et al. Automatic tumor segmentation in breast ultrasound images using a dilated fully convolutional network combined with an active contour model. Med Phys 2019;46:215-28.
- [39] Shareef B, Xian M, Vakanski A. Stan: Small tumor-aware network for breast ultrasound image segmentation. In: International Symposium on Biomedical Imaging (ISBI). IEEE; 2020. p. 1-5.
- [40] Xu C, Qi Y, Wang Y, Lou M, Pi J, Ma Y. Arf-net: An adaptive receptive field network for breast mass segmentation in whole mammograms and ultrasound images. Biomed Signal Process Control 2022;71:103178.
- [41] Zhang X, Li X, Hu K, Gao X. Bgra-net: Boundary-guided and region-aware convolutional neural network for the segmentation of breast ultrasound images. In: 2021 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE; 2021. p. 1619-22.
- [42] Huang R, Lin M, Dou H, Lin Z, Ying Q, Jia X, et al. Boundary-rendering network for breast lesion segmentation in ultrasound images. Med Image Anal 2022;80:102478.
- [43] Lee H, Park J, Hwang JY. Channel attention module with multiscale grid average pooling for breast cancer segmentation in an ultrasound image. IEEE Trans Ultrasonics Ferroelect Freq Control 2020;67:1344-53.
- [44] Wu H, Liu J, Wang W, Wen Z, Qin J. Region-aware global context modeling for automatic nerve segmentation from ultrasound images. In: Proceedings of the AAAI Conference on Artificial Intelligence; 2021. pp. 2907-2915.
- [45] Xue C, Zhu L, Fu H, Hu X, Li X, Zhang H, et al. Global guidance network for breast lesion segmentation in ultrasound images. Med Image Anal 2021;70:101989.
- [46] Chen G, Li L, Dai Y, Zhang J, Yap MH. Aau-net: an adaptive attention u-net for breast lesions segmentation in ultrasound images. IEEE Trans Med Imag 2022.
- [47] Chen G, Dai Y, Zhang J. C-net: Cascaded convolutional neural network with global guidance and refinement residuals for breast ultrasound images segmentation. Comput Methods Programs Biomed 2022;225:107086.
- [48] Zhuang Z, Li N, Joseph Raj AN, Mahesh VG, Qiu S. An rdau-net model for lesion segmentation in breast ultrasound images. PloS One 2019;14:e0221535.
- [49] Yan Y, Liu Y, Wu Y, Zhang H, Zhang Y, Meng L. Accurate segmentation of breast tumors using ae u-net with hdc model in ultrasound images. Biomed Signal Process Control 2022;72:103299.
- [50] Lyu Y, Xu Y, Jiang X, Liu J, Zhao X, Zhu X. Ams-pan: Breast ultrasound image segmentation model combining attention mechanism and multi-scale features. Biomed Signal Process Control 2023;81:104425.
- [51] Shao H, Zhang Y, Xian M, Cheng HD, Xu F, Ding J. A saliency model for automated tumor detection in breast ultrasound images. In: 2015 IEEE International Conference on Image Processing (ICIP). IEEE; 2015. p. 1424-8.
- [52] Vakanski A, Xian M, Freer PE. Attention-enriched deep learning model for breast tumor segmentation in ultrasound images. Ultrasound Med Biol 2020;46:2819-33.
- [53] Ning Z, Zhong S, Feng Q, Chen W, Zhang Y. Smu-net: Saliency-guided morphology-aware u-net for breast lesion segmentation in ultrasound image. IEEE Trans Med Imag 2021;41:476-90.
- [54] Valanarasu JMJ, Oza P, Hacihaliloglu I, Patel VM. Medical transformer: Gated axial-attention for medical image segmentation. In: International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer; 2021. p. 36-46.
- [55] Wang H, Cao P, Wang J, Zaiane OR. Uctransnet: rethinking the skip connections in u-net from a channel-wise perspective with transformer. In: Proceedings of the AAAI Conference on Artificial Intelligence; 2022. pp. 2441-2449.
- [56] Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, et al. Swin transformer: Hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF International Conference on Computer Vision; 2021. pp. 10012-10022.
- [57] Chen Q, Wu Q, Wang J, Hu Q, Hu T, Ding E, et al. Mixformer: Mixing features across windows and dimensions. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 2022. pp. 5249-5259.
- [58] Zhou D, Yu Z, Xie E, Xiao C, Anandkumar A, Feng J, et al. Understanding the robustness in vision transformers. In: International Conference on Machine Learning. PMLR; 2022. p. 27378-94.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2024).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-9b8c6b4c-747f-4c0a-9a5b-0472d9b43416