PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Powiadomienia systemowe
  • Sesja wygasła!
  • Sesja wygasła!
Tytuł artykułu

Multi-stage fully convolutional network for precise prostate segmentation in ultrasound images

Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Prostate cancer is one of the most commonly diagnosed non-cutaneous malignant tumors and the sixth major cause of cancer-related death generally found in men globally. Automatic segmentation of prostate regions has a wide range of applications in prostate cancer diagnosis and treatment. It is challenging to extract powerful spatial features for precise prostate segmentation methods due to the wide variation in prostate size, shape, and histopathologic heterogeneity among patients. Most of the existing CNN-based architectures often produce unsatisfactory results and inaccurate boundaries in prostate segmentation, which are caused by inadequate discriminative feature maps and the limited amount of spatial information. To address these issues, we propose a novel deep learning technique called Multi-Stage FCN architecture for 2D prostate segmentation that captures more precise spatial information and accurate prostate boundaries. In addition, a new prostate ultrasound image dataset known as CCH-TRUSPS was collected from Chongqing University Cancer Hospital, including prostate ultrasound images of various prostate cancer architectures. We evaluate our method on the CCH-TRUSPS dataset and the publicly available Multi-site T2-weighted MRI dataset using five commonly used metrics for medical image analysis. When compared to other CNN-based methods on the CCH-TRUSPS test set, our Multi-Stage FCN achieves the highest and best binary accuracy of 99.15%, the DSC score of 94.90%, the IoU score of 89.80%, the precision of 94.67%, and the recall of 96.49%. The statistical and visual results demonstrate that our approach outperforms previous CNN-based techniques in all ramifications and can be used for the clinical diagnosis of prostate cancer.
Twórcy
autor
  • Department of Ultrasound, Chongqing Key Laboratory for Intelligent Oncology in Breast Cancer (iCQBC), Chongqing University Cancer Hospital, Chongqing, China
  • School of Microelectronics and Communication Engineering, Chongqing University, Chongqing, China
autor
  • School of Microelectronics and Communication Engineering, Chongqing University, Chongqing, China
autor
  • School of Microelectronics and Communication Engineering, Chongqing University, Chongqing, China
autor
  • Department of Ultrasound, Chongqing Key Laboratory for Intelligent Oncology in Breast Cancer (iCQBC), Chongqing University Cancer Hospital, Chongqing, China
autor
  • Department of Ultrasound, Chongqing Key Laboratory for Intelligent Oncology in Breast Cancer (iCQBC), Chongqing University Cancer Hospital, Chongqing, China
autor
  • Department of Ultrasound, Chongqing Key Laboratory for Intelligent Oncology in Breast Cancer (iCQBC), Chongqing University Cancer Hospital, Chongqing, China
autor
  • Department of Ultrasound, Chongqing Key Laboratory for Intelligent Oncology in Breast Cancer (iCQBC), Chongqing University 11 Cancer Hospital, Chongqing 400030, China
autor
  • School of Microelectronics and Communication Engineering, Chongqing University, Chongqing, China
Bibliografia
  • [1] Sung H, Ferlay J, Siegel RL, Laversanne M, Soerjomataram I, Jemal A, et al. Global cancer statistics 2020: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: A Cancer J Clin 2021;71 (3):209-49.
  • [2] Ou W, Lei J, Li M, Zhang X, Liang R, Long L, et al. Ultrasoundbased radiomics score for pre-biopsy prediction of prostate cancer to reduce unnecessary biopsies. Prostate 2023;83 (1):109-18.
  • [3] Wright C, Mäkelä P, Bigot A, Anttinen M, Boström PJ, Blanco Sequeiros R. Deep learning prediction of non-perfused volume without contrast agents during prostate ablation therapy. Biomed Eng Let 2023;13(1):31-40.
  • [4] Huang YP, Lin TP, Shen SH, Cheng WM, Huang TH, Huang IS, et al. Combining prostate health index and multiparametric magnetic resonance imaging may better predict extraprostatic extension after radical prostatectomy. J Chinese Med Assoc 2023;86(1):52-6.
  • [5] Wang Y, Deng Z, Hu X, Zhu L, Yang X, Xu X, et al. Deep attentional features for prostate segmentation in ultrasound. Medical Image Computing and Computer Assisted Intervention-MICCAI 2018, vol. 11073. Springer; 2018. p. 523-30.
  • [6] Hammouda K, Khalifa F, Alghamdi NS, Darwish H, El-Baz A. Multi-stage classification-based deep learning for gleason system grading using histopathological images. Cancers 2022;14(23):5897.
  • [7] Jager A, Postema AW, Mischi M, Wijkstra H, Beerlage HP, Oddens JR. Clinical trial protocol: developing an image classification algorithm for prostate cancer diagnosis on three-dimensional multiparametric transrectal ultrasound. Eur Urol Open Sci 2023;49:32-43.
  • [8] Saeed SU, Syer T, Yan W, Yang Q, Emberton M, Punwani S, et al. Bi-parametric prostate MR image synthesis using pathology and sequence-conditioned stable diffusion. arXiv preprint arXiv:230302094. 2023.
  • [9] Rakauskas A, Peters M, Martel P, van Rossum PS, La Rosa S, Meuwly JY, et al. Do cancer detection rates differ between transperineal and transrectal micro-ultrasound mpMRI-fusion-targeted prostate biopsies? A propensity score-matched study. PloS One 2023;18(1):e0280262.
  • [10] Choe S, Patel HD, Lanzotti N, Okabe Y, Rac G, Shea SM, et al. MRI vs transrectal ultrasound to estimate prostate volume and PSAD: impact on prostate cancer detection. Urology 2023;171:172-8.
  • [11] Liu Y, Wang S, Xu G, Zhou B, Zhang Y, Ye B, et al. Effectiveness and accuracy of MRI-ultrasound fusion targeted biopsy based on PI-RADS v2. 1 category in transition/peripheral zone of the prostate. J Magnet Reson Imag 2023;n/a(n/a).
  • [12] Guo S, Zhang J, Jiao J, Li Z, Wu P, Jing Y, et al. Comparison of prostate volume measured by transabdominal ultrasound and MRI with the radical prostatectomy specimen volume: a retrospective observational study. BMC Urol 2023;23(1):1-9.
  • [13] Ramos F, Korets R, Fleishman A, Kaul S, Johnson M, Wei JL, et al. Comparative effectiveness of magnetic resonance imaging-ultrasound fusion versus in-bore magnetic resonance imaging-targeted prostate biopsy. Urology 2023;171:164-71.
  • [14] Lu X, Wu S, Xiao Z, Huang X. An enhanced multiscale generation and depth-perceptual loss-based super-resolution network for prostate ultrasound images. Measur Sci Technol 2022;34(2):024002.
  • [15] Jiang J, Guo Y, Bi Z, Huang Z, Yu G, Wang J. Segmentation of prostate ultrasound images: the state of the art and the future directions of segmentation algorithms. Artif Intell Rev 2023;56(1):615-51.
  • [16] Xu X, Sanford T, Turkbey B, Xu S, Wood BJ, Yan P. Polar transform network for prostate ultrasound segmentation with uncertainty estimation. Med Image Anal 2022;78:102418.
  • [17] Geng L, Li S, Xiao Z, Zhang F. Multi-channel feature pyramid networks for prostate segmentation, based on transrectal ultrasound imaging. Appl Sci 2020;10(11):3834.
  • [18] Liu Z, Yang C, Huang J, Liu S, Zhuo Y, Lu X. Deep learning framework based on integration of S-Mask R-CNN and Inception-v3 for ultrasound image-aided diagnosis of prostate cancer. Future Generat Comput Syst 2021;114:358-67.
  • [19] Epstein JI, Allsbrook Jr WC, Amin MB, Egevad LL, Committee IG, et al. The 2005 International Society of Urological Pathology (ISUP) consensus conference on Gleason grading of prostatic carcinoma. Am J Surg Pathol 2005;29(9):1228-42.
  • [20] Peng T, Gu Y, Zhang J, Dong Y, DI G, Wang W, et al. A robust and explainable structure-based algorithm for detecting the organ boundary from ultrasound multi-datasets. J Digital Imag 2023:1-18.
  • [21] Liu Q, Dou Q, Yu L, Heng PA. MS-Net: multi-site network for improving prostate segmentation with heterogeneous MRI data. IEEE Trans Med Imag 2020;39(9):2713-24.
  • [22] Wang B, Lei Y, Tian S, Wang T, Liu Y, Patel P, et al. Deeply supervised 3D fully convolutional networks with group dilated convolution for automatic MRI prostate segmentation. Med Phys 2019;46(4):1707-18.
  • [23] Sanders J, Frank S, Lewis G, Kudchadker R, Bruno T, Ma J. Multi-tasking neural networks for anatomy segmentation in prostate brachytherapy MRI. Brachytherapy 2019;18(3):S16.
  • [24] Song E, Long J, Ma G, Liu H, Hung CC, Jin R, et al. Prostate lesion segmentation based on a 3D end-to-end convolution neural network with deep multi-scale attention. Magn Reson Imag 2023;99:98-109.
  • [25] Nemoto T, Futakami N, Yagi M, Kunieda E, Akiba T, Takeda A, et al. Simple low-cost approaches to semantic segmentation in radiation therapy planning for prostate cancer using deep learning with non-contrast planning CT images. Phys Medica 2020;78:93-100.
  • [26] Shi Y, Yang W, Gao Y, Shen D. Does manual delineation only provide the side information in ct prostate segmentation? Medical Image Computing and Computer Assisted Intervention- MICCAI 2017, vol. 10435. Springer; 2017. p. 692-700.
  • [27] Shen D, Zhan Y, Davatzikos C. Segmentation of prostate boundaries from ultrasound images using statistical shape model. IEEE Trans Med Imag 2003;22(4):539-51.
  • [28] Yang X, Yu L, Wu L, Wang Y, Ni D, Qin J, et al. Fine-grained recurrent neural networks for automatic prostate segmentation in ultrasound images. In: Proceedings of the AAAI Conference on Artificial Intelligence. vol. 31; 2017. p. 1633-1639.
  • [29] Girum KB, Lalande A, Hussain R, Créhange G. A deep learning method for real-time intraoperative US image segmentation in prostate brachytherapy. Int J Comput Assist Radiol Surg 2020;15(9):1467-76.
  • [30] Gillies DJ, Rodgers JR, Gyacskov I, Roy P, Kakani N, Cool DW, et al. Deep learning segmentation of general interventional tools in two-dimensional ultrasound images. Med Phy 2020;47(10):4956-70.
  • [31] Kitner N, Rodgers JR, Ungi T, Korzeniowski M, Olding T, Mousavi P, et al. Multi-catheter modelling in reconstructed 3D transrectal ultrasound images from prostate brachytherapy. Medical Imaging 2023: Image-Guided Procedures, Robotic Interventions, and Modeling, vol. 12466. SPIE; 2023. p. 126-35.
  • [32] Wang X, Chang Z, Zhang Q, Li C, Miao F, Gao G. Prostate ultrasound image segmentation based on DSU-Net. Biomedicines 2023;11(3):646.
  • [33] Peng T, Wu Y, Qin J, Wu QJ, Cai J. H-ProSeg: Hybrid ultrasound prostate segmentation based on explainability-guided mathematical model. Comput Methods Programs Biomed 2022;219:106752.
  • [34] Peng T, Zhao J, Gu Y, Wang C, Wu Y, Cheng X, et al. H-ProMed: ultrasound image segmentation based on the evolutionary neural network and an improved principal curve. Pattern Recognit 2022;131:108890.
  • [35] Peng T, Tang C, Wu Y, Cai J. Semi-automatic prostate segmentation from ultrasound images using machine learning and principal curve based on interpretable mathematical model expression. Front Oncol 2022;12:878104.
  • [36] Vesal S, Gayo I, Bhattacharya I, Natarajan S, Marks LS, Barratt DC, et al. Domain generalization for prostate segmentation in transrectal ultrasound images: A multi-center study. Med Image Anal 2022;82:102620.
  • [37] Zhang J, Venkataraman R, Staib LH, Onofrey JA. Atlas-based semantic segmentation of prostate zones. Medical Image Computing and Computer Assisted Intervention-MICCAI 2022, vol. 13435. Springer; 2022. p. 570-9.
  • [38] Montazerolghaem M, Sun Y, Sasso G, Haworth A. U-Net architecture for prostate segmentation: the impact of loss function on system performance. Bioengineering 2023;10 (4):412.
  • [39] Aldoj N, Biavati F, Michallek F, Stober S, Dewey M. Automatic prostate and prostate zones segmentation of magnetic resonance images using DenseNet-like U-net. Sci Rep 2020;10 (1):1-17.
  • [40] Quihui-Rubio PC, Ochoa-Ruiz G, Gonzalez-Mendoza M, Rodriguez-Hernandez G, Mata C. Comparison of automatic prostate zones segmentation models in MRI images using Unet-like architectures. Advances in Computational Intelligence, vol. 13612. Springer; 2022. p. 282-96.
  • [41] Rodrigues NM, Silva S, Vanneschi L, Papanikolaou N. A comparative study of automated deep learning segmentation models for prostate mri. Cancers 2023;15(5):1467.
  • [42] Ye LY, Miao XY, Cai WS, Xu WJ. Medical image diagnosis of prostate tumor based on PSP-Net+ VGG16 deep learning network. Comput Methods Programs Biomed 2022;221:106770.
  • [43] Bardis M, Houshyar R, Chantaduly C, Tran-Harding K, Ushinsky A, Chahine C, et al. Segmentation of the prostate transition zone and peripheral zone on MR images with deep learning. Radiol: Imag Cancer 2021;3(3):e200024.
  • [44] Xu L, Zhang G, Zhang D, Zhang J, Zhang X, Bai X, et al. Development and clinical utility analysis of a prostate zonal segmentation model on T2-weighted imaging: a multicenter study. Insights Imag 2023;14(1):1-11.
  • [45] Dai Z, Carver E, Liu C, Lee J, Feldman A, Zong W, et al. Segmentation of the prostatic gland and the intraprostatic lesions on multiparametic magnetic resonance imaging using mask region-based convolutional neural networks. Adv Radiat Oncol 2020;5(3):473-81.
  • [46] Ren H, Ren C, Guo Z, Zhang G, Luo X, Ren Z, et al. A novel approach for automatic segmentation of prostate and its lesion regions on magnetic resonance imaging. Front Oncol 2023;13:1095353.
  • [47] Chen W, Zhang Y, He J, Qiao Y, Chen Y, Shi H, et al. Prostate segmentation using 2D bridged U-net. In: 2019 International Joint Conference on Neural Networks (IJCNN). IEEE; 2019. p. 1-7.
  • [48] Liu B, Zheng J, Zhang H, Chen P, Li S, Wen Y. An improved 2D U-Net model integrated squeeze-and-excitation layer for prostate cancer segmentation. Sci Programm 2021;2021:1-8.
  • [49] Hambarde P, Talbar S, Mahajan A, Chavan S, Thakur M, Sable N. Prostate lesion segmentation in MR images using radiomics based deeply supervised U-Net. Biocybernet Biomed Eng 2020;40(4):1421-35.
  • [50] Liu Q, Dou Q, Heng PA. Shape-aware meta-learning for generalizing prostate MRI segmentation to unseen domains. Medical Image Computing and Computer Assisted Intervention-MICCAI 2020, vol. 12262. Springer; 2020. p. 475-85.
  • [51] Gu R, Zhang J, Huang R, Lei W, Wang G, Zhang S. Domain composition and attention for unseen-domain generalizable medical image segmentation. Medical Image Computing and Computer Assisted Intervention-MICCAI 2021, vol. 12903. Springer; 2021. p. 241-50.
  • [52] Palladino L, Maris B, Antonelli A, Fiorini P. PROST-Net: A deep learning approach to support real-time fusion in prostate biopsy. IEEE Trans Med Robot Bion 2022;4(2):323-6.
  • [53] Wang Z, Wu R, Xu Y, Liu Y, Chai R, Ma H. A two-stage CNN method for MRI image segmentation of prostate with lesion. Biomed Signal Process Control 2023;82:104610.
  • [54] Li Z, Fang J, Qiu R, Gong H, Zhang W, Li L, et al. CDA-Net: A contrastive deep adversarial model for prostate cancer segmentation in MRI images. Biomed Signal Process Control 2023;83:104622.
  • [55] Jimenez-Pastor A, Lopez-Gonzalez R, Fos-Guarinos B, GarciaCastro F, Wittenberg M, Torregrosa-Andrés A, et al. Automated prostate multi-regional segmentation in magnetic resonance using fully convolutional neural networks. Eur Radiol 2023;1-10.
  • [56] Isaksson LJ, Pepa M, Summers P, Zaffaroni M, Vincini MG, Corrao G, et al. Comparison of automated segmentation techniques for magnetic resonance images of the prostate. BMC Med Imag 2023;23(1):1-16.
  • [57] Li X, Bagher-Ebadian H, Gardner S, Kim J, Elshaikh M, Movsas B, et al. An uncertainty-aware deep learning architecture with outlier mitigation for prostate gland segmentation in radiotherapy treatment planning. Med Phys 2023;50 (1):311-22.
  • [58] He K, Lian C, Zhang B, Zhang X, Cao X, Nie D, et al. HF-UNet: Learning hierarchically inter-task relevance in multi-task Unet for accurate prostate segmentation in CT images. IEEE Trans Medical Imag 2021;40(8):2118-28.
  • [59] Xu X, Lian C, Wang S, Wang A, Royce T, Chen R, et al. Asymmetrical multi-task attention U-Net for the segmentation of prostate bed in CT image. Medical Image Computing and Computer Assisted Intervention-MICCAI 2020, vol. 12264. Springer; 2020. p. 470-9.
  • [60] Xu X, Lian C, Wang S, Zhu T, Chen RC, Wang AZ, et al. Asymmetric multi-task attention network for prostate bed segmentation in computed tomography images. Med Image Anal 2021;72:102116.
  • [61] Wang F, Xu X, Yang D, Chen RC, Royce TJ, Wang A, et al. Dynamic cross-task representation adaptation for clinical targets co-segmentation in CT image-guided post-prostatectomy radiotherapy. IEEE Trans Med Imag 2023;42:1046-55.
  • [62] Wang S, Liu M, Lian J, Shen D. Boundary coding representation for organ segmentation in prostate cancer radiotherapy. IEEE Trans Med Imag 2020;40(1):310-20.
  • [63] Ronneberger O, Fischer P, Brox T. U-net: Convolutional networks for biomedical image segmentation. Medical Image Computing and Computer-Assisted Intervention-MICCAI 2015, vol. 9351. Springer; 2015. p. 234-41.
  • [64] Badrinarayanan V, Kendall A, Cipolla R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans Pattern Anal Mach Intell 2017;39 (12):2481-95.
  • [65] Zhou Z, Rahman Siddiquee MM, Tajbakhsh N, Liang J. Unet++: A nested u-net architecture for medical image segmentation. Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, vol. 11045. Springer; 2018. p. 3-11.
  • [66] Oktay O, Schlemper J, Folgoc LL, Lee M, Heinrich M, Misawa K, et al. Attention u-net: Learning where to look for the pancreas. arXiv preprint arXiv:180403999. 2018.
  • [67] Ibtehaz N, Rahman MS. MultiResUNet: Rethinking the U-Net architecture for multimodal biomedical image segmentation. Neural Networks 2020;121:74-87.
  • [68] Chen G, Liu Y, Qian J, Zhang J, Yin X, Cui L, et al. DSEU-net: A novel deep supervision SEU-net for medical ultrasound image segmentation. Expert Syst Appl 2023;223:119939.
  • [69] Alom MZ, Hasan M, Yakopcic C, Taha TM, Asari VK. Recurrent residual convolutional neural network based on unet (r2u-net) for medical image segmentation. arXiv preprint arXiv:180206955. 2018.
  • [70] Chen G, Dai Y, Li R, Zhao Y, Cui L, Yin X. SDFNet: Automatic segmentation of kidney ultrasound images using multi-scale low-level structural feature. Expert Syst Appl 2021;185:115619.
  • [71] Zhang L, Liu A, Xiao J, Taylor P. Dual encoder fusion u-net (defu-net) for cross-manufacturer chest x-ray segmentation. In: 2020 25th International Conference on Pattern Recognition (ICPR). IEEE; 2021. p. 9333-9.
  • [72] Chen G, Zhang J, Liu Y, Yin J, Yin X, Cui L, et al. ESKNet-An enhanced adaptive selection kernel convolution for breast tumors segmentation. arXiv preprint arXiv:221102915. 2022.
  • [73] Chen G, Li L, Zhang J, Dai Y. Rethinking the unpretentious Unet for medical ultrasound image segmentation. Pattern Recog 2023;109728.
  • [74] Chen G, Dai Y, Zhang J. RRCNet: Refinement residual convolutional network for breast ultrasound images segmentation. Eng Appl Artif Intell 2023;117:105601.
  • [75] Chen G, Zhao Y, Dai Y, Zhang J, Yin XT, Cui L, et al. Asymmetric U-shaped network with hybrid attention mechanism for kidney ultrasound images segmentation. Expert Syst Appl 2023;212:118847.
  • [76] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:14091556. 2014.
  • [77] Soomro TA, Afifi AJ, Gao J, Hellwich O, Paul M, Zheng L. Strided U-Net Model: Retinal Vessels Segmentation using Dice Loss. In: 2018 Digital Image Computing: Techniques and Applications (DICTA); 2018. p. 1-8.
  • [78] Sudre CH, Li W, Vercauteren T, Ourselin S, Jorge Cardoso M. Generalised dice overlap as a deep learning loss function for highly unbalanced segmentations. Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, vol. 10553. Springer; 2017. p. 240-8.
  • [79] Epstein JI, Zelefsky MJ, Sjoberg DD, Nelson JB, Egevad L, MagiGalluzzi C, et al. A contemporary prostate cancer grading system: a validated alternative to the Gleason score. Eur Urol 2016;69(3):428-35.
Uwagi
PL
Poprawka do artykułu znajduje się w : Biocybernetics and Biomedical Engineering , 2023, Vol. 43, no 4, s. 776
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-0b723a28-a2ab-411d-9ffe-06ee328e51ea
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.