Tytuł artykułu
Identyfikatory
Warianty tytułu
Języki publikacji
Abstrakty
We propose a novel approach to breast mass classification based on deep learning models that utilize raw radio-frequency (RF) ultrasound (US) signals. US images, typically displayed by US scanners and used to develop computer-aided diagnosis systems, are reconstructed using raw RF data. However, information related to physical properties of tissues present in RF signals is partially lost due to the irreversible compression necessary to make raw data readable to the human eye. To utilize the information present in raw US data, we develop deep learning models that can automatically process small 2D patches of RF signals and their amplitude samples. We compare our approach with classification method based on the Nakagami parameter, a widely used quantitative US technique utilizing RF data amplitude samples. Our better performing deep learning model, trained using RF signals and their envelope samples, achieved good classification performance, with the area under the receiver attaining operating characteristic curve (AUC) and balanced accuracy of 0.772 and 0.710, respectively. The proposed method significantly outperformed the Nakagami parameter-based classifier, which achieved AUC and accuracy of 0.64 and 0.611, respectively. The developed deep learning models were used to generate parametric maps illustrating the level of mass malignancy. Our study presents the feasibility of using RF data for the development of deep learning breast mass classification models.
Wydawca
Czasopismo
Rocznik
Tom
Strony
977--986
Opis fizyczny
Bibliogr. 52 poz., rys., tab.
Twórcy
autor
- Department of Information and Computational Science, Institute of Fundamental Technological Research, Polish Academy of Sciences, Pawinskiego 5B, Warsaw, Poland
autor
- Department of Ultrasound, Institute of Fundamental Technological Research, Polish Academy of Sciences, Warsaw, Poland
autor
- Laboratory of Professional Electronics, Institute of Fundamental Technological Research, Polish Academy of Sciences, Warsaw, Poland
autor
- Department of Ultrasound, Institute of Fundamental Technological Research, Polish Academy of Sciences, Warsaw, Poland
Bibliografia
- [1] Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A. Global cancer statistics 2018: globocan estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: Cancer J Clin 2018;68:394–424.
- [2] Cheng H-D, Shan J, Ju W, Guo Y, Zhang L. Automated breast cancer detection and classification using ultrasound images: a survey. Pattern Recognit 2010;43:299–317.
- [3] Flores WG, de Albuquerque Pereira WC, Infantosi AFC. Improving classification performance of breast lesions on ultrasonography. Pattern Recognit 2015;48:1125–36.
- [4] Wu G-G, Zhou L-Q, Xu J-W, Wang J-Y, Wei Q, Deng Y-B, et al. Artificial intelligence in breast ultrasound. World J Radiol 2019;11:19.
- [5] Virmani J, Agarwal R, et al. Assessment of despeckle filtering algorithms for segmentation of breast tumours from ultrasound images. Biocybern Biomed Eng 2019;39:100–21.
- [6] Yu X, Guo Y, Huang S-M, Li M-L, Lee W-N. Beamforming effects on generalized nakagami imaging. Phys Med Biol 2015;60:7513.
- [7] Virmani J, Agarwal R, et al. Effect of despeckle filtering on classification of breast tumors using ultrasound images. Biocybern Biomed Eng 2019;39:536–60.
- [8] Oelze ML, Mamou J. Review of quantitative ultrasound: envelope statistics and backscatter coefficient imaging and contributions to diagnostic ultrasound. IEEE Trans Ultrason Ferroelectr Freq Control 2016;63:336–51.
- [9] Tsui P-H, Chang C-C. Imaging local scatterer concentrations by the nakagami statistical model. Ultrasound Med Biol 2007;33:608–19.
- [10] Liao Y-Y, Tsui P-H, Li C-H, Chang K-J, Kuo W-H, Chang C-C, et al. Classification of scattering media within benign and malignant breast tumors based on ultrasound texture-feature-based and nakagami-parameter images. Med Phys 2011;38:2198–207.
- [11] Trop I, Destrempes F, El Khoury M, Robidoux A, Gaboury L, Allard L, et al. The added value of statistical modeling of backscatter properties in the management of breast lesions at us. Radiology 2014;275:666–74.
- [12] Larrue A, Noble JA. Modeling of errors in nakagami imaging: illustration on breast mass characterization. Ultrasound Med Biol 2014;40:917–30.
- [13] Byra M, Nowicki A, Wróblewska-Piotrzkowska H, Dobruch- Sobczak K. Classification of breast lesions using segmented quantitative ultrasound maps of homodyned k distribution parameters. Med Phys 2016;43:5561–9.
- [14] Uniyal N, Eskandari H, Abolmaesumi P, Sojoudi S, Gordon P, Warren L, et al. Ultrasound rf time series for classification of breast lesions. IEEE Trans Med Imaging 2015;34:652–61.
- [15] Ouyang Y, Tsui P-H, Wu S, Wu W, Zhou Z. Classification of benign and malignant breast tumors using h-scan ultrasound imaging. Diagnostics 2019;9:182.
- [16] Goodfellow I, Bengio Y, Courville A. Deep learning. MIT Press; 2016.
- [17] Lang KJ, Waibel AH, Hinton GE. A time-delay neural network architecture for isolated word recognition. Neural Netw 1990;3:23–43.
- [18] LeCun Y, Boser B, Denker JS, Henderson D, Howard RE, Hubbard W, et al. Backpropagation applied to handwritten zip code recognition. Neural Comput 1989;1:541–51.
- [19] Antropova N, Huynh BQ, Giger ML. A deep feature fusion methodology for breast cancer diagnosis demonstrated on three imaging modality datasets. Med Phys 2017;44:5162–71.
- [20] Han S, Kang H-K, Jeong J-Y, Park M-H, Kim W, Bang W-C, et al. A deep learning framework for supporting the classification of breast lesions in ultrasound images. Phys Med Biol 2017;62:7714.
- [21] Byra M. Discriminant analysis of neural style representations for breast lesion classification in ultrasound. Biocybern Biomed Eng 2018;38:684–90.
- [22] Yap MH, Pons G, Martí J, Ganau S, Sentís M, Zwiggelaar R, et al. Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J Biomed Health Inform 2018;22:1218–26.
- [23] Yap MH, Goyal M, Osman FM, Martí R, Denton E, Juette A, et al. Breast ultrasound lesions recognition: end-to-end deep learning approaches. J Med Imaging 2018;6:011007.
- [24] Byra M, Galperin M, Ojeda-Fournier H, Olson L, O'Boyle M, Comstock C, et al. Breast mass classification in sonography with transfer learning using a deep convolutional neural network and color conversion. Med Phys 2019;46:746–55.
- [25] Qi X, Zhang L, Chen Y, Pi Y, Chen Y, Lv Q, et al. Automated diagnosis of breast ultrasonography images using deep neural networks. Med Image Anal 2019;52:185–98.
- [26] Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, et al. ImageNet large scale visual recognition challenge. Int J Comput Vis (IJCV) 2015;115:211–52. http://dx.doi.org/10.1007/s11263-015-0816-y.
- [27] Bardou D, Zhang K, Ahmad SM. Classification of breast cancer based on histology images using convolutional neural networks. IEEE Access 2018;6:24680–93.
- [28] Mullooly M, Bejnordi BE, Pfeiffer RM, Fan S, Palakal M, Hada M, et al. Application of convolutional neural networks to breast biopsies to delineate tissue correlates of mammographic breast density. NPJ Breast Cancer 2019;5:1–11.
- [29] Destounis SV, Santacroce A, Arieno A. Update on breast density, risk estimation, and supplemental screening. Am J Roentgenol 2020;214:296–305.
- [30] Rebolj M, Blyuss O, Chia KS, Duffy SW. Long-term excess risk of breast cancer after a single breast density measurement. Eur J Cancer 2019;117:41–7.
- [31] Diniz JOB, Diniz PHB, Valente TLA, Silva AC, de Paiva AC, Gattass M. Detection of mass regions in mammograms by bilateral analysis adapted to breast density using similarity indexes and convolutional neural networks. Comput Methods Programs Biomed 2018;156:191–207.
- [32] Al-Masni MA, Al-Antari MA, Park J-M, Gi G, Kim T-Y, Rivera P, et al. Simultaneous detection and classification of breast masses in digital mammograms via a deep learning yolo-based cad system. Comput Methods Programs Biomed 2018;157:85–94.
- [33] Piotrzkowska-Wróblewska H, Dobruch-Sobczak K, Byra M, Nowicki A. Open access database of raw ultrasonic signals acquired from malignant and benign breast lesions. Med Phys 2017;44:6105–9.
- [34] LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proc IEEE 1998;86:2278–324.
- [35] Glorot X, Bordes A, Bengio Y. Deep sparse rectifier neural networks. Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics; 2011. p. 315–23.
- [36] Hoshen Y, Weiss RJ, Wilson KW. Speech acoustic modeling from raw multichannel waveforms. 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); 2015. pp. 4624–8.
- [37] Golik P, Tüske Z, Schlüter R, Ney H. Convolutional neural networks for acoustic modeling of raw time signal in lvcsr. Sixteenth Annual Conference of the International Speech Communication Association; 2015.
- [38] Sainath TN, Weiss RJ, Senior A, Wilson KW, Vinyals O. Learning the speech front-end with raw waveform cldnns. Sixteenth Annual Conference of the International Speech Communication Association; 2015.
- [39] Schirrmeister RT, Springenberg JT, Fiederer LDJ, Glasstetter M, Eggensperger K, Tangermann M, et al. Deep learning with convolutional neural networks for eeg decoding and visualization. Hum Brain Mapp 2017;38:5391–420.
- [40] Nam K, Zagzebski JA, Hall TJ. Quantitative assessment of in vivo breast masses using ultrasound attenuation and backscatter. Ultrason Imaging 2013;35:146–61.
- [41] Zhang W, Li R, Deng H, Wang L, Lin W, Ji S, et al. Deep convolutional neural networks for multi-modality isointense infant brain image segmentation. NeuroImage 2015;108:214–24.
- [42] Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift; 2015, arXiv:1502.03167 (arXiv preprint).
- [43] Tsui P-H, Yeh C-K, Chang C-C, Liao Y-Y. Classification of breast masses by ultrasonic nakagami imaging: a feasibility study. Phys Med Biol 2008;53:6027.
- [44] Shankar PM. A general statistical model for ultrasonic backscattering from tissues. IEEE Trans Ultrason Ferroelectr Freq Control 2000;47:727–36.
- [45] Lin J-J, Cheng J-Y, Huang L-F, Lin Y-H, Wan Y-L, Tsui P-H. Detecting changes in ultrasound backscattered statistics by using nakagami parameters: comparisons of moment-based and maximum likelihood estimators. Ultrasonics 2017;77:133–43.
- [46] LeCun YA, Bottou L, Orr GB, Müller K-R. Efficient backprop. Neural networks: tricks of the trade. Springer; 2012. p. 9–48.
- [47] Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics; 2010. p. 249–56.
- [48] Chollet F. Keras; 2018, https://github.com/fchollet/keras.
- [49] Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, et al. Tensorflow: a system for large-scale machine learning. 12th {USENIX} Symposium on Operating Systems Design and Implementation ({OSDI} 16). 2016. pp. 265–83.
- [50] Byra M, Piotrzkowska-Wróblewska H, Dobruch-Sobczak K, Nowicki A. Combining nakagami imaging and convolutional neural network for breast lesion classification. 2017 IEEE International Ultrasonics Symposium (IUS); 2017. pp. 1–4.
- [51] Zhang Q, Xiao Y, Dai W, Suo J, Wang C, Shi J, et al. Deep learning based classification of breast tumors with shear-wave elastography. Ultrasonics 2016;72:150–7.
- [52] Jensen JA. Field: a program for simulating ultrasound systems. 10TH Nordicbaltic Conference on Biomedical Imaging, Vol. 4, Supplement 1, Part 1; 1996. pp. 351–3.
Uwagi
PL
Opracowanie rekordu ze środków MNiSW, umowa Nr 461252 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2020).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-dcfd30b7-5d7e-419b-83a3-db2d2e708391