B-mode ultrasonography and sonoelastography are used in the clinical diagnosis of prostate cancer (PCa). A combination of the two ultrasound (US) modalities using computer aid may be helpful for improving the diagnostic performance. A technique for computer-aided diagnosis (CAD) of PCa is presented based on multimodal US. Firstly, quantitative features are extracted from both B-mode US images and sonoelastograms, including intensity statistics, regional percentile features, gray-level co-occurrence matrix (GLCM) texture features and binary texture features. Secondly, a deep network named PGBM-RBM2 is proposed to learn and fuse multimodal features, which is composed of the point-wise gated Boltzmann machine (PGBM) and two layers of the restricted Boltzmann machines (RBMs). Finally, the support vector machine (SVM) is used for prostatic disease classification. Experimental evaluation was conducted on 313 multimodal US images of the prostate from 103 patients with prostatic diseases (47 malignant and 56 benign). Under five-fold cross-validation, the classification sensitivity, specificity, accuracy, Youden’s index and area under the receiver operating characteristic (ROC) curve with the PGBM-RBM2 were 87.0%, 88.8%, 87.9%, 75.8% and 0.851, respectively. The results demonstrate that multimodal feature learning and fusion using the PGBM-RBM2 can assist in the diagnosis of PCa. This deep network is expected to be useful in the clinical diagnosis of PCa.
Funding source: National Natural Science Foundation of China
Award Identifier / Grant number: 61671281
Award Identifier / Grant number: 61911530249
Award Identifier / Grant number: 81571693
Award Identifier / Grant number: 81871361
Award Identifier / Grant number: 81627804
Funding statement: This study was funded by the National Natural Science Foundation of China (Nos. 61671281, 61911530249, 81571693, 81871361, and 81627804, Funder Id: http://dx.doi.org/10.13039/501100001809), the Important Weak Subject Construction Project of Pudong Health and Family Planning Commission of Shanghai (No. PWZbr2017-09) and Shanghai East Hospital “Leading Talent Project” (No. DFRC2018024).
Conflict of interest: The authors declare that they have no conflict of interest.
Informed consent: Informed consent is not applicable.
Ethical approval: The conducted research is not related to either human or animals use.
 Li X, Wang X, Zhang J, Hanagata N, Wang X, Weng Q, et al. Hollow boron nitride nanospheres as boron reservoir for prostate cancer treatment. Nat Commun 2017;8:1–12. Search in Google Scholar
 Maisonneuve P, Botteri E, Lowenfels AB. Screening and surveillance for the early detection of colorectal cancer and adenomatous polyps. CA A Cancer J Clin 2008;58:130–60. Search in Google Scholar
 Wegelin O, van Melick HHE, Hooft L, Bosch JLHR, Reitsma HB, Barentsz JO, et al. Comparing three different techniques for magnetic resonance imaging-targeted prostate biopsies: a systematic review of in-bore versus magnetic resonance imaging-transrectal ultrasound fusion versus cognitive registration. Is there a preferred technique? Eur Urol 2017;71:517–31.10.1016/j.eururo.2016.07.041 Search in Google Scholar
 Flanigan RC, Catalona WJ, Richie JP, Ahmann FR, Hudson MA, Scardino PT, et al. Accuracy of digital rectal examination and transrectal ultrasonography in localizing prostate cancer. J Urol 1994;152(5 Part 1):1506–09.10.1016/S0022-5347(17)32457-67523707 Search in Google Scholar
 Zhang Q, Yao J, Cai Y, Zhang L, Wu Y, Xiong J, et al. Elevated hardness of peripheral gland on real-time elastography is an independent marker for high-risk prostate cancers. Radiol Medica 2017;122:944–51.10.1007/s11547-017-0803-1 Search in Google Scholar
 Miyagawa T, Tsutsumi M, Matsumura T, Kawazoe N, Ishikawa S, Shimokama T, et al. Real-time elastography for the diagnosis of prostate cancer: evaluation of elastographic moving images. Jpn J Clin Oncol 2009;39:394–8.10.1093/jjco/hyp026 Search in Google Scholar
 Zhang Q, Li C, Han H, Yang L, Wang Y, Wang W, et al. Computer-aided quantification of contrast agent spatial distribution within atherosclerotic plaque in contrast-enhanced ultrasound image sequences. Biomed Signal Process Control 2014;13: 50–61.10.1016/j.bspc.2014.03.005 Search in Google Scholar
 Ngiam J, Khosla A, Kim M, Nam J, Lee H, Ng AY, et al. Multimodal deep learning. Proc 28th Int Conf Mach Learn 2011; 689–96. Search in Google Scholar
 Lei B, Chen S, Ni D, Wang T. Discriminative learning for Alzheimer’s disease diagnosis via canonical correlation analysis and multimodal fusion. Front Aging Neurosci 2016;8:77.27242506 Search in Google Scholar
 Prashanth R, Roy SD, Mandal PK, Ghosh S. High-accuracy detection of early Parkinson’s disease through multimodal features and machine learning. Int J Med Inform 2016;90: 13–21.10.1016/j.ijmedinf.2016.03.00127103193 Search in Google Scholar
 Lei B, Li W, Yao Y, Jiang X, Tan E-L, Qin J, et al. Multi-modal and multi-layout discriminative learning for placental maturity staging. Pattern Recognit 2017;63:719–30.10.1016/j.patcog.2016.09.037 Search in Google Scholar
 Keil M, Stolka PJ, Wiebel M, Sakas G, Mcveigh E, Taylor R, et al. Ultrasound and CT registration quality: elastography vs. classical B-Mode. IEEE Int Symp Biomed Imaging From Nano to Macro 2009;967–70. Search in Google Scholar
 Huang Q, Zhang F, Li X. Machine learning in ultrasound computer-aided diagnostic systems: a survey. Biomed Res Int 2018;2018:1–10. Search in Google Scholar
 Huang Q, Zhang F, Li X. A new breast tumor ultrasonography CAD system based on decision tree and BI-RADS features. World Wide Web 2018;21:1491–504.10.1007/s11280-017-0522-5 Search in Google Scholar
 Zhang Q, Xiao Y, Suo J, Shi J, Yu J, Guo Y, et al. Sonoelastomics for breast tumor classification: a radiomics approach with clustering-based feature selection on sonoelastography. Ultrasound Med Biol 2017;43:1058–69.2823361910.1016/j.ultrasmedbio.2016.12.016 Search in Google Scholar
 Sohn K, Zhou G, Lee C, Lee H. Learning and selecting features jointly with point-wise gated Boltzmann machines. Proc 30th Int Conf Mach Learn 2013;28:217–25. Search in Google Scholar
 Suk HI, Lee SW, Shen D, Alzheimer’s Disease Neuroimaging Initiative. Hierarchical feature representation and multimodal fusion with deep learning for AD/MCI diagnosis. Neuroimage 2014;101:569–82.10.1016/j.neuroimage.2014.06.07725042445 Search in Google Scholar
 Tajbakhsh N, Shin JY, Gurudu SR, Hurst RT, Kendall CB, Gotway MB, et al. Convolutional neural networks for medical image analysis: full training or fine tuning? IEEE Trans Med Imaging 2016;35:1299–312.2697866210.1109/TMI.2016.2535302 Search in Google Scholar
 Krizhevsky A. One weird trick for parallelizing convolutional neural networks. ArXiv, 2014:1404.5997. Search in Google Scholar
 Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. Int Conf Learn Represent 2015;14:1–14. Search in Google Scholar
 He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. Proceedings of the IEEE CVPR 2016;77:770–8. Search in Google Scholar
 Dietterich TG. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Comput 1998;10:1895–923.10.1162/0899766983000171979744903 Search in Google Scholar
 Chang CC, Lin CJ. LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 2011;2:27. Search in Google Scholar
 Zhang Q, Xiao Y, Dai W, Suo J, Wang C, Shi J, et al. Deep learning based classification of breast tumors with shear-wave elastography. Ultrasonics 2016;72:150–7.10.1016/j.ultras.2016.08.00427529139 Search in Google Scholar
 Liu H, Setiono R. Chi2: Feature selection and discretization of numeric attributes. Proc 7th IEEE Int Conf Tools with Artif Intell 1995;388–91. Search in Google Scholar
 Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, et al. Going deeper with convolutions. Proc of the IEEE CVPR 2015;1–9. Search in Google Scholar
©2020 Walter de Gruyter GmbH, Berlin/Boston