The Role of Artificial Intelligence in Endoscopic Ultrasound for Pancreatic Disorders

Ryosuke Tonozuka, Shuntaro Mukai, Takao Itoi, Ryosuke Tonozuka, Shuntaro Mukai, Takao Itoi

Abstract

The use of artificial intelligence (AI) in various medical imaging applications has expanded remarkably, and several reports have focused on endoscopic ultrasound (EUS) images of the pancreas. This review briefly summarizes each report in order to help endoscopists better understand and utilize the potential of this rapidly developing AI, after a description of the fundamentals of the AI involved, as is necessary for understanding each study. At first, conventional computer-aided diagnosis (CAD) was used, which extracts and selects features from imaging data using various methods and introduces them into machine learning algorithms as inputs. Deep learning-based CAD utilizing convolutional neural networks has been used; in these approaches, the images themselves are used as inputs, and more information can be analyzed in less time and with higher accuracy. In the field of EUS imaging, although AI is still in its infancy, further research and development of AI applications is expected to contribute to the role of optical biopsy as an alternative to EUS-guided tissue sampling while also improving diagnostic accuracy through double reading with humans and contributing to EUS education.

Keywords: artificial intelligence; computer-aided diagnosis; convolutional neural network; deep learning; deep neural network; endoscopic ultrasound; machine learning; pancreas; pancreatic cancer; support vector machine.

Conflict of interest statement

T.I. has received a teaching fee from Olympus Medical Corporation.

Figures

Figure 1
Figure 1
Overview of the development of artificial intelligence.
Figure 2
Figure 2
Artificial neural network. (A) Simple perceptron. (B) Neural network. (C) Deep neural network (DNN). Black dots: multiple hidden layers.
Figure 2
Figure 2
Artificial neural network. (A) Simple perceptron. (B) Neural network. (C) Deep neural network (DNN). Black dots: multiple hidden layers.
Figure 3
Figure 3
Support vector machine (SVM). (A) Basic structure of SVM. (B) Soft-margin SVM. (C) Kernel method.
Figure 4
Figure 4
An example convolutional neural network (CNN). (A) Translation of an input image into a feature map in a convolution layer. (B) Layout of a deep convolutional neural network.
Figure 4
Figure 4
An example convolutional neural network (CNN). (A) Translation of an input image into a feature map in a convolution layer. (B) Layout of a deep convolutional neural network.
Figure 5
Figure 5
Gradient-weighted lass activation mapping (Grad-CAM). The left image is a representative original endoscopic ultrasound image. The right is a Grad-CAM image displaying the regions recognized as being important.

References

    1. McGuigan A., Kelly P., Turkington R.C., Jones C., Coleman H.G., McCain R.S. Pancreatic cancer: A review of clinical diagnosis, epidemiology, treatment and outcomes. World J. Gastroenterol. 2018;24:4846–4861. doi: 10.3748/wjg.v24.i43.4846.
    1. Egawa S., Toma H., Ohigashi H., Okusaka T., Nakao A., Hatori T., Maguchi H., Yanagisawa A., Tanaka M. Japan Pancreatic Cancer Registry; 30th year anniversary: Japan Pancreas Society. Pancreas. 2012;41:985–992. doi: 10.1097/MPA.0b013e318258055c.
    1. Kitano M., Yoshida T., Itonaga M., Tamura T., Hatamaru K., Yamashita Y. Impact of endoscopic ultrasonography on diagnosis of pancreatic cancer. J. Gastroenterol. 2019;54:19–32. doi: 10.1007/s00535-018-1519-2.
    1. Van Dam J., Brady P.G., Freeman M., Gress F., Gross G.W., Hassall E., Hawes R., Jacobsen N.A., Liddle R.A., Ligresti R.J., et al. Guidelines for training in electronic ultrasound: Guidelines for clinical application. From the ASGE. American Society for Gastrointestinal Endoscopy. Gastrointest Endosc. 1999;49:829–833.
    1. Jiang Y., Inciardi M.F., Edwards A.V., Papaioannou J. Interpretation Time Using a Concurrent‒Read Computer‒Aided Detection System for Automated Breast Ultrasound in Breast Cancer Screening of Women With Dense Breast Tissue. Am. J. Roentgenol. 2018;211:452–461. doi: 10.2214/AJR.18.19516.
    1. Abràmoff M.D., Lavin P.T., Birch M., Shah N., Folk J.C. Pivotal trial of an autonomous AI‒based diagnostic system for detection of diabetic retinopathy in primary care offices. Digit. Med. 2018;39:20. doi: 10.1038/s41746-018-0040-6.
    1. Mori Y., Kudo S.-E., Misawa M., Saito Y., Ikematsu H., Hotta K., Ohtsuka K., Urushibara F., Kataoka S., Ogawa Y. Real-Time Use of Artificial Intelligence in Identification of Diminutive Polyps During Colonoscopy: A Prospective Study. Ann Intern. Med. 2018;169:357–366. doi: 10.7326/M18-0249.
    1. Goyal H., Mann R., Gandhi Z., Perisetti A., Ali A., Aman Ali K., Sharma N., Saligram S., Tharian B., Inamdar S. Scope of Artificial Intelligence in Screening and Diagnosis of Colorectal Cancer. J. Clin. Med. 2020;9:3313. doi: 10.3390/jcm9103313.
    1. Kanesaka T., Lee T.-C., Uedo N., Lin K.-P., Chen H.-Z., Lee J.-Y., Wang H.-P., Chang H.T. Computer-aided diagnosis for identifying and delineating early gastric cancers in magnifying narrow-band imaging. Gastrointest. Endosc. 2018;87:1339–1344. doi: 10.1016/j.gie.2017.11.029.
    1. Lee B.-I., Matsuda T. Estimation of Invasion Depth: The First Key to Successful Colorectal ESD. Clin. Endosc. 2019;52:100–106. doi: 10.5946/ce.2019.012.
    1. Rosenblatt F. The perceptron: A probabilistic model for information storage and organization in the brain. Psychol. Rev. 2006;65:386. doi: 10.1037/h0042519.
    1. Peng F., Schuurmans D., Wang S. Augmenting naive bayes classifiers with statistical language models. Inf. Retr. 2004;7:317–345. doi: 10.1023/B:INRT.0000011209.19643.e2.
    1. Walker S.H., Duncan D.B. Estimation of the probability of an event as a function of several independent variables. Biometrika. 1967;54:167–179. doi: 10.1093/biomet/54.1-2.167.
    1. Quinlan J. Ross. Simplifying decision trees. Int. J. Man-Mach. Stud. 1987;27:221–234. doi: 10.1016/S0020-7373(87)80053-6.
    1. Breiman L. Random Forests. Mach. Learn. 2001;45:5–32. doi: 10.1023/A:1010933404324.
    1. Vapnik V. Statistical Learning Theory. Volume 1. Wiley; New York, NY, USA: 1998. p. 624.
    1. Rumelhart D.E., Hinton G.E., Williams R.J. Learning representations by back-propagating errors. Nature. 1986;323:533–536. doi: 10.1038/323533a0.
    1. LeCun Y., Bengio Y., Hinton G. Deep learning. Nature. 2015;521:436–444. doi: 10.1038/nature14539.
    1. Hinton G.E., Salakhutdinov R.R. Reducing the Dimensionality of Data with Neural Networks. Science. 2006;313:504–507. doi: 10.1126/science.1127647.
    1. Salimans T., Goodfellow I., Zaremba W., Cheung V., Radford A., Chen X. Improved Techniques for Training GANs. arXiv. 20161606.03498
    1. Kido S., Hirano Y., Hashimoto N. Proceedings of the IWAIT2017. Institute of Electrical and Electronics Engineers (IEEE); Penang, Malaysia: 2017. Computer-aided classification of pulmonary diseases: Feature extraction based method versus non-feature extraction based method; pp. 1–3.
    1. Doi K. Computer-aided diagnosis in medical imaging: Historical review, current status and future potential. Comput. Med. Imaging Graph. 2007;31:198–21111. doi: 10.1016/j.compmedimag.2007.02.002.
    1. Wu D., Kim K., Dong B., El Fakhri G., Li Q. International Workshop on Machine Learning in Medical Imaging. Springer; Cham, Switzerland: 2018. End‒to‒End Lung Nodule Detection in Computed Tomography; pp. 37–45.
    1. Noble W.S. What is a support vector machine? Nat. Biotechnol. 2006;24:1565–1567. doi: 10.1038/nbt1206-1565.
    1. Yasaka K., Akai H., Kunimatsu A., Kiryu S., Abe O. Deep learning with convolutional neural network in radiology. Jpn. J. Radiol. 2018;36:257–272. doi: 10.1007/s11604-018-0726-3.
    1. Lecun Y., Bottou L., Bengio Y., Haffner P. Gradient-based learning applied to document recognition. Proc. IEEE. 1998;86:2278–2324. doi: 10.1109/5.726791.
    1. Krizhevsky A., Sutskever I., Hinton G.E. ImageNet Classification with Deep Convolutional Neural Net Works. Commun. ACM. 2017;60:84–90. doi: 10.1145/3065386.
    1. Szegedy C., Liu W., Jia Y., Sermanet P., Reed S., Anguelov D., Erhan D., Vanhoucke V., Rabinovich A. Going deeper with convolutions; Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition; Boston, MA, USA. 7–12 June 2015; pp. 1–9.
    1. Simonyan K., Zisserman A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv Preprint. 20141409.1556
    1. He K., Zhang X., REN S., Sun J. Deep Residual Learning for Image Recognition; Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition; Las Vegas, NV, USA. 26 June–1 July 2016; pp. 770–778.
    1. Norton I.D., Zheng Y., Wiersema M.S., Greenleaf J., Clain J.E., Dimagno E.P. Neural network analysis of EUS images to differentiate between pancreatic malignancy and pancreatitis. Gastrointest Endosc. 2001;54:625–629. doi: 10.1067/mge.2001.118644.
    1. Das A., Nguyen C.C., Li F., Li B. Digital image analysis of EUS images accurately differentiates pancreatic cancer from chronic pancreatitis and normal tissue. Gastrointest. Endosc. 2008;67:861–867. doi: 10.1016/j.gie.2007.08.036.
    1. Zhang M.M., Yang H., Jin Z.D., Yu J.G., Cai Z.Y., Li Z.S. Differential diagnosis of pancreatic cancer from normal tissue with digital imaging processing and pattern recognition based on a support vector machine of EUS images. Gastrointest. Endosc. 2010;72:978–985. doi: 10.1016/j.gie.2010.06.042.
    1. Săftoiu A., Vilmann P., Gorunescu F., Janssen J., Hocke M., Larsen M., Iglesias-Garcia J., Arcidiacono P., Will U., Giovannini M., et al. European EUS Elastography Multicentric Study Group. Efficacy of an artificial neural network-based approach to endoscopic ultrasound elastography in diagnosis of focal pancreatic masses. Clin. Gastroenterol. Hepatol. 2012;10:84–90.e1. doi: 10.1016/j.cgh.2011.09.014.
    1. Zhu M., Xu C., Yu J., Wu Y., Li C., Zhang M., Jin Z., Li Z. Differentiation of pancreatic cancer and chronic pancreatitis using computer-aided diagnosis of endoscopic ultrasound (EUS) images: A diagnostic test. PLoS ONE. 2013;8:e63820. doi: 10.1371/journal.pone.0063820.
    1. Saftoiu A., Vilmann P., Dietrich C.F., Iglesias-Garcia J., Hocke M., Seicean A., Ignee A., Hassan H., Streba C.T., Ioncică A.M., et al. Quantitative contrast-enhanced harmonic EUS in differential diagnosis of focal pancreatic masses (with videos) Gastrointest. Endosc. 2015;82:59–69. doi: 10.1016/j.gie.2014.11.040.
    1. Kurt M., Ozkan M., Cakiroglu M., Kocaman O., Yilmaz B., Can G., Korkmaz U., Dandil E., Eksi Z. Age-based computer-aided diagnosis approach for pancreatic cancer on endoscopic ultrasound images. Endosc. Ultrasound. 2016;5:101–107. doi: 10.4103/2303-9027.180473.
    1. Kuwahara T., Hara K., Mizuno N., Okuno N., Matsumoto S., Obata M., Kurita Y., Koda H., Toriyama K., Onishi S., et al. Usefulness of Deep Learning Analysis for the Diagnosis of Malignancy in Intraductal Papillary Mucinous Neoplasms of the Pancreas. Clin. Transl. Gastroenterol. 2019;10:1–8. doi: 10.14309/ctg.0000000000000045.
    1. Zhang J., Zhu L., Yao L., Ding X., Chen D., Wu H., Lu Z., Zhou W., Zhang L., An P., et al. Deep learning-based pancreas segmentation and station recognition system in EUS: Development and validation of a useful training tool (with video) Gastrointest. Endosc. 2020;92:874–885.e3. doi: 10.1016/j.gie.2020.04.071.
    1. Tonozuka R., Nagakawa Y., Nagata N., Kojima H., Sofuni A., Tsuchiya T., Ishii K., Tanaka R., Nagakawa Y., Mukai S. Deep learning analysis for the detection of pancreatic cancer on endosonographic images: A pilot study. J. Hepato-Biliary Pancreat. Sci. 2020 doi: 10.1002/jhbp.825.
    1. Selvaraju R.R., Cogswell M., Das A., Vedantam R., Parikh D., Batra D. Grad-cam: Visual explanations from deep networks via gradient-based localization; Proceedings of the IEEE International Conference on Computer Vision; Venice, Italy. 22–29 October 2017; pp. 618–626.

Source: PubMed

Подписаться