Artificial intelligence in radiology: 100 commercially available products and their scientific evidence

Kicky G van Leeuwen, Steven Schalekamp, Matthieu J C M Rutten, Bram van Ginneken, Maarten de Rooij, Kicky G van Leeuwen, Steven Schalekamp, Matthieu J C M Rutten, Bram van Ginneken, Maarten de Rooij

Abstract

Objectives: Map the current landscape of commercially available artificial intelligence (AI) software for radiology and review the availability of their scientific evidence.

Methods: We created an online overview of CE-marked AI software products for clinical radiology based on vendor-supplied product specifications ( www.aiforradiology.com ). Characteristics such as modality, subspeciality, main task, regulatory information, deployment, and pricing model were retrieved. We conducted an extensive literature search on the available scientific evidence of these products. Articles were classified according to a hierarchical model of efficacy.

Results: The overview included 100 CE-marked AI products from 54 different vendors. For 64/100 products, there was no peer-reviewed evidence of its efficacy. We observed a large heterogeneity in deployment methods, pricing models, and regulatory classes. The evidence of the remaining 36/100 products comprised 237 papers that predominantly (65%) focused on diagnostic accuracy (efficacy level 2). From the 100 products, 18 had evidence that regarded level 3 or higher, validating the (potential) impact on diagnostic thinking, patient outcome, or costs. Half of the available evidence (116/237) were independent and not (co-)funded or (co-)authored by the vendor.

Conclusions: Even though the commercial supply of AI software in radiology already holds 100 CE-marked products, we conclude that the sector is still in its infancy. For 64/100 products, peer-reviewed evidence on its efficacy is lacking. Only 18/100 AI products have demonstrated (potential) clinical impact.

Key points: • Artificial intelligence in radiology is still in its infancy even though already 100 CE-marked AI products are commercially available. • Only 36 out of 100 products have peer-reviewed evidence of which most studies demonstrate lower levels of efficacy. • There is a wide variety in deployment strategies, pricing models, and CE marking class of AI products for radiology.

Keywords: Artificial intelligence; Device approval; Evidence-based practice; Radiology.

Conflict of interest statement

KGvL, SS, MJCMR, and MdR declare no relationships with any companies, whose products or services may be related to the subject matter of the article.

BvG is co-founder and shareholder, and receives royalties from Thirona and receives royalties from Delft Imaging and Mevis Medical Solutions.

Figures

Fig. 1
Fig. 1
Characteristics of 100 CE-marked AI products based on organ-based subspeciality, modality, and main functionality. MSK, musculoskeletal
Fig. 2
Fig. 2
Distribution of CE class, FDA class, pricing model, and deployment strategies of 100 CE-marked AI products. CE, European Conformity Marking; FDA, Food and Drug Administration
Fig. 3
Fig. 3
Visualization of the timeline for the one hundred CE-marked AI products. Yellow circles denote the year the company was founded, red circles the year the product was brought to market, and blue circles provide the date of peer-reviewed papers. The larger the circle, the more papers were published in that year. Product specifications were not verified by the vendor when the product is listed in gray text
Fig. 4
Fig. 4
Peer-reviewed articles were present for 36 out of the 100 commercially available AI products. For these 36 products, the three pie charts on the right demonstrate the characteristics of the validation data when aggregating all included papers per product (i.e., the number of scanner manufacturers, centers, and countries)
Fig. 5
Fig. 5
The levels of efficacy of the included papers. The search strategy yielded 239 peer-reviewed publications on the efficacy of 36 out of 100 commercially available AI products. A single paper could address multiple levels

References

    1. Radiological Society of North America (2017) AI Exhibitors RSNA 2017. Radiological Society of North America. . Accessed 6 Oct 2020
    1. Radiological Society of North America (2019) AI Exhibitors RSNA 2019. Radiological Society of North America. . Accessed 6 Oct 2020
    1. Huisman M, Ranschaert ER, Parker W et al (2020) Implementation of artificial intelligence: is the community ready? An international survey of 1,041 radiologists and residents [abstract]. In: Proceedings of the European Congress of Radiology; 2020 Jul15–19; Vienna, Austria: ESR; 2020. Insights into Imaging, pp 302–303
    1. Strohm L, Hehakaya C, Ranschaert ER, et al. Implementation of artificial intelligence (AI) applications in radiology: hindering and facilitating factors. Eur Radiol. 2020;30:5525–5532. doi: 10.1007/s00330-020-06946-y.
    1. Wichmann JL, Willemink MJ, De Cecco CN (2020) Artificial intelligence and machine learning in radiology: current state and considerations for routine clinical implementation. Invest Radiol 55
    1. Kim DW, Jang HY, Kim KW, et al. Design characteristics of studies reporting the performance of artificial intelligence algorithms for diagnostic analysis of medical images: results from recently published papers. Korean J Radiol. 2019;20:405–410. doi: 10.3348/kjr.2019.0025.
    1. Nagendran M, Chen Y, Lovejoy CA, et al. Artificial intelligence versus clinicians: systematic review of design, reporting standards, and claims of deep learning studies. BMJ. 2020;368:m689. doi: 10.1136/bmj.m689.
    1. Yao AD, Cheng DL, Pan I, Kitamura F. Deep learning in neuroradiology: a systematic review of current algorithms and approaches for the new wave of imaging technology. Radiol: Artif Intell. 2020;2:e190026.
    1. International Organization for Standardization [ISO] (2020) ISO/IEC TR 24028:2020 Information technology — artificial intelligence — overview of trustworthiness in artificial intelligence. International Organization for Standardization [ISO]. . Accessed 3 Feb 2021
    1. European Commission (2020) Medical devices - EUDAMED, overview. European Union. . Accessed 12 Jun 2020
    1. U.S. Food & Drug Administration (2019) Medical device databases. U.S. Food & Drug Administration. . Accessed 10 Sep 2020
    1. Mahajan V, Venugopal VK, Murugavel M, Mahajan H. The algorithmic audit: working with vendors to validate radiology-AI algorithms; how we do it. Acad Radiol. 2020;27:132–135. doi: 10.1016/j.acra.2019.09.009.
    1. Fryback DG, Thornbury JR. The efficacy of diagnostic imaging. Med Decis Making. 1991;11:88–94. doi: 10.1177/0272989X9101100203.
    1. Benjamens S, Dhunnoo P, Meskó B. The state of artificial intelligence-based FDA-approved medical devices and algorithms: an online database. npj Digit Med. 2020;3:118. doi: 10.1038/s41746-020-00324-0.
    1. Rezazade Mehrizi MH, van Ooijen P, Homan M (2020) Applications of artificial intelligence (AI) in diagnostic radiology: a technography study. Eur Radiol. 10.1007/s00330-020-07230-9

Source: PubMed

3
Předplatit