Foot & Ankle Orthopaedics (Dec 2023)

Automated AI Detection Tool for Ankle Fractures Using X-Rays and Smart Devices

  • Nour Nassour MD,
  • Jose M. Acitores MS,
  • Caleb Jang BS,
  • Noopur Ranganathan BS,
  • John Wolf BS,
  • John Kwon MD,
  • Christopher W. DiGiovanni MD,
  • Soheil Ashkani-Esfahani

DOI
https://doi.org/10.1177/2473011423S00026
Journal volume & issue
Vol. 8

Abstract

Read online

Category: Ankle; Trauma Introduction/Purpose: The use of artificial intelligence (AI) is particularly salient to visually oriented medical professions, especially orthopedics. The most prominent use of AI in orthopedics comes in the form of medical imaging examinations. AI has a huge potential to help doctors make diagnoses by acting as a second pair of eyes. Some results suggested a very high level of agreement between AI models’ and the clinician’s assessments of radiographs. Furthermore, it was also suggested that the sensitivity and specificity of emergency medicine physicians for detecting some pathologies are significantly improved when aided by an AI tool. In view of these observations, our study aimed to create an AI-based ankle fracture detection tool that can be used on smart devices for X-ray interpretation. Methods: We examined 2,193 patients’ charts from 2 academic and 1 community hospital in Boston. We retrieved the anteroposterior (AP), oblique, and lateral ankle X-rays of each patient. Patients with ankle fractures and adults older than 18 years met our inclusion criteria. We excluded patients younger than 18 years old and those with any artifact, such as a cast, screws, or other artifacts in their X-rays. The study comprised 352 healthy controls and a total of 579 ankle fracture patients. Other than the digital images obtained from Electronic patient records (EPR), we used two different smart devices, a cellphone, and a tablet, to capture images from the monitor screen. Using Machine Learning models, we developed a fracture detection model using all three types of imaging and named it the “combination model”. We subsequently tested the combination model on digital X-rays, smart devices, and on both datasets together (Table 1). Results: We extracted the X-rays of a total of 931 patients in this study. Following the development and testing of our AI models, we noticed that all performed well with AUCs and accuracies above 0.85 and 0.86, respectively (Table1). The best performance was found when the combination model was tested on images taken from the camera of our smart devices, with an AUC of 0.88, a sensitivity of 0.86, and an accuracy of 0.89. This performance was closely followed by that of our model tested on a mix of both smart devices and original digital images with an AUC of 0.88, a sensitivity of 0.86, and an accuracy of 0.88. Conclusion: Our AI-based tool showed promising performance in the detection of ankle fractures using smart devices and images obtained from the monitor screen. We were able to reach an accuracy of diagnosis on smart device-captured images that were comparable with the original digital X-rays. The outcome of this study can be used to help providers who lack sufficient experience in detecting fractures. It can also be used for educational purposes for trainees in this field.