Unboxing AI | Friday Webinars @ 11 AM ET | Dr R. Kent Hutson, Radiology Partners (Aug 23) | Dr Avishkar Sharma, Einstein Healthcare Network (Aug 30) | Steve Holloway, Signify Research Ltd (Sep 6) | Dr Hugh Harvey, Hardian Health (Sep 13) | Jean Joseph CHRISTOPHE, CASIS (Sep 20) | Register Now
  • 0023-11-24

New Epochs in AI Supervision: Design and Implementation of an Autonomous Radiology AI Monitoring System

With the increasingly widespread adoption of AI in healthcare, maintaining the accuracy and reliability of AI models in clinical practice has become crucial. In this context, we introduce novel methods for monitoring the performance of radiology AI classification models in practice, addressing the challenges of obtaining real-time ground truth for performance monitoring. We propose two metrics - predictive divergence and temporal stability - to be used for preemptive alerts of AI performance changes. Predictive divergence, measured using Kullback-Leibler and Jensen-Shannon divergences, evaluates model accuracy by comparing predictions with those of two supplementary models. Temporal stability is assessed through a comparison of current predictions against historical moving averages, identifying potential model decay or data drift. This approach was retrospectively validated using chest X-ray data from a single-center imaging clinic, demonstrating its effectiveness in maintaining AI model reliability. By providing continuous, real-time insights into model performance, our system ensures the safe and effective use of AI in clinical decision-making, paving the way for more robust AI integration in healthcare

Link to complete publication here: https://scholar.google.com/citations?view_op=view_citation&hl=en&user=dqpMNRUAAAAJ&cstart=20&pagesize=80&citation_for_view=dqpMNRUAAAAJ:vV6vV6tmYwMC

Unlock the potential of CARPL platform for optimizing radiology workflows

Talk to a Clinical Solutions Architect