Active Monitoring

AI Radiology False Negative Diagnosis

Category: Medical AI & Robotic Surgery

Hazard Definition

AI radiology false negative diagnosis refers to instances where artificial intelligence systems used to assist in medical imaging interpretation fail to detect clinically significant findings—such as tumors, fractures, or other abnormalities—that are present in the images. These missed detections can result in delayed diagnosis, disease progression, and adverse patient outcomes.

Mechanism of Harm

AI radiology tools are typically deployed as clinical decision support systems, intended to flag potential abnormalities for radiologist review. Harm may occur through several documented pathways:

Automation bias: When clinicians over-rely on AI outputs, a negative AI result may reduce scrutiny of images that would otherwise receive closer examination.

Training data limitations: AI systems trained on datasets that underrepresent certain demographics, cancer subtypes, or imaging equipment may exhibit reduced sensitivity for those populations or conditions.

Workflow integration failures: In some reported cases, AI flags were not surfaced to interpreting physicians due to interface design issues or integration failures with existing PACS systems.

Documented Incident Patterns

The following patterns have been identified through FDA adverse event reports, peer-reviewed literature, and publicly reported cases. Adverse events involving AI radiology devices can be searched in the FDA MAUDE database.

Mammography screening: Multiple studies have documented AI mammography systems missing interval cancers—malignancies that develop between screening rounds.

Lung nodule detection: FDA MAUDE database entries include reports of AI chest CT systems failing to flag pulmonary nodules later confirmed as malignant.

Fracture identification: Emergency department AI triage systems have been documented missing subtle fractures, particularly in pediatric patients.

Regulatory Status

AI radiology devices are regulated by the FDA under the 510(k) clearance pathway for most diagnostic applications. The FDA maintains a list of authorized AI/ML-enabled medical devices, with radiology representing the largest category.

No mandatory incident reporting requirement exists for AI diagnostic errors that do not involve device malfunction. This creates a documentation gap for false negative events.

Known Data Gaps

  • Comparative false negative rates across commercial AI radiology products
  • Demographic breakdown of missed diagnoses by patient age, race, and sex
  • Institutional variation in AI override rates by radiologists
  • Long-term patient outcome data following AI-assisted missed diagnoses

Report an Incident

If you have direct knowledge of an incident involving AI radiology misdiagnosis, you may submit a confidential report for documentation and potential investigation.

Submit a Report
← Back to Medical AI & Robotic Surgery