New AI System Prioritizes Chest X-Rays Containing Critical Findings

By HospiMedica International staff writers

[07 Feb 2019]
Image: Examples of correctly and incorrectly prioritized radiographs. (a) Radiograph was reported as showing large right pleural effusion (arrow). This was correctly prioritized as urgent. (b) Radiograph reported as showing “lucency at the left apex suspicious for pneumothorax.” This was prioritized as normal. On review by three independent radiologists, the radiograph was unanimously considered to be normal. (c) Radiograph reported as showing consolidation projected behind heart (arrow). The finding was missed by the artificial intelligence system, and the study was incorrectly prioritized as normal (Photo courtesy of RSNA).

A team of UK researchers has trained an artificial intelligence (AI) system to interpret and prioritize abnormal chest X-rays with critical findings, thereby creating the potential for reducing the backlog of exams and bringing urgently needed care to patients more quickly.

Globally, chest X-rays account for 40% of all diagnostic imaging and the number of exams can create significant backlogs at health care facilities. Deep learning (DL), a type of AI that is capable of being trained to recognize subtle patterns in medical images, is being seen as an automated means to reduce this backlog and identify exams that warrant immediate attention, particularly in publicly funded health care systems.

In their study, the researchers used 470,388 adult chest X-rays to develop an AI system that could identify key findings. The radiologic reports were pre-processed using Natural Language Processing (NLP), an important algorithm of the AI system that extracts labels from written text. For each X-ray, the researchers' in-house system required a list of labels indicating which specific abnormalities were visible on the image.

The NLP analyzed the radiologic report to prioritize each image as critical, urgent, non-urgent or normal. An AI system for computer vision was then trained using labeled X-ray images to predict the clinical priority from appearances only. The researchers tested the system's performance for prioritization in a simulation using an independent set of 15,887 images. The AI system distinguished abnormal from normal chest X-rays with high accuracy. Simulations showed that critical findings received an expert radiologist opinion in 2.7 days, on average, with the AI approach—significantly sooner than the 11.2-day average for actual practice.

"The initial results reported here are exciting as they demonstrate that an AI system can be successfully trained using a very large database of routinely acquired radiologic data," said study co-author Giovanni Montana, Ph.D., formerly of King's College London in London and currently at the University of Warwick in Coventry, England. "With further clinical validation, this technology is expected to reduce a radiologist's workload by a significant amount by detecting all the normal exams so more time can be spent on those requiring more attention."