Radiology Room |
Ultrasound Room |
Surgery Room |
Laboratory Room |
Comprehensive Room |
Pediatrics Room |
Dental Room |
Medical operation instruments |
Hospital Furniture |
Medical supplies |
News Center
AI-Aided Interpretation of Chest X-Ray Improves Reader Performance and Efficiency
There has been an increasing interest, with the rise of deep learning and artificial intelligence (AI) applications in medical imaging, to create chest radiograph AI algorithms that can help clinicians to accurately and efficiently detect key radiographic findings. Research shows that AI algorithms can improve the performance of readers when used in a concurrent manner. However, there are concerns about what the impact of AI would be in the real world, given that most research was conducted in a simulated setting without an observer performance tool that mimics the real-world workflow. There is also a lack of evidence on the impact of AI in the reader efficiency, especially in terms of time taken for readers to complete their reports. Now, a new study that explored the impact of AI on reader performance, both in terms of accuracy and efficiency, found that an AI algorithm can improve the reader performance and efficiency in interpreting chest radiograph abnormalities.
Researchers at the Massachusetts General Hospital (Boston, MA, USA) conducted a multicenter cohort study from April to November 2021 that involved radiologists, including attending radiologists, thoracic radiology fellows, and residents, who independently participated in two observer performance test sessions. The study involved a total of 497 frontal chest radiographs from adult patients with and without four target findings (pneumonia, nodule, pneumothorax, and pleural effusion). A commercially available AI algorithm (Lunit INSIGHT CXR, version 3.1.2.0) was used to process the chest radiograph images. The sessions included a reading session with AI and a session without AI, in a randomized crossover manner with a four-week washout period in between. The AI produced a heat map and the image-level probability of the presence of the referable lesion.
The ground truths for the labels were created via consensual reading by two thoracic radiologists. Each reader documented their findings in a customized report template, in which the four target chest radiograph findings and the reader confidence of the presence of each finding was recorded. The time taken for reporting each chest radiograph was also recorded. Sensitivity, specificity, and area under the receiver operating characteristic curve (AUROC) were calculated for each target finding. The target findings were found in 351 of 497 chest radiographs. The AI was associated with higher sensitivity for all findings compared with the readers. AI-aided interpretation was associated with significantly improved reader sensitivities for all target findings, without negative impacts on the specificity. Overall, the AUROCs of readers improved for all four target findings, with significant improvements in detection of pneumothorax and nodule. The reporting time with AI was 10% lower than without AI.
In conclusion, the use of an AI algorithm was associated with an improved sensitivity for detection of four target chest radiograph findings (pneumonia, lung nodules, pleural effusion, and pneumothorax) for radiologists, thoracic imaging fellows as well as radiology residents, while maintaining the specificity. These findings suggest that an AI algorithm can improve the reader performance and efficiency in interpreting chest radiograph abnormalities.
http://www.gzjiayumed.com/en/index.asp