WE-G-204-09: Medical Physics 2.0 in Practice: Automated QC Assessment of Clinical Chest Images.
PURPOSE: To determine whether a proposed suite of objective image quality metrics for digital chest radiographs is useful for monitoring image quality in our clinical operation. METHODS: Seventeen gridless AP Chest radiographs from a GE Optima portable digital radiography (DR) unit (Group 1), seventeen (routine) PA Chest radiographs from a GE Discovery DR unit (Group 2), and sixteen gridless (non-routine) PA Chest radiographs from the same Discovery DR unit (Group 3) were chosen for analysis. Groups were selected to represent "sub-standard" (Group 1), "standard-of-care" (Group 2), and images with a gross technical error (Group 3). Group 1 images were acquired with lower kVp (90 vs. 125), shorter source-to-image distance (127cm vs 183cm) and were expected to have lower quality than images in Group 2. Group 3 was expected to have degraded contrast versus Group 2.This evaluation was approved by the institutional Quality Improvement Assurance Board (QIAB). Images were anonymized and securely transferred to the Duke University Clinical Imaging Physics Group for analysis using software previously described(1) and validated(2). Image quality for individual images was reported in terms of lung grey level(Lgl); lung noise(Ln); rib-lung contrast(RLc); rib sharpness(Rs); mediastinum detail(Md), noise(Mn), and alignment(Ma); subdiaphragm-lung contrast(SLc); and subdiaphragm area(Sa). Metrics were compared across groups. RESULTS: Metrics agreed with published Quality Consistency Ranges with three exceptions: higher Lgl, lower RLc, and SDc. Higher bit depth (16 vs 12) accounted for higher Lgl values in our images. Values were most internally consistent for Group 2. The most sensitive metric for distinguishing between groups was Mn followed closely by Ln. The least sensitive metrics were Md and RLc. CONCLUSION: The software appears promising for objectively and automatically identifying substandard images in our operation. The results can be used to establish local quality consistency ranges and action limits per facility preferences.
Duke Scholars
Published In
DOI
ISSN
Publication Date
Volume
Issue
Start / End Page
Location
Related Subject Headings
- Nuclear Medicine & Medical Imaging
- 5105 Medical and biological physics
- 4003 Biomedical engineering
- 1112 Oncology and Carcinogenesis
- 0903 Biomedical Engineering
- 0299 Other Physical Sciences
Citation
Published In
DOI
ISSN
Publication Date
Volume
Issue
Start / End Page
Location
Related Subject Headings
- Nuclear Medicine & Medical Imaging
- 5105 Medical and biological physics
- 4003 Biomedical engineering
- 1112 Oncology and Carcinogenesis
- 0903 Biomedical Engineering
- 0299 Other Physical Sciences