Open Journal of Medical Imaging

Volume 5, Issue 3 (September 2015)

ISSN Print: 2164-2788   ISSN Online: 2164-2796

Google-based Impact Factor: 0.15  Citations  

Inter-Observer Variability in the Detection and Interpretation of Chest X-Ray Anomalies in Adults in an Endemic Tuberculosis Area

HTML  XML Download Download as PDF (Size: 319KB)  PP. 143-149  
DOI: 10.4236/ojmi.2015.53018    3,487 Downloads   5,198 Views  Citations

ABSTRACT

Purpose: To assess the inter-observer agreement in reading adults chest radiographs (CXR) and determine the effectiveness of observers in radiographic diagnosis of pulmonary tuberculosis (PTB) in a tuberculosis endemic area. Methods: A quasi-observational study was conducted in the Pneumology Department of Yaounde Jamot Hospital (Cameroon) from January to March 2014. This included six observers (two chest physicians, two radiologists, two end-training residents in medical imaging) and 47 frontal CXRs (4 of diffuse interstitial lung disease, 6 normal, 7 of lung cancers, 7 of bacterial pneumonia, 23 of PTB). The sample size was calculated on the basis of an expected 0.47 Kappa with a spread of 0.13 (α = 5%, CI = 95%) for six observers and five diagnostic items. The analysis of concordance was focused on the detection of nodules, cavitary lesions, pleural effusion, adenomegaly and diagnosis of PTB and lung cancer. These intervals of kappa coefficient were considered: discordance (<0.0), poor agreement (0.0 - 0.20), fair (0.21 - 0.40), moderate (0.41 - 0.60), good (0.61 - 0.80), excellent (>0.81). Results: The average score for the detection of caverns was the highest (58.3%) followed by that of the correct diagnosis of tuberculosis (49.3%). Pneumologists had the highest proportions of correct diagnosis of tuberculosis (69.6% and 73.9%) and better inter-observer agreement (k = 0.71) for PTB diagnosis. Observers were more in agreement for the detection of nodules (0.32 - 0.74), adenomegalies (0.43 - 0.69), and for the diagnosis of cancer (0.22 - 1) than for the diagnosis of tuberculosis (0.19 - 0.71). Disagreements were more frequent for the detection of pleural effusions (0.08 - 0.73). Conclusion: The inter-observer agreement varies with the type of lesions and diagnosis. Pneumologists were most effective for the diagnosis of pulmonary tuberculosis. Observers were more in agreement for the detection of nodules and the diagnosis of cancer than for the diagnosis of pulmonary tuberculosis.

Share and Cite:

Moifo, B. , Pefura-Yone, E. , Nguefack-Tsague, G. , Gharingam, M. , Tapouh, J. , Kengne, A. and Amvene, S. (2015) Inter-Observer Variability in the Detection and Interpretation of Chest X-Ray Anomalies in Adults in an Endemic Tuberculosis Area. Open Journal of Medical Imaging, 5, 143-149. doi: 10.4236/ojmi.2015.53018.

Cited by

[1] Translating medical image to radiological report: Adaptive multilevel multi-attention approach
Computer Methods and Programs in …, 2022
[2] Comparison of diagnostic accuracy of the AI system with human readers in the diagnosis of portable chest x-rays during the COVID-19 pandemic
2022
[3] Kloman Metre: An EMD-Based Tool for Triaging Diseases Leading to Lung Infections Including COVID-19
Contemporary Issues in …, 2022
[4] YOĞUN BAKIM HASTASI TAKİP EDEN UZMANLIK ÖĞRENCİSİ HEKİMLERİN AKCİĞER GRAFİSİ DEĞERLENDİRME KONUSUNDAKİ BİLGİ DÜZEYLERİNİN …
SDÜ Tıp Fakültesi …, 2021
[5] Diagnosis of normal chest radiographs using an autonomous deep-learning algorithm
2021
[6] Key Technology Considerations in Developing and Deploying Machine Learning Models in Clinical Radiology Practice
2021
[7] Birinci Basamak Sağlık Hizmeti Veren Aile Hekimliği Asistanlarının Posteroanterior Akciğer Grafisi Değerlendirme Konusundaki Bilgi, Tutum ve Davranışlarının …
2019
[8] Increasing Number and Volume of Cavitary Lesions on Chest Computed Tomography Are Associated With Prolonged Time to Culture Conversion in Pulmonary …
2019
[9] Birinci Basamak Sağlık Hizmeti Veren Aile Hekimliği Asistanlarının Posteroanterior Akciğer Grafisi Değerlendirme Konusundaki Bilgi, Tutum ve …
2019
[10] From Shallow to Deep Interactions Between Knowledge Representation, Reasoning and Machine Learning (Kay R. Amel group)
Deep Learning Monitor, 2019
[11] Can artificial intelligence reliably report chest x-rays
2018
[12] Can Artificial Intelligence Reliably Report Chest X-Rays?: Radiologist Validation of an Algorithm trained on 1.2 Million X-Rays
2018

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.