Skip to main content
Journal cover image

Assessing intra, inter and total agreement with replicated readings.

Publication ,  Journal Article
Barnhart, HX; Song, J; Haber, MJ
Published in: Stat Med
May 15, 2005

In clinical studies, assessing agreement of multiple readings on the same subject plays an important role in the evaluation of continuous measurement scale. The multiple readings within a subject may be replicated readings by using the same method or/and readings by using several methods (e.g. different technologies or several raters). The traditional agreement data for a given subject often consist of either replicated readings from only one method or multiple readings from several methods where only one reading is taken from each of these methods. In the first case, only intra-method agreement can be evaluated. In the second case, traditional agreement indices such as intra-class correlation (ICC) or concordance correlation coefficient (CCC) is often reported as inter-method agreement. We argue that these indices are in fact measures of total agreement that contains both inter and intra agreement. Only if there are replicated readings from several methods for a given subject, then one can assess intra, inter and total agreement simultaneously. In this paper, we present new inter-method agreement index, inter-CCC, and total agreement index, total-CCC, for agreement data with replicated readings from several methods where the ICCs within methods are used to assess intra-method agreement for each of the several methods. The relationship of the total-CCC with the inter-CCC and the ICCs is investigated. We propose a generalized estimating equations approach for estimation and inference. Simulation studies are conducted to assess the performance of the proposed approach and data from a carotid stenosis screening study is used for illustration.

Duke Scholars

Published In

Stat Med

DOI

ISSN

0277-6715

Publication Date

May 15, 2005

Volume

24

Issue

9

Start / End Page

1371 / 1384

Location

England

Related Subject Headings

  • Statistics & Probability
  • Reproducibility of Results
  • Observer Variation
  • Magnetic Resonance Angiography
  • Humans
  • Diagnostic Tests, Routine
  • Coronary Angiography
  • Computer Simulation
  • Clinical Trials as Topic
  • Carotid Stenosis
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Barnhart, H. X., Song, J., & Haber, M. J. (2005). Assessing intra, inter and total agreement with replicated readings. Stat Med, 24(9), 1371–1384. https://doi.org/10.1002/sim.2006
Barnhart, Huiman X., Jingli Song, and Michael J. Haber. “Assessing intra, inter and total agreement with replicated readings.Stat Med 24, no. 9 (May 15, 2005): 1371–84. https://doi.org/10.1002/sim.2006.
Barnhart HX, Song J, Haber MJ. Assessing intra, inter and total agreement with replicated readings. Stat Med. 2005 May 15;24(9):1371–84.
Barnhart, Huiman X., et al. “Assessing intra, inter and total agreement with replicated readings.Stat Med, vol. 24, no. 9, May 2005, pp. 1371–84. Pubmed, doi:10.1002/sim.2006.
Barnhart HX, Song J, Haber MJ. Assessing intra, inter and total agreement with replicated readings. Stat Med. 2005 May 15;24(9):1371–1384.
Journal cover image

Published In

Stat Med

DOI

ISSN

0277-6715

Publication Date

May 15, 2005

Volume

24

Issue

9

Start / End Page

1371 / 1384

Location

England

Related Subject Headings

  • Statistics & Probability
  • Reproducibility of Results
  • Observer Variation
  • Magnetic Resonance Angiography
  • Humans
  • Diagnostic Tests, Routine
  • Coronary Angiography
  • Computer Simulation
  • Clinical Trials as Topic
  • Carotid Stenosis