Multirater agreement of arthroscopic meniscal lesions.
Establishing the validity of classification schemes is a crucial preparatory step that should precede multicenter studies. There are no studies investigating the reproducibility of arthroscopic classification of meniscal pathology among multiple surgeons at different institutions.Arthroscopic classification of meniscal pathology is reliable and reproducible and suitable for multicenter studies that involve multiple surgeons.Multirater agreement study.Seven surgeons reviewed a video of 18 meniscal tears and completed a meniscal classification questionnaire. Multirater agreement was calculated based on the proportion of agreement, the kappa coefficient, and the intraclass correlation coefficient.There was a 46% agreement on the central/peripheral location of tears (kappa = 0.30), an 80% agreement on the depth of tears (kappa = 0.46), a 72% agreement on the presence of a degenerative component (kappa = 0.44), a 71% agreement on whether lateral tears were central to the popliteal hiatus (kappa = 0.42), a 73% agreement on the type of tear (kappa = 0.63), an 87% agreement on the location of the tear (kappa = 0.61), and an 84% agreement on the treatment of tears (kappa = 0.66). There was considerable agreement among surgeons on length, with an intraclass correlation coefficient of 0.78, 95% confidence interval of 0.57 to 0.92, and P < .001.Arthroscopic grading of meniscal pathology is reliable and reproducible.Surgeons can reliably classify meniscal pathology and agree on treatment, which is important for multicenter trials.
Dunn, WR; Wolf, BR; Amendola, A; Andrish, JT; Kaeding, C; Marx, RG; McCarty, EC; Parker, RD; Wright, RW; Spindler, KP
Volume / Issue
Start / End Page
Electronic International Standard Serial Number (EISSN)
International Standard Serial Number (ISSN)
Digital Object Identifier (DOI)