Interrater reliability of the NIH stroke scale.


Journal Article

The interobserver reliability of a rating scale employed in several multicenter stroke trials was investigated. Twenty patients who had a stroke were rated with this scale by four clinical stroke fellows. Each patient was independently evaluated by one pair of observers. The degree of interrater agreement for each item on the scale was determined by calculation of the kappa statistic. Interobserver agreement was moderate to substantial for 9 of 13 items. This rating system compares favorably with other scales for which such comparisons can be made. However, the validity of this system must be established.

Full Text

Duke Authors

Cited Authors

  • Goldstein, LB; Bertels, C; Davis, JN

Published Date

  • June 1989

Published In

Volume / Issue

  • 46 / 6

Start / End Page

  • 660 - 662

PubMed ID

  • 2730378

Pubmed Central ID

  • 2730378

International Standard Serial Number (ISSN)

  • 0003-9942

Digital Object Identifier (DOI)

  • 10.1001/archneur.1989.00520420080026


  • eng

Conference Location

  • United States