Skip to main content

Validating Parallel-Forms Tests for Assessing Anesthesia Resident Knowledge.

Publication ,  Journal Article
Lee, AJ; Goodman, SR; Bauer, MEB; Minehart, RD; Banks, S; Chen, Y; Landau, RL; Chatterji, M
Published in: J Med Educ Curric Dev
2024

We created a serious game to teach first year anesthesiology (CA-1) residents to perform general anesthesia for cesarean delivery. We aimed to investigate resident knowledge gains after playing the game and having received one of 2 modalities of debriefing. We report on the development and validation of scores from parallel test forms for criterion-referenced interpretations of resident knowledge. The test forms were intended for use as pre- and posttests for the experiment. Validation of instruments measuring the study's primary outcome was considered essential for adding rigor to the planned experiment, to be able to trust the study's results. Parallel, multiple-choice test forms development steps included: (1) assessment purpose and population specification; (2) content domain specification and writing/selection of items; (3) content validation by experts of paired items by topic and cognitive level; and (4) empirical validation of scores from the parallel test forms using Classical Test Theory (CTT) techniques. Field testing involved online administration of 52 shuffled items from both test forms to 24 CA-1's, 21 second-year anesthesiology (CA-2) residents, 2 fellows, 1 attending anesthesiologist, and 1 of unknown rank at 3 US institutions. Items from each form yielded near-normal score distributions, with similar medians, ranges, and standard deviations. Evaluations of CTT item difficulty (item p values) and discrimination (D) indices indicated that most items met assumptions of criterion-referenced test design, separating experienced from novice residents. Experienced residents performed better on overall domain scores than novices (P < .05). Kuder-Richardson Formula 20 (KR-20) reliability estimates of both test forms were above the acceptability cut of .70, and parallel forms reliability estimate was high at .86, indicating results were consistent with theoretical expectations. Total scores of parallel test forms demonstrated item-level validity, strong internal consistency and parallel forms reliability, suggesting sufficient robustness for knowledge outcomes assessments of CA-1 residents.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

J Med Educ Curric Dev

DOI

ISSN

2382-1205

Publication Date

2024

Volume

11

Start / End Page

23821205241229778

Location

United States

Related Subject Headings

  • 3901 Curriculum and pedagogy
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Lee, A. J., Goodman, S. R., Bauer, M. E. B., Minehart, R. D., Banks, S., Chen, Y., … Chatterji, M. (2024). Validating Parallel-Forms Tests for Assessing Anesthesia Resident Knowledge. J Med Educ Curric Dev, 11, 23821205241229776. https://doi.org/10.1177/23821205241229778
Lee, Allison J., Stephanie R. Goodman, Melissa E. B. Bauer, Rebecca D. Minehart, Shawn Banks, Yi Chen, Ruth L. Landau, and Madhabi Chatterji. “Validating Parallel-Forms Tests for Assessing Anesthesia Resident Knowledge.J Med Educ Curric Dev 11 (2024): 23821205241229776. https://doi.org/10.1177/23821205241229778.
Lee AJ, Goodman SR, Bauer MEB, Minehart RD, Banks S, Chen Y, et al. Validating Parallel-Forms Tests for Assessing Anesthesia Resident Knowledge. J Med Educ Curric Dev. 2024;11:23821205241229776.
Lee, Allison J., et al. “Validating Parallel-Forms Tests for Assessing Anesthesia Resident Knowledge.J Med Educ Curric Dev, vol. 11, 2024, p. 23821205241229776. Pubmed, doi:10.1177/23821205241229778.
Lee AJ, Goodman SR, Bauer MEB, Minehart RD, Banks S, Chen Y, Landau RL, Chatterji M. Validating Parallel-Forms Tests for Assessing Anesthesia Resident Knowledge. J Med Educ Curric Dev. 2024;11:23821205241229776.

Published In

J Med Educ Curric Dev

DOI

ISSN

2382-1205

Publication Date

2024

Volume

11

Start / End Page

23821205241229778

Location

United States

Related Subject Headings

  • 3901 Curriculum and pedagogy