Current practices in evaluating radiology residents, faculty, and programs: results of a survey of radiology residency program directors.

Published

Journal Article

RATIONALE AND OBJECTIVES: We surveyed program directors to determine current radiology program practices in evaluating their residents, faculty, and program. MATERIALS AND METHODS: In January 2003, a 52-item Web-based survey was made available to program directors of accredited core radiology programs. Responses to the items were tabulated to determine relative frequency distribution. Two-tailed Pearson chi-square tests were used to compare proportions and assess the association between variables. RESULTS: A total of 99 (52%) of 192 program directors responded. Programs were largely in compliance with Accreditation Council for Graduate Medical Education (ACGME) requirements. Noncompliance was related to the requirements to evaluate residents at least four times per year in at least 22 (22.2%) of 99 programs and annually evaluate the program in 20 (20.2%) of 99 programs. New program directors (<1-year tenure) were less likely than those with >/=1-year tenure to be using the Association of Program Directors in Radiology Education Committee global rating form (41.2% versus 68.8%, P =.03). Programs that used this form, compared with those that didn't, were more likely to evaluate resident competence in systems-based practice (88.5% versus 44.0%, P =.001). Being a program director for 1 or more years versus less than 1 year was associated with using a computerized evaluation system (35.8% versus 11.8%, P =.05). CONCLUSION: In general, there is a high degree of compliance among radiology programs in meeting ACGME evaluation requirements. However, some programs do not comply with requirements for frequency of resident evaluation or annual program evaluation. The percentage of new program directors is high and related to not using or knowing about useful evaluation resources. Use of computerized evaluation systems, which have the potential to decrease the work associated with evaluations and provide more dependable and reliable data, is minimal.

Full Text

Duke Authors

Cited Authors

  • Collins, J; Herring, W; Kwakwa, F; Tarver, RD; Blinder, RA; Gray-Leithe, L; Wood, B

Published Date

  • July 1, 2004

Published In

Volume / Issue

  • 11 / 7

Start / End Page

  • 787 - 794

PubMed ID

  • 15217596

Pubmed Central ID

  • 15217596

International Standard Serial Number (ISSN)

  • 1076-6332

Digital Object Identifier (DOI)

  • 10.1016/j.acra.2004.04.005

Language

  • eng

Conference Location

  • United States