A comparative study of measures to evaluate medical students' performances

Ronald A. Edelstein, Helen M. Reid, Richard Usatine, Michael S Wilkes

Research output: Contribution to journalArticle

32 Citations (Scopus)

Abstract

Purpose. To assess how new National Board of Medical Examiners (NBME) performance examinations - computer-based case simulations (CBX) and standardized patient exams (SPX) - compare with each other and with traditional internal and external measures of medical students' performances. Secondary objectives examined attitudes of students toward new and traditional evaluation modalities. Method. Fourth-year students (n = 155) at the University of California, Los Angeles, School of Medicine (including joint programs at Charles R. Drew University of Medicine and Science and University of California, Riverside) were assigned two days of performance examinations (eight SPXs, ten CBXs, and a self-administered attitudinal survey). The CBX was scored by the NBME and the SPX by a NBME/Macy consortium. Scores were linked to the survey and correlated with archival student data, including traditional performance indicators (licensing board scores, grade-point averages, etc.). Results. Of the 155 students, 95% completed the testing. The CBX and the SPX had low to moderate statistically significant correlations with each other and with traditional measures of performance. Traditional measures were intercorrelated at higher levels than with the CBX or SPX. Students' perceptions of the various evaluation methods, varied based on the assessment. These findings are consistent with the theoretical construct for development of performance examinations. For example, to assess clinical decision making, students rated the CBX best, while they rated multiple-choice examinations best to assess knowledge. Conclusion. Examination results and student perception studies provide converging evidence that performance examinations measure different physician competency domains and support using multipronged assessment approaches.

Original languageEnglish (US)
Pages (from-to)825-833
Number of pages9
JournalAcademic Medicine
Volume75
Issue number8
StatePublished - 2000
Externally publishedYes

Fingerprint

Medical Students
medical student
Students
Coroners and Medical Examiners
examination
medical examiner
performance
student
Medicine
medicine
Los Angeles
Licensure
evaluation
physician
Physicians
decision making
simulation
science
school
evidence

ASJC Scopus subject areas

  • Nursing(all)
  • Public Health, Environmental and Occupational Health
  • Education

Cite this

A comparative study of measures to evaluate medical students' performances. / Edelstein, Ronald A.; Reid, Helen M.; Usatine, Richard; Wilkes, Michael S.

In: Academic Medicine, Vol. 75, No. 8, 2000, p. 825-833.

Research output: Contribution to journalArticle

Edelstein, RA, Reid, HM, Usatine, R & Wilkes, MS 2000, 'A comparative study of measures to evaluate medical students' performances', Academic Medicine, vol. 75, no. 8, pp. 825-833.
Edelstein, Ronald A. ; Reid, Helen M. ; Usatine, Richard ; Wilkes, Michael S. / A comparative study of measures to evaluate medical students' performances. In: Academic Medicine. 2000 ; Vol. 75, No. 8. pp. 825-833.
@article{9ced5e7b2e8646268cf3c5f1249772e4,
title = "A comparative study of measures to evaluate medical students' performances",
abstract = "Purpose. To assess how new National Board of Medical Examiners (NBME) performance examinations - computer-based case simulations (CBX) and standardized patient exams (SPX) - compare with each other and with traditional internal and external measures of medical students' performances. Secondary objectives examined attitudes of students toward new and traditional evaluation modalities. Method. Fourth-year students (n = 155) at the University of California, Los Angeles, School of Medicine (including joint programs at Charles R. Drew University of Medicine and Science and University of California, Riverside) were assigned two days of performance examinations (eight SPXs, ten CBXs, and a self-administered attitudinal survey). The CBX was scored by the NBME and the SPX by a NBME/Macy consortium. Scores were linked to the survey and correlated with archival student data, including traditional performance indicators (licensing board scores, grade-point averages, etc.). Results. Of the 155 students, 95{\%} completed the testing. The CBX and the SPX had low to moderate statistically significant correlations with each other and with traditional measures of performance. Traditional measures were intercorrelated at higher levels than with the CBX or SPX. Students' perceptions of the various evaluation methods, varied based on the assessment. These findings are consistent with the theoretical construct for development of performance examinations. For example, to assess clinical decision making, students rated the CBX best, while they rated multiple-choice examinations best to assess knowledge. Conclusion. Examination results and student perception studies provide converging evidence that performance examinations measure different physician competency domains and support using multipronged assessment approaches.",
author = "Edelstein, {Ronald A.} and Reid, {Helen M.} and Richard Usatine and Wilkes, {Michael S}",
year = "2000",
language = "English (US)",
volume = "75",
pages = "825--833",
journal = "Academic Medicine",
issn = "1040-2446",
publisher = "Lippincott Williams and Wilkins",
number = "8",

}

TY - JOUR

T1 - A comparative study of measures to evaluate medical students' performances

AU - Edelstein, Ronald A.

AU - Reid, Helen M.

AU - Usatine, Richard

AU - Wilkes, Michael S

PY - 2000

Y1 - 2000

N2 - Purpose. To assess how new National Board of Medical Examiners (NBME) performance examinations - computer-based case simulations (CBX) and standardized patient exams (SPX) - compare with each other and with traditional internal and external measures of medical students' performances. Secondary objectives examined attitudes of students toward new and traditional evaluation modalities. Method. Fourth-year students (n = 155) at the University of California, Los Angeles, School of Medicine (including joint programs at Charles R. Drew University of Medicine and Science and University of California, Riverside) were assigned two days of performance examinations (eight SPXs, ten CBXs, and a self-administered attitudinal survey). The CBX was scored by the NBME and the SPX by a NBME/Macy consortium. Scores were linked to the survey and correlated with archival student data, including traditional performance indicators (licensing board scores, grade-point averages, etc.). Results. Of the 155 students, 95% completed the testing. The CBX and the SPX had low to moderate statistically significant correlations with each other and with traditional measures of performance. Traditional measures were intercorrelated at higher levels than with the CBX or SPX. Students' perceptions of the various evaluation methods, varied based on the assessment. These findings are consistent with the theoretical construct for development of performance examinations. For example, to assess clinical decision making, students rated the CBX best, while they rated multiple-choice examinations best to assess knowledge. Conclusion. Examination results and student perception studies provide converging evidence that performance examinations measure different physician competency domains and support using multipronged assessment approaches.

AB - Purpose. To assess how new National Board of Medical Examiners (NBME) performance examinations - computer-based case simulations (CBX) and standardized patient exams (SPX) - compare with each other and with traditional internal and external measures of medical students' performances. Secondary objectives examined attitudes of students toward new and traditional evaluation modalities. Method. Fourth-year students (n = 155) at the University of California, Los Angeles, School of Medicine (including joint programs at Charles R. Drew University of Medicine and Science and University of California, Riverside) were assigned two days of performance examinations (eight SPXs, ten CBXs, and a self-administered attitudinal survey). The CBX was scored by the NBME and the SPX by a NBME/Macy consortium. Scores were linked to the survey and correlated with archival student data, including traditional performance indicators (licensing board scores, grade-point averages, etc.). Results. Of the 155 students, 95% completed the testing. The CBX and the SPX had low to moderate statistically significant correlations with each other and with traditional measures of performance. Traditional measures were intercorrelated at higher levels than with the CBX or SPX. Students' perceptions of the various evaluation methods, varied based on the assessment. These findings are consistent with the theoretical construct for development of performance examinations. For example, to assess clinical decision making, students rated the CBX best, while they rated multiple-choice examinations best to assess knowledge. Conclusion. Examination results and student perception studies provide converging evidence that performance examinations measure different physician competency domains and support using multipronged assessment approaches.

UR - http://www.scopus.com/inward/record.url?scp=0033840494&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033840494&partnerID=8YFLogxK

M3 - Article

C2 - 10965862

AN - SCOPUS:0033840494

VL - 75

SP - 825

EP - 833

JO - Academic Medicine

JF - Academic Medicine

SN - 1040-2446

IS - 8

ER -