Determining the influence of different linking patterns on the stability of students' score adjustments produced using Video-based Examiner Score Comparison and Adjustment (VESCA)
Affiliation
School of Medicine, David Weatherall Building, Keele University, Keele, Staffordshire, ST5 5BG, UK. p.yeates@keele.ac.uk. Fairfield General Hospital, Northern Care Alliance NHS Foundation Trust, Rochdale Old Road, Bury, BL9 7TD, Lancashire, UK. p.yeates@keele.ac.uk. School of Medicine, David Weatherall Building, Keele University, Keele, Staffordshire, ST5 5BG, UK. Christie Education, Christie Hospitals NHS Foundation Trust, Wilmslow Rd, Manchester, M20 4BX, UK.Issue Date
2022
Metadata
Show full item recordAbstract
Background Ensuring equivalence of examiners’ judgements across different groups of examiners is a priority for large scale performance assessments in clinical education, both to enhance fairness and reassure the public. This study extends insight into an innovation called Video-based Examiner Score Comparison and Adjustment (VESCA) which uses video scoring to link otherwise unlinked groups of examiners. This linkage enables comparison of the influence of different examiner-groups within a common frame of reference and provision of adjusted “fair” scores to students. Whilst this innovation promises substantial benefit to quality assurance of distributed Objective Structured Clinical Exams (OSCEs), questions remain about how the resulting score adjustments might be influenced by the specific parameters used to operationalise VESCA. Research questions, How similar are estimates of students’ score adjustments when the model is run with either: fewer comparison videos per participating examiner?; reduced numbers of participating examiners? Methods Using secondary analysis of recent research which used VESCA to compare scoring tendencies of different examiner groups, we made numerous copies of the original data then selectively deleted video scores to reduce the number of 1/ linking videos per examiner (4 versus several permutations of 3,2,or 1 videos) or 2/examiner participation rates (all participating examiners (76%) versus several permutations of 70%, 60% or 50% participation). After analysing all resulting datasets with Many Facet Rasch Modelling (MFRM) we calculated students’ score adjustments for each dataset and compared these with score adjustments in the original data using Spearman’s correlations. Results Students’ score adjustments derived form 3 videos per examiner correlated highly with score adjustments derived from 4 linking videos (median Rho = 0.93,IQR0.90–0.95,p < 0.001), with 2 (median Rho 0.85,IQR0.81–0.87,p < 0.001) and 1 linking videos (median Rho = 0.52(IQR0.46–0.64,p < 0.001) producing progressively smaller correlations. Score adjustments were similar for 76% participating examiners and 70% (median Rho = 0.97,IQR0.95–0.98,p < 0.001), and 60% (median Rho = 0.95,IQR0.94–0.98,p < 0.001) participation, but were lower and more variable for 50% examiner participation (median Rho = 0.78,IQR0.65–0.83, some ns). Conclusions Whilst VESCA showed some sensitivity to the examined parameters, modest reductions in examiner participation rates or video numbers produced highly similar results. Employing VESCA in distributed or national exams could enhance quality assurance or exam fairness.Citation
Yeates P, McCray G, Moult A, Cope N, Fuller R, McKinley R. Determining the influence of different linking patterns on the stability of students’ score adjustments produced using Video-based Examiner Score Comparison and Adjustment (VESCA). Vol. 22, BMC Medical Education. Springer Science and Business Media LLC; 2022.Journal
BMC Medical EducationDOI
10.1186/s12909-022-03115-1PubMed ID
35039023Additional Links
https://dx.doi.org/10.1186/s12909-022-03115-1Type
ArticleLanguage
enae974a485f413a2113503eed53cd6c53
10.1186/s12909-022-03115-1