» Articles » PMID: 25804967

Experience of Clinical Skills Assessment in the Busan-Gyeongnam Consortium

Overview
Specialty Medical Education
Date 2015 Mar 26
PMID 25804967
Citations 1
Authors
Affiliations
Soon will be listed here.
Abstract

Purpose: The purpose of this study is to judge the quality of clinical skills assessment in Busan-Gyeongnam Consortium.

Methods: Fourth grade medical school students (n=350 in 2012 and n=419 in 2013) in the Busan-Gyeongnam Consortium were included in the study. The examination was consisted of 6 clinical performance examination (CPX) and 6 objective structured clinical examination (OSCE) stations. The students were divided into groups to take the exam in 4 sites during 3 days. The overall reliability was estimated by Cronbach alpha coefficient across the stations and the case reliability was by alpha across checklist items. Analysis of variance and between-group variation were used to evaluate the variation of examinee performance across different days and sites.

Results: The mean total CPX/OSCE score was 67.0 points. The overall alpha across-stations was 0.66 in 2012 and 0.61 in 2013. The alpha across-items within a station was 0.54 to 0.86 in CPX, 0.51 to 0.92 in OSCE. There was no significant increase in scores between the different days. The mean scores over sites were different in 30 out of 48 stations but between-group variances were under 30%, except 2 cases.

Conclusion: The overall reliability was below 0.70 and standardization of exam sites was unclear. To improve the quality of exam, case development, item design, training of standardized patients and assessors, and standardization of sites are necessary. Above of all, we need to develop the well-organized matrix to measure the quality of the exam.

Citing Articles

Changes in medical students' patient-centeredness attitudes by implementation of clinical performance examination.

Hur Y, Kim S, Park J, Cho A, Choi C Korean J Med Educ. 2015; 26(2):99-106.

PMID: 25805196 PMC: 8813429. DOI: 10.3946/kjme.2014.26.2.99.

References
1.
Auewarakul C, Downing S, Praditsuwan R, Jaturatamrong U . Item analysis to improve reliability for an internal medicine undergraduate OSCE. Adv Health Sci Educ Theory Pract. 2005; 10(2):105-13. DOI: 10.1007/s10459-005-2315-3. View

2.
Schoonheim-Klein M, Muijtjens A, Muijtens A, Habets L, Manogue M, van der Vleuten C . On the reliability of a dental OSCE, using SEM: effect of different days. Eur J Dent Educ. 2008; 12(3):131-7. DOI: 10.1111/j.1600-0579.2008.00507.x. View

3.
Hodges B, Herold McIlroy J . Analytic global OSCE ratings are sensitive to level of training. Med Educ. 2003; 37(11):1012-6. DOI: 10.1046/j.1365-2923.2003.01674.x. View

4.
Reznick R, Blackmore D, Dauphinee W, Rothman A, Smee S . Large-scale high-stakes testing with an OSCE: report from the Medical Council of Canada. Acad Med. 1996; 71(1 Suppl):S19-21. DOI: 10.1097/00001888-199601000-00031. View

5.
Brannick M, Erol-Korkmaz H, Prewett M . A systematic review of the reliability of objective structured clinical examination scores. Med Educ. 2011; 45(12):1181-9. DOI: 10.1111/j.1365-2923.2011.04075.x. View