» Articles » PMID: 35692359

Usability and Preference of Electronic Vs. Paper and Pencil OSCE Checklists by Examiners and Influence of Checklist Type on Missed Ratings in the Swiss Federal Licensing Exam

Overview
Journal GMS J Med Educ
Date 2022 Jun 13
PMID 35692359
Authors
Affiliations
Soon will be listed here.
Abstract

Background: Only a few studies with small sample sizes have compared electronic Objective Structured Clinical Examination (OSCE) rating checklists with traditional paper-based OSCE rating checklists. In this study, the examiner-perceived usability and preference for type of OSCE checklist (electronic vs. paper based) were compared, and the influence of OSCE checklist type on missed ratings was determined, for the Swiss Federal Licensing Examination in clinical skills for human medicine.

Methods: All examiners in the Swiss Federal Licensing Examination in clinical skills for human medicine were invited over two subsequent years to evaluate the OSCE checklist type they had worked with during the examination. This was based on a questionnaire with 14 closed questions (i.e., demographic, checklist-type experience, perceived usability, checklist type preference). Furthermore, the numbers of missed ratings for the paper-based checklist were recorded.

Results: The data from these examiners (=377) with experience of both OSCE checklist types were analyzed. The electronic OSCE checklist was rated significantly higher on all usability aspects (i.e., ease of use, candidate rating and error correction, clarity, distraction using the checklist, overall satisfaction), except for the speed of registering comments (no significant difference). The majority of the examiners in both years (2014: 54.5%, =60, 2015: 89.8%, =230) reported preference for working with the electronic OSCE checklist in the future. Missed ratings were seen for 14.2% of the paper-based OSCE checklists, which were prevented with the electronic OSCE checklists.

Conclusions: Electronic OSCE checklists were rated significantly more user-friendly and were preferred over paper-based OSCE checklists by a broad national sample of examiners, supporting previous results from faculty-level examinations. Furthermore, missed ratings were prevented with the electronic OSCE checklists. Overall, the use of electronic OSCE checklists is therefore advisable.

Citing Articles

Nurse educators' satisfaction with online Objective Structured Clinical Examination scoring system.

Haris F, Indarwati F, Primanda Y, Sutrisno R, Irawati K, Shih Y Rev Lat Am Enfermagem. 2024; 32:e4344.

PMID: 39570192 PMC: 11653744. DOI: 10.1590/1518-8345.6816.4344.


Objective structured clinical examination versus traditional written examinations: a prospective observational study.

Lebdai S, Bouvard B, Martin L, Annweiler C, Lerolle N, Rineau E BMC Med Educ. 2023; 23(1):69.

PMID: 36707797 PMC: 9883896. DOI: 10.1186/s12909-023-04050-5.

References
1.
Treadwell I . The usability of personal digital assistants (PDAs) for assessment of practical performance. Med Educ. 2006; 40(9):855-61. DOI: 10.1111/j.1365-2929.2006.02543.x. View

2.
Harden R, Stevenson M, Downie W, WILSON G . Assessment of clinical competence using objective structured examination. Br Med J. 1975; 1(5955):447-51. PMC: 1672423. DOI: 10.1136/bmj.1.5955.447. View

3.
Hochlehnert A, Schultz J, Moltner A, Timbil S, Brass K, Junger J . Electronic acquisition of OSCE performance using tablets. GMS Z Med Ausbild. 2015; 32(4):Doc41. PMC: 4606489. DOI: 10.3205/zma000983. View

4.
Currie G, Sinha S, Thomson F, Cleland J, Denison A . Tablet computers in assessing performance in a high stakes exam: opinion matters. J R Coll Physicians Edinb. 2017; 47(2):164-167. DOI: 10.4997/JRCPE.2017.215. View

5.
Collins D . Pretesting survey instruments: an overview of cognitive methods. Qual Life Res. 2003; 12(3):229-38. DOI: 10.1023/a:1023254226592. View