» Articles » PMID: 33388043

Comparison of the Validity of Bookmark and Angoff Standard Setting Methods in Medical Performance Tests

Overview
Journal BMC Med Educ
Publisher Biomed Central
Specialty Medical Education
Date 2021 Jan 3
PMID 33388043
Citations 2
Authors
Affiliations
Soon will be listed here.
Abstract

Background: One of the main processes of determining the ability level at which a student should pass an assessment is standard setting. The current study aimed to compare the validity of Angoff and bookmark methods in standard-setting.

Method: 190 individuals with an M.Sc. degree in laboratory science participated in the study. A test with 32 items, designed by a group of experts, was used to assess the laboratory skills of the participants. Moreover, two groups each containing 12 content specialists in laboratory sciences, voluntarily participated in the application of the Angoff and bookmark methods. To assess the process validity, a 5-item questionnaire was asked from two groups of panelists. To investigate the internal validity, the classification agreement was calculated using the kappa and Fleiss's Kappa coefficient. External validity was assessed by using five indices (correlation with criterion score, specificity, sensitivity, and positive and negative predictive values of correlation test with criterion score).

Results: The results showed that the obtained cut-scores was 17.67 for Angoff and 18.8 for bookmark. The average total of items related to the quality of the execution process was 4.25 for the Angoff group and 4.79 for the bookmark group. Pass rates pass rates percentages for the Angoff and bookmark group were 55.78 and 41.36, respectively. Correlations of passing/failing, between employer ratings and test scores were 0.69 and 0.88 for Angoff and bookmark methods, respectively.

Conclusion: Based on the results, it can be concluded that the process and internal validities of the bookmark method were higher than the Angoff method. For evaluation of the external validity (concordance of the cut score with the criterion score), all five external validity indices supported the bookmark method.

Citing Articles

Intention to use eLearning-based continuing professional development and its predictors among healthcare professionals in Amhara region referral hospitals, Ethiopia, 2023: using modified UTAUT-2 model.

Kelkay J, Maru Wubante S, Sendek Anteneh D, Takilo M, Gebeyehu C, Alameraw T BMC Health Serv Res. 2025; 25(1):178.

PMID: 39885532 PMC: 11780820. DOI: 10.1186/s12913-025-12317-4.


A Unique Simulation Methodology for Practicing Clinical Decision Making.

Amar S, Bitan Y J Med Educ Curric Dev. 2025; 12:23821205241310077.

PMID: 39872541 PMC: 11770706. DOI: 10.1177/23821205241310077.


Assessment of attitude towards e-professionalism: Students' perspectives from a Private Medical College in Lahore, Pakistan.

Khan H, Rabbani M, Ikram F Pak J Med Sci. 2025; 41(1):163-170.

PMID: 39867789 PMC: 11755296. DOI: 10.12669/pjms.41.1.8644.


Research involvement among undergraduate medical students in Bangladesh: a multicenter cross-sectional study.

Hasan M, Islam S, Sujon H, Chowdhury F, Islam M, Ahmed M BMC Med Educ. 2025; 25(1):126.

PMID: 39863841 PMC: 11762105. DOI: 10.1186/s12909-024-06566-w.


Health literacy and influencing factors in university students across diverse educational fields in Kazakhstan.

Dauletkaliyeva Z, Bolatova Z, Yerdessov N, Nukeshtayeva K, Zhamantayev O, Takuadina A Sci Rep. 2025; 15(1):3197.

PMID: 39863762 PMC: 11762303. DOI: 10.1038/s41598-025-87049-w.


References
1.
Lypson M, Downing S, Gruppen L, Yudkowsky R . Applying the Bookmark method to medical education: standard setting for an aseptic technique station. Med Teach. 2013; 35(7):581-5. DOI: 10.3109/0142159X.2013.778395. View

2.
Trevethan R . Sensitivity, Specificity, and Predictive Values: Foundations, Pliabilities, and Pitfalls in Research and Practice. Front Public Health. 2017; 5:307. PMC: 5701930. DOI: 10.3389/fpubh.2017.00307. View

3.
Park J, Ahn D, Yim M, Lee J . Comparison of standard-setting methods for the Korea Radiological technologist Licensing Examination : Angoff, Ebel, Bookmark, and Hofstee. J Educ Eval Health Prof. 2018; 15:32. PMC: 6380908. DOI: 10.3352/jeehp.2018.15.32. View

4.
Yim M . Comparison of results between modified-Angoff and bookmark methods for estimating cut score of the Korean medical licensing examination. Korean J Med Educ. 2018; 30(4):347-357. PMC: 6288617. DOI: 10.3946/kjme.2018.110. View

5.
Talente G, Haist S, Wilson J . A model for setting performance standards for standardized patient examinations. Eval Health Prof. 2003; 26(4):427-46. DOI: 10.1177/0163278703258105. View