» Articles » PMID: 31133911

Multiple-Choice Item Distractor Development Using Topic Modeling Approaches

Overview
Journal Front Psychol
Date 2019 May 29
PMID 31133911
Citations 10
Authors
Affiliations
Soon will be listed here.
Abstract

Writing a high-quality, multiple-choice test item is a complex process. Creating plausible but incorrect options for each item poses significant challenges for the content specialist because this task is often undertaken without implementing a systematic method. In the current study, we describe and demonstrate a systematic method for creating plausible but incorrect options, also called distractors, based on students' misconceptions. These misconceptions are extracted from the labeled written responses. One thousand five hundred and fifteen written responses from an existing constructed-response item in Biology from Grade 10 students were used to demonstrate the method. Using a topic modeling procedure commonly used with machine learning and natural language processing called latent dirichlet allocation, 22 plausible misconceptions from students' written responses were identified and used to produce a list of plausible distractors based on students' responses. These distractors, in turn, were used as part of new multiple-choice items. Implications for item development are discussed.

Citing Articles

Automatic distractor generation in multiple-choice questions: a systematic literature review.

Awalurahman H, Budi I PeerJ Comput Sci. 2024; 10:e2441.

PMID: 39650367 PMC: 11623049. DOI: 10.7717/peerj-cs.2441.


Item Analysis of Multiple-Choice Question (MCQ)-Based Exam Efficiency Among Postgraduate Pediatric Medical Students: An Observational, Cross-Sectional Study From Saudi Arabia.

Shahat K Cureus. 2024; 16(9):e69151.

PMID: 39262935 PMC: 11388033. DOI: 10.7759/cureus.69151.


Evaluation of the Open-Ended Green Chemistry Generic Comparison (GC) Prompt for Probing Student Conceptions about the Greenness of a Chemical Reaction.

Grieger K, Leontyev A J Chem Educ. 2024; 101(7):2644-2655.

PMID: 39007074 PMC: 11238543. DOI: 10.1021/acs.jchemed.4c00258.


The impact of repeated item development training on the prediction of medical faculty members' item difficulty index.

Lee H, Yune S, Lee S, Im S, Kam B BMC Med Educ. 2024; 24(1):599.

PMID: 38816855 PMC: 11140961. DOI: 10.1186/s12909-024-05577-x.


The Effect of a One-Day Workshop on the Quality of Framing Multiple Choice Questions in Physiology in a Medical College in India.

Dhanvijay A, Dhokane N, Balgote S, Kumari A, Juhi A, Mondal H Cureus. 2023; 15(8):e44049.

PMID: 37746478 PMC: 10517710. DOI: 10.7759/cureus.44049.


References
1.
Griffiths T, Steyvers M . Finding scientific topics. Proc Natl Acad Sci U S A. 2004; 101 Suppl 1:5228-35. PMC: 387300. DOI: 10.1073/pnas.0307752101. View

2.
Collins J . Education techniques for lifelong learning: writing multiple-choice questions for continuing medical education activities and self-assessment modules. Radiographics. 2006; 26(2):543-51. DOI: 10.1148/rg.262055145. View

3.
Tarrant M, Ware J, Mohammed A . An assessment of functioning and non-functioning distractors in multiple-choice questions: a descriptive analysis. BMC Med Educ. 2009; 9:40. PMC: 2713226. DOI: 10.1186/1472-6920-9-40. View

4.
Moreno R, Martinez R, Muniz J . Guidelines based on validity criteria for the development of multiple choice items. Psicothema. 2015; 27(4):388-94. DOI: 10.7334/psicothema2015.110. View

5.
Lai H, Gierl M, Touchie C, Pugh D, Boulais A, De Champlain A . Using Automatic Item Generation to Improve the Quality of MCQ Distractors. Teach Learn Med. 2016; 28(2):166-73. DOI: 10.1080/10401334.2016.1146608. View