Multiple-Choice Item Distractor Development Using Topic Modeling Approaches
Overview
Authors
Affiliations
Writing a high-quality, multiple-choice test item is a complex process. Creating plausible but incorrect options for each item poses significant challenges for the content specialist because this task is often undertaken without implementing a systematic method. In the current study, we describe and demonstrate a systematic method for creating plausible but incorrect options, also called distractors, based on students' misconceptions. These misconceptions are extracted from the labeled written responses. One thousand five hundred and fifteen written responses from an existing constructed-response item in Biology from Grade 10 students were used to demonstrate the method. Using a topic modeling procedure commonly used with machine learning and natural language processing called latent dirichlet allocation, 22 plausible misconceptions from students' written responses were identified and used to produce a list of plausible distractors based on students' responses. These distractors, in turn, were used as part of new multiple-choice items. Implications for item development are discussed.
Automatic distractor generation in multiple-choice questions: a systematic literature review.
Awalurahman H, Budi I PeerJ Comput Sci. 2024; 10:e2441.
PMID: 39650367 PMC: 11623049. DOI: 10.7717/peerj-cs.2441.
Shahat K Cureus. 2024; 16(9):e69151.
PMID: 39262935 PMC: 11388033. DOI: 10.7759/cureus.69151.
Grieger K, Leontyev A J Chem Educ. 2024; 101(7):2644-2655.
PMID: 39007074 PMC: 11238543. DOI: 10.1021/acs.jchemed.4c00258.
Lee H, Yune S, Lee S, Im S, Kam B BMC Med Educ. 2024; 24(1):599.
PMID: 38816855 PMC: 11140961. DOI: 10.1186/s12909-024-05577-x.
Dhanvijay A, Dhokane N, Balgote S, Kumari A, Juhi A, Mondal H Cureus. 2023; 15(8):e44049.
PMID: 37746478 PMC: 10517710. DOI: 10.7759/cureus.44049.