Document Type : Review article
Authors
1 Anglia Ruskin University, Department of Medical Education, Cambridge, United Kingdom
2 Tunbridge Wells NHS Trust, United Kingdom
3 West Middlesex NHS Foundation Trust, United Kingdom
4 Addenbrookes NHS Foundation Trust, Cambridge, United Kingdom
5 Ashford and St Peters NHS Trust, United Kingdom
Abstract
Introduction: This systematic review and meta-analysis investigated the impact of utilising peer-generated multiplechoice question (MCQ) banks on the summative performance of undergraduate students studying medicine and allied subjects. Answering and writing peer-made MCQ questions are hypothesised to enhance learning through achievement of the
domains of Bloom’s taxonomy and thus summative examination performance.
Methods: A random-effects meta-analysis of correlation coefficients was conducted on six studies (n = 1,571) published between 2011 and 2021, drawn from MEDLINE, Scopus, Web of Science, PubMed, CENTRAL, and ERIC. The studies included undergraduate medical students from four countries. The risk of bias was assessed using the ROBINS-I tool.
Results: A weak positive correlation was found between answering peer-made MCQs and summative performance (Spearman’s ρ = 0.22, 95% CI: 0.15 to 0.29, p < 0.0001), with a prediction interval of 0.00 to 0.42, indicating that in future studies, the effect of answering peer-made questions is likely beneficial or, at worst, neutral. A similar weak positive correlation was observed for writing peer-made MCQs (Spearman’s ρ = 0.21, 95% CI: 0.09 to 0.32, p < 0.0004), though the prediction interval (-0.27 to 0.61) cannot exclude negative correlation between writing questions and summative performance in future studies. The findings suggest that answering and creating peer-generated MCQs positively influence exam performance. The modest correlations likely
reflect confounding factors, such as prior academic performance and socio-economic background. This complicates isolating the impact of MCQ banks and may understate their true impact.
Conclusion: This study advocates for the integration of peergenerated MCQ banks into medical curricula, highlighting their potential as a cost-effective method to improve summative
performance. Future research should focus on large-scale observational studies to better quantify these effects as well as controlling for confounding factors. The study underscores the value of peer engagement in learning and the utility of peer-made MCQ banks as educational tools.
Highlights
Keywords
- Ali SH, Ruit KG. The impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality. Perspect Med Educ. 2015;4:244251.
- Thompson AR, O’Loughlin VD. The Blooming Anatomy Tool (BAT): A discipline-specific rubric for utilizing Bloom’s taxonomy in the design and evaluation of assessments in the anatomical sciences. Anat Sci Educ. 2015;8:493501.
- Zaidi NLB, Grob KL, Monrad SM, Kurtz JB, Tai A, Ahmed AZ, et al. Pushing Critical Thinking Skills With Multiple-Choice Questions: Does Bloom's Taxonomy Work? Acad Med. 2018;93(6):856-9.
- Pugh D, De Champlain A, Touchie C. Plus ça change, plus c'est pareil: Making a continued case for the use of MCQs in medical education. Med Teach. 2019;41(5):569-77.
- Heist BS, Gonzalo JD, Durning S, Torre D, Elnicki DM. Exploring Clinical Reasoning Strategies and Test-Taking Behaviors During Clinical Vignette Style Multiple-Choice Examinations: A Mixed Methods Study. J Grad Med Educ. 2014;6(4):709-14.
- Sam AH, Westacott R, Gurnell M, Wilson R, Meeran K, Brown C. Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study. BMJ Open. 2019;9(9):e032550.
- Kilgour JM, Tayyaba S. An investigation into the optimal number of distractors in single-best answer exams. Adv in Health Sci Educ. 2016;21:571–85.
- Tai J. Cheating the system- Tackling the problem at UCL: RUMS Review. [Internet]. 2019 [Accessed 3 May 2024]. Available from: https://rumsreview.com/2019/03/10/cheating-the-system-tackling-the-problem-at-ucl/.
- Wynter L, Burgess A, Kalman E, Heron JE, Bleasel J. Medical students: what educational resources are they using?. BMC Med Educ. 2019;19:36.
- Makus D, Kashyap A, Labib M, Humphrey-Murto S. A Curriculum Ignored? The Usage of Unofficial Commercial and Peer Learning Resources in Undergraduate Medical Education at a Canadian Medical School. Med Sci Educ. 2023;33(6):1379-88.
- Passmedicine-Medical student finals/UKMLA resource [Internet]. 2024 [Accessed 15 July 2024]. Available from: https://www.passmedicine.com.
- Passmedicine-Medical student finals/UKMLA resource: Revision [Internet]. 2024 [Accessed 15 Jul. 2024]. Available from: https://www.pastest.com.
- Clemmons KR, Vuk J, Jarrett DM. Educational Videos Versus Question Banks: Maximizing Medical Student Performance on the United States Medical Licensing Examination Step 1 Exam. Cureus. 2023;15(4):e38110.
- Kühbeck F, Berberat PO, Engelhardt S, Sarikas A. Correlation of online assessment parameters with summative exam performance in undergraduate medical education of pharmacology: a prospective cohort study. BMC Med Educ. 2019;19:412.
- Bloom BS. Taxonomy of educational objectives: Cognitive and affective domains. New York: David McKay; 1956.
- Bottomley S. Denny P. A participatory learning approach to biochemistry using student authored and evaluated multiple-choice questions. Biochem Mol Biol Educ. 2011;39:352-61.
- Guilding C, Pye RE, Butler S, Atkinson M, Field E. Answering questions in a co-created formative exam question bank improves summative exam performance, while students perceive benefits from answering, authoring, and peer discussion: A mixed methods analysis of PeerWise. Pharmacol Res Perspect. 2021;9:e00833.
- Nguyen KA, Lucas C, Leadbeatter D. Student generation and peer review of examination questions in the dental curriculum: Enhancing student engagement and learning. Eur J Dent Educ. 2020;24:548–58.
- Rhind SM. Pettigrew GW. ‘Peer generation of multiple-choice questions: Student engagement and experiences’. Journal of Veterinary Medical Education. 2012;39(4):375–9.
- Walsh JL, Harris BHL, Denny P, Smith P. Formative student-authored question bank: perceptions, question quality and association with summative performance. Postgrad Med J. 2018;94(1108):97-103.
- Pathak R, Aye AM. The relationship between web-based MCQ authoring by students and their performance in medical Physiology. Indian Journal of Applied Research. 2015;5(10):1-2.
- Wu A, Choi E, Diderich M, Shamim A, Rahhal Z, Mitchell M, et al. Internationalization of Medical Education - Motivations and Formats of Current Practices. Med Sci Educ. 2022;32(3):733-45.
- Borenstein M, Higgins JPT, Hedges LV, Rothstein HR. “Basics of Meta-Analysis: I2 Is Not an Absolute Measure of Heterogeneity.” Research Synthesis Methods. 2017;8(1):5–18.
- Schwarzer G, Chemaitelly H, Abu-Raddad LJ, Rücker G. “Seriously Misleading Results Using Inverse of Freeman-Tukey Double Arcsine Transformation in Meta-Analysis of Single Proportions.” Research Synthesis Methods. 2019;10(3):476–83.
- Langensee L, Rumetshofer T, Mårtensson J. Interplay of socioeconomic status, cognition, and school performance in the ABCD sample. NPJ Sci Learn. 2024;9(1):17.
- Geraci L, Kurpad N, Tirso R, Gray KN, Wang Y. Metacognitive errors in the classroom: The role of variability of past performance on exam prediction accuracy. Metacognition Learning. 2023;18:219–36.