Authors

1 1Section of Urology, Department of Surgery, The Aga Khan University Hospital, Karachi, Pakistan

2 Section of Urology, Department of Surgery, The Aga Khan University Hospital, Karachi, Pakistan.

3 Department for Educational development, The Aga Khan University Hospital, Karachi, Pakistan

4 Section of Urology, Department of Surgery, The Aga Khan University Hospital, Karachi, Pakistan

5 Foundation for Advancement of International Medical Education and Research (FAIMER), Philadelphia, Pennsylvania, United States of America

Abstract

Introduction: Clinical reasoning skill is the core of medicalcompetence. Commonly used assessment methods for medicalcompetence have limited ability to evaluate critical thinking andreasoning skills. Script Concordance Test (SCT) and Extended Matching Questions (EMQs) are the evolving tests which areconsidered to be valid and reliable tools for assessing clinicalreasoning and judgment. We performed this pilot study to determinewhether SCT and EMQs can differentiate clinical reasoning abilityamong urology residents, interns and medical students.
Methods: This was a cross-sectional study in which an examinationwith 48 SCT-based items on eleven clinical scenarios and four themedEMQs with 21 items were administered to a total of 27 learners atthree differing levels of experience i.e. 9 urology residents, 6 internsand 12 fifth year medical students. A non-probability conveniencesampling was done. The SCTs and EMQs were developed fromclinical situations representative of urological practice by 5 contentexperts (urologists) and assessed by a medical education expert.Learners’ responses were scored using the standard and the graduatedkey. A one way analysis of variance (ANOVA) was conducted tocompare the mean scores across the level of experience. A p-value of <0.05 was considered statistically significant. Test reliability was estimated by Cronbach α. A focused group discussion with candidates was done to assess their perception of test.
Results: Both SCT and EMQs successfully differentiated residents from interns and students. Statistically significant difference in mean score was found for both SCT and EMQs among the 3 groups using both the standard and the graduated key. The mean scores were higher for all groups as measured by the graduated key compared to the standard key. The internal consistency (Cronbach’s α) was 0.53 and 0.6 for EMQs and SCT, respectively. Majority of the participants were satisfied with regard to time, environment, instructions provided and the content covered and nearly all felt that the test helped them in thinking process particularly clinical reasoning.
Conclusion: Our data suggest that both SCT and EMQs are capable of discriminating between learners according to their clinical experience in urology. As there is a wide acceptability by all candidates, these tests could be used to assess and enhance clinical reasoning skills. More research is needed to prove validity of these tests.

Keywords

  1. Sharma N.Redesigning competency based medical education in a world of many team players. Med Teach. 2017 Aug 22:1. doi: 10.1080/0142159X.2017.1367374.
  2. Loughlin M, Bluhm R, Buetow S, Borgerson K, Fuller J. Reasoning, evidence, and clinical decision-making: The great debate moves forward. J Eval Clin Pract. 2017 Oct;23(5):905-914. doi: 10.1111/jep.12831
  3. Veloski JJ, Rabinowitz HK, Robeson MR, Young PR. Patients don't present with five choices: an alternative to multiple-choice tests in assessing physicians' competence. Acad Med.1999;74:539-546.
  4. Schuwirth LW, Verheggen MM, van der Vleuten C, Boshuizen HP, Dinant GJ. Do short cases elicit different thinking processes than factual knowledge questions do? Med Educ.2001;35:348-356.
  5. Bagg W, Clark K. Professionalism: medical students, future practice and all of us. Intern Med J. 2017 Feb;47(2):133-134. doi: 10.1111/imj.13320.
  6. Banning M. A review of clinical decision making: models and current research. J Clin Nurs. 2008 Jan;17(2):187-95. Epub 2007 Mar 1.
  7. Charlin B, Desaulniers M, Gagnon R, Blouin D, van der Vleuten C. Comparison of an aggregate scoring method with a consensus scoring method in a measure of clinical reasoning capacity.Teach Learn Med. 2002 Summer;14(3):150-6.
  8. Charlin B, Tardif J, Boshuizen HP. Scripts and medical diagnostic knowledge: theory and applications for clinical reasoning instruction and research. Acad Med. 2000 Feb;75(2):182-90.
  9. Miller G. The assessment of clinical skills ⁄ competence ⁄ performance. Acad Med 1990;65 (9):63–7.
  10. Vorstenbosch MA, Bouter ST, van den Hurk MM, Kooloos JG, Bolhuis SM, Laan RF. Exploring the validity of assessment in anatomy: do images influence cognitive processes used in answering extended matching questions? Anat Sci Educ. 2014 Mar-Apr;7(2):107-16. doi: 10.1002/ase.1382. Epub 2013 Jun 27.
  11. Charlin B, van der Vleuten C. Standardised assessment of reasoning in contexts of uncertainty: the script concordance approach. Eval Health Prof 2004;27 (3):304–19.
  12. Tormey W. Education, learning and assessment: current trends and best practice for medical educators. Ir J Med Sci. 2015 Mar;184(1):1-12. doi: 10.1007/s11845-014-1069-4. Epub 2014 Feb 19.
  13. Lubarsky S, Charlin B, Cook DA, Chalk C, van der Vleuten CP. Script concordance testing: a review of published validity evidence. Med Educ. 2011 Apr;45(4):329-38. doi: 10.1111/j.1365-2923.2010.03863.x.
  14. Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice: AMEE guide no. 75. Med Teach. 2013;35(3):184-93. doi: 10.3109/0142159X.2013.760036. Epub 2013 Jan 29.
  15. Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC Med Inform Decis Mak. 2008 May 6;8:18. doi: 10.1186/1472-6947-8-18.
  16. Gagnon R, Charlin B, Coletti M, Sauve E, Vleuten C Van der: Assessment in the context of uncertainty: how many members are needed on the panel of reference of a script concordance test? Medical Education 2005, 39:284-291.
  17. Sibert L, Charlin B, Corcos J, Gagnon R, Grise P, van der Vleuten C. Stability of clinical reasoning assessment results with the script concordance test across two different linguistic, cultural and learning environments. Med Teach 2002; 24:522–527.
  18. Subra J, Chicoulaa B, Stillmunkès A, Mesthe P, Oustric S, Rouge Bugat ME. Reliability and validity of the script concordance test for postgraduate students of general practice. Eur J Gen Pract. 2017 Dec;23(1):208-213. doi: 10.1080/13814788.2017.1358709.
  19. Gagnon R, Charlin B, Lambert C, Carriere B, van der Vleuten C. Script concordance testing: more cases or more questions? Adv Health Sci Educ Theory Pract 2009;14 (3):367–75.
  20. Dory V, Gagnon R, Vanpee D, Charlin B. How to construct andimplement script concordance tests: insights from a systematic review. Med Educ. 2012 Jun;46(6):552-63. doi: 10.1111/j.1365-2923.2011.04211.x.