Document Type : Original Article
Authors
Department of Physiology, Government Institute of Medical Sciences, Greater Noida - 201310, Gautam Buddha Nagar, Uttar Pradesh, India
Abstract
Introduction: Concept mapping is a multidimensional tool that has been put to little use in India. We designed this study to check its applicability for assessing higher-order thinking in the subject of Physiology.
Methods: This interventional analytical study was carried out among 65 students of Phase I of MBBS in the year 2021. The students were sensitized to the technique and were given a practice session. On a pre-informed date, an assessment of a topic taught to them was done using concept mapping and a multiple-choice question (MCQ) based test. Feedback on the technique was taken from the students. The statistical tests used were test of normality – Kolmogorov-Smirnov Test, significance of association - Wilcoxon Signed Rank test, correlation - Spearman’s correlation, and agreement - Bland Altman Analysis. The discrimination index was calculated for both concept mapping and MCQ based tests, separately. Percentages were calculated for feedback questionnaire items. The data were analysed using Microsoft Excel (2019) and an online calculator. P values <0.05 were considered statistically significant.
Results: Students scored more in concept mapping. There was a significant difference in the scores of the students on the two tests (Wilcoxon Signed-Rank test, Z=-2.66, P=0.008) and a weakly positive non-significant correlation between them (Spearman’s correlation coefficient, rs=0.07, P=0.60). Bland Altman’s Analysis showed agreement in the scores of the students in the two tests. The mean score of the students in the two tests increased, so did the difference in the scores in the two tests. The discrimination index of concept mapping (0.28) was higher than that of the MCQ based test (0.18). Most of the students agreed on the advantages of concept mapping in the feedback.
Conclusion: The assessment result of concept mapping is better than that of MCQ-based test and it may be included as a teaching learning and assessment strategy in the context of Indian medical education in the subject of Physiology.
Keywords
Introduction
Formative assessments are integral to curriculum implementation and make teaching/learning meaningful. They are the means to identify deficiencies in learning and provide scope for feedback to both students and teachers, while there is still time to make them good ( 1 ). They are also useful to differentiate high-ability students form lower-ability ones to direct the further course of individualized teaching-learning strategies for them ( 2 ).
With respect to medical education, assessment and feedback of the critical thinking ability, called ‘higher-order thinking skill (HOTS)’ is imperative. It is important to ascertain that the student is able to bring their learning into rational practice ( 3 , 4 ). The usual battery of assessment tools, including that for HOTS, in our Indian curriculum in the subject of Physiology includes Modified Essay Questions (MEQs), Extended Matching Questions (EMQs), Multiple Choice Questions (MCQs), Problem Based Learning (PBL), and Viva-voce ( 5 - 7 ). While each of these has its own merits, they have their own set of limitations and have a huge call on the resources ( 7 ). An alternate tool of assessment that both evokes critical thinking and assesses at the same time would be more than welcome. Concept mapping is a good potential choice in this regard ( 8 ).
Concept maps are diagrammatic representations of different components, called concepts or nodes, of a particular focus question that are presented hierarchically, with linking words and phrases indicated on the connecting arrows, and have interlinks between different subsets of the concept, i.e. nodes ( 9 ). Constructing a good concept map requires a thorough knowledge of the concepts involved and a clear understanding of the relationship between them. Consequently, the exercise of creating concept maps evokes critical thinking as the student explores and analyses concepts and their relationships, thereby leading to deeper learning ( 10 ). Therefore, a concept map that has an exhaustive number of relevant concepts or nodes has identified multiple levels of hierarchy, has established many cross-links across them, and reflects higher-order thinking ability of the student who made it. It would correspond to the highest levels of Bloom’s levels of learning: synthesis and evaluation, both in cognitive as well as in metacognitive domains ( 11 ). Also, as the student progresses in the course over a period of time, there is an enhancement in their learning. And parallel to that they may expand and improvise upon their earlier-made concept map. So, there is continuous improvement in the quality of the concept map that a student makes, parallel to their learning, and the same is available for concurrent assessment ( 12 , 13 ). A concept map is also handy for revisions at a glance.
Concept mapping has been in use in medical education in many parts of the world, both as a teaching/learning tool ( 13 - 16 ) and a tool for assessment ( 17 , 18 ). But it has been put to only little use ( 19 - 23 ) in the context of Indian medical education. Our students have not had sufficient exposure to this multi-pronged tool and remain largely unfamiliar with its utility. Given the merits of concept mapping, helping build insight into a topic and its application, knowledge integration, and improved formative assessment and its outcome, we expect that it should be well accepted. This requires that we check the acceptability of concept mapping with the students and evaluate how its results, as a tool of assessment, weigh in comparison with those of another popular tool of assessment, like MCQs. It is against this background that we designed our study.
Therefore, our aim was ‘to familiarize the students of MBBS Phase I in the subject of Physiology with ‘Concept mapping’ as another tool for both teaching/learning and assessment.
Methods
This interventional analytical study was carried out among Phase I MBBS students of the Government Institute of Medical Sciences, Greater Noida, Uttar Pradesh, India in the subject of Physiology in 2021. Approval was taken from the Institutional Ethics Committee (GIMS/IEC/HR2021/37 dated 18.10.2021). We included all the students present in the class on the day of intervention with their consent to participate in the study. Out of a class strength of 100, only 65 students appeared in both concept mapping and MCQ tests, and also gave their feedback. Their data was used for all calculations. The data for the students who did not give either of the tests or did not give feedback were excluded. We included all the students present in the class on the day of intervention with their consent to participate in the study. Out of a class strength of 100, only 65 students appeared in both concept mapping and MCQ tests, and also gave their feedback. Their data was used for all calculations. The data for the students who did not give either of the tests or did not give feedback were excluded.
Our overall methodology is given by means of figure 1:
MCQ-based test: One point was given to each question answered correctly. The test had 20 questions making a maximum score of 20.
Concept map-based test: Two teachers, independently, assessed the concept maps drawn by the students using the assessment criteria given by West et al. in 2000: Each valid concept link – 0.2 points, each level of hierarchy – 0.5 points, each cross link – 0.1 points, and each example – 0.1 points ( 24 ). The average score was calculated from the total score awarded by both teachers. The maximum score a student could get in the concept map was not defined, the activity being open-ended. Therefore, for the purpose of testing the comparability of assessment techniques, the concept map-based test score of the students would be calculated in proportion to 20, the maximum MCQ based-test score.
Feedback: A pre-validated feedback questionnaire requiring rating of responses on a 5-point Likert scale, varying from 1 to 5, with 1 meaning ‘strongly disagree’, 2 ‘disagree’, 3 ‘neutral- neither agree, nor disagree’, 4 ‘agree’ and 5 ‘strongly agree’ was administered to the students (Table 1) (Cutrer, et al. 2011; consent of the creators was obtained via email) ( 25 ).
No. | Item | Strongly disagree | Disagree | Neither Agree nor Disagree | Agree | Strongly disagree |
---|---|---|---|---|---|---|
1 | Concept mapping is an easy skill to learn. | 0 (0) | 1 (2) | 10 (15) | 22 (34) | 32 (49) |
2 | Concept maps are easy to complete. | 0 (0) | 1 (2) | 9 (14) | 31 (48) | 24 (37) |
3 | Concept maps take too long to complete for a given topic. | 5 (8) | 10 (15) | 22 (34) | 20 (31) | 8 (12) |
4 | I find concept mapping to be a useful tool in helping me gain a deeper understanding of a given topic. | 0 (0) | 1 (2) | 9 (14) | 35 (54) | 20 (31) |
5 | I feel comfortable with my ability to complete a concept map. | 0 (0) | 2 (3) | 13 (20) | 31 (48) | 19 (29) |
6 | Concept mapping is a difficult skill to learn. | 11 (17) | 16 (25) | 14 (22) | 15 (23) | 9 (14) |
7 | Concept mapping helps learners see the big picture. | 0 (0) | 1 (2) | 14 (22) | 26 (40) | 24 (37) |
8 | Concept mapping helped me to make connections that I had not previously made. | 0 (0) | 3 (5) | 7 (11) | 31 (48) | 24 (37) |
9 | I did NOT find concept mapping to be a useful tool in helping me gain a deeper understanding of a given topic. | 26 (40) | 17 (26) | 7 (11) | 11 (17) | 4 (6) |
10 | Concept mapping was beneficial in my learning about immunity. | 0 (0) | 2 (3) | 10 (15) | 29 (45) | 24 (37) |
Yes | No | |||||
11 | Should concept maps be used to teach/learn/ assess other physiology topics as well? | 64 (98) | 1 (2) | |||
12 | Should concept maps be used in other subjects as well? | 63 (97) | 2 (3) |
Statistical Analysis
Correlation of scores of the students on concept map-based test and MCQ-based test: Assuming the Null hypothesis, H0, that the difference in the means/medians of the scores of students on the two tests is zero at an α (level of significance) of 5%, we calculated means, medians, and standard deviations of the two scores. The scores in the two tests were subjected to the test of normality - Kolmogorov-Smirnov Test of Normality (K-S test). The significance of association and correlation coefficients was then calculated. For the significance of association, Student’s t-test was used if both data sets were found to be normally distributed and the Wilcoxon Signed Rank test was used if either or both data sets were found not to be normally distributed. For the correlation coefficient, Pearson’s correlation was calculated if MCQ based-test score and concept map-based test score were normally distributed and Spearman’s Correlation test if either or both data sets were not normally distributed.
Analysis of agreement of scores of the students on concept map-based and MCQ-based tests: Analysis of agreement of the two scores was done using Bland Altman Analysis ( 26 ) by calculating the difference in scores of Concept Map and MCQ, after checking if the distribution of the difference in scores was normally distributed.
Comparison of discriminating potential of higher ability students from lower ability students in the two tests: We calculated Discrimination index (DI) for both MCQ-based test and Concept Map-based test. The following steps and formula were used ( 27 ):
We assumed that both Concept Mapping and MCQ-based tests were two items of a single assessment. The scores of the students in the two tests were added. The students were ranked in a descending order of merit based on the total score. The students whose total scores were in the top 25% (N = 16) were called Higher Ability Group (HAG) and the students whose total scores were in bottom 25% ( 16 ) were called Lower Ability Group (LAG). The total score of HAG students (ΣH) and LAG students (ΣL) in each of the two tests were calculated, separately. The maximum (ScoreMax) and minimum (ScoreMin) scores for each of the two tests were 20 and 0, respectively. Then Discrimination Index was calculated using the formula:
Interpretation of the results was based on the following criteria – DI - negative - Defective Item, DI - 0-0.19 - Poor discrimination, DI - 0.2-0.29 - Acceptable discrimination, DI - 0.3-0.39 - Good discrimination, DI - 0.4 Very good discrimination, DI > 0.4 Excellent discrimination
Feedback: We analysed the responses to feedback questionnaire items by calculating percentages from ratings for each of them. We summarized and tabulated the responses to the open-ended questions.
Statistical analysis software: All The data were analysed using Microsoft Excel (2019) and an online calculator https://www.socscistatistics.com.
Results
Correlation of scores of students in the Concept Map and MCQ tests
Shown in Figures 2and 3are histograms of the scores of the students in the two tests. The distributions of scores are not too deviated from the normal bell-shaped distribution in either test. Table 2 shows the mean scores of students in MCQ based test and Concept Map, the mean score in the two tests, the mean difference in score in the two tests, and their standard deviations. Students scored relatively more in the concept map-based test, and also, the standard deviation in the concept map-based test score was higher.
MCQ based test score (Maximum Marks 20) | Concept Map score (Original score in proportion to score from 20) | Wilcoxon Signed-Rank test | p | Spearman’s correlation coefficient, rs | p | |
---|---|---|---|---|---|---|
Mean | 14.28 | 15.54 | -2.66 | 0.001 | 0.066 | 0.60 |
Median | 14 | 15.5 | ||||
Standard Deviation | 1.93 | 3.07 |
For further statistical analysis, it was first determined if the scores in the tests were normally distributed.
MCQ test-based scores were found to be normally distributed – the value of the K-S test statistic (D) was 0.15 with a p-value of 0.10, while concept map-based test scores were not found to be normally distributed – the value of the K-S test statistic (D) was 0.18 with a p-value of 0.03.
There was a significant difference in the performance of students in the two tests (Wilcoxon Signed-Rank test, p <0.05) and a weakly positive correlation of scores in the two tests (Spearman’s correlation) which was not statistically significant. Figure 4 depicts the same result with the help of a scatter plot and trendline: There are two points that show higher scores in concept mapping compared to the students’ MCQ test scores, and three points that show much better scores in MCQ test than in concept mapping. The rest of the points are more clustered together. The trendline shows a little upward progression to right – students who scored more on the MCQ test, also generally scored more in concept mapping, and vice versa.
Agreement of the students’ scores in the Concept Map and MCQ tests
Table 3 shows the mean and the standard deviation of the average score of the students in the two tests and the difference in their scores in the two tests, the correlation coefficient of the two, and the p-value. The difference in scores is normally distributed: the Kolmogorov-Smirnov Test of Normality gives a D-value of 0.07 with a p-value of 0.93. The mean score is also normally distributed: the K-S Test gives a D-value of 0.08, with a p-value of 0.75. There is a weakly positive but significant correlation (Pearson’s correlation coefficient) between the average score and the difference in scores in the two tests.
The average score in the MCQ test and Concept Map | The difference in the score of the MCQ test and the Concept Map | Pearson’s correlation coefficient, r | p | |
---|---|---|---|---|
Mean±SD | 14.91±1.86 | 1.26±3.58 | 0.44 | 0.0003 |
Figure 5 shows the Bland-Altman analysis plot of the two scores, the MCQ test and Concept Map, and table 4 shows its related statistics (Giavarina 2015) ( 26 ). In Figure 5, it can be observed that most of the points fall within±1.96 SD of the mean difference (limits of agreement). However, the line of no difference lies outside the 95% confidence limits of the mean difference indicating a bias in the scores. The trend line of difference in score with mean score shows that as the average score increases, the difference in score also increases.
Parameter | Unit | Standard Error, SE, Formula | Standard Error, SE | ‘t’ value of 64 degrees of freedom | Confidence ‘SE * t’ | Confidence interval | |
---|---|---|---|---|---|---|---|
From | To | ||||||
Number, n | 65 | ||||||
Degrees of freedom, n-1 | 64 | ||||||
Mean difference, d | -2.81538 | s 2 / n | 0.443648 | 1.9977 | 0.886275 | 0.375263 | 2.147814 |
Standard deviation, s | 3.810481 | ||||||
d – 1.96 * s | -5.748995377 | 3 s 2 / n | 0.768421 | 1.9977 | 1.535074 | -7.28407 | -4.21392 |
d + 1.96 * s | 8.2720723 | 3 s 2 / n | 0.768421 | 1.9977 | 1.535074 | 6.736999 | 9.807146 |
Comparison of discriminating potential of higher ability group students from lower ability group students in the two tests
The discrimination index of the MCQ test was calculated to be 0.18, which was ‘poor’, and that of Concept map was 0.28 which was ‘acceptable’.
Students’ perception and acceptance of Concept Mapping
Table 1 shows the feedback of students on the Concept Map. Most of them had a favourable opinion towards the use of concept maps.
Discussion
Concept mapping
During concept mapping, when the student reflects to identify and define relationships between different nodes in the concept map and how different concepts may be integrated to address a common problem, their metacognitive skills are enhanced ( 8 , 14 , 16 , 28 - 30 ). It also gives the student an opportunity to express all the related and relevant details that they may know about a topic/ problem. On the other hand, in an MCQ test, even though it may have questions constructed to assess HOTS ( 31 ), the reflection of the students is limited to the given question and its options. Therefore, with usual assessment tools like MCQ-based tests, the student is restricted in the expression of their knowledge by the scope of the questions asked of them in the test, though they may know much more both quantitatively and in relevance ( 32 ).
Overall performance of students in the two tests
In our study, the students scored better in concept mapping than in the MCQ test (table 2). As already explained above, concept mapping allows students to be more expressive giving them an opportunity to score more, while with MCQs, the student is limited by what is asked, and how much is asked and they may not know what is asked.
Next, while the scores obtained by the students in the MCQ-based test were found to be normally distributed, the scores obtained by them in the concept map-based test were not normally distributed. It may be because of the subjective element in the assessment of concept maps though a rubric was followed to make it quite objective.
Correlation and agreement of the scores obtained by the students in the two tests
There was a non-significant positive correlation between the scores in the concept map and the MCQ tests (figure 4). Those who performed well in MCQ based test also performed better in concept mapping. Bland and Altman’s analysis (tables 3 & 4 and figure 5) also showed that there is an agreement between the two test scores within 1.96 standard deviations from the mean difference between the two scores. The two scores were significantly different (table 2). The latter finding refutes the null hypothesis indicating that the medians of the two scores, of the MCQ test and Concept Map score, are significantly different. Here as well, in Bland and Altman’s analysis (tables 3& 4 and figure 5) a definite positive bias of 1.26 units in the difference in scores was observed. Also, the line of equality lies outside the confidence interval of mean difference (figure 5). Then, as the mean score in the two tests increased (figure 5), the difference in the scores of the two tests also increased. These findings may be suggestive of the inherent difference in the assessment of learning and level of learning by the two tools: concept mapping gave students more opportunity for expression of cognitive and metacognitive abilities, thereby resulting in better scores.
How the results of our study weigh with the results of other studies
Fonesca, et al. 2020 conducted an exhaustive research of teaching pathophysiology using a concept map over a period of two years ( 33 ). The students’ score was better in summative tests that used concept map for evaluation than in the final MCQ-based quiz. The two scores were significantly different in both years and there was a positive correlation between the two scores in any year. Their results were similar to that of our study: the scores of concept mapping were significantly different from and better than those of MCQ-based test, and the two correlated as well. As the two assessment techniques are fundamentally different, as discussed earlier in the introduction section, there may be a significant difference in the performance of the students in tests based on the two techniques. However, concept mapping provides more scope for the students to express their knowledge and analytical ability. And a higher ability student is likely to perform better in either type of assessment. Therefore, the results of concept mapping may be higher than and also correlate with the result of MCQ based test.
Gamboa, et al. conducted a comparison study in 2012 ( 34 ) where they used concept maps to teach pathophysiology in a slightly different manner – students were to fill up the blanks in the concept map from among the given choices. They compared MCQ embedded concept map score with that of the traditional MCQ test and concluded that the performance of students was better in the concept map. The same fact is highlighted here, as in our study, that concept mapping provides a better opportunity for the students to present their knowledge in an assessment.
Ekin, et al. conducted a comparison study in Turkey in 2016 among sixth-standard students ( 35 ). They found a significant positive relationship between the scores of the students in the concept map based and MCQ based tests, though in our study the positive relationship was non-significant. In their study, students scored better in the traditional assessment using the MCQ test. They attributed it to the students’ nonfamiliarity with the concept map and not being able to form cross-links. However, given the much younger students included in their study and the professional course students in our study, we would refrain from drawing any conclusions in this regard here.
The ability of the two types of tests, as individual items of an assessment tool, to discriminate between high and low-ability students
The ability to differentiate between high and low-ability students is an important characteristic of a good assessment item. For the purpose of weighing this ability of concept mapping and the MCQ-based test as a whole, we calculated their discrimination indices. For the said purpose we had assumed that the two tests were items of an assessment exam. And we observed that the discrimination index of the concept map-based test was higher and acceptable, compared to that of the MCQ-based test, which was poor. This may suggest that concept mapping could identify high-ability students and low-ability ones better than the same done using an MCQ based test. At the same time, in Bland Altman’s analysis, it was observed that as the average score of a student in the two tests increased, so did the difference in the two scores (Figure 5). We may take these two results as corroborative evidence that concept mapping was able to differentiate higher ability students from lower ability ones better than an MCQ test in the present study. However, we could not find studies that have used these parameters to evaluate concept mapping which, therefore, limits our discussion of this aspect of the concept mapping technique.
Students’ perception of Concept Mapping
In their feedback (Table 1), most of the students either ‘strongly agreed’ or ‘agreed’ that concept mapping was an easy skill to learn, helped them gain insight into the topic and they were comfortable with their ability to complete the concept maps. They either ‘disagreed’ or ‘strongly disagreed’ that concept maps took too long to complete, it was a difficult skill to learn or they did not find it useful to help them in deeper learning and understanding. Students recommended that concept mapping may be used to teach other topics and other subjects, too. The students themselves identified the advantages of concept mapping as they freely made their remarks on concept mapping (table 5): Studies conducted over time in different parts of the world have affirmed the same positive attitude of the students towards concept mapping: Loizou, et al. (2022) introduced concept mapping to 1st year students in the Medical School, University of Nicosia in Cyprus ( 36 ); Baliga et al. (2021) used concept map as a teaching/learning tool in group activity among 86 students belonging to 3rd year of MBBS of J. N. Medical College, Belagavi in India, and took students’ feedback which showed the effectiveness and the students’ acceptance of the technique ( 21 ); Choudhuri, et al. (2021) introduced the technique of visual mapping, including both concept mapping and mind mapping, to 200 final year medical students during their community based teaching program in Datta Meghe University of Medical Sciences, India ( 22 ); Addaae, et al. (2012) used concept mapping along with a modified form of problem based learning for 50 1st and 2nd year medical students in the University of the West Indies, Trinidad and Tobago and noted their feedback ( 37 ); Torre, et al. (2007) introduced concept mapping to 136 third-year medical students of the Medical College of Wisconsin, USA in 2005 and analyzed the students’ feedback qualitatively ( 38 ).
No. | Suggestion/ remark |
---|---|
1 | Good for revising any topic in a short time. |
2 | Useful for compiling/ summarizing details of a topic after studying it from different sources. |
3 | Comprehending a concept map is easier than comprehending from paragraphs. |
4 | Concept Maps should be put up as posters in department/classrooms/notice boards. |
5 | Making concept maps requires a deeper understanding. |
6 | A good way of learning in less time. |
7 | Helps in retaining knowledge for a longer time. |
8 | Easy representation of a topic. |
9 | Improves cognitive skill. |
10 | Improves visualization power. |
11 | Requires attention and focus. |
12 | It is the best way of learning. |
13 | Should be provided for important topics. |
14 | Should be given more time in class. |
Limitations
Our study lacked a comparison group. We, therefore, cannot comment if concept mapping is a better teaching/learning and assessment strategy than the usual ones that are in vogue. Also, we cannot attribute the higher-order thinking skill ability inculcated among the students during the activity to concept mapping alone.
Most of the students were enthusiastic about concept maps. But there were also those who struggled with the technique; perhaps they needed more practice to be more comfortable with the use of the technique.
The scope of the current paper is limited to the response to the one-time intervention of concept mapping. If reinforcement of the technique shows more improvement in the students’ performance or any change in their perception of the technique, it cannot be commented upon at present.
A Hawthorne effect, where students perform better and more affirmatively when they know they are under observation than otherwise, is unavoidable in a study like ours. So, there is an inherent bias in our observations which, unfortunately, cannot be completely done away with.
We did not include an analysis of the teachers’ attitudes towards concept mapping in our strategy as the number of faculty members in the Department of Physiology at the institute is very limited. But given the amount of time it takes for assessing concept maps, and that the assessment of concept maps may suffer from subjectivity despite the use of objective rubrics, an in-depth evaluation of the teachers’ responses may shed more light on the applicability and utility of the tool.
Conclusion
In our study, we found that result of concept mapping as a tool of assessment is better than that of a usual MCQ-based test and the response of the students to the technique is encouraging. Overall, the findings of our study are in conformity with the big picture of concept mapping found in various studies conducted from time to time, both in places where it had been an identified teaching/learning strategy in the curriculum and in places where it had been experimented like in ours. We recommend that concept mapping may be incorporated as a teaching/learning and assessment strategy in the context of Indian medical education in the subject of Physiology.
Acknowledgments
We extend our thankfulness to the participants of the study for their enthusiasm and cooperation.
Authors’ contribution
P.A, B.B were involved in the original conception and design of this project and the drafting of its proposal. V.G, A.P, A.D helped in drafting the final proposal of the project. Data analysis was done by P.A, B.B. P.A mainly wrote the paper. Analysis and results were further refined by all authors. The other authors then revised it critically for intellectual content, and all of them approved the paper. All authors agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
References
- Ferris HA, O’ Flynn D. Assessment in Medical Education; What Are We Trying to Achieve?. International Journal of Higher Education. 2015; 4(2):139-44.
- Kibble JD. Best practices in summative assessment. Adv Physiol Educ. 2017; 41(1):110-9.
- Morrissey B, Heilbrun ME. Teaching Critical Thinking in Graduate Medical Education: Lessons Learned in Diagnostic Radiology. J Med Educ Curric Dev. 2017; 4:1-5.
- Hobbins JO, Murrant CL, Snook LA, Tishinsky JM, Ritchie KL. Incorporating higher order thinking and deep learning in a large, lecture-based human physiology course: can we do it?. Adv Physiol Educ. 2020; 44(4):670-8.
- Palmer EJ, Devitt PG. Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions?. Research paper. BMC Med Educ. 2007; 7:49.
- Tabish SA. Assessment methods in medical education. Int J Health Sci (Qassim). 2008; 2(2):3-7.
- Sood R, Singh T. Assessment in medical education: evolving perspectives and contemporary trends. Natl Med J India. 2012; 25(6):357-64.
- Khine A, Adefuye A, Busari J. Utility of concept mapping as a tool to enhance metacognitive teaching and learning of complex concepts in undergraduate medical education. Archives of Medicine and Health Sciences. 2019; 7(2):267-72.
- Novak JD. Concept mapping: A useful tool for science education. J Res Sci Teach. 1990; 27(10):937-49.
- Novak JD, Cañas AJ. The Theory Underlying Concept Maps and How to Construct Them [Internet]. Technical Report IHMC Cmap Tools 2006-01, Florida Institute for Human and Machine Cognition. 2006 [Cited 2 May 2006]. Available from: http://cmap.ihmc.us/Publications/ResearchPapers/TheoryUnderlyingConceptMaps.pdf.
- Novak JD. Concept maps and Vee diagrams: two metacognitive tools to facilitate meaningful learning. Instr Sci. 1990; 19:29-52.
- Pintoi AJ, Zeitz HJ. Concept mapping: A strategy for promoting meaningful learning in medical education. Med Teach. 1997; 19(2):114-21.
- Gomes AP, Dias-Coelho UC, Cavalheiro PO, Siqueira-Batist R. The Role of Concept Maps in the Medical Education. Revista Brasileira de Educação Médica. 2011; 35(2):275-82.
- Aslami M, Dehghani MR, Shakurnia A, Ramezani Gh, Kojuri J. Effect of Concept Mapping Education on Critical Thinking Skills of Medical Students: A Quasi-experimental Study. Ethiop J Health Sci. 2021; 31(2):409-18.
- Mlika M, Ben Khelil M, Charfi R, Dziri C, Mezni F. Evaluation and acceptability of concept maps as a learning tool in medical studies. Tunis Med. 2019; 97(5):606-12.
- Daley BJ, Durning SJ, Torre DM. Using Concept Maps to Create Meaningful Learning in Medical Education [version 1]. MedEdPublish. 2016; 5:19.
- Torre DM, Durning SJ, Daley BJ. Twelve tips for teaching with concept maps in medical education. Med Teach. 2013; 35(3):201-8.
- Mok CK, Whitehill TL, Dodd BJ. Concept map analysis in the assessment of speech-language pathology students' learning in a problem-based learning curriculum: a longitudinal study. Clin Linguist Phon. 2014; 28(1-2):83-101.
- Bhusnurmath SR, Bhusnurmath B, Goyal S, Hafeez S, Abugroun A, Okpe J. Concept map as an adjunct tool to teach pathology. Indian J Pathol Microbiol. 2017; 60(2):226-31.
- Joshi U, Vyas S. Assessment of Perception and Effectiveness of Concept Mapping in Learning Epidemiology. Indian J Community Med. 2018; 43(1):37-9.
- Baliga SS, Walvekar PR, Mahantshetti GJ. Concept map as a teaching and learning tool for medical students. J Educ Health Promot. 2021; 10:35.
- Choudhari SG, Gaidhane AM, Desai P, Srivastava T, Mishra V, Zahiruddin SQ. Applying visual mapping techniques to promote learning in community-based medical education activities. BMC Med Educ. 2021; 21(1):210.
- Nath S, Bhattacharyya S, Preetinanda P. Perception of Students and Faculties towards Implementation of Concept Mapping in Pharmacology: A Cross-sectional Interventional Study. J Clin of Diagn Res. 2021; 15(4):FC08-FC13.
- West DC, Pomeroy JR, Park JK, Gerstenberger EA, Sandoval J. Critical Thinking in Graduate Medical Education: A Role for Concept Mapping Assessment?. JAMA. 2000; 284(9):1105-10.
- Cutrer WB, Castro D, Roy KM, Turner TL. Use of an expert concept map as an advance organizer to improve understanding of respiratory failure. Med Teach. 2011; 33(12):1018-26.
- Giavarina D. Understanding Bland Altman analysis. Biochem Med (Zagreb). 2015; 25(2):141-51.
- Azzopardi M, Azzopardi C. Relationship between Item Difficulty Level and Item Discrimination in Biology Final Examinations. Education and New Developments 2019 [Internet]. 2019 [Cited 13 June 2019]. Available from: http://end-educationconference.org/2019/wp-content/uploads/2020/05/2019v2end001.pdf.
- Machado CT, Carvalho AA. Concept Mapping: Benefits and Challenges in Higher Education. The Journal of Continuing Higher Education. 2020; 68(1):38-53.
- Ortega-Tudela JM, Lechuga MT, Gómez-Ariza CJ. A specific benefit of retrieval-based concept mapping to enhance learning from texts. Instr Sci. 2019; 47:239-55.
- Chevron MP. A metacognitive tool: Theoretical and operational analysis of skills exercised in structured concept maps. Perspectives in Science. 2014; 2(1-4):46-54.
- Ho VW, Harris PG, Kumar RK, Velan GM. Knowledge maps: a tool for online assessment with automated feedback. Med Educ Online. 2018; 23(1):1457394.
- The Pros and Cons of Multiple-Choice Questions used a Means for Evaluating the Child’s Knowledge - The Knowledge Hub [Internet]. The Knowledge Hub. 2021 [cited 2022 Aug 6]. Available from: http://end-educationconference.org/2019/wp-content/uploads/2020/05/2019v2end001.pdf.
- Fonseca M, Oliveira B, Carreiro-Martins P, Neuparth N, Rendas A. Revisiting the role of concept mapping in teaching and learning pathophysiology for medical students. Adv Physiol Educ. 2020; 44(3):475-81.
- Gamboa T, Rosado-Pinto P, Rendas A. Concept Maps: Theory, Methodology, Technology Proc. of the Fifth Int. Conference on Concept Mapping. Valletta, Malta; 2012.
- Ekin B, Uluçınar Sağır Ş, Saltan F. The Comparison on Evaluation of Concept Map and Structured Grid with Multiple-Choice Test. Participatory Educational Research. 2016; 3(5):100-11.
- Loizou S, Nicolaou N, Pincus BA, Papageorgiou A, McCrorie P. Concept maps as a novel assessment tool in medical education [version 1]. MedEdPublish. 2022; 12:21.
- Addae JI, Wilson JI, Carrington C. Students' perception of a modified form of PBL using concept mapping. Med Teach. 2012; 34(11):e756-62.
- Torre DM, Daley B, Stark-Schweitzer T, Siddartha S, Petkova J, Ziebert M. A qualitative evaluation of medical student learning with concept maps. Med Teach. 2007; 29(9):949-55.