Document Type : Review article

Authors

1 Medical Oncology Department, St. Vincent University Hospital, Dublin, IRL

2 Radiography Department, Independent Researcher, Dublin, IRL

3 Medical Education Department, Qassim University, Qassim, KSA

4 Diagnostic Imaging Department, University of Sharjah, Sharjah, UAE

10.30476/jamp.2024.102078.1955

Abstract

Introduction: The landscape of medical education has witnessed significant transformations over the past decades, particularly with the advent of active teaching methodologies. However, despite these advancements, the traditional theoretical assessment methods have remained largely unchanged. This lack of evolution in assessment systems poses a challenge as it is crucial for assessment methods to evolve in tandem with teaching approaches to ensure a comprehensive and effective learning process in medical education. This paper reviews the integration and effectiveness of open-book examinations (OBEs) in medical education, reflecting their growing significance.
Methods: An integrative review of the literature was conducted, drawing from a range of relevant publications over the last decade, sourced from databases such as PubMed, Web of Science, Scopus, and ERIC. The inclusion criteria focused on full-text articles in English, with search terms including “medicine,” “assessment,” “open book examination,” “open book exam,” and “open book assessment,” combined using Boolean operators. Thirteen publications were selected and critically appraised using The Critical Appraisal Skills Program checklist.
Results: The analysis identified three primary thematic categories: “Teaching Strategy for Pandemic and Challenging Conditions,” “Tool of Learning & Educational Impact”, and “Operational Challenges & Future Directions”. These themes were explored to understand the role and impact of open-book examinations in medical education.
Conclusion: The findings indicate that open-book examinations are a crucial component in the evolving landscape of medical education. While certain reservations remain, open-book
examinations have shown significant potential in fostering critical thinking, argumentation skills, and lifelong learning among medical students. They reflect the ongoing evolution of knowledge in the medical field and contribute to the development of professionals’ adept at navigating and applying complex information. Further research is recommended to solidify these findings and expand the understanding of open-book examinations in medical education.

Highlights

YASAR AHMED

Keywords

Introduction

The landscape of medical education is in a state of continuous evolution, adapting to the ever-changing demands of healthcare and the advances in educational theory. A critical aspect of this transformation is the assessment strategies employed to evaluate medical students. Traditional examinations, predominantly closed-book in nature, have long been the cornerstone of academic evaluation ( 1 ). However, in recent decades, there has been a paradigm shift towards more innovative assessment methods, one of which is the open-book examination ( 2 ) which goals reflect the values of the system or institution.

An open-book exam is a type of assessment in which students can refer to authorized materials while taking the test ( 3 ). The approved resources could include class notes, textbooks, primary or secondary readings, and/or access to the Internet when answering questions. The inception of open-book examinations in medical education marks a significant departure from conventional memorization-based assessments. Unlike closed-book exams, which often prioritize rote learning and recall, open-book exams encourage students to understand, analyze, and apply knowledge, mirroring the realities of clinical practice where resources and references are readily available ( 4 ). This shift acknowledges the immense body of medical knowledge, continuously expanding and evolving, making it impractical and unnecessary for students to memorize vast quantities of information ( 1 ).

Open-book examinations critically assess a student's ability to efficiently find and use information. These skills are vital for practicing physicians who navigate vast amounts of clinical data and evidence-based guidelines daily ( 2 , 5 , 6 ) . This approach aligns assessment methods with the practical demands of modern healthcare, emphasizing the application of knowledge rather than its mere recall ( 6 ). It also reflects a broader educational philosophy that values lifelong learning and adaptability, skills that are indispensable in a profession characterized by rapid advancements and constant change ( 7 ).

However, the implementation of open-book examinations in medical education is not without challenges. Questions arise regarding their ability to rigorously assess the students’ competence, the potential for academic dishonesty, and the need for carefully crafted questions that probe deeper levels of understanding. Additionally, there is a debate on how well open-book examinations prepare students for high-stakes, closed-book examinations like licensing and board certification tests, which remain a reality in the medical profession ( 8 ).

The effectiveness of open-book examinations depends heavily on the pedagogical context in which they are used. They demand a curriculum that fosters independent learning, critical thinking, and efficient information management. Instructors are crucial in designing questions that test knowledge application and problem solving skills, discourage superficial learning, and encourage a thorough understanding of core concepts ( 9 ).

In exploring the role of open-book examinations, it is imperative to consider their impact on students' learning experiences, study behaviors, and academic performance. Research suggests that open-book examinations can reduce examination-related anxiety, promote a deeper engagement with the material, and encourage a more strategic approach to learning ( 10 ). However, there is also a need to understand the perspectives of both students and educators on the effectiveness of this assessment method, its impact on learning outcomes, and the challenges encountered in its implementation.

As the medical field continues to advance, so must the methods we use to educate and assess future physicians. Open-book examinations represent a progressive step in aligning medical education with the realities of clinical practice and the principles of adult learning theory. By embracing this innovative assessment approach, medical education can foster a generation of doctors who are not only knowledgeable but also adept at navigating the vast landscape of medical information, which is critical in making informed, evidence-based decisions in patient care ( 11 ).

The exploration of open-book examinations in medical education offers valuable insights into how assessment strategies can evolve to better prepare students for the demands of modern medical practice. This exploration is not just about changing how we test but also about redefining what and how we teach, ensuring that medical education remains relevant, effective, and responsive to the needs of students and the healthcare systems they will serve.

Assessment in medical schools during times of crisis, such as pandemics like COVID-19, war, and natural disasters, poses unique challenges that can significantly impact the quality and effectiveness of medical education. More recently, the COVID-19 pandemic has had significant impacts on both the delivery of education and the implementation of fair, effective evaluation methods, highlighting the need to evaluate our students with the utmost quality and reliability ( 12 , 13 ). Social distancing requirements and the challenges associated with ensuring rigorous examination conditions have, in numerous instances, made the administration of conventional traditional closed-book exams impossible. The experience gained during this period should continue and be scientifically based to be used more effectively, with greater confidence and knowledge.

This review highlights the discrepancy between contemporary teaching methods and traditional assessments in medical education, emphasizing the limitations of closed-book exams that may not prepare students for real-world practice where information retrieval is essential. The study is essential as it explores open-book examinations, which promote deeper knowledge application rather than simple recall, addressing the expanding and vast field of medical information. By synthesizing current insights on OBEs and identifying areas for further research, this study offers a foundational framework for future investigations into their long-term impacts and practical applications in medical disciplines, providing a nuanced understanding of how OBEs can assess higher-order cognitive skills in an evolving medical landscape.

Thus, this study aimed to systematically investigate the integration of open-book examinations as an educational assessment method in the medical curriculum, focusing on understanding the pedagogical approaches, challenges, and learning outcomes associated with their implementation.

Methods

Design

This is an integrative review, described following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The following research question was posed in this review: How do open-book examinations impact the learning process and academic performance of medical students compared to traditional closed-book examinations?

To structure this review and answer this research question, we used Whittemore and Knafl's framework for integrated reviews ( 14 ). An integrative review is a type of research that evaluates, analyzes, and combines relevant literature on a subject in a cohesive manner, leading to the development of new frameworks and insights related to the topic. It presents a unique approach by combining and synthesizing data from diverse research designs including qualitative, quantitative, and mixed methods, to gain a more comprehensive understanding of the phenomenon of interest ( 15 , 16 ).

Eligibility criteria

The PICOS (population, intervention, comparison, outcome, study design) framework was used to develop search terms for the systematic review and was the foundation for study inclusion and exclusion criteria (Table 1).

Parameter Description
Population Health care professionals from any medical education context, without restriction to specific levels of education or specialties. Studies that focused exclusively on students in nursing, dentistry, or other specialized healthcare fields were excluded.
Intervention Studies that have evaluated open-book examinations or assessments as the primary educational or evaluative method being studied.
Comparison Comparisons between open-book and closed-book examination or between different implementations of open-book examination.
Studies may have no comparison group or comparator.
Outcome Measures related to the effectiveness, student performance, perceptions, challenges, or advantages of open-book examinations.
Studies Experimental and non-experimental studies published in in English between 2013 and 2023 from peer-reviewed journals. This includes qualitative, quantitative, and mixed method studies.
Commentary articles, letters to the editor, editorial, theses, dissertations, reports, book chapters, non-peer-reviewed publication, and publications not in English were excluded.
Table 1.Eligibility criteria according to the PICOS framework.

Search strategies

A comprehensive systematic literature search was conducted using four electronic databases, including PubMed, Web of Science, Scopus, and ERIC. The databases were checked for English-language studies published from 2013 to 2023. Two reviewers (Y.A., S.K.) independently did a systematic search for studies evaluating open-book examination in medical education. We utilized a group of descriptors, combined using the Boolean operators "AND" and "OR". The following key search terms were used for databases “medicine,” “assessment,” “open book examination,” “open book exam,” “open book test,” and “open book assessment.” Using 6 descriptors, we performed a total of 30 search combinations.

Selection process

The process began by defining the theme, in this case, open-book examinations in medical education. This allowed us to identify specific descriptors or keywords relevant to the subject.

Data management of this systematic review study was done using Zotero version 6.0.30 (Corporation for Digital Scholarship, Vienna, VA, USA). Titles and abstracts were initially screened by 2 researchers (YA, SK) separately to identify potentially included articles. Subsequently, the full articles were thoroughly reviewed by both reviewers against the eligibility criteria. A consensus was reached by the research team on the final list of articles to be included. The other 2 researchers (MW, MA) resolved any contradictions between the 2 researchers in selecting studies. Given the diverse range of research methodologies, the Critical Appraisal Skills Programme (CASP) tool was adapted. This adaptation closely followed the well-documented modifications introduced by Halcomb et al. ( 17 , 18 ). Finally, to prevent data loss, we evaluated the list of study references manually.

Study selection

A total of 6,720 articles were identified from the electronic databases. Initially, a total of 249 duplicate records were identified and removed from the dataset to ensure data integrity. The comprehensive search yielded 6,471 records relevant to the research question, which underwent the initial screening process. After careful screening based on titles and abstracts (inclusion criteria of content), many records (n=6,267) were found which did not meet the inclusion criteria of content and were consequently excluded from further consideration. Further scrutiny was applied to the remaining records, resulting in the exclusion of all but 204 full-text articles, which were assessed for eligibility in the systematic review. After conducting a thorough full-text screening, we excluded 191 articles due to not meeting the inclusion criteria of the research. Subsequently, 13 articles were deemed eligible and included in the final analysis for this systematic review (Figure 1).

Figure 1. Literature search and selection flow

Ethical Consideration

This is a literature-based study; therefore, neither approval from the institutional review board nor informed consent was required.

Results

The search strategy for this integrative review is shown in Figure 1 as per PRISMA guidelines ( 19 ) . This review identified 13 publications that met the eligibility criteria (Table 2).

Study Authors Design Focus of study
“Comparing Open-Book and Closed-Book Examinations: A Systematic Review”. Durning et al. 2016 ( 11 ) Qualitative Analyse and synthesize existing research on the utility of OBES and CBES.
“Remote E-exams during Covid-19 pandemic: A cross-sectional study of students' preferences and academic dishonesty in faculties of medical sciences”. Elsalem et al. 2021 ( 20 ) Quantitative Evaluate the experiences and preferences of students regarding remote E-exams during the COVID-19 pandemic.
“Adaptation to Open-Book Online Examination During the Covid-19 Pandemic”. Eurboonyanun et al. 2021 ( 21 ) Quantitative Compare the performance of students in online OBE to traditional written examinations.
“Online open-book examination of undergraduate medical students: a pilot study of a novel assessment method used during the coronavirus disease 2019 pandemic”. Sarkar et al. 2021 ( 22 ) Quantitative check the feasibility and acceptability of an online OBE for medical students.
“Medical teaching and assessment in the era of Covid-19”. Monaghan et al. 2020 ( 23 ) Quantitative  
“Assessment during Covid-19: quality assurance of an online open book formative examination for undergraduate medical students”. Rehman et al. 2022 ( 24 ) Quantitative Evaluate the quality of an online OBES administered to first-year medical students during the COVID-19 pandemic.
“Test-enhanced” focused self-directed learning after the teaching modules in biochemistry. Bobby et al. 2018 ( 25 ) Mixed methods Evaluate the effectiveness of OBE and self-study in promoting learning among medical students.
“A systematic review to compare open and closed book examinations in medicine and dentistry”.   Dave et al. 2021 ( 26 ) Qualitative Summarize and critically evaluate the existing literature regarding OBE in medicine.
“Perceptions of clinical years' medical students and interns towards assessment methods used in King Abdulaziz University, Jeddah”. Ibrahim et al. 2015 ( 27 ) Quantitative Assess students’ perceptions of different assessment methods.
“Comparison of Open-Book and Closed-Book Formats for Medical Certification Exams: A Controlled Study”. Brossman et al. 2017 ( 28 ) Quantitative Assess the viability of open-book formats for medical certification exams, with a focus on logistical and psychometric aspects.
“Assessment of factual recall and higher-order cognitive domains in an open-book medical school examination”. Davies et al. 2021 ( 29 ) Quantitative Quantify the effect of open-book resources on student performance in different cognitive domains based on Bloom's taxonomy.
“Because Life is Open Book: An Open Internet Family Medicine Clerkship Exam”.  Erlich et al. 2017 ( 30 ) Quantitative Assess medical students' information mastery competency.
“Medical Student Assessment in the Time of COVID-19”. Prigoff, et al. 2020 ( 31 ) Quantitative comparative observational research Examines the impact of changes made in response to the pandemic, including the use of OBE.
OBES=Open-Book Examinations, CBES=Closed-Book Examinations
Table 2.Descriptive list of the 13 studies included in the review

Study characteristics

The findings from the 13 publications reviewed are categorized into three primary categories providing a structured framework for discussing the central evidence identified in the publications. These categories, ‘Teaching Strategy for Pandemic and Challenging Conditions, Tool of Learning and Educational Impact in Medical Education, and Operational Challenges, and Future Directions will be used to guide our discussion of the findings. Table 3 summarizes these categories and the key aspects of findings associated with each.

Category Description Key Aspects
Teaching Strategy for Pandemic and Challenging Conditions This category examines the role and effectiveness of OBEs as a teaching and assessment method for the COVID-19 pandemic and Challenging Conditions. Response to digital learning and the pandemic.
Efficacy of OBEs in remote/online settings.
Student perspectives on OBEs
Academic integrity.
Tool of Learning & Educational Impact This category explores the utility of OBEs as an ongoing educational tool within medical education, beyond the context of the pandemic and challenging conditions. Educational transformations.
Depth and accessibility of resources.
Test-enhanced learning benefits. Comparative Efficacy and Student Performance.
Operational Challenges & Future Directions This category discusses the implications of OBEs on exam duration and student anxiety, including the balance between comprehensive assessment and time constraints. Impact on exam preparation and study strategies.
Time efficiency considerations.
Influence on students' anxiety levels.
Discrimination power of OBEs.
Table 3.Summary of Primary Categories in Open-Book Examination Studies

This Table provides a structured framework for understanding the three primary categories identified in the study, each encompassing various aspects related to open-book examinations in medical education.

Discussion

This section review commences by underscoring the adaptability and effectiveness of open-book examinations during the unprecedented challenges of the COVID-19 pandemic, highlighting their role in maintaining educational continuity and integrity. Conditions like the COVID-19 pandemic that would rationalize the use of open book exams include any situation where traditional, in-person examination methods are impractical or pose health, safety, or logistical issues. These conditions often necessitate a shift towards more flexible, inclusive, and accessible assessment methods that can be administered remotely. This includes public health crises, natural disasters, political or civil unrest, and technological disruptions. As we delve deeper, we explore the enduring utility of open-book examinations beyond the pandemic context, examining their impact on student learning strategies, performance, and anxiety levels.

Teaching strategy for pandemic and challenging conditions

Since the beginning of the year 2020, we have witnessed significant challenges brought about by the COVID-19 pandemic. Its global impacts have rippled across all sectors and activities, including education. Social distancing measures became imperative to contain viral transmission, resulting in the abrupt suspension of in-person teaching activities. For prevention of potential disruptions, adaptations were made to teaching and assessment methods ( 32 ). This period of remote learning extended far beyond its initial expectations. This would be applied in all similar scenarios where there are challenges that disrupt traditional examination methodologies (natural disasters, public health crises, political or civil unrest, etc.). In the following sections, we shall delve into a comprehensive discussion of studies of assessments conducted within this unique context.

Response to the Digital Era and the Pandemic Challenge

Utilization of open-book examinations in distance learning has emerged as a promising approach to adapt to the evolving landscape of education, especially considering the COVID-19 pandemic. This assessment method aligns well with the principles of remote and online education, where students have access to vast digital resources ( 20 , 21 , 23 ). Open-book exams acknowledge the reality that in the era of information abundance, memorization of facts and figures holds less importance compared to the ability to locate, synthesize, and apply knowledge. As such, these assessments emphasize critical thinking, problem-solving, and the practical application of information - skills that are highly relevant in the digital age and future medical practice ( 21 , 22 ). the feasibility of online assessment and how it compares to traditional examinations is unclear in \n\METHODS\n. We compared 4th year medical students’ online surgery clerkship assessment scores to the traditional written examinations. The percent of correct scores using online open-book examination was compared to the results of the traditional closed-book examination in the previous three rotations. Additional correlation between grade point average (GPA).

Additionally, the use of open-book exams in distance education offers flexibility and adaptability. This approach enhances the learning experience, promoting active engagement with the content ( 24 ). Moreover, open-book exams often necessitate continuous engagement with the course material, reinforcing the principles of active learning. In a remote learning environment, where self-directed learning is crucial, open-book exams support students in becoming more independent and resourceful learners ( 22 ). This is in agreement with Bobby et al.’s study ( 25 ) conducted on assessment of the effectiveness of "Test enhanced learning" via "open-book examination" as a formative assessment tool in the context of medical education. Most of the students believed that the open-book examination increased their self-directed focused learning process. They also felt that open-book examination was more advantageous than “self-study” in augmenting the learning concepts after regular didactic lectures.

Efficacy in a remote online setting

During the COVID-19 pandemic, despite the challenges encountered, it presented an excellent opportunity for medical educators to carefully explore the utilization of open-book assessments in an online environment ( 32 ). This mode of assessment can evaluate the students' ability to efficiently research and translate information, a crucial skill requisite for future clinical practice. Furthermore, the same authors advocate for hybrid assessment strategies, including an initial section without access to reference materials to assess the learning of fundamental concepts that students should possess without external aids. Subsequently, a second section permits access to resources, evaluating the students' ability to research, synthesize, correlate, construct arguments, and apply clinical reasoning to specific topics.

Eurboonyanun et al. ( 21 ) compared the final-year medical students' online surgery clerkship assessment scores to the traditional written examinations in the previous three rotations, using the same question bank. Students who took the online open-book examination scored higher on average in multiple-choice and essay questions but lower on short-answer questions. This result is important as it highlights the need to establish comparable pass rates and minimum passing grades for assessments with and without open-book access, procedures that should be replicated in other institutions wishing to change their assessment methodologies. Similarly, Sarkar et al. ( 22 ) conducted a study on an online open-book assessment in the otolaryngology discipline, also due to adaptations to remote teaching during the pandemic. The authors compared these results with previous traditional in-person assessments without open-book access and demonstrated similar pass rates in both methodologies.

Student perspectives

In a study conducted by Elsalem et al. ( 20 ), the findings revealed that only one-third of the students preferred online open-book exams as an assessment modality during the COVID-19 pandemic. The authors attributed this low level of acceptance among students toward open-book assessments to several factors, the need for more effort/time to prepare for online open-book exams, challenges encountered during pre-examination preparations, and perceived disparities between the examination questions and the study materials provided. These findings are valuable for planning academic strategies aimed at effectively addressing the challenges associated with remote open-book assessments. Such strategies may include enhancements in distance learning methods, reorganization of assessment strategies, and review of academic curricula to ensure alignment with the evolving educational landscape.

Academic integrity

Researchers have raised questions about the occurrence of academic dishonesty or cheating in remote open-book assessments. Elsalem et al. ( 20 ) investigated the occurrence of cheating and dishonesty during these exams and showed that 55.07% of students reported no exam dishonesty or misconduct, while 20.41% mentioned seeking help from friends, and 24.52% used other unauthorized sources of information. Additionally, it is noteworthy that Sarkar et al. ( 22 ) obtained similar results; 72.2% of students did not consult with their friends during the examination and answered independently. Monaghan ( 23 ) argues that to safeguard against cheating or collusion, strategies like randomizing the order of questions for each student were employed, rendering communication between them ineffective. This demonstrates that the occurrence of dishonesty and cheating in remote open-book assessments does not appear to be as frequent, and there are methods to discourage such behaviour. However, the current literature lacks studies comparing the frequency of these dishonest actions between in-person and remote assessment methods, as well as comparisons between open-book and closed-book assessments, to determine whether permitting reference materials or open-book access inhibits or reduces the occurrence of cheating.

Tool of learning and educational impact in medical education

Assessment stands as a fundamental facet in any educational process, a crucial element that demands the serious consideration of all stakeholders, particularly in the context of medical education ( 33 ). Therefore, in this section, we shall dig into studies that have examined the integration of open-book assessment as an educational tool within medical education.

Educational Transformations

The shifts in teaching and assessment methods during the COVID-19 pandemic provided the opportunity to implement, test, and better understand open-book assessments in medical curricula whether conducted remotely or in traditional settings ( 22 , 23 , 32 ). While most of these educational transformations occurred out of urgency, many will probably remain, in a more refined form, as preferred methods of teaching and assessment in the future ( 23 ). Furthermore, Dave and Durning ( 26 ) assert that the window of opportunity afforded during this period should be exploited to advance our comprehension of holistic student assessment.

It is noteworthy that even before the COVID-19 pandemic, Bobby et al. ( 25 ) pointed out the need for such changes, advocating for an era of open books. Their argument posits that this change in assessment philosophy could benefit students by exposing them to deeper and more enjoyable learning methods. This perspective resonates with Ibrahim et al. ( 27 ) who emphasize the necessity of adopting more innovative assessment methods such as open-book assessment, self-assessment, and peer assessment.

Challenges in the adoption of open-book assessments are notably highlighted in the consensus among researchers, supporting the necessity of persuading medical educators who may exhibit resistance to change and a preference for adhering to established traditional assessment systems ( 22 ). Similarly, other authors concur that the inherent challenge resides in educators' reluctance to embrace changes, thereby exhibiting a proclivity to perpetuate the utilization of conventional assessment methodologies ( 23 ). To overcome these challenges, a paradigm shift in assessment philosophy is needed, which can result in students engaging in deeper and more thorough knowledge and innovative teaching and assessment methods.

Depth and Resource Accessibility

Regarding the depth of the topics covered in assessments with and without access to reference materials, it was observed that written clinical exams were suitable for open-book assessment formats. This is because the questions require a differentiated synthesis of information from the provided clinical scenarios, and, therefore, the answers cannot be simply looked up on the Internet ( 11 ).

However, Davies et al. ( 29 ) emphasize that questions in which students should show understanding of a concept or apply knowledge to new information are less influenced by access to open-book resources. This is because the information needed to answer the question should be more complex and more difficult to find online, so this may have been particularly limiting in a time-pressured examination.

Test-enhanced learning

It is noteworthy that one of the advantages highlighted in open-book assessments is that they not only discourage students from temporarily memorizing superficial information for regurgitation during evaluations but also closely mirror real clinical practice where such information is readily available from resources such as evidence-based digital medical libraries ( 11 , 22 ). It becomes evident that utilizing this type of assessment also confers the advantage of exposing students to situations more closely aligned with the professional practice they will encounter in the future.

Bobby et al. ( 25 ) indicate that open-book examinations present a valuable method for promoting higher-order cognitive competencies. This stands in contrast to traditional closed-book exams, which are much based on the candidates’ ability to memorize. Within the framework of open-book exams, questions can be framed to align with the higher levels of Bloom's Taxonomy. This format empowers educators to appraise the students' proficiency in critical thinking, knowledge analysis, synthesis, and evaluation, characteristics of vital importance in the medical field. As a research gap, Eurboonyanun et al. ( 21 ) point out the need for further studies to assess the long-term effects of online open-book exams on knowledge retention and application.

Comparative efficacy and student performance

An important initial point to note is that Erlich ( 30 ) and Durning et al. ( 11 ) made comparisons between open-book and closed-book examinations to determine if there is a difference in student performance in these modalities. However, they found similar results in both cases. Additionally, Durning et al. ( 11 ) demonstrated that the performance of students in open-book examinations could be enhanced by the implementation of practical preparatory tests and instructions on this assessment modality as students often have limited experience with this type of evaluation. Equally important, Erlich ( 30 ) argued that the majority of the students who perform below average in open-book examinations also have low scores in clinical assessments by preceptors (family physicians hosting students in their clinical practices), specifically in the domain of information mastery. This demonstrates that there is no superiority of one assessment modality over the other, and both can be used depending on the assessment objectives. Therefore, both assessment tools can identify students with low performance.

Operational challenges and future directions

Exam preparation and study strategies

Regarding the influence of assessment on the preparation and study method, researchers have different opinions. Durning et al. ( 11 ) showed that students did not change their study tactics for open-book examinations. Similarly, Davies ( 29 ) found no difference in preparation time or study tactics for open-book examinations.

In contrast, Sarkar et al. ( 22 ) evaluated student feedback after an online open-book examination and reported that they spent more time understanding the subject rather than just memorizing it. They also realized that they would not be able to write answers to the examination questions if the subjects had not been read and studied beforehand. This may indicate that open-book examinations do not hinder the method of study and preparation, or even better, that student’s study more deeply, focusing less on memorizing concepts and more on higher functions such as correlation, argumentation, and synthesis. This view is shared by Erlich ( 30 ), who emphasizes that in an era of Internet-based knowledge evolution, medical professionals and medical students must be competent in quickly accessing, synthesizing, and applying continually updated information for decision-making.

Time efficiency

In a study conducted by Durning et al. ( 11 ), students took 10% to 60% more time to complete open-book examinations when compared to similar closed-book assessments. These findings are in line with the results of studies carried out in the United States of America. Brossman et al. ( 28 ) reported the need for 40% more time to complete open-book examinations. Studies that evaluated the time required to complete the assessments agree, showing an increase in the time needed for open-book examinations. This finding has direct implications for the implementation of open-book assessments, as the additional time required for completion should be taken into account to ensure that it does not become a factor that negatively influences the assessment outcome ( 28 ).

•Anxiety levels

Other researchers assessed the level of anxiety related to open-book assessments. Sarkar et al. ( 22 ) analyzed the students’ feedback and found a lower level of stress during open-book assessments, in agreement with Prigoff, et al. ( 31 ).

Davies et al. ( 29 ) suggest that anxiety may have played a role in the exam performance, but the researchers did not measure the examinees' anxiety directly. They considered it unlikely that those sitting the open-book exam felt less anxious, given that they had no experience within the course of performing open-book exams and the uncertainty and disruption caused by COVID-19. Furthermore, while examinees may expect themselves to be less anxious in an open-book exam, there is evidence that their experienced anxiety is similar.

However, in Durning et al.’s study ( 11 ), it is observed that students associated open-book assessments with lower anxiety levels, but only a few of them reported lower anxiety when they performed this type of assessment. Thus, there does not seem to be a consensus on whether students experience reduced anxiety with open-book assessments. However, none of the studies demonstrated an increase in stress or anxiety related to open-book assessments.

Discrimination power

The power of discrimination in a test or assessment is an index indicating how well the question separates the high-scoring from the low-scoring examinees ( 32 ). To assess discriminative power, Brossman et al. ( 28 ) employed Item Response Theory (IRT), which considers three characteristics of test items: their ability to evaluate whether students have the necessary knowledge to answer them, the level of difficulty, and the likelihood of guessing the correct answer by chance or random guessing. Using this methodology, they demonstrated that open-book assessments have greater discriminative power than similar closed-book assessments. This means that the questions in an open-book assessment have a greater capacity to differentiate high-performing candidates from low-performing ones. This appears to be linked to the depth and complexity of the questions in open-book assessments, which tend to require higher-order thinking skills. At this point, it is worth questioning whether the improved discriminative results are attributed solely to the use of external reference materials or if they are driven by the formulation of questions demanding higher levels of clinical reasoning.

Additionally, it is noteworthy that, Rehman et al. ( 24 ) obtained similar results. The open-book exam showed significant differences in performance between top and struggling students, with most questions exhibiting moderate (index values between 16 and 30) to high discrimination indices exceeding 30. The clearly articulated, straightforward test questions of moderate difficulty, which enhance assessment reliability, further corroborate these results.

Limitations and Future research direction

Like any other study, our research had some limitations. One limitation of this study is the relatively limited number of research literature available concerning open-book assessments in medical education, resulting in a small pool of studies included in this review. However, it is worth noting that among these, there are robust and technically sound studies, that allow for important conclusions to be drawn on this subject and provide a foundation for future research initiatives. Therefore, it is anticipated that the insight and knowledge generated in this review will not only stimulate more in-depth discussions on open-book assessments but also serve as a basis for further studies in the future.

Conclusions

There is clearly a growing need for appropriate assessment tools, particularly in the biomedical field, to keep bound with the rapid expansion and accessibility of knowledge. These tools should be dynamic, adaptable, and reflective of the evolving landscape of biomedical knowledge. These tools should aim to evaluate not only factual knowledge but also higher-order cognitive skills, critical thinking, and the ability to apply knowledge in practical scenarios.

This review comprehensively examined the current applications of open-book examinations and their integration into medical education. It has presented robust evidence supporting the utilization of remote online open-book assessments, demonstrating their effectiveness, reliability, and compatibility with active teaching methodologies that emphasize student engagement. These assessments enable thorough and high-quality evaluations without compromising their ability to distinguish student performance and reliability. Consequently, they emerge as highly valuable tools for assessing medical students. Although its implementation within medical curricula has increased during the COVID-19 pandemic, particularly through remote examinations, it remains somewhat limited overall.

We have highlighted several potential advantages of open-book assessments. Studies indicate that they foster the development of professionals equipped with critical thinking, and problem-solving skills, prepared for lifelong learning and awareness of the constant evolution of medical knowledge. However, there are also various challenges associated with this assessment method, including the need to redefine passing scores and grading criteria, the operational aspects of conducting open-book exams, and the training of both educators and students in using this assessment format.

While open-book examinations in medical education show promise for fostering critical thinking, problem-solving skills, and an appreciation for the evolving nature of medical knowledge, some challenges warrant attention. These include redefining assessment criteria, addressing concerns about academic integrity, and ensuring that such examinations are used complementarily with traditional assessment methods. Therefore, while open-book examinations are a valuable addition to medical curricula, their implementation should be carefully balanced with other forms of assessments to ensure a comprehensive evaluation of student competence.

Authors’ Contribution

Y.A and S.K designed the study; then, M.W wrote the search strategy and performed the literature search. Data acquisition and analysis were done by Y.A and then reviewed by M.A. The authors finally categorized the articles and prepared the manuscript. All authors contributed to the discussion, read, and approved the manuscript and agreed to be accountable for all aspects of the work.

Conflict of Interests:

The authors declare no conflicts of interest.

References

  1. Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, et al. Performance assessment: Consensus statement and recommendations from the 2020 Ottawa Conference. Med Teach. 2021; 43(1):58-67.
  2. Pangaro L, Ten Cate O. Frameworks for learner assessment in medicine: AMEE Guide No. 78. Med Teach. 2018; 35(6):e1197-210.
  3. Myyry L, Joutsenvirta T. Open-book, open-web online examinations: Developing examination practices to support university students’ learning and self-efficacy. Active Learning in Higher Education. 2015; 16(2):119-32.
  4. Epstein RM. Assessment in Medical Education. New England Journal of Medicine. 2007; 356(4):387-96.
  5. GMC. Assessment in undergraduate medical education-Tomorrow’s Doctors. General Medical Council-UK: UK; 2018.
  6. Ben-David MF. AMEE Guide No. 18: Standard setting in student assessment. Med Teach. 2000; 22(2):120-30.
  7. Mahajan R, Saiyad S, Virk A, Joshi A, Singh T. Blended programmatic assessment for competency based curricula. J Postgrad Med. 2021; 67(1):18-23.
  8. Schuwirth LWT, van der Vleuten CPM. How ‘Testing’ Has Become ‘Programmatic Assessment for Learning’. Health Professions Education. 2019; 5(3):177-84.
  9. Vleuten C, Lindemann I, Schmidt L. Programmatic assessment: the process, rationale and evidence for modern evaluation approaches in medical education. Medical Journal of Australia. 2018; 209(9):386-8.
  10. Hauer KE, Boscardin C, Brenner JM, van Schaik SM, Papp KK. Twelve tips for assessing medical knowledge with open-ended questions: Designing constructed response examinations in medical education. Med Teach. 2020; 42(8):880-5.
  11. Durning SJ, Dong T, Ratcliffe T, Schuwirth L, Artino ARJ, Boulet JR, et al. Comparing Open-Book and Closed-Book Examinations: A Systematic Review. Acad Med. 2016; 91(4):583.
  12. Sam AH, Reid MD, Amin A. High‐stakes, remote‐access, open‐book examinations. Med Educ. 2020; 54(8):767-8.
  13. Greene J, Mullally W, Ahmed Y, Khan M, Calvert P, Horgan A, et al. Maintaining a Medical Oncology Service during the Covid-19 Pandemic. Irish medical journal. 2020; 11:77.
  14. Whittemore R. Combining Evidence in Nursing Research: Methods and Implications. Nursing Research. 2005; 54(1):56.
  15. Whittemore R, Knafl K. The integrative review: updated methodology. Journal of Advanced Nursing. 2005; 52(5):546-53.
  16. Yu X, Watson M. Guidance on Conducting a Systematic Literature Review. Journal of Planning Education and Research. 2019; 9(1):93-112.
  17. Halcomb E, Stephens M, Bryce J, Foley E, Ashley C. Nursing competency standards in primary health care: an integrative review. Journal of Clinical Nursing. 2016; 25(9–10):1193-205.
  18. CASP - Critical Appraisal Skills Programme [Internet]. CASP Checklists - Critical Appraisal Skills Programme; 2023 [Cited 2023 Oct 6]. Available from: https://casp-uk.net/casp-tools-checklists/.
  19. Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews. 2015; 4(1):1.
  20. Elsalem L, Al-Azzam N, Jum’ah AA, Obeidat N. Remote E-exams during Covid-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery. 2021; 62:326-33.
  21. Eurboonyanun C, Wittayapairoch J, Aphinives P, Petrusa E, Gee DW, Phitayakorn R. Adaptation to Open-Book Online Examination During the COVID-19 Pandemic. J Surg Educ. 2021; 78(3):737-9.
  22. Sarkar S, Mishra P, Nayak A. Online open-book examination of undergraduate medical students - a pilot study of a novel assessment method used during the coronavirus disease 2019 pandemic. J Laryngol Otol. 2021; 135(4):288-92.
  23. Monaghan AM. Medical Teaching and Assessment in the Era of COVID-19. J Med Educ Curric Dev. 2020; 7:2382120520965255.
  24. Rehman J, Ali R, Afzal A, Shakil S, Sultan AS, Idrees R, et al. Assessment during Covid-19: quality assurance of an online open book formative examination for undergraduate medical students. BMC Medical Education. 2022; 22(1):792.
  25. Bobby Z, Meiyappan K. “Test-enhanced” focused self-directed learning after the teaching modules in biochemistry. Biochemistry and Molecular Biology Education. 2018; 46(5):472-7.
  26. Dave M, Patel K, Patel N. A systematic review to compare open and closed book examinations in medicine and dentistry. Faculty Dental Journal [Internet]. 2021 Oct 10 [Cited 2023 Oct 10]. Available from: https://publishing.rcseng.ac.uk/doi/10.1308/rcsfdj.2021.41.
  27. Ibrahim NK, Al-Sharabi BM, Al-Asiri RA, Alotaibi NA, Al-Husaini WI, Al-Khajah HA, et al. Perceptions of clinical years’ medical students and interns towards assessment methods used in King Abdulaziz University, Jeddah. Pak J Med Sci. 2015; 31(4):757-62.
  28. Brossman BG, Samonte K, Herrschaft B, Lipner RS. A Comparison of Open-Book and Closed-Book Formats for Medical Certification Exams: A Controlled Study. American Educational Research Association: San Antonio, TX; 2017.
  29. Davies DJ, McLean PF, Kemp PR, Liddle AD, Morrell MJ, Halse O, et al. Assessment of factual recall and higher-order cognitive domains in an open-book medical school examination. Adv Health Sci Educ Theory Pract. 2022; 27(1):147-65.
  30. Erlich D. Because Life Is Open Book: An Open Internet Family Medicine Clerkship Exam. PRiMER. 2017; 1:7.
  31. Prigoff J, Hunter M, Nowygrod R. Medical Student Assessment in the Time of COVID-19. J Surg Educ. 2021; 78(2):370-4.
  32. Zagury-Orly I, Durning SJ. Assessing open-book examination in medical education: The time is now. Med Teach. 2021; 43(8):972-3.
  33. Taha MH, Ahmed Y, El Hassan YAM, Ali NA, Wadi M. Internal Medicine Residents’ perceptions of learning environment in postgraduate training in Sudan. Future of Medical Education Journal. 2019; 9(4):3-9.