Document Type : Original Article

Authors

1 Department of Restorative Dentistry, Faculty of Dentistry, Tabriz University of Medical Sciences, Tabriz, Iran

2 Department of Oral and Maxillofacial Medicine, Faculty of Dentistry, Tabriz University of Medical Sciences, Tabriz, Iran

3 Department of Restorative Dentistry, School of Dentistry, Ardabil University of Medical Sciences, Ardabil, Iran

4 Faculty of Dentistry, Tabriz University of Medical Sciences, Tabriz, Iran

Abstract

Introduction: Direct Observation of Practical Skills (DOPS) tests is a valuable method for clinical assessment. This study aimed to implement the DOPS test to assess the procedural skills of
community dentistry courses and its effects on mastery learning and satisfaction of professors and students at Tabriz faculty of dentistry in 2021-2022.
Methods: In a quasi-experimental study, 60 dentistry students of a class were assigned into two study (n=30) and control (n=30) groups by Permuted block randomization. In the case group, the
skills were related to Fluoride therapy, fissure sealant therapy, and health education evaluated by DOPS. In the control group, these skills were evaluated by traditional evaluation methods. Each test was repeated three times. Finally, the satisfaction of students in the case group was assessed by a questionnaire. The chi-square test was used to compare qualitative variables. Repeated measure ANOVA test was used to compare the mean scores in three stages and two groups. P-value less than 0.05 was considered significant. Data were analyzed using SPSS 16 software.
Results: A significant difference in the mean score of Fluoride therapy, pit and fissure sealant therapy, and health education was seen between the case and control groups (P<0.001).
Also a significant increase in these skills in the third stage of assessment in the case group was observed (P<0.001). The professors and students’ satisfaction was considerably high on
the DOPS test.
Conclusion: The DOPS method had more impact on Fluoride therapy, pit and fissure sealant therapy, and health education's learning process in dentistry students than the conventional
evaluation. The professors and students’ satisfaction level was high regarding DOPS. The advantages of the DOPS method are student-centeredness, objectivity, and appropriate feedback.

Highlights

FATEMEH DABAGHI TABRIZ

KATAYOUN KATEBI

Keywords

Introduction

Assessment is one of the essential dimensions of educational activities, which makes education from a static state to a dynamic process. Educational performance assessment aims to improve productivity and quality ( 1 ). Evaluation of clinical skills includes more than half of the total volume of evaluation of medical students, including dentistry. It is one of the most critical and challenging tasks of academic members and professors of medical universities ( 2 , 3 ). This type of assessment ensures the student's clinical competence to deal with the patient and mastery of the skills needed for standard care of patients ( 4 ). In dentistry, clinical assessment of students in direct observation in practical and real situations will ensure their ability to face clinical events in various conditions.

Currently, various checklists are used in most dental schools to evaluate the students' clinical performance, and based on the evidence, most of them do not have sufficient validity and reliability, causing instability and uncertainty in the clinical assessment of students ( 5 ). For this reason, clinical tests such as Mini-CEX (mini-clinical evaluation exercise), OSCE (Objective Structured Clinical Examination), and DOPS (direct observation of clinical skills are suggested ( 6 ). The DOPS and Mini-CEX are called workplace-based assessments, with the difference that Mini-CEX usually assesses nonprocedural skills ( 7 ).

As a clinical assessment method, DOPS is suitable for providing constructive feedback to the students ( 8 ). This method requires direct observation of the learner while performing an actual procedure in a real environment. In this way, the learner's practical skill can be evaluated objectively and structured ( 9 ). The observation of students in real clinical environment is the main advantage of DOPS to OSCE. The OSCE evaluates clinical competence, based on objective direct observation of students’ performance in simulated clinical scenarios, though having its advantage of having standardized and predictable conditions ( 10 ). DOPS is a student-centered assessment method which promotes self-directed learning and provides the opportunity for learning, and feedback. Because of the special features of DOPS, such as valuable educational effects, and possibility of immediate feedback, it can be used for all levels of clinical education ( 11 ).

Many studies have been conducted in different countries, including Iran, regarding the validity and reliability of DOPS, and this tool has been used to evaluate medical students in pediatrics, anesthesia and surgery, and dentistry ( 4 , 12 ). These studies have mainly examined the validity and reliability of the DOPS test and the level of students’ satisfaction, but few studies have examined its effectiveness on the students' learning. These studies have shown that students' clinical competence has a direct and significant relationship with the number of passed DOPS tests. The feedback provided during this assessment is highly educational ( 12 , 13 ).

This study aimed to evaluate the effect of DOPS on the students’ learning and professors and students’ satisfaction in the Department of Oral Health and Community Dentistry of Tabriz Faculty of Dentistry in Tabriz in 2021-2022.

Methods

This quasi-experimental study was conducted in the Department of Oral Health and Community Dentistry of Tabriz Faculty of Dentistry, Iran. All students who had the second practical course of Oral Health and Community Dentistry in the second semester of 2021-2022 were included in the study after completing the written informed consent form. Participation in this study was voluntary and confidential. To ensure fairness between participants and non-participants, it was explicitly stated that partaking in the study would not affect their grades. The study protocol was approved by the Ethics Committee of Tabriz University of Medical Sciences with the code of IR.TBZMED.REC.1398.002.

Of 68 students in the course, 61 agreed to participate in the study. Sixty dentistry students were divided by Permuted block randomization into two groups: 30 in the DOPS group and 30 in the traditional logbook group. The Permuted block randomization was made according to students’ last semester's grades to ensure that both groups were similar at baseline; the students were divided into two subgroups with last semester grades of above or below 15, and then they were allocated to case and control groups.

To design and perform DOPS tests, three essential skills were selected by the academic staff: Fluoride therapy, Pit and fissure sealant therapy, and Health education. A DOPS checklist was designed for each clinical skill. The checklists included questions about how to perform each procedure and students’ communication skills and infection control observations. The scoring for each item was classified as weak (score 0-6), below expectation (score 6-12), acceptable (score 12-17), and excellent (score 17-20).

DOPS checklist preparation

DOPS checklists were prepared for each skill by considering the related literature and the professors’ ideas. Content validity was determined by five dentists from the oral health and community dentistry department and five medical education specialists. The Content Validity Index (CVI) was 0.81 and the ICC was 0.93 by test-retest method. The Cronbach’s alpha α=0.86 confirmed the reliability of the questionnaire.

The checklist related to health education included nine questions, the checklist of fluoride therapy had eight questions, and the pit and fissure sealant therapy checklist included 13 questions. The checklists are presented in Supplement 1. The professors of the oral health and community dentistry department (six professors) were given orientation to the DOPS and the principles of giving good feedback by medical education specialists. A brief orientation was also provided for the students regarding DOPS.

Each student's skill assessment was repeated three times at two-week intervals ( 3 ) since in this test the students are required to improve each time based on the previous test’s feedback. In the first stage, the intervention group was evaluated for three clinical skills: fluoride therapy, pit and fissure sealant therapy, and health education by DOPS method based on a valid and reliable checklist. After completing the checklist, during a meeting attended by the student and professor, feedback was given to the student, and the strengths and weaknesses were discussed.

In the control group, the assessment was done traditionally. In the traditional assessment, the students performed the procedure in the presence of the professors, but no feedback was provided, and students received a subjective general grade. The second and third assessments were conducted to check the student's progress with a similar method as in stage one. The maximum general grades for the three skills were like the DOPS group, so the comparison could be conducted. Finally, the scores obtained from the assessment in both control and intervention groups were compared.

In the next step, the professors and students’ opinions were evaluated on the effectiveness of the test in facilitating clinical learning, which was achieved by a questionnaire that included nine questions with a five-point Likert scale. Two researcher-made questionnaires were used after confirming their validity and reliability. To prepare the initial draft of the questionnaires, the items of the valid questionnaires available in the articles and members’ opinions were used. To this end, a search was done on PubMed, Scopus, and the Scientific Information Database (SID) using the keywords DOPS, clinical performance, clinical assessment, and dentistry students.

The content validity ratio (CVR) and content validity index (CVI) were used to evaluate the content validity of the questionnaires. For calculating the CVR, five medical education specialists and dentistry professors were asked to categorize each question based on a 3-point Likert scale as “necessary,” “useful but unnecessary,” and “unnecessary.” The items with scores of >0.78 were approved. The CVR for the items of both questionnaires was in the range of 0.80-0.89. The CVI was assessed by experts rating each item according to its simplicity, relevance, and clarity on a scale ranging from one to four. The CVI was measured as the proportion of items that attained a rating of three or four. The CVI was measured as the proportion of items on the questionnaire that achieved a rating of three or four. The items that had scores >0.79 were approved. The CVI was obtained in the range of 0.91-0.99 for the items of both questionnaires. For determining reliability, 20 dentistry students completed the final questionnaire of students, and 20 dentists completed the final questionnaire of professors. The reliability was confirmed by Cronbach’s alpha of 0.8 in the students' questionnaire and 0.83 in the professors’ questionnaire.

Questionnaire answers were set on a five-point Likert scale ranging from "completely agree" to "completely disagree", and “I have no opinion” in the middle. The frequency of choosing each option was reported as a percentage. The questionnaires were distributed at the end of the semester at the last session and students were given time to fill in the questionnaires without the presence of the professors and then collected by one of the staff in the department. The professors were also asked to fill out the questionnaires at the end of the semester, and all of them agreed to participate.

Statistical Analysis

The normality of data distribution was checked using the Kolmogorov-Smirnov test. Descriptive statistics, such as skewness, Kurtosis indices, were also evaluated. The data were presented as Mean±Standard Deviation and frequency (percentage) for quantitative and qualitative variables. The chi-square test was used to compare qualitative variables. Sphericity was assessed using Mauchly's test. A repeated measure ANOVA test was used to compare the mean scores between the two groups and three stages (intragroup). Post- hoc analyses were performed the Sidak test. P value less than 0.05 was considered significant. Statistical analysis was done using SPSS version 16.

Ethical considerations

The study was conducted after obtaining ethical approval from the Ethics Committee of Tabriz University of Medical Sciences with the code IR.TBZMED.REC.1398.002. Written informed consent was obtained from the participants. The questionnaires were coded for the confidentiality of the participants. The students were assured that they could withdraw from the study at any time, and this wouldn’t affect their final grades.

Results

Thirty students were tested by the DOPS method. The mean age of the students was 23±2.08 years. Also, 32 (53.3%) students were female. The two groups had no statistically significant difference regarding the mean age (P=0.91) and gender (P=0.98). Mauchly’s test results showed that regarding the health education, the assumption of sphericity was not rejected (P-value>0.05), and as to the two skills, Pit and fissure sealant and Fluoride therapy, the hypothesis of sphericity was rejected (P-value<0.001). Therefore, Huynh-Feldt correction was used. There was a significant difference in the scores in all three skills of fluoride therapy, pit and fissure sealant therapy, and health education by DOPS method in the intervention group compared to the control group (P<0.001). Based on the Sidak test, the mean grades of the procedures in the second and third stages differed significantly from the first stage, and students' performance improved significantly (P<0.001); also, the scores increased significantly in the third stage compared to the second stage (P<0.001). A comparison of the mean and standard deviation of each skill in three stages using the DOPS test is shown in Table 1.

Clinical skill Maximum possible score Stages Groups Repeated measures ANOVA
DOPS (n=30) Control (n=30) Source P
Fluoride therapy 160 First 86.3±5.2 85.6±4.4 Stage <0.001
Second 127.4±7.5 83.3±6.2 Groups <0.001
Third 145.5±6.1 86.1±7.3 Stage * Groups <0.001
Total 119.7±25.8 95.2±24.2
Pit and fissure sealant 260 First 147.9±9.4 146.1±8.3 stage <0.001
Second 215.3±7.8 144.8±4.9 Groups <0.001
Third 245.2±6.8 148.2±6.9 stage * Groups <0.001
Total 158.0±44.6 152.1±47.7
Health education 180 First 112.5±7.3 110.6±9.6 stage <0.001
Second 151.6±4.4 109.0±5.9 Groups <0.001
Third 168.1±5.2 112.0±6.6 stage * Groups <0.001
Total 144.3±23.9 139.2±49.8
Table 1.Comparison of the scores of the study groups in each skill

The results showed that both professors and students were satisfied with the DOPS. In the professors’ questionnaire, many items received more than 80% “completely agree” and “agree” scores. In the students’ questionnaire, all items received more than 50% “completely agree” and “agree” scores, except for the second question. Their opinions about Dops are shown in Tables 2 and 3. It is worth mentioning that the "I have no opinion" option was not chosen by any of the professors and students; therefore, the related column was removed from the table.

No. Question Completely agree Agree Disagree Completely disagree
1 DOPS provides an opportunity to teach clinical skills. 1 (16.6%) 3 (50%) 1 (16.6%) 1 (16.6%)
2 DOPS facilitates the achievement of learning objectives during the course. 2 (33.3%) 3 (50%) 1 (16.6%) 0
3 DOPS provides a friendly and stress-free atmosphere. 1 (16.6%) 4 (66.6%) 1 (16.6%) 0
4 DOPS provides an opportunity to identify the strengths and weaknesses of students’ clinical skills. 6 (100%) 0 0 0
5 DOPS helps in clinical decision-making and independence in students. 1 (16.6%) 3 (50%) 1 (16.6%) 1 (16.6%)
6 DOPS motivates students to improve their clinical skills. 2 (33.3%) 4 (66.6%) 0 0
7 The immediate feedback provided by professors after DOPS is helpful for students’ progress. 5 (83.3%) 1 (16.6%) 0 0
8 DOPS provides an opportunity to communicate with the students. 2 (33.3%) 3 (50%) 1 (16.6%) 0
9 Judgment is fair in DOPS. 2 (33.3%) 3 (50%) 1 (16.6%) 0
Data is presented as frequency (percentage).
Table 2.The professors' opinions about the DOPS
No. Question Completely agree Agree Disagree Completely disagree
1 DOPS Checklists are useful as guides on how to perform the skills correctly. 4 (13.3%) 13 (43.3%) 10 (33.3%) 3 (10%)
2 DOPS provides the basis self-evaluation. 5 (16.6%) 8 (26.6%) 11 (36.6%) 6 (20%)
3 DOPS provides a friendly and stress-free atmosphere. 5 (16.6%) 17 (56.6%) 8 (26.6%) 0
4 DOPS provides an opportunity to identify the strengths and weaknesses of students’ clinical skills. 11 (36.6%) 12 (40%) 7 (23.3%) 0
5 This method is useful in improving the clinical performance of students. 7 (23.3%) 14 (46.6%) 6 (20%) 3 (10%)
6 DOPS motivates students to improve their clinical skills. 6 (20%) 15 (50%) 7 (23.3%) 2 (6.6%)
7 The immediate feedback provided by professors after DOPS is helpful for students’ progress. 16 (53.3%) 10 (33.3%) 4 (13.3%) 0
8 DOPS provides an opportunity to communicate with professors. 9(30%) 12 (40%) 6 (20%) 3 (10%)
9 Judgment is fair in DOPS. 7 (23.3%) 16 (53.3%) 5 (16.6%) 2 (6.6%)
Data is presented as frequency (percentage).
Table 3.The students' opinions about the DOPS

Discussion

Student assessment is considered a fundamental component of education. In the past years, various assessment methods have been used, some of the most important of which include oral tests, written tests, multiple choice questions, OSCE tests, and DOPS methods. DOPS is a specific method designed to evaluate clinical skills and provide feedback. This method requires direct observation of the student while performing a procedure. With this method, the student's practical skills can be assessed in an objective and structured way ( 5 ). The present study was conducted to compare the impact of the DOPS method on the clinical skills of dentistry students. The results of this study show that the DOPS test has a more significant impact on improving the skills of dental students than the usual assessment method.

With this method, after each test, the student’s strengths and weaknesses were identified for each skill, and the student’s skill was objectively evaluated based on specific criteria. In addition, having this checklist, the student could evaluate his performance accordingly. This method made providing feedback to the student easier because instead of giving a general opinion, the feedback was based on objective performance. Jafarpoor et al. evaluated the impact of DOPS and mini-clinical evaluation exercises (mini-CEX) on nursing students. Their results showed that students who were evaluated by DOPS and mini-CEX methods had a higher score of clinical performance and a higher satisfaction level ( 14 ).

The present study emphasized the effectiveness of the DOPS test on the acquisition of clinical skills of dental students, which is in line with the study of Bagheri et al, who investigated the effect of the DOPS test on the learning of clinical skills of Mashhad emergency medicine students; the results showed that they had a relatively good performance, and the DOPS test had a significant effect on the students' learning ( 15 ). Also, the study of Gholam Nejad et al. was conducted to determine and compare the effect of the DOPS method on the level of clinical skills of nursing students. They found that the DOPS test had a more significant impact on the development of students' clinical skills than the traditional assessment method; in other words, it had increased the students' clinical skills ( 4 ). One of the possible reasons for the relatively good scores in the DOPS tests is the motivation that the DOPS tests create.

The comparison of the assessment scores of the two intervention and control groups in the present study indicates that the DOPS method is more effective than the usual assessment method; it has been confirmed in other studies that the effect of providing feedback to the participants during the assessment is one of the strengths of this type of assessment ( 16 ). The satisfaction of students is one of the important indicators of evaluating clinical skills. One of the other merits of this method is to increase the students' satisfaction. In addition to receiving feedback from professors, high objectivity of the technique, preparation for entering the real work environment, and fair and just judgment are the most important strengths of this method from the student's point of view ( 17 , 18 ). However, according to Erfani et al.'s study, stressfulness, time limitation, and bias of evaluators are among the most important limitations of this type of assessment method ( 19 ).

The comparison of the data in the three stages of the DOPS method indicated a significant difference in scores and an improvement in the mean scores of the procedures in the second and third stages compared to the first stage. Finally, there was an improvement in the performance of dental students; however, in Amini et al.'s study, which was conducted on residents and professors, the results of the first stage were relatively good, and the second stage was good. It increased compared to the first stage, while the results of the third stage, compared to the second stage, decreased ( 3 ).

In this study, most professors believed that holding DOPS tests and providing feedback to the students could significantly increase the students' skills and abilities. Various studies also showed that DOPS tests could improve learning and empower students.

In a review study conducted by Sohrabi et al. in 2016 about the DOPS test on the practical skills of students in Iranian universities of medical sciences, it was shown that the students' views and satisfaction towards this test are positive, and this method is considered suitable for evaluating clinical skills. They believed that this test positively affected their learning and independence of action ( 20 ).

The results of professors and students' opinions about the effectiveness of the DOPS test in facilitating the students' learning and skills showed that 70% of them expressed the positive views on the effect of DOPS on improvement of clinical skills. Given the nature of the test that objectively evaluates the independent performance of the clinical process, this issue seems to be one of the main strengths of the test.

Anxiety during the clinical process and the stress caused by the test may have inappropriate effects on the quality of treatments ( 21 ). Trying to create a stress-free environment during the tests can effectively improve the students’ performance. Although many reports emphasize the appropriate validity of this test in evaluating the skill of performing clinical procedures ( 22 , 23 ), the possibility of the effect of stress during the supervised clinical process on the validity of this test has also been reported. However, this test is recommended in formative and even high-stakes tests ( 24 ).

Although the implementation of clinical tests based on direct observation of clinical skills has a high validity ( 25 ), the training and orientation of the professors with the advantages and philosophy of this clinical assessment is one of the implementation prerequisites. To successfully implement a new assessment method in a field, it is necessary to hold educational workshops and involve all professors in its design process. From this point of view, similar research in other conditions can be recommended.

Limitations

The possibility of hawthorn effect in students is one of the limitations of this study; also, due to the nature of the study, conducting a pre-test was impossible.

Conclusion

It can be concluded from the results of the present study that the DOPS method has a more significant effect on learning health education skills, fluoride therapy, and pit and fissure sealant therapy than the conventional assessment method. Also, the level of satisfaction of professors and students from the DOPS test was suitable. Therefore, this test can be suggested for clinical assessment of community oral health courses.

Recommendation

It is recommended that a crossover design should be used in the future studies so that all participants can experience both interventions.

Authors’ Contributions

All authors contributed to the discussion, read and approved the manuscript and agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated resolved.

Conflict of Interest

The authors declare no conflicts of interest.

References

  1. Xin X, Shu-Jiang Y, Nan P, ChenXu D, Dan L. Review on A big data-based innovative knowledge teaching evaluation system in universities. J Innov Knowl. 2022; 7(3):100197.
  2. Shumway JM, Harden RM, Association for Medical Education in Europe. AMEE Guide No. 25: The assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003; 25(6):569-84.
  3. Amini A, Shirzad F, Mohseni MA, Sadeghpour A, Elmi A. Designing Direct observation of procedural skills (dops) test for selective skills of orthopedic residents and evaluating its effects from their points of view. Res Dev Med Educ. 2015; 4(2):147-52.
  4. Gholamnejad H, Ghofrani Kelishami F, Manoochehri H, Hoseini M. Efficacy of Direct Observation of Procedural Skills (DOPS) on Practical Learning of Nursing Students in Intense Care Unit. Educ Strategy Med Sci. 2017; 10(1):9-14.
  5. Ramlogan S, Raman V. An educational approach for early student self-assessment in clinical periodontology. BMC Med Educ. 2022; 22:1.
  6. Luo P, Shen J, Yu T, Zhang X, Zheng B, Yang J. Formative objective structured clinical examination with immediate feedback improves surgical clerks’ self-confidence and clinical competence. Med Teach. 2023; 45(2):212-8.
  7. Jasemi M, Ahangarzadeh Rezaie S, Hemmati Maslakpak M, Parizad N. Are workplace-based assessment methods (DOPS and Mini-CEX) effective in nursing students’ clinical skills? A single-blind randomized, parallel group, controlled trial. Contemp nurse. 2019; 55(6):565-75.
  8. Bendiab NT, Henaoui L. The perception by 4th year medical students of the mini-CEX/DOPS, as a tool for evaluating the practical internship in cardiology. Arch Cardiovasc Dis Suppl. 2023; 15(1):175-6.
  9. Rela M, Price T. Review of the validity of DOPS as an assessment tool for the procedural skills of surgical trainees. Ann R Coll Surg Engl. 2023; 105(7):599-606.
  10. Malau-Aduli BS, Jones K, Saad Sh, Richmond C. Has the OSCE Met Its Final Demise? Rebalancing Clinical Assessment Approaches in the Peri-Pandemic World. Front Med. 2022; 9:825502.
  11. Farajpour A, Amini M, Pishbin E, Mostafavian Z, Akbari Farmad S. Using Modified Direct Observation of Procedural Skills (DOPS) to assess undergraduate medical students. J Adv Med Educ Prof. 2018; 6(3):130-6.
  12. Qureshi MA, Latif MZ. Experience of Dops (Direct Observation Practical Skills) By Postgraduate General Surgical Residents of Lahore. Esculapio - JSIMS. 2023; 19(1):91-5.
  13. Yan CC, Choo K. Designing direct observation of procedural skills (DOPS) for core competency skills at the primary care setting. Sci Talk. 2023; 5:100159.
  14. Jafarpoor H, Hosseini M, Sohrabi M, Mehmannavazan M. The effect of direct observation of procedural skills/mini-clinical evaluation exercise on the satisfaction and clinical skills of nursing students in dialysis. J Educ Health Promot. 2021; 10:74.
  15. Bagheri M, Sadeghnezhad M, Sayyadee T, Hajiabadi F. The Effect of Direct Observation of Procedural Skills (DOPS) Evaluation Method on Learning Clinical Skills among Emergency Medicine Students. Iran J Med Educ. 2014; 13(12):1073-81.
  16. Lagoo JY, Joshi ShB. Introduction of direct observation of procedural skills (DOPS) as a formative assessment tool during postgraduate training in anaesthesiology Exploration of perceptions. Indian Journal of Anaesthesia. 2021; 65(3):202-9.
  17. Profanter C, Perathoner A. DOPS (Direct Observation of Procedural Skills) in undergraduate skill lab: Does it work? Analysis of skills-performance and curricular side effects. GMS Zeitschrift für Medizinische Ausbildung. 2015; 32(4):1-14.
  18. Nooreddini A, Sedaghat S, Sanagu A, Hoshyari H, Cheraghian B. Effect of Clinical Skills Evaluation Applied by Direct Observation Clinical Skills (DOPS) on the Clinical Performance of Junior Nursing Students. J Res Dev Nurs Midw. 2015; 12(1):8-16.
  19. Erfani Khanghahi M, Ebadi Fard Azar F. Direct observation of procedural skills (DOPS) evaluation method: Systematic review of evidence. Med J Islam Repub Iran. 2018; 32:45.
  20. Sohrabi Z, Salehi K, Rezaie H, Haghani F. The Implementation of Direct Observation of Procedural Skills (DOPS) in Iran’s Universities of Medical Sciences: A Systematic Review. Iran J Med Educ. 2016; 16:407-17.
  21. Akbari M, Mahavelati Shamsabadi R. Direct Observation of Procedural Skills (DOPS) in Restorative Dentistry: Advantages and Disadvantages in Student&#039;s Point of View. Iranian Journal of Medical Education. 2023; 13(3):212-20.
  22. Nayyeri S, Bozorgvar A, Barzanouni S, Masoumain Hosseini T. Effect of Clinical Skills Evaluation by Direct Observation of Procedural Skills (DOPS) Method on Clinical Performance of Operating Room Students. J Med Edu. 2021; 20(3):e119342.
  23. Siau K, Crossley J, Dunckley P, Johnson G, Feeney M, Hawkes ND, et al. Direct observation of procedural skills (DOPS) assessment in diagnostic gastroscopy: nationwide evidence of validity and competency development during training. Surg Endosc. 2020; 34:105-14.
  24. Happy D, Aditya A, Hadge P, Devkar N, Vibhute A. Introduction and Comparison of Direct Observation of Procedural Skills (DOPS) with Conventional Method of Skill Assessment in Dental Students. J Clin Diagnostic Res. 2019; 13(2):ZC13-ZC17.
  25. Kamat C, Todakar M, Patil M, Teli A. Changing trends in assessment: Effectiveness of Direct observation of procedural skills (DOPS) as an assessment tool in anesthesiology postgraduate students. J Anaesthesiol Clin Pharmacol. 2022; 38(2):275.