The effect of direct observation of procedural skills method on learning clinical skills of midwifery students of medical sciences
Soheila Mohamadirizi1, Farahnaz Mardanian2, Fatemeh Torabi1
1 Nursing and Midwifery Care Research Center, Isfahan University of Medical Sciences, Isfahan, Iran
2 Department of Obstetrics and Gynecology, School of Medicine, Isfahan University of Medical Sciences, Isfahan, Iran
|Date of Submission||05-Nov-2019|
|Date of Acceptance||12-Dec-2019|
|Date of Web Publication||28-Apr-2020|
Miss. Fatemeh Torabi
Nursing and Midwifery Care Research Center, Isfahan University of Medical Sciences, Isfahan
Source of Support: None, Conflict of Interest: None
INTRODUCTION: Clinical education is one of the most important parts of medical students' education, and it is a major part of the education of qualified and professional people. Therefore, this study was conducted to determine the effect of applying Direct Observation of Procedural Skills (DOPS) on midwifery students' clinical skills.
MATERIALS AND METHODS: This is a quasi-experimental, two-group study conducted as a pre- and post-study on midwifery students in 2017–2018. Cluster and randomized sampling method was used. The processors involved in this study were three main skills of vaginal examination, pelvic examination, and vaginal delivery. The DOPS method was used to assess the practical skills in the interventional group during three times the process at day, 1 day, and at least 1 week later; the usual logbook method was used in the control group. Two groups were evaluated at the end of the midwifery course by Comprehensive Final Midwifery checklist. The tools were checked for validity and reliability, and data were analyzed using descriptive and analytical statistics.
RESULTS: There was no statistically significant difference between the two groups in terms of important demographic variables such as age, grade, marital status, and initial assessment score (P > 0.05). The mean of final scores in the normal delivery, vaginal examination, and pelvimetry was statistically significantly higher in the interventional group (P < 0.001). On the other hand, the functional field of the students in the interventional group was statistically significantly improved in normal delivery and pelvimetry (P < 0.05), and this difference was not significant in the vaginal examination. In addition, the mean scores of students before and after the DOPS method were statistically significantly different in every skill in Comprehensive Final Midwifery checklist (P < 0/05).
CONCLUSIONS: The DOPS assessment methodology is not only a useful tool of clinical evaluation, but also an effective tool for clinical learning of students. For this purpose, it is suggested that educational members of midwifery take enough time to design DOPS method in the same process.
Keywords: Clinical evaluation, direct observation of procedural skills, learning, vaginal delivery
|How to cite this article:|
Mohamadirizi S, Mardanian F, Torabi F. The effect of direct observation of procedural skills method on learning clinical skills of midwifery students of medical sciences. J Edu Health Promot 2020;9:91
|How to cite this URL:|
Mohamadirizi S, Mardanian F, Torabi F. The effect of direct observation of procedural skills method on learning clinical skills of midwifery students of medical sciences. J Edu Health Promot [serial online] 2020 [cited 2020 May 31];9:91. Available from: http://www.jehp.net/text.asp?2020/9/1/91/283374
| Introduction|| |
Clinical education is one of the most important parts of education for students of nursing and midwifery sciences and is a vital part of training competent and professional people. The value of ideal clinical education is such that its role in individual and professional development as well as the clinical skills of students is undeniable. To this end, promoting clinical competence and education has always been one of the major concerns in the education of medical sciences. On the other hand, the results of many studies show that evaluation is the most important measure to improve effectiveness in this field. The real purpose of evaluation is monitoring and gathering data to improve the educational status. With proper evaluation, the strengths and weaknesses of education can be identified and stepped up in developing and reforming the educational system, by reinforcing the positive aspects and eliminating inadequacies. Effective evaluation not only enhances students' motivation but also helps teachers to evaluate their activities, and if this evaluation is also accompanied by appropriate feedback, it can enhance the learning skills of the learner.,
Due to the increasing changes in clinical education approaches, the necessity of using appropriate new evaluation methods is becoming increasingly evident. In the study conducted in the nursing colleges in the southern states of America, it was found that 45%, 35%, 17%, and 3% of colleges have had no revision in their clinical evaluation methods for 5, 6–10, 11–15, and over 15 years, respectively. In addition, a study conducted in the Tehran School of Nursing and Midwifery showed that 62% of students believed that the conditions and cases of clinical evaluation were not the same and satisfactory for all students. Specialists have been searching for validated methods for years to effectively evaluate students' clinical performance. Clinical evaluation methods that come with feedback, in addition to evaluating difficult cases in traditional student assessment, also promote learning.,
Currently, the use of performance-based tests (such as Direct Observation of Procedural Skills [DOPS] and Objective Structured Clinical Examination Mini-Clinical Evaluation Exercise) for the measurement of clinical and practical skills is emphasized. As midwifery is a practical profession, direct observational assessment, in practical and real situations, provides the examiners with confidence in the student's ability to anticipate and predict clinical changes and events in a particular patient's condition and helps determine her ability. Therefore, evaluation through direct observational assessment of practical skills seems necessary. On the other hand, DOPS is one of the new methods of clinical education as well as a good way to provide an opportunity to provide constructive feedback and student attention and focus on what is needed to accomplish the desired skill because evaluation requires timely and specific feedback to improve performance.
Considering the researcher's experience of the current problems in the clinical evaluation of midwifery students in current methods, and the future working conditions of these students in maternity hospitals that require high skill and speed in performing the techniques, the use of an accurate evaluation method to ensure good clinical competence seems essential; in addition, as midwifery is a stressful job and midwifery students need to be proficient in a variety of areas, including childbirth and physical examinations, and their clinical competence will lead to greater self-efficacy, the purpose of this study was to determine the effect of DOPS evaluation method on midwifery students' clinical skills learning.
| Materials and Methods|| |
This is a quasi-experimental, two-group study conducted as a pre- and post-study in 2017–2018. After obtaining permission from the Ethics Committee of Isfahan University of Medical Sciences, cluster random sampling was done by explaining the purpose and methodology and satisfying the last semester midwifery students who were undergoing pregnancy and childbirth internship. Each internship group was considered as a cluster and randomly divided into experimental (n = 30) and control (n = 30) groups. Inclusion criteria were theoretical pregnancy and childbirth courses and practicing the relevant course.
The procedures used in this study were three basic skills of vaginal examination, pelvic examination, and vaginal delivery. DOPS and log book (according to the routine of the university) evaluation methods were used for intervention and control groups, respectively. The students of control and experimental groups were evaluated on the basis of a checklist of relevant clinical skills. In the conventional method in the control group, students' skills during the internship period were measured with a log book so that on the 1st day of the internship, the log book was read by the instructor and justified to the students. Then, it is scored in a phase. In the experimental group, the interventional steps were as follows:
- Step 1: Observing the desired skill within the prescribed time (1 min for vaginal examination, 15 min for delivery, and 15 min for episiotomy repair) and giving feedback in 5 min (while reviewing the items in the evaluation)
- Step 2: Repeating the desired skill after 1 day and emphasizing the strengths and weaknesses of the student in that skill
- Step 3: Repeating the desired skill after at least 1 week and emphasizing the strengths and weaknesses of the student.
The student training tool in the experimental group was a skill-based checklist; in the vaginal and pelvic and delivery examination skills, it included eight areas of communication, namely, pre examination preparation, sterile conditions, technical ability in examination, judging and reporting skills, and a general skill in the technique. In the study of Kuhpayehzade et al., reliability and validity of the checklist were reported to be 0.98 and 0.95, respectively. However, the content validity was confirmed by four members of Isfahan Medical Sciences Faculty.
Both groups were then assessed at the end of semester eight by the Midwifery Comprehensive Examination Evaluation Checklist on the three skills listed; at the end, each skill was scored as poor, improper performance, and in need of full guidance and supervision; moderate and in need of relative guidance and supervision; good; in need of minimum guidance for the proper and great performance; and proper performance without the need for the slightest guidance. The checklist was standardized and approved by the Midwifery Board of the country. The content validity of the instrument was confirmed by the Midwifery and Reproductive Health Board and the faculty members of the Midwifery Department and had a reliability coefficient of 0.99. Data were then analyzed using SPSS-22 software and descriptive statistics (produced by IBM United States), Chi-square, independent t-test, paired t-test, and Pearson's correlation.
| Results|| |
There was no significant differences between two groups in homogeneity in term of age, average grade point by using independent t-test and pre intervention skill and marital status by using Chi-square test [Table 1].
|Table 1: Frequency distribution of demographic quantitative variables by study groups|
Click here to view
At Kirkpatrick's level, at the results' level, there was a significant difference between the final evaluation scores in the two control and experimental groups in all the three skills of vaginal delivery, vaginal examination, and pelvic examination [Table 2].
|Table 2: Comparison of the skills final scores in the control and experimental groups after evaluation|
Click here to view
In addition, the performance areas of each student were categorized according to the areas available in the instrument. Students in the experimental group had higher performance in the skills of delivery and pelvimetry compared to the control group, and this improvement in performance made a significant difference. However, this difference was not significant in vaginal examination skills [Table 3].
|Table 3: Comparison of skill performance areas in the control and experimental groups after evaluation|
Click here to view
While in the control group after the end of semester, there was no statistically significant difference in the scores obtained in delivery skills (P = 0.1), vaginal examination (P = 0.08), and pelvic examination (P = 0.2) in comprehensive midwifery test.
There was also a statistically significant difference between the mean of the control and experimental groups both in the evaluation of the same internship and at the end of the eighth semester in the comprehensive midwifery test, showing the students' proficiency and certification of clinical competency (P < 0.05).
At Kirkpatrick's levels, at the performance level, students' clinical competence was assessed by student self-assessment as well as evaluation by the relevant personnel [Table 4].
|Table 4: Comparison of students' clinical performance scores in control and intervention groups by students and personnel|
Click here to view
| Discussion|| |
This study was conducted to determine the effect of DOPS method on learning clinical skills of midwifery students. The results of this study showed that the DOPS evaluation method is not only a good way to evaluate students in practical skills, but also an acceptable way to learn and overcome students' weaknesses in practical skills [Table 5].
|Table 5: Delivery, vaginal and pelvic examinations skills scores before and after direct observation of procedural skills in experimental group|
Click here to view
As stated in the study by Habibi et al., the DOPS evaluation method significantly enhances students' clinical skills. Furthermore, in a study conducted by Bagheri et al. on medical emergency students, this method was found to be appropriate for learning emergency skills. In addition, the study carried out by Nazariet al. acknowledged that the DOPS method in the intensive care unit nursing group was highly effective.
On the other hand, in the study conducted by Nooreddini et al. by applying the DOPS method in the control group, the mean scores of students did not change and no improvement was observed in students' clinical performance. However, in the present study, the mean scores before and after the DOPS method were significantly different. Moreover, even the skill area of students has been improved significantly in an emergency process, such as natural childbirth. Studies show that students' psychological satisfaction with workplace education is much higher. These results are in line with the results of the study by Tsui et al. in Taiwan, who concluded in their study that using DOPS evaluation and feedback from teachers can enhance the skills of medical students. It is recommended to perform this procedure in other clinical processes and review the results; further studies can also be carried out in other clinical groups.
| Conclusions|| |
According to the present results and the mentioned studies, it seems that DOPS method has a great impact on learning and understanding of students' weaknesses in both short- and long-term practical and clinical processes. Therefore, it is recommended that the executives of this test pay special attention to the time required for each practical skill. It is advisable for faculty and staff at each institution to spend time in evaluating knowledge, and this should be included in the human resources development program. The important point is that there is not much attention paid to the evaluator or observer training. This may be due to cost, time constraints, or ignorance. However, evaluators need to be adequately trained to understand the difference between various levels of students' performance. Although evaluation training may seem costly and time-consuming at the first glance, the benefits of doing so are considerable in improving the quality of education. According to the results of the study, it is recommended that teachers use this method to evaluate students' performance in clinical practice.
We acknowledge the Nursing and Midwifery Care Research Center of Isfahan for helping in the conduct of this study and financial support, and scientific code of project and ethics are 295249 and IR.MUI.REC.1395.2.249, respectively.
Financial support and sponsorship
Conflicts of interest
There are no conflicts of interest.
| References|| |
Nasr-Esfahani M, Yazdannik A, Mohamadiriz S. Development of nursing students' performance in advanced cardiopulmonary resuscitation through role-playing learning model. J Educ Health Promot 2019;8:151.
Crossley J, Humphris G, Jolly B. Assessing health professionals. Med Educ 2002;36:800-4.
Mohamadirizi S, Kohan S, Shafei F, Mohamadirizi S. The relationship between clinical competence and clinical self-efficacy among nursing and midwifery students. International Journal of Pediatrics. 2015;3:1117-23.
Jalili M, Imanipour M, Nayeri DN, Mirzazadeh A. Evaluation of the nursing students' skills by DOPS. J Med Educ 2015;14:13-9.
Grauer GF, Forrester SD, Shuman C, Sanderson MW. Comparison of student performance after lecture-based and case-based/problem-based teaching in a large group. J Vet Med Educ 2008;35:310-7.
Smith-Strøm H, Nortvedt MW. Evaluation of evidence-based methods used to teach nursing students to critically appraise evidence. J Nurs Educ 2008;47:372-5.
Franko DL, Cousineau TM, Trant M, Green TC, Rancourt D, Thompson D, et al
. Motivation, self-efficacy, physical activity and nutrition in college students: Randomized controlled trial of an internet-based education program. Prev Med 2008;47:369-77.
Bari V. Direct observation of procedural skills in radiology. AJR Am J Roentgenol 2010;195:W14-8.
Sohrabi Z, Salehi K, Rezaie H, Haghani F. The implementation of direct observation of procedural skills (DOPS) in Iran's universities of medical sciences: A systematic review. Iran J Med Educ 2016;16:8-14.
Chehrzad M, Sohail SZ, Mirzaee M, Kazemnejad E. compare of Osce and traditional clinical evaluation methods on nursing students' satisfaction. J Med Faculty Guilan Univ Med Sci 2007;13:8-12.
Noohi E, Motasedi M, Haghdoost A. Clinical teachers' viewpoints towards objective structured clinical examination in Kerman University of Medical Science. Iran J Med Educ 2008;8:113-9.
Kariman N, Moafi F. Effect of portfolio assessment on student learning in prenatal training for midwives. J Educ Eval Health Prof 2011;8:2.
Habibi H, Khaghanizadeh M, Mahmoudi H, Ebadi A, SeyedMazhari M. Comparison of the effects of modern assessment methods (DOPS and mini-CEX) with traditional method on nursing students' clinical skills: A randomized trial. Iran J Med Educ 2013;13:16-22.
Mitchell C, Bhat S, Herbert A, Baker P. Workplace-based assessments of junior doctors: Do scores predict training difficulties? Med Educ 2011;45:1190-8.
Kuhpayehzade J, Hemmati A, Baradaran H, Mirhosseini F, Akbari H, Sarvieh M. Validity and reliability of direct observation of procedural skills in evaluating clinical skills of midwifery students of Kashan nursing and midwifery school. Sabzevar Med Univ J 2014;21:12-16.
Bagheri M, Sadeghnezhad M, Sayyadee T, Hajiabadi F. The effect of direct observation of procedural skills (DOPS) evaluation method on learning clinical skills among emergency medicine students. Iran J Med Educ 2014;13:1073-81.
Nazari R, Hajihosseini F, Sharifnia H, Hojjati H. The effect of formative evaluation using “direct observation of procedural skills” (DOPS) method on the extent of learning practical skills among nursing students in the ICU. Iran J Nurs Midwifery Res 2013;18:290-3.
Nooreddini A, Sedaghat S, Sanagu A, Hoshyari H, Cheraghian B. Effect of clinical skills evaluation applied by direct observation clinical skills (DOPS) on the clinical performance of junior nursing students. J Res Dev Nurs Midwifery 2015;12:8-16.
Bhugra D, Malik A, Brown N. Workplace-based Assessment in Psychiatry. London: Royal College of Psychiatrists; 2007. p. 1-13.
Tsui KH, Liu CY, Lui JM, Lee ST, Tan RP, Chang PL. Direct observation of procedural skills to improve validity of students' measurement of prostate volume in predicting treatment outcomes. Urol Sci 2013;24:84-8.
Shumway JM, Harden RM; Association for Medical Education in Europe. AMEE Guide No. 25: The assessment of learning outcomes for the competent and reflective physician. Med Teach 2003;25:569-84.
van der Vleuten CP, Schuwirth LW. Assessing professional competence: From methods to programmes. Med Educ 2005;39:309-17.
[Table 1], [Table 2], [Table 3], [Table 4], [Table 5]