Abstract | | |
Introduction: The objective structured clinical examination (OSCE) is regarded as the gold standard for evaluating students' skills and competencies. Aim: This study aimed to assess the perceptions of postgraduate dental residents toward OSCE. Materials and Methods: A cross-sectional survey was conducted among fifty postgraduate dental residents' doctors (senior and junior residents) during one of their update courses in February 2015 with a self-administered questionnaire to probe their perceptions about OSCE. The collected data were analyzed using SPSS version 17.0 statistical software. Results: A total of fifty questionnaires were administered to consenting residents, and all questionnaires were filled and returned giving a 100% response rate. There was a male preponderance with a male: female ratio of 1.5:1. The respondents' ages varied with 58% belonging to the 31–35 years' age group. More than half (56%) of the respondents agreed that they understood the aim and objectives of OSCE. With regard to OSCE being a valid assessment tool for clinical competence, 46% agreed and 6% strongly agreed that it was a valid tool. More than half (52%) of the respondents had favorable perceptions about OSCE. There was a statistically significant relationship between the status of respondents and their perception with regard to preparation for OSCE and impact on knowledge. The most objective and reliable form of examination was reported to be multiple choice questions by 56% and 58% of the respondents, respectively. Conclusion: Perceptions of postgraduate dental residents about the OSCE were favorable. However, there is a need to adjust the postgraduate curriculum to accommodate specific training modules for OSCE. Keywords: Objective structured clinical examination, perception, postgraduate dental residents
How to cite this article: Omo JO, Enabulele JE. Perception of postgraduate dental resident doctors towards the objective structured clinical examination. J Educ Ethics Dent 2016;6:14-9 |
How to cite this URL: Omo JO, Enabulele JE. Perception of postgraduate dental resident doctors towards the objective structured clinical examination. J Educ Ethics Dent [serial online] 2016 [cited 2023 Apr 1];6:14-9. Available from: https://www.jeed.in/text.asp?2016/6/1/14/216516 |
Introduction | |  |
The objective structured clinical examination (OSCE) is an assessment tool consisting of a series of standardized stations testing both clinical and technical skills.[1] It was first introduced by Hardin et al. in 1975.[2] Conventionally, clinical skills have been evaluated by means of multiple choice questions, oral examinations, progressive assessment, essay written tests, long and short cases, viva (interview), and other forms of written examinations.[1],[2],[3] These methods mainly focus on cognitive abilities of the trainees[4] and have been shown to be lacking in either reliability or validity.[2],[5],[6]
The OSCE enables the same clinical scenarios to be presented to many trainees[7] and is regarded as the gold standard for evaluating students' skills and competencies, among different examination methods.[8],[9]
OSCE was introduced into the postgraduate dental examination about 3 years ago and it is been ongoing. The aim of the study was to assess the perception of postgraduate dental residents toward OSCE.
Materials and Methods | |  |
A cross-sectional survey assessing the perception of resident doctors toward the OSCE was conducted among postgraduate dental residents' doctors (senior and junior residents) who attended the Faculty of Dentistry, West African College of Surgeons revision course in February 2015. All residents who were recruited for the study gave informed consent to participate in the study.
Data were collected by means of a self-administered questionnaire Which was initially developed from reviewed articles.[10],[11],[12],[13] It was then pretested on ten volunteering dental residents who were not part of the revision course. Based on the analysis from the pilot study, the questionnaire was further developed into its final form. There was no prior notice about the topic to the respondents and no identifiers in the questionnaire.
The questionnaire consisted of two sections [Appendix 1 [Additional file 1]]. The first section consisted of sociodemographic data while the second section comprised 15 questions assessing the respondents' perception of resident doctors toward the OSCE. The responses in section 2 were categorized in a Likert scale and points were allotted to each response.
A score of 1 was allotted to a “strongly disagree” response, 2 to “disagree,” 3 to “neutral,” 4 to “agree,” and 5 to “strongly agree” responses. The highest possible score was 75 while the lowest possible score was 15. For the purpose of analysis, the scores were graded as follows: good perception 56–75, fair perception 36–55, and poor perception 15–35.
The collected data were analyzed statistically using Software version 17.0. (Chicago, Illinois, USA). Descriptive analysis in the form of mean, mode, and standard deviation was used to analyze continuous variables. Frequencies, cross-tabulations, and Chi-square test to determine associations between variables were carried out considering P > 0.05 as statistically significant.
Results | |  |
A total of fifty questionnaires were administered to consenting residents, and all questionnaires were filled and returned giving a 100% response rate. A total of fifty residents were recruited for the study with a male:female ratio of 1.5:1. The respondents' ages varied with 58% belonging to the 31–35 years' age group. Most of the respondents were junior residents accounting for 74%. The specialty with the highest representation was restorative (34%), followed by child oral health with 22% [Table 1].
More than half (56%) of the respondents agreed that they understood the aim and objectives of OSCE while 16% strongly agreed. With regard to OSCE being a valid assessment tool for clinical competence, 46% agreed and 6% strongly agreed that it was a valid assessment tool. More than half (56.0%) of the respondents agreed that OSCE gives a chance to test out the ability to apply knowledge to real situations face to face. Less than half (40.0%) of the respondents agreed that OSCE is designed fairly. Less than half (44%) agreed and 20% strongly agreed that OSCE reduces bias related to type of clinical case. In the same vein, 34% agreed and 20% strongly agreed that OSCE reduces bias related to examiner. Forty-eight percent felt OSCE was a worthwhile experience while 34% were undecided whether residency training program adequately prepares dental residents for OSCE [Table 2]. However, there was no statistically significant relationship between the residents' perception of the adequacy of residency training in preparing candidates for OSCE and gender, status as well as specialty of the respondents [Table 3]. More than half (52%) of the respondents had good perception about OSCE. There was statistically significant relationship between the status of respondents and their perception with regard to preparation for OSCE and impact on knowledge. More junior residents (64.9%) agreed that preparing for OSCE improved their knowledge while 30.8% of senior residents were neutral [Table 4]. | Table 3: Adequacy of residency training in preparing candidates for objective structured clinical examination by gender, status, and specialty of respondents
Click here to view |
 | Table 4: Relationship between status of respondents and their perception with regard to preparation for objective structured clinical examination and impact on knowledge
Click here to view |
Discussion | |  |
Assessment provides a crucial role in the educational process. Not only does it check that learning has occurred, but also it can provide a powerful influence on future learning.[14],[15] The current emphasis in education is moving away from “assessment of learning” to “assessment for learning,”[16] a type of formative assessment where assessment information is used by teachers to adjust their teaching strategies and by students to adjust their learning process.[17] The OSCE has become one of the most widely used methods of assessing clinical competency in health-care education.[18] It was introduced into postgraduate dental examination in Nigeria about 3 years ago. It has been shown to be a reliable and valid measure of basic clinical and technical competence.[1]
Male dominance of the dental profession was observed in this study as previously reported in another study.[19] It was observed that most of the residents fell within the age group of 31–35 years. Giving the length of undergraduate medical/dental education, the fourth decade of life seems to be when postgraduate/residency training is done as depicted by previous studies[20],[21] and also observed in this study. Junior residents make up a large volume of the residents as becoming a senior resident is determined by success in postgraduate examination, hence junior residents tend to make up a bulk of the residents and this was also observed in other studies.[22],[23]
It is important that candidates undergoing this form of assessment should have a good perception about the qualities of OSCE to identify gaps and weaknesses in their competence. The findings of this study and previous studies[10],[24],[25] revealed that respondents had good perception about OSCE suggesting that OSCE is acceptable to residents undergoing residency training.
OSCE has evolved into a reliable and valid means of assessment of clinical skills.[26] Reliability of an assessment tool is a measure of its reproducibility and accuracy; in other words, the degree to which it consistently measures what it is intended to measure.[14] It was observed that about half of the residents felt that OSCE was a valid assessment tool for clinical competence. This report supports a previous one on final year medical students[11] and contradicts that of a previous study where most of the residents felt that the OSCE was not an adequate measure of either clinical or technical skill.[1] The validity of an assessment tool is determined by its ability to actually measure what it is intended to measure. OSCE is considered valid if it succeeds in measuring competencies that it was originally designed to test.[16] In this study, it is possible that the OSCEs at the postgraduate level were valid.
The ultimate educational objective of OSCE to be achieved is to yield a competent performer which requires the assessment of cognitive, psychomotor, and affective skill of the learner.[15] OSCE aims to enhance the validity of clinical assessment by simulating realistic clinical scenarios to reflect real-life professional task.[26] It is important that candidates have a good understanding of this as they are the ones to be assessed using this tool. Majority of the respondents in this study had a good understanding of the aims and objective of OSCE which is a positive feedback regarding OSCE as an assessment tool.
OSCE is an assessment of competence carried out in a well-planned, structured, and objective way.[27] There can be simulated examples or scenarios of rare or unusual cases that are often hard to come by in clinical settings, they can create realistic exposure to the junior doctors as well as help them to deal with unanticipated medical events.[28] It was observed that more than half of the respondents agreed that OSCE gives a chance to test out the ability to apply knowledge to real-life situations. This substantiates that of a previous study.[29]
The philosophy of OSCEs is that all candidates are presented with the same clinical tasks, to be completed in the same time frame and are scored using structured marking schemes.[30] This study showed that most respondents perceived OSCE as an assessment tool that is designed fairly. This attribute of OSCE was also reported by other studies.[31],[32]
OSCE was developed to reduce bias in examiners and assessment of competence.[11],[33] Less than half of the respondents agreed that OSCE reduces bias related to the type of clinical case and examiner's bias.
In a previous study[34] conducted on nursing students, it was reported that there was a statistically significant association between the students' year of study and their perception about the preparation and information provided by their lectures. It was observed that the 1st year students were more positive toward OSCE than the 3rd year students. This corroborates our study where more than half of the junior residents agreed that preparing for OSCE improved their knowledge, while a little above a quarter of the senior residents were neutral about it.
There is a strong indication that additional clinical skill, training, and assessment are needed during residency training[35] to prepare residents for OSCE. In this study, it was observed that over a quarter of the respondents were undecided whether residency training program adequately prepares dental residents for OSCE. This suggests that there is a need for structured training programs to prepare residents for OSCE.
OSCE was perceived to be a worthwhile experience by about half of the respondents similar to findings in previous studies.[36],[37] Since all candidates go through the same scope and criteria for assessment, this tends to create fairness and eliminates prejudice.[37]
Conclusion | |  |
The perception of postgraduate dental residents about the OSCE was favorable. However, there is a need to adjust the postgraduate curriculum to accommodate specific training modules for OSCE.
Financial support and sponsorship
Nil.
Conflicts of interest
There are no conflicts of interest.
References | |  |
1. | Zyromski NJ, Staren ED, Merrick HW. Surgery residents' perception of the objective structured clinical examination (OSCE). Curr Surg 2003;60:533-7.  [ PUBMED] |
2. | Hardin R, Stevenson M, Downie W, Wilson G. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-51. |
3. | Cohen R, Reznick RK, Taylor BR, Provan J, Rothman A. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg 1990;160:302-5. |
4. | Labaf A, Eftekhar H, Majlesi F, Anvari P, Sheybaee-Moghaddam F, Jan D, et al. Students' concerns about the pre-internship objective structured clinical examination in medical education. Educ Health (Abingdon) 2014;27:188-92. |
5. | Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The objective structured clinical examination (OSCE): The new gold standard for evaluating resident performance. Ann Surg 1995;222:735-42. |
6. | Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of objective structured clinical examination (OSCE) for evaluation and instruction in graduate medical education. J Surg Res 1996;63:225-30. |
7. | Sloan DA, Plymale MA, Donnelly MB, Schwartz RW, Edwards MJ, Bland KI. Enhancing the clinical skills of surgical residents through structured cancer education. Ann Surg 2004;239:561-6. |
8. | Turner JL, Dankoski ME. Objective structured clinical exams: A critical review. Fam Med 2008;40:574-8. |
9. | Carraccio C, Englander R. The objective structured clinical examination: A step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med 2000;154:736-41. |
10. | Swanson DB. A measurement framework for performance based tests. In: Hart I, Harden R, editors. Further Developments in Assessing Clinical Competence. Montreal: Can-Heal; 1987. p. 13-45. |
11. | van der Vleuten CP, Schuwirth LW. Assessing professional competence: From methods to programmes. Med Educ 2005;39:309-17. |
12. | Gormley G. Summative OSCEs in undergraduate medical education. Ulster Med J 2011;80:127-32. |
13. | Newble D. Techniques for measuring clinical competence: Objective structured clinical examinations. Med Educ 2004;38:199-203. |
14. | McKay JC, Quiñonez CR. The feminization of dentistry: Implications for the profession. J Can Dent Assoc 2012;78:c1. |
15. | Okonkwo CA, Aziken ME. Resident doctors perceptions of the postgraduate examinations in obstetrics and gynaecology. Ann Biomed Sci 2007;6:22-7. |
16. | Akinsola OJ, James O, Ibikunle AA, Adeyemo WL. Understanding biostatistics: A study of Nigerian dental resident doctors. Niger J Exp Clin Biosci 2014;2:100-4. [Full text] |
17. | Stiggins R. From formative assessment to assessment for learning: A path to success in student-based schools. Phi Delta Kappan. 2005;87:324-8. |
18. | Enabulele J, Ibhawoh L. Knowledge of Nigerian dentists about drug safety and oral health practices during pregnancy. Indian J Oral Sci 2015;6:55-9. [Full text] |
19. | Onyeaso CO, Dosumu EB, Obuekwe O. Postgraduate dental education in Nigeria: Professional knowledge self assessed in relationship to skills among resident dental surgeons in Nigerian teaching hospitals. Niger J Med 2004;13:18-25. |
20. | Ameh N, Abdul MA, Adesiyun GA, Avidime S. Objective structured clinical examination vs. traditional clinical examination: An evaluation of students' perception and preference in a Nigerian medical school. Niger Med J 2014;55:310-3.  [ PUBMED] [Full text] |
21. | Sadia S, Sultana S, Fareesa Waqar F. OSCE as an assessment tool: Perception of undergraduate medical students. Anaesth Pain Intensive Care 2009;13:65-7. |
22. | Nasir AA, Yusuf AS, Abdur-Rahman LO, Babalola OM, Adeyeye AA, Popoola AA, et al. Medical students' perception of objective structured clinical examination: A feedback for process improvement. J Surg Educ 2014;71:701-6. |
23. | Naumann FL, Moore KM. Developing an objective structured clinical examination to assess clinical competence. Springvale South, Victoria: Australian Collaborative Education Network (ACEN) Incorporated; 2012. p. 223-9. |
24. | Idris SA, Hamza AA, Elhaj MA, Elsiddig KE, Hafiz MM, Adam ME. Students' perception of surgical objective structured clinical examination (OSCE) at final year MBBS, University of Khartoum, Sudan. Med J 2014;1:17-20. |
25. | Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979;13:39-54. |
26. | Lateef F. Simulation-based learning: Just like the real thing. J Emerg Trauma Shock 2010;3:348-52.  [ PUBMED] [Full text] |
27. | Jabbani N, Zeinali A, Ageshteh S. Assessment of radiology technology students' internship with objective structured clinical examination. FMEJ 2012;2:19-23. |
28. | Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination. Med Educ 2000;13:41-54. |
29. | Eswi A, Badawy AS, Shaliabe H. OSCE in maternity and community health nursing: Saudi nursing students perspective. Am J Res Commun 2013;1:143-62. |
30. | Dadgar SR, Saleh A, Bahador H, Baradaran HR. OSCE as a tool for evaluation of practical semiology in comparison to MCQ and oral examination. J Pak Med Assoc 2008;58:506-7. |
31. | Rushforth HE. Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Educ Today 2007;27:481-90. |
32. | Small LF, Pretorius L, Walters A, Ackerman M, Tshifugula P. Students' perception regarding the objective, structured, clinical evaluation as an assessment approach. Health SA Gesondheid 2013;18:629-37. |
33. | Ramani S, Ring BN, Lowe R, Hunter D. A pilot study assessing knowledge of clinical signs and physical examination skills in incoming medicine residents. J Grad Med Educ 2010;2:232-5. |
34. | Fidment S. The objective structured clinical exam (OSCE). A qualitative study exploring the healthcare student's experience. Stud Engagem Exp J 2012;1:1-18. |
35. | Zayyan M. Objective structured clinical examination: The assessment of choice. Oman Med J 2011;26:219-22. |
36. | Abdulla MA. Students' perception of objective structured clinical examination (OSCE) in surgery at Basrah College of Medicine. Basrah J Surg 2012;18:1-6. |
37. | Siddiqui FG. Final year MBBS students' perception for observed structured clinical examination. J Coll Physicians Surg Pak 2013;23:20-4. |

Correspondence Address: Julie Omole Omo University of Benin/University of Benin Teaching Hospital, Benin City Nigeria
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/jeed.jeed_31_15

[Table 1], [Table 2], [Table 3], [Table 4] |