Print this page Email this page Users Online: 734 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
ORIGINAL RESEARCH PAPER
Year : 2009  |  Volume : 22  |  Issue : 1  |  Page : 209

Revising an Objective Structured Clinical Examination in a Resource-limited Pakistani Medical School


Shifa College of Medicine, Islamabad, Pakistan

Date of Submission23-Apr-2008
Date of Acceptance04-Mar-2009
Date of Web Publication08-May-2009

Correspondence Address:
M Iqbal
H8/4, Islamabad
Pakistan
Login to access the Email id

Source of Support: None, Conflict of Interest: None


PMID: 19953439

  Abstract 

Introduction: The objective structured clinical examination (OSCE) has not been used extensively in undergraduate medical education in resource-constrained locations, including Pakistan. The Shifa College of Medicine (SCM) in Islamabad modified an end-of-clerkship OSCE assessment in internal medicine for final year medical students from a previous static, pattern-recognition format to an interactive, clinical reasoning and skill-based format.
Methods: We modified the OSCE to be more dynamic and effective by creating a customized clinical skills laboratory, using standardized patients, developing competency checklists for OSCE stations, and stimulating more active participation from faculty members. Students were surveyed at the end of their medicine clerkship about the OSCE's organization, content, perceived utility and validity and its stressfulness. Faculty involved in the modified format also reported their perceptions in an open-ended survey.
Results: The modified format was generally received positively by students and faculty. Twenty-eight percent of students found the OSCE to be stressful, which is a lower proportion than reported in the literature in other settings. Students suggested that OSCEs should be given more frequently and come with clearer instructions, and they indicated a need for better training in counseling skills. Responses from faculty were generally positive even though the modified format was regarded as more labor-intensive and time-consuming.
Conclusion: The OSCE, in its true sense, can be created and successfully implemented to assess the clinical skills of medical students in a resource-limited setting in the developing world.

Keywords: OSCE, competencies, assessment, validity


How to cite this article:
Iqbal M, Khizar B, Zaidi Z. Revising an Objective Structured Clinical Examination in a Resource-limited Pakistani Medical School. Educ Health 2009;22:209

How to cite this URL:
Iqbal M, Khizar B, Zaidi Z. Revising an Objective Structured Clinical Examination in a Resource-limited Pakistani Medical School. Educ Health [serial online] 2009 [cited 2020 Oct 28];22:209. Available from: https://www.educationforhealth.net/text.asp?2009/22/1/209/101562

Introduction



The objective structured clinical examination (OSCE) is a proven valid and reliable, formative and summative tool for assessing the clinical skills learned by health sciences students (Newble, 1992; Harden & Gleeson, 1979; Cohen et al., 1990). OSCEs can assess students’ clinical competencies in a comprehensive, consistent and standardized manner. Studies have shown that OSCEs help students develop procedural, communication and physical examination skills (Cohen et al., 1990; Carraccio & Englander, 2000). The main objective of OSCEs is to evaluate students’ skills and attitudes at a higher level of integrated learning than is possible with traditional evaluation approaches that rely principally on written examinations.



Undergraduate medical education in Pakistan has been based on the traditional model of learning with a hierarchical relationship between the teacher and student. The most common teaching modalities are lectures and small-group clinical rotations. Students’ knowledge is principally assessed through oral exams and essay-type questions, while their skills are assessed in short and long case formats. The limitations of these assessment approaches include an emphasis on the simple recall of facts and the limited sampling of clinical domains. In addition, the assessment of students’ clinical skills with non-standardized patients and settings compromises reliability and validity.



Over the past five years the Higher Education Commission and Pakistan Medical and Dental Council (PMDC) have undertaken an initiative to promote student-centered, small-group and self-directed learning—and a patient-oriented style of care—to promote better critical reasoning and clinical problem solving among students. This has resulted in several innovations in curricular design, namely well-defined objectives, learner-centeredness, use of small-group learning environments and more reliable and valid assessment tools. Private medical schools are taking the lead in implementing these innovations while public sector schools have been slower in their progress.



In the early 1990s the OSCE was introduced in Pakistan by the College of Physicians and Surgeons to evaluate postgraduate students’ competencies for the fellowship examinations of various specialties. It was subsequently used as a formative and summative assessment tool in undergraduate medical education.



Drawing on the PMDC’s guiding principles, the faculty of the Shifa College of Medicine undertook curricular reforms in 2003. These reforms included the vertical and horizontal integration of the curriculum, an emphasis on small-group, self-directed learning, early clinical exposure, and a clerkship model of training during the clinical years. Faculty used problem-based learning on a sporadic basis to integrate basic science and clinical concepts in solving patient care problems.



This report describes our experiences in revising the OSCE at our institution as part of our curriculum reform. In our institution, OSCEs are routinely used for final year undergraduate students at the end of their medicine clinical clerkship as part of a summative assessment. Unfortunately, the lack of a clinical skills laboratory, trained faculty and use of standardized patient encounters had made the OSCE a tool that simply evaluated students’ ability to report back medical facts without assessing their deeper understanding of the material or ability to apply it in their care of patients. Until 2005, our OSCEs focused on data interpretation and pattern recognition for certain laboratory, X-ray, and EKG findings. In other words, OSCE was used as a fancy multiple choice question (MCQ) format examination and to evaluate some basic procedural skills (e.g. intravenous line insertion, urinary bladder catheterization). In addition to a lack of logistic support, the main reasons for the weak OSCE format were a lack of institutional capacity, motivation, leadership and understanding of available valid assessment tools for the competencies targeted.



To overcome the weaknesses of our summative OSCE examination, we restructured it to focus more on skills and performance evaluation. We introduced standardized patients to evaluate students’ physical examination skills, models through which to assess their procedural skills, and interactive stations to assess their counseling skills. This report describes the experiences and perceptions of our faculty and students with these changes, within our resource-limited environment.



Methods



Shifa College of Medicine (SCM), Islamabad is a private medical school that admits 50 students each year. The school offers a traditional subject-based curriculum with a sharp demarcation between the pre-clinical years (years 1 & 2) and clinical years (years 2, 3, 4), with sporadic problem-based learning that integrates basic and clinical concepts in solving patient care problems.



The department of medicine’s faculty has been at the forefront of trying new techniques in teaching and assessment in our school. The chair has been supportive of such innovations. In 2006, it was recognized that establishing a clinical skills laboratory was needed to make skills evaluation more systematic than was possible with the then-existing opportunistic learning. The major hindrance was a shortage of resources. After discussions with the dean and a thorough needs assessment, a cost-conscious plan for developing a skills laboratory was approved, as has been previously reported (Quadri et al., 2008). We used monthly faculty development seminars within our institution to make faculty aware of and sensitive to valid and reliable tools for clinical skills assessment, including OSCEs. Two faculty members, trained in medical education by the Foundation for Advancement in Medical Education and Research (FAIMER), played a major role in these seminars. The department of medicine established a model for the revised format of the OSCE, which placed greater emphasis on assessing students’ communication skills, physical examination skills, their ability with routine ward procedures and their problem solving skills.



The previous OSCE format mainly assessed data interpretation skills at the “knows/knows how” level on Miller’s hierarchy of clinical competence (Miller, 1990). The OSCE was previously held in a room that had eight stations, two each for EKG reading, image interpretation, and body fluid and arterial blood gas analysis; and one station each for reading a hematology slide and demonstrating the use of medical instruments used in wards (lumber puncture needle, nasogastric tube, and intravenous cannula). Students were given three minutes on each station. The entire setting required only one or two faculty personnel as they were only there to observe students’ interpretation of data.



The modified format was extensively discussed by the department chair and faculty and implemented in 2006. The new format included eight stations with students devoting eight minutes to each station. Two stations were dedicated to physical examinations with standardized patients and another two stations addressed communication skills with faculty acting as standardized patients. In addition, there was one station each for demonstrating procedural skills, interpreting simulated computer-based physical examination findings, interpreting X-rays, and interpreting and proposing management for EKG tracings.



Standardized patients were trained to follow students’ commands for various aspects of the physical examination, and a faculty observer used a checklist to rate each student’s performance. Standardized patients were healthy individuals from our house keeping department who were given prior briefing regarding appropriate mannerisms and how to respond to students’ commands during OSCEs.



Faculty members, acting as standardized patients, were trained locally by two other faculty members. Common counseling situations were identified based on faculty experience. The skills that were required of students included patient counseling around nutrition, diabetic teaching, and asthma management; breaking bad news; and obtaining informed consent. Checklists were developed based on competencies targeted during such encounters. After the encounter, the faculty members who acted as standardized patients used a checklist to assess the students’ performance.



Procedural skills stations were designed to assess students’ attitudes and skills related to particular clinical procedures. A computer station presented either a simulated heart murmur or respiratory findings available through headphones and subsequent relevant physical examination findings linked to a simulated patient. Checklists were developed by the faculty after several revisions. Checklists included appropriately obtaining informed consent, explaining the procedure, behaving with courtesy, and performance skills. Content validity for each checklist was developed after detailed discussion with the faculty involved. Each item on the checklist was scored from 0 to 3, with 0 indicating the skill was “not done at all”, 1 that it was “poorly done”, 2 that it “could have been done better” and 3 that it was “done well”. Each station was manned by a faculty member for observation and rating. At the end of the session, all faculty members gave feedback to students for their respective stations in a group format. Students were encouraged to give their immediate feedback on any technical issues encountered, such as problems with audiovisual aids or lack of clarity in instructions, and on any remaining academic questions, such as the concepts underlying cardiopulmonary auscultation.



The OSCE was administered throughout one year, with 4 groups of students (8-10 in each group) undergoing an OSCE at the end of their 8-week medicine clerkship. After each OSCE, students were surveyed about their perceptions of the revised OSCE format. The survey instrument consisted of 15 statements to be rated on a 5-point Likert scale, with responses ranging from “strongly agree” to “strongly disagree”, and a section encouraging comments and suggestions. Survey questions addressed the OSCE’s organization, ease of examination, utility, stressfulness for the student, and academic value.



Eight faculty members involved in the OSCE stations also responded to an open-ended survey about their perceptions of the modified OSCE as an assessment tool and its value in identifying knowledge and skills gaps to guide subsequent needed educational interventions for students.



Data were entered in spreadsheets and response value frequencies, means, and standard deviations were calculated. Content analysis of the comments from students and faculty was carried out independently by two faculty members who had been involved in conducting the OSCEs. Common themes were identified and any coding differences between coders were resolved by consensus.



The study was approved by the institutional review board of Shifa College of Medicine.



Results



Twenty-nine out of 34 students (85%) completed the questionnaire. About half of the students (48.3%) agreed that they were aware of the format and two-thirds (67.8%) agreed that the instructions were adequate and clear (see Table I).



Table 1:  Perceptions of Students about the Revised OSCE Format







Regarding the content of OSCE, 48.3% agreed with the adequacy of the level of information required from them, and 55.7% agreed that the tasks in the OSCE were clinically relevant to their clerkships.



A strong majority of the students (82.7%) thought that the OSCE was a useful and practical experience and the same proportion agreed that it provided opportunities to learn. More than half (58.6%) agreed that it was helpful in pointing out their areas of weakness.



One quarter of students (27.6%) agreed that the stress of the OSCE was significant, and 48% of the students found the OSCE to be intimidating.



Seventeen students wrote comments or suggestions. Common themes identified were grouped together in Table II under perceptions (positive and negative) and suggestions. Students liked the objectivity of the OSCE and appreciated the post-OSCE feedback, finding it valuable to their learning. They thought that repetitive exposure to OSCEs would help them hone their clinical skills. The OSCE helped students identify their deficiencies in communication skills, and they suggested more coaching during clerkships.



Table 2:  Sample of Students’ Open-ended Comments about their Perceptions of the OSCE







All eight participating faculty members responded to the survey. The OSCE was perceived to be an excellent tool for both summative and formative assessment that enabled them to identify gaps in the students’ knowledge and skills to be addressed in subsequent teaching and curricular reforms (see Table III). The faculty generally agreed that OSCEs, if done properly, would provide evidence with which to accurately assess the students’ physical examination and procedural skills. The use of standardized patients, rather than faculty members, at counseling stations was also preferred. Faculty members also thought that successful implementation of the new format could have an impact on other staff members' attitudes towards innovations in medical education.



Table 3:  Advantages and Disadvantages of the Revised OSCE as Perceived by Faculty







Discussion



The revival of the OSCE in its full form within a resource-limited setting was well-received. In the process, a skills laboratory suited to our needs was also established. The use of standardized patients and faculty observation of student performance at various OSCE stations was generally perceived positively by both faculty and students. The group feedback at the end of each OSCE indicated that it was very well received and appreciated by students. In addition, students provided useful feedback on the process and their educational needs, such as requesting better instructions prior to the OSCE and more training in counseling skills and computer-simulated stations. Our findings are consistent with the literature that shows that OSCEs are well accepted by both students and faculty (Duerson et al., 2000; Newble, 1988). Most of our students found it to be a fair and unbiased examination.



The OSCE is well known to be stressful to students (Van der Vleuten et al., 2000; Allen et al., 1998). In our case only 28% of the students thought it was stressful, which is a lower proportion than that previously found in both developed and developing countries (Allen et al., 1998; Dadgar et al., 2008). The reasons for stress may include receiving inadequate prior instructions, the newness of the format to students and their inexperience with it, and the presence of faculty-observers at each station. The relatively low proportion of students who found the OSCE stressful in our school may be because the faculty-observers were familiar to the students and because of the less interactive nature of the tasks in the OSCE.



Our students generally regarded the focused tasks that they performed at most stations of the OSCE as clinically relevant and close to real life situations. The question remains whether such an assessment tool, with fragmented tasks can inculcate the holistic approach towards patients that is desired in real life practice (Mavis et al., 1996; Troncon, 2004).



Our students found the OSCE to be an objective tool that gave them an opportunity to demonstrate their procedural skills. The students appreciated the timely feedback that helped them better target their learning in the future. They suggested that there should be better instructions for OSCE stations and that they needed more training in counseling skills. This student feedback was used to improve communication skills coaching during various clerkships for our students, and two small courses were incorporated in the foundation modules for the first and fourth year students to give them an early exposure to the basics of these skills.



Comments from the faculty members indicated that they realized the importance of well-trained faculty and standardized patients for successful student-centered learning in future OSCEs. Since then, faculty members from various disciplines have attended workshops designed within the institution addressing the need for introducing OSCEs, designing OSCE tasks, and gathering reliability data (Cronbach’s alpha) for each station for future improvement. The faculty members have developed greater awareness that the properly designed OSCE is a reliable and valid tool for identifying gaps in the clinical years that can be addressed with curricular reforms that improve student learning in clinical settings (Duffield & Spencer, 2002; Cohen et al., 1990). This is in contrast to the old OSCE format that was limited to knowledge assessment. These new understandings helped in designing subsequent assessment strategies for the newly implemented system-based integrated curriculum.



The successful implementation of the OSCE also brought the role of a clinical skills laboratory to light. Since its inception, the laboratory has added more space, an informatics section, dedicated space for standardized patient encounters, new manikins, two full-time faculty members and a part-time clinical skills director.



The laboratory has triggered further faculty development in medical education in our school. Two members have completed the FAIMER fellowship and a third is currently enrolled. Several members have attended workshops on curricular designing, problem-based learning, standardized patient setting, ethics and professionalism. Faculty members have realized the importance of scholarship in medical education, and several faculty members have made presentations at national and international conferences.



In other schools the assessment of clinical skills using OSCEs has required extensive resources in the form of facilities, funding, trained faculty and administrative support (Heard et al., 1998; Harden, 1990; Huang et al., 2007). Our experience has differed; we introduced OSCE successfully with few resources. The cost of setting up an OSCE has been variously reported from US$496 to US$870 per student (Reznick et al., 1993). Though we have not calculated our actual costs per student, they have clearly been less if we look at the principal costs, i.e. the expenses of standardized patients, honoraria for the faculty, time of the examination and number of stations. The faculty members involved were already paid their full-time or part-time salary. The standardized patients were paid per diem. The two full-time senior instructors, maintenance of equipment, and consumables were the major contributors to operating costs. The main expense incurred was for the multi-purpose skills laboratory. The equipment used for procedural skills (manikins and web-enabled computer work stations) was customized to our limited budget. We were able to get some help from a local non-governmental organization that was already running the training courses on adult cardiac life support (ACLS) using a designated area in our school. The joint venture benefited the institution in expanding the ACLS laboratory to incorporate informatics, video repository, and computer simulations. Subsequently, as more funding was available, the laboratory was expanded to incorporate two full-time staff members. Currently a separate area has been designated for training simulated patients and for video-recording student-patient encounters with these simulated patients. The recordings will be used for providing students with formative feedback.



Conclusion



The process of revising the OSCE raised school-wide faculty awareness of the need for more objective clinical skills assessment. In this day and age, it seems to us that depriving students of valid and reliable assessment tools, like an OSCE, is unethical. National regulatory bodies like the Pakistan Medical & Dental Council and the Higher Education Commission should define competencies for undergraduate medical education and define the precise role of performance assessment tools such as the OSCE and their use in both formative and summative assessments, including the high stakes, end-of-year professional examination. Careful planning, maximizing available resources, exploring programs for faculty development in developing countries and assistance from non-profit organizations can help overcoming resource limitations. Our experience served as a catalyst for several other reforms in our school, which have eventually led to the implementation of a system-based integrated curriculum, more student-centered learning, the use of better assessment tools, and the use of ongoing program evaluation to provide constant feedback for program improvement.



References:



Allen, R., Heard, J., Savidge, M., Bittengle, J., Cantrell, M., & Huffmaster, T. (1998). Surveying students' attitudes during the OSCE. Advances in Health Sciences Education: Theory and Practice, 3, 197-206.



Carraccio, C., & Englander, R. (2000). The objective structured clinical examination, a step in the direction of competency-based evaluation. Archives of Pediatrics & Adolescent Medicine, 154, 736-741.



Cohen, R., Reznick, R.K., Taylor, B.R., Provan, J., & Rothman, A. (1990). Reliability and validity of the objective structured clinical examination in assessing surgical residents. American Journal of Surgery, 160(3), 302-5.



Dadgar, S.R., Saleh, A., Bahador, H., & Baradaran, H.R. (2008). OSCE as a tool for evaluation of practical semiology in comparison to MCQ & oral examination. Journal of Pakistan Medical Association, 58, 506-507.



Duerson, M.C., Romrell, L.J., & Stevens, C.B. (2000). Impacting faculty teaching and student performance: nine years' experience with the objective structured clinical examination. Teaching and Learning in Medicine, 12, 176-182.



Duffield, K.E., & Spencer, J.A. (2002). A survey of medical students’ views about the purposes and fairness of assessment. Medical Education, 36(9), 879-86.



Harden, R.M. (1990). Twelve tips for organizing objectively structured clinical examination (OSCE). Medical Teacher, 12, 259-264.



Harden, R.M., & Gleeson, F.A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13(1), 41 54.



Heard, J.K., Allen, R.N., & Cason, G.H. (1998). Practical issues in developing a program for the objective assessment of clinical skills. Medical Teacher, 20,15-21.



Huang, Y.S., Liu, M., Huang, C.H., & Liu, K.M. (2007). Implementation of an OSCE at Kaohsiung Medical University. Kaohsiung Journal of Medical Sciences, 23, 161-169.



Mavis, B.E., Henry, R.C., Ogle, K.S., & Hoppe, R.B. (1996). The emperor’s new clothes: the OSCE reassessed. Academic Medicine, 71(5), 447-53.



Miller, G.E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine (supplement) 65, S63-S7.



Newble, D.I. (1992). Assessing clinical competence at the undergraduate level. Medical Education, 26(6), 504-11.



Newble, D.I. (1988). Eight years experience with a structured clinical examination. Medical Education, 22, 200-204.



Quadri, K.H.M., Rahim, M.F., Alam, A.Y., Jaffery, T., Zaidi, Z., & Iqbal, M. (2008). The structure and function of a new Clinical Skills and Medical Informatics Laboratory (SCIL) in a developing country — a two year institutional experience. Journal of Pakistan Medical Association, 58(11), 612-615.



Reznick, R.K., Smee, S., Baumber, J.S., Cohen, R., Rothman, A., Blackmore, D., & Bérard, M. (1993). Guidelines for estimating the real cost of an objective structured clinical examination. Academic Medicine, (68)7, 513-517.



Troncon, L.E. (2004). Clinical skills assessment: limitations to the introduction of an "OSCE" (Objective Structured Clinical Examination) in a traditional Brazilian medical school. Sao Paulo Medical Journal, 122(1), 12-17.



Van der Vleuten, C.P.M.., Scherpbier, A.J.J.A., Dolmans, D.H.J.M., Schuwirth, L.W.T., Verwijnen, G.M., & Wolfhagen, H.A.P. (2000). Clerkship assessment assessed. Medical Teacher, 22(6), 592-600.




 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract

 Article Access Statistics
    Viewed2315    
    Printed62    
    Emailed0    
    PDF Downloaded224    
    Comments [Add]    

Recommend this journal