Print this page Email this page Users Online: 1176 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
ORIGINAL RESEARCH ARTICLE
Year : 2014  |  Volume : 27  |  Issue : 2  |  Page : 188-192

Students' concerns about the pre-internship objective structured clinical examination in medical education


1 Department of Emergency Medicine, Imam Khomeini Hospital; Clinical Skills Development Center, Medical Faculty, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
2 Department of Health Promotion and Education, School of Public Health; Health Center for Community Based Participatory Research, Tehran University of Medical Sciences, Tehran, Iran
3 Department of Health Management and Economic, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
4 Students' Scientific Research Center (SSRC), Tehran University of Medical Sciences, Tehran, Iran

Date of Web Publication31-Oct-2014

Correspondence Address:
Dr. Ali Labaf
Department of Emergency Medicine, Imam Khomeini Hospital Clinical Skills Development Center, Medical Faculty, Tehran University of Medical Sciences, Tehran
Iran
Login to access the Email id

Source of Support: This study was a part of MD-MPH thesis supported by Tehran University of Medical Sciences,, Conflict of Interest: Ali Labaf is the director of Clinical Skills Development Center, Medical Faculty, Tehran University of Medical Sciences, Tehran, Iran.


DOI: 10.4103/1357-6283.143787

  Abstract 

Background: Despite several studies on implementation, reliability and validity of the Objective Structured Clinical Examination (OSCE), the perceptions of examinees toward this evaluation tool remain unclear. The aim of the current study was to assess students' perceptions of the OSCE. Methods: All students in their final year of studies, who participated in the pre-internship OSCE in September 2010, were included in the study. A 16-item questionnaire was designed to assess: Characteristics of respondents; organization, content and structure of the OSCE; and perceptions of validity, reliability and rating of the OSCE with respect to other assessment methods. Questionnaires were administered immediately after all students had finished the OSCE and before leaving the examination venue. Results: Response rate was 86.2%, with 77% of the students indicating the OSCE as a useful learning experience. A majority of the students (62%) agreed that a wide range of clinical skills was covered in this exam. However, 66% had concerns about the wide coverage of knowledge assessed. A total of 81% of students did not prefer the OSCE to multiple choice question exams and 88% found the OSCE intimidating and more stressful than other forms of assessment. Discussion: Our study demonstrates that although the majority of students believe in the reliability and validity of the OSCE, they have concerns about it and report poor acceptance of the OSCE. Further studies are necessary to assess the important concerns of the students and the effectiveness of interventions in improving the acceptability of the OSCE.

Keywords: Anxiety, OSCE, student acceptability


How to cite this article:
Labaf A, Eftekhar H, Majlesi F, Anvari P, Sheybaee-Moghaddam F, Jan D, Jamali A. Students' concerns about the pre-internship objective structured clinical examination in medical education. Educ Health 2014;27:188-92

How to cite this URL:
Labaf A, Eftekhar H, Majlesi F, Anvari P, Sheybaee-Moghaddam F, Jan D, Jamali A. Students' concerns about the pre-internship objective structured clinical examination in medical education. Educ Health [serial online] 2014 [cited 2019 Nov 11];27:188-92. Available from: http://www.educationforhealth.net/text.asp?2014/27/2/188/143787


  Background Top


The Objective Structured Clinical Examination (OSCE) is regarded as the gold standard for evaluating students' skills and competencies, among different examination methods. [1] Previous studies have shown that the OSCE has considerable divergent validity and evaluates capacities, which other forms of assessment fail to adequately address; therefore, it is suggested that the OSCE can serve as an appropriate complement to traditional evaluations which mainly focus on cognitive abilities of the trainees. After description of the OSCE by Harden et al. in 1974, [2] implementation of the OSCE in countries with traditional education systems has resulted in different opinions among students and faculty members. Acceptability of the OSCE by students is often overlooked in the present era of complex statistical analysis and contributions of experts' opinions in validating examinations. Nevertheless, this concept is vitally important in implementing the OSCE as a key tool for measuring students' clinical capabilities in educational systems, since any discrepancies between students' perceptions and experts' expectations may lead to invalid results.

Tehran University of Medical Sciences (TUMS) was one of the leading universities in Iran in implementing the OSCE to ensure that interns are capable of exercising their duties in patient care. Medical education in Iranian medical schools is divided into four stages of basic sciences, physiopathology of internal medicine diseases, clinical clerkships and internships. [3] Students are evaluated via traditional methods of multiple choice questions (MCQs) in the first two stages of their education; however, in clerkships, students are assessed via written examinations, faculty observations, or mini-OSCEs comprising their first exposures to the clinical skills evaluation modes. In 2009, TUMS established administering a pre-internship OSCE in order to assess students' clinical capabilities. The students were informed that the national comprehensive pre-internship examination, which mainly covers students' knowledge of diseases, their etiologies, and management, would comprise 90% of their total ranking scores for choosing their internship rotations and hospitals, while the OSCE would encompass the remaining 10% of ranking scores. Furthermore, students were informed that in contrast to the national comprehensive examination, there was no pass/fail cut-off for the OSCE. To prepare for the OSCE, the students were allowed to practice with their peers on examination models and simulators one week prior to the OSCE.

While several studies have focused on implementation of the OSCE and its reliability and validity, [4],[5],[6],[7],[8] examinees' perception toward this evaluation tool remains unclear. In order to enhance the development of a more rigorous, practical, reliable and valid examination, in this study, we assessed acceptance of the recently introduced tool by the students.


  Methods Top


Participants

All students in their final year of studies who participated in the pre-internship OSCE in September 2010 were included in the study.

OSCE description

The OSCE consisted of twelve stations and allowed five min for each one, except for the history-taking station, which allowed up to ten min. Stations were categorized into five components of knowledge: History-taking; patient management; para-clinic data interpretation; physical examination; and skills in medical procedures. Expert observers were selected from second-year residents by contacting relevant departments. The selected expert observers were trained in a one-hour session and subsequently equipped with written instructions. These observers scored nine process-centered stations via a pre-established structured checklist, while the other three stations were designed without checklists and were scored after the examination using a structured answer sheet.

OSCE procedure

One hundred and thirty students were divided into four tracks (A, B, C and D) within the same stations. Students drew closed and secured envelopes that contained their track label and the station where they were to begin, which ensured random assignment.

Within 5 min, students were required to perform a medical procedure, provide differential diagnosis and appropriate management for a medical imaging, perform a physical examination and manage the patient. Five standardized patients (SPs) who had been trained for their roles participated in this study. SPs were trained in three 45-min sessions and received written instructions one week prior to the examination. Nine expert-observers were instructed to score examinees performance using the checklists. At the end of the examination, students were asked to respond to a questionnaire about their perceptions regarding the OSCE.

Study instrument and procedure

A 16-item questionnaire with various domains was designed to assess characteristics of participants, organization, content and structure of the examination, perceptions of OSCE validity, reliability and rating of the OSCE with respect to other assessment methods. A 5-point Likert scale representing the degrees of agreement was used to assess most of the dimensions included in the questionnaire. Six faculty members assessed the content validity of the questionnaire. Twenty-five volunteers who had previously passed the pre-internship OSCE completed the questionnaire twice, within 2-week intervals to assess its reliability. The range of interclass correlation coefficients of the questions was 0.82-1.0, with a median of 0.91. Questionnaires were administered immediately after all students finished the OSCE and before they left the examination venue. Students were asked to complete the questionnaire anonymously on a voluntary basis. The study was approved by the ethics committee of the Tehran University of Medical Sciences.

Statistical analysis

Descriptive statistics were used to examine the data. All statistical analysis of the data was performed using Statistical Package for Social Science (SPSS) for Windows, version 11 (SPSS Inc., Chicago, IL, USA).


  Results Top


Of a total 130 students who participated in the OSCE, 112 returned the questionnaire, resulting in a response rate of 86.2%. The remaining 28 students declined to participate in the study due to personal preferences. No significant difference was observed in terms of age or gender between students who refused to participate and students who were enrolled in the study (P = 0.67 and 0.46, respectively). Of the total 112 participants, 54 (41.5%) were male; 33 (29.5%) were enrolled in track A, 29 (25.9%) were in track B, while equal numbers of 25 (22.3%) were placed in tracks C and D.

Quality of the OSCE

Data regarding students' perceptions about the quality of the OSCE are summarized in [Table 1]. About 60% of the students agreed that the sequence of stations was logical and appropriate. In addition, more than 65% of the students believed that the instructions were clear, and they felt appropriately informed about the exam. Over three-quarters of the students considered the OSCE a useful learning experience; and about one-half were satisfied with the organization and administration of the OSCE. However, more than 64% were not satisfied with the time allocation for each station.
Table 1: Students' perceptions of quality of the objective structured clinical examination (n=112)

Click here to view


Perceptions of validity and reliability

A significant proportion of students (61.6%) viewed the OSCE as an accurate measure of clinical skills. Further, 48.3% of the students felt that the required tasks were consistent with the actual curriculum they had been taught [Table 2]. A total of 61.6% agreed that a wide range of clinical skills were covered in this exam; however, approximately 67% of the students were concerned about the wide coverage of knowledge area. Approximately, 60%of the students were satisfied with the SPs' role-playing. In contrast, more than one-half (55.4%) were concerned about fair judgment by the observers, and 58% of the students raised concerns that personality, ethnicity or gender may have affected their scores.
Table 2: Students' perceptions of objective structured clinical examination validity and reliability (n=112)

Click here to view


Pre-Internship OSCE versus multiple choice exams

A total of 81.4% of the students did not prefer the OSCE to the MCQ exams and 88.4% found the OSCE to be intimidating and more stressful than other assessment formats.


  Discussion Top


It is crucial for a competent clinician to have both medical knowledge and clinical skills. For many years, medical schools have relied on evaluating students solely on the basis of clinical knowledge. As a result, clinical skills were rather neglected. Numerous studies attest to the lack of a correspondence between the performance of high achievers in the classroom and in the clinical setting. [9] These findings point to the incorporation of examinations that also evaluate clinical skills in contemporary educational systems. The OSCE, in the form of performance-based assessment, has become a promising instrument for evaluating the clinical capabilities of undergraduate medical students as well as residents. [10],[11],[12],[13] Moreover, OSCE has been verified as the most reliable and valid tool for assessing clinical competency in different settings. [4],[5],[6],[7],[8]

In this study, we thoroughly assessed students' attitudes regarding the OSCE. Our results suggest that students believe the OSCE provides a positive learning experience, helpful faculty feedback, and that they learned much in the process. Surprisingly, although the majority of students found the OSCE to be an accurate measure of clinical skills, they did not prefer the OSCE to the MCQ exams. This may be due to greater anxiety elicited by the OSCE, as documented in previous studies. [14],[15],[16] Although the students' clinical capabilities have been assessed via mini-OSCE settings in some of their clinical clerkships, this OSCE was their first exposure in their clinical education experience to a comprehensive standardized practical assessment. Interestingly, previous studies have shown that the level of anxiety lessens slightly as the student progresses through the examination, further emphasizing the role of experience and exposure to the exam. [14],[15] Adding clinical skills courses to the medical school curricula and exposing students to this type of assessment throughout their education may serve to decrease anxiety about the OSCE and increase its acceptance among students. [14],[15],[17]

Further, serious concerns about observers' fairness, [18],[19] as reflected by the concerns and reservations regarding whether the OSCE scores are influenced by the observer's point of view, may also play an important role in student assessments. Notably, a significant proportion of students felt that personality, ethnicity and gender affected their scores for this examination. In line with students concerns, inter-rater and inter-SP variability could be major sources of bias, as documented by previous studies. [17],[19],[20],[21] Addressing these issues could potentially increase students' acceptance of this exam. Conceivably, resistance toward these changes may have triggered the students' judgment to not prefer the OSCE.

A majority of the students asserted that the expected tasks were fair and reasonable. Finding that an overwhelming proportion of the students agreed that the OSCE provided a useful and practical learning experience was consistent with previous studies reported in various biomedical education settings. [17],[18],[22],[23],[24],[25] Notably, it is suggested that students will gain the most valuable learning experience through the OSCE as it provides them with the opportunity to feel like being a doctor rather than a student, receive constructive feedback from faculty members and control their anxiety. [26] The content of the assessment can strongly influence students' learning strategies and their profile of strengths and weaknesses. [27] Communication and interpersonal skills, ethical problem identification and resolution skills may be assessed more effectively through a well-planned OSCE than through other testing methods. [12] Therefore, it is suggested that the OSCE could be adapted and used as a diagnostic tool to guide student learning.

Although, Pierre et al. suggested that the ambiguity of the questions or tasks may contribute to lower acceptance of the OSCE, it seems that it was not the case in this study since most students stated that the instructions had been clearly presented and the orientation was sufficient. Attending review classes and orientation sessions prior to the actual exam appeared to benefit the students, making them well acquainted with the regulations of the anticipated actions and performances by the examiners and examination staff.

Students did not believe that the sequence of stations and different tracks might have affected their scores. This finding is in contrast to previous studies. [24] This key point acknowledges that the randomized allocation to the tracks of the stations resulted in students feeling more secure about the exam structure.

Students indicated that the allocated time for completing each task was insufficient. The large number of examinees and time limitations for taking an exam required the assignment of 5 min for each station. Concerns about time allocation could have arisen from the students' anxiety, since previous studies have demonstrated that the OSCE can be a strong anxiety-producing experience. [14],[15] In contrast, concerns about time allocation per station and the degree of stress expressed by the students were in part due to inadequate preparation for the examination, particularly in competencies not previously assessed in more traditional examinations. [17],[24],[25]

Just over two-fifths of the students (40.2%) found the SPs not realistic. [28],[29],[30] To improve this aspect of the exam, faculty can conduct more supporting classes for the SPs, and the entire OSCE should be taped and reviewed to discern pitfalls in the role-playings, which may need to be corrected in future exams. [31] In this regard, use of peer-based examiners or SPs may also be a helpful option since previous studies indicate that involvement of peer examiners or SPs engages students in a useful activity and inspires them to have more self-confidence in their clinical skills. [32],[33]

Comparing the results of the OSCE with the MCQ national comprehensive pre-internship examination and students' grade point average showed a moderate to high correlation between students' knowledge as judged by written examinations and their clinical competencies. [34] This suggests that the expected tasks in OSCE are in line with the curriculum. Considering the students' perceptions of the discrepancies between expected tasks and the curriculum, it seems beneficial to improve informing the students' about the skills they are supposed to learn during clerkships. Additionally, providing the faculty members with the feedback about the subject and encouraging them to place more emphasis on the necessary competencies that students are supposed to gain during their clinical education seems paramount.

Our study has several limitations. First, it was conducted at a single center. Second, students' perceptions of the OSCE may have been influenced by the lack of confidence associated with the imposition of a new assessment and responses may have also been affected by the timing of the inquiry (immediately after the examination); hence, students' stress and fatigue should be taken into consideration. However, the high response rate ensured that the views were a reasonable representation of the students' perceptions.

Our study demonstrates that although the majority of students viewed the OSCE as a valuable tool in measuring clinical competencies, there were still concerns leading to its poor acceptance. The findings of this study encourage policy-makers to enhance students' early exposure to the clinical evaluation settings to increase familiarity of students to these examinations. Furthermore, interventions including improving examiners' scoring training, analyzing and reporting the internal consistency and validity of scoring checklists might be beneficial in increasing the reliability of the OSCE and decreasing students' concerns about inter-rater variability. Further studies are necessary to assess the effectiveness of the interventions in improving the accessibility of the OSCE.


  Acknowledgments Top


The authors wish to acknowledge the staff of the clinical skills development center that kindly facilitated this study, which was a part of a MD-MPH thesis supported by Tehran University of Medical Sciences.

 
  References Top

1.
Turner JL, Dankoski ME. Objective structured clinical exams: A critical review. Fam Med 2008;40:574-8.  Back to cited text no. 1
    
2.
Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-51.  Back to cited text no. 2
[PUBMED]    
3.
Jamali A, Tofangchiha S, Jamali R, Nedjat S, Jan D, Narimani A, et al. Medical students' health-related quality of life: Roles of social and behavioural factors. Med Educ 2013;47:1001-12.  Back to cited text no. 3
    
4.
Cohen R, Reznick RK, Taylor BR, Provan J, Rothman A. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg 1990;160:302-5.  Back to cited text no. 4
    
5.
Petrusa ER, Blackwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Arch Intern Med 1990;150:573-7.  Back to cited text no. 5
    
6.
Woodburn Jim, Nick S. The reliability, validity, and evaluation of the objective structured clinical examination in podiatry (Chiropody). Assess Eval Higher Educ 1996;21:131-46.  Back to cited text no. 6
    
7.
Hodges B, Regehr G, Hanson M, McNaughton N. Validation of an objective structured clinical examination in psychiatry. Acad Med 1998;73:910-2.  Back to cited text no. 7
    
8.
Prislin MD, Fitzpatrick CF, Lie D, Giglio M, Radecki S, Lewis E. Use of an objective structured clinical examination in evaluating student performance. Fam Med 1998;30:338-44.  Back to cited text no. 8
    
9.
Stephanie F. Gardner, Cindy D. Stowe, Hopkins DD. Comparison of Traditional Testing Methods and Standardized Patient Examinations for Therapeutics. Am J Pharm Educ 2001;65:236-40.  Back to cited text no. 9
    
10.
Dupras DM, Li JT. Use of an objective structured clinical examination to determine clinical competence. Acad Med 1995;70:1029-34.  Back to cited text no. 10
    
11.
Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The Objective Structured Clinical Examination. The new gold standard for evaluating postgraduate clinical performance. Ann Surg 1995;222:735-42.  Back to cited text no. 11
    
12.
Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of objective structured clinical examination (OSCE) for evaluation and instruction in graduate medical education. J Surg Res 1996;63:225-30.  Back to cited text no. 12
    
13.
Jain SS, DeLisa JA, Nadler S, Kirshblum S, Banerjee SN, Eyles M, et al. One program's experience of OSCE vs. written board certification results: A pilot study. Am J Phys Med Rehabil 2000;79:462-7.  Back to cited text no. 13
    
14.
Brand HS, Schoonheim-Klein M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur J Dent Educ 2009;13:147-53.  Back to cited text no. 14
    
15.
Marshall G, NJ. A pilot study into the anxiety induced by various assessment methods. Radiography 2003;9:185-91.  Back to cited text no. 15
    
16.
Delavar MA, Salmalian H, Faramarzi M, Pasha H, Bakhtiari A, Nikpour M, et al. Using the objective structured clinical examinations in undergraduate midwifery students. J Med Life 2013;6:76-9.  Back to cited text no. 16
    
17.
Awaisu A, Mohamed MH, Al-Efan QA. Perception of pharmacy students in Malaysia on the use of objective structured clinical examinations to evaluate competence. Am J Pharm Educ 2007;71:118.  Back to cited text no. 17
    
18.
Duffield KE, Spencer JA. A survey of medical students' views about the purposes and fairness of assessment. Med Educ 2002;36:879-86.  Back to cited text no. 18
    
19.
Martin JA, Reznick RK, Rothman A, Tamblyn RM, Regehr G. Who should rate candidates in an objective structured clinical examination? Acad Med 1996;71:170-5.  Back to cited text no. 19
    
20.
LaMantia J, Rennie W, Risucci DA, Cydulka R, Spillane L, Graff L, et al. Interobserver variability among faculty in evaluations of residents' clinical skills. Acad Emerg Med 1999;6:38-44.  Back to cited text no. 20
    
21.
Furman G, Colliver JA, Galofre A. Effects of student gender and standardized-patient gender in a single case using a male and a female standardized patient. Acad Med 1993;68:301-3.  Back to cited text no. 21
    
22.
Hammad M, Oweis Y, Taha S, Hattar S, Madarati A, Kadim F. Students' opinions and attitudes after performing a dental OSCE for the first time: A Jordanian experience. J Dent Educ 2013;77:99-104.  Back to cited text no. 22
    
23.
Smith LJ, Price DA, Houston IB. Objective structured clinical examination compared with other forms of student assessment. Arch Dis Child 1984;59:1173-6.  Back to cited text no. 23
[PUBMED]    
24.
Pierre RB, Wierenga A, Barton M, Branday JM, Christie CD. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ 2004;4:22.  Back to cited text no. 24
    
25.
Awaisu A, Abd Rahman NS, Nik Mohamed MH, Bux Rahman Bux SH, Mohamed Nazar NI. Malaysian pharmacy students' assessment of an objective structured clinical examination (OSCE). Am J Pharm Educ 2010;74:34.  Back to cited text no. 25
    
26.
Allen R, Heard J, Savidge M, Bittergle J, Cantrell M, Huffmaster T. Surveying students' attitudes during the OSCE. Adv Health Sci Educ Theort Pract 1998;3:197-206.  Back to cited text no. 26
    
27.
Crossley J, Humphris G, Jolly B. Assessing health professionals. Med Educ 2002;36:800-4.  Back to cited text no. 27
    
28.
Barrows HS. An overview of the uses of standardized patients for teaching and evaluating clinical skills. AAMC. Acad Med 1993;68:443-51.  Back to cited text no. 28
[PUBMED]    
29.
Colliver JA, Swartz MH. Assessing clinical performance with standardized patients. JAMA 1997;278:790-1.  Back to cited text no. 29
    
30.
Adamo G. Simulated and standardized patients in OSCEs: Achievements and challenges 1992-2003. Med Teach 2003;25:262-70.  Back to cited text no. 30
[PUBMED]    
31.
McCormick DP, Rassin GM, Stroup-Benham CA, Baldwin CD, Levine HG, Persaud DI, et al. Use of videotaping to evaluate pediatric resident performance of health supervision examinations of infants. Pediatrics 1993;92:116-20.  Back to cited text no. 31
    
32.
Burgess A, Clark T, Chapman R, Mellis C. Senior medical students as per examiners in an OSCE. Med Teach 2013;35:58-62.  Back to cited text no. 32
    
33.
Burgess A, Clark T, Chapman R, Mellis C. Medical student experience as simulated patients in the OSCE. Clin Teach 2013;10:246-50.  Back to cited text no. 33
    
34.
Eftekhar H, Labaf A, Anvari P, Jamali A, Sheybaee-Moghaddam F. Association of the pre-internship objective structured clinical examination in final year medical students with comprehensive written examinations. Med Educ Online 2012;17.  Back to cited text no. 34
    



 
 
    Tables

  [Table 1], [Table 2]


This article has been cited by
1 Why So Stressed? A Descriptive Thematic Analysis of Physical Therapy Studentsę Descriptions of Causes of Anxiety during Objective Structured Clinical Exams
Nancy Zhang,David M. Walton
Physiotherapy Canada. 2018; 70(4): 356
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Background
Methods
Results
Discussion
Acknowledgments
References
Article Tables

 Article Access Statistics
    Viewed2844    
    Printed52    
    Emailed0    
    PDF Downloaded516    
    Comments [Add]    
    Cited by others 1    

Recommend this journal