Print this page Email this page Users Online: 162 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
LETTER TO THE EDITOR
Year : 2010  |  Volume : 23  |  Issue : 2  |  Page : 434

Rationale for Using OSCEs to Assess Student Competency in Evidence-based Medicine


Monash Institute of Health Services Research, Monash Medical Centre, Clayton, Victoria, Australia

Date of Submission15-Dec-2009
Date of Acceptance19-Jun-2010
Date of Web Publication16-Aug-2010

Correspondence Address:
D Ilic
Monash Institute of Health Services Research, 43-51 Kanooka Grove, Monash Medical Centre, Clayton VIC 3168
Australia
Login to access the Email id

Source of Support: None, Conflict of Interest: None


PMID: 20853244


How to cite this article:
Ilic D. Rationale for Using OSCEs to Assess Student Competency in Evidence-based Medicine. Educ Health 2010;23:434

How to cite this URL:
Ilic D. Rationale for Using OSCEs to Assess Student Competency in Evidence-based Medicine. Educ Health [serial online] 2010 [cited 2020 Sep 30];23:434. Available from: http://www.educationforhealth.net/text.asp?2010/23/2/434/101486

Dear Editor,



Interest in Evidence-based Medicine (EBM) has significantly grown since its formulation in the early 1990s. EBM-integrated university courses now require that medical students are competent in the principles of EBM and can translate that competency into the clinical environment. Each EBM step taught requires mastery of a specific skill set to achieve competence (Table 1). The current literature focuses on using written assessment tools to assess EBM competency with little emphasis placed on researching the effectiveness of using the Objective Structured Clinical Examination (OSCE) to assess EBM competency.



Assessing EBM competency requires a flexible assessment tool that can evaluate components of knowledge, skills and communication. Due to its practical nature, the OSCE is an assessment format suited to assessing practical skills that would otherwise not be examined in a written format. It also provides an opportunity to assess such skills in an environment that best approximates ‘real-life’ settings that medical students will encounter in future clinical contexts.



Findings from several pilot studies have demonstrated the potential for using the OSCE to assess the EBM competency of medical students, with all reporting very good to excellent inter-rater reliability and construct validity for an EBM-specific OSCE station1-5. The OSCE provides sufficient time and facilities for students to demonstrate their ability to effectively search the medical literature on a specific medical topic (Steps 1 and 2 of EBM competency)1. Similarly, it has been used to assess the ability of medical students to appraise a journal abstract and communicate it to a simulated patient (Steps 3 and 4 of EBM competency)2.



With usual times for OSCEs ranging between 6-8 minutes, any EBM-specific OSCE should be run as a ‘double’ station in order to provide sufficient time to assess all aspects of EBM competency (Table 1). Training and resources are important issues to consider when designing and implementing OSCE in assessments. The proposed EBM OSCE station would require both human and technological elements. A simulated patient would be required to provide the student with the clinical scenario (to assess Step 1 of the EBM process) and interact with the student when discussing the implications of the evidence in practice (Step 4 of the EBM process).



Table 1:  Rationale for Assessing Student Competency in Evidence-based Medicine







A reliable computer would also be required to allow the student to search the medical literature (Step 3 of the EBM process). Access to online databases would further require a ‘live’ internet connection, which could be problematic. Use of ‘static’ databases, such as The Cochrane Library, would overcome such barriers since it is available in both online and CD format. The same skill set (i.e. literature searching) can be assessed whether students search MEDLINE on the internet, or the Cochrane Library in CD format. Examiners need not be content experts, particularly for a topic such as EBM, since answers and processes are typically dichotomous in their assessment (i.e. either the students can perform or communicate the task correctly or they cannot). However, assessors would require training in order to understand how to interact with students, simulated patients and maintain consistent evaluations of student performance across the examination period.



Assessing competency in EBM can be difficult due to the various knowledge and cognitive skills that must be performed. The OSCE provides examiners with an instrument that can assess students’ EBM competency in an environment that best mimics their future clinical working environment. Using the OSCE as the assessment tool encourages students to adopt a deep surface approach to their EBM learning, as the OSCE permits any aspect of EBM to be assessed. Further research is warranted to validate the use of OSCEs in assessing EBM competency with medical students.



Dragan Ilic, PhD

School of Public Health & Preventive Medicine

Monash Institute of Health Services Research

Clayton, Australia.



References




1. Burrows SC, Tylman V. Evaluating medical student searches of MEDLINE for evidence-based information: process and application of results. Bulletin of the Medical Library Association. Oct 1999; 87(4):471-476.



2. Bradley P, Humphris G. Assessing the ability of medical students to apply evidence in practice: the potential of the OSCE. Medical Education. Nov 1999; 33(11):815-817.



3. Fliegel JE, Frohna JG, Mangrulkar RS. A computer-based OSCE station to measure competence in evidence-based medicine skills in medical students. Academic Medicine. Nov 2002; 77(11):1157-1158.



4. Frohna JG, Gruppen LD, Fliegel JE, Mangrulkar RS. Development of an evaluation of medical student competence in evidence-based medicine using a computer-based OSCE station. Teaching & Learning in Medicine. 2006; 18(3):267-272.



5. Tudiver F, Rose D, Banks B, Pfortmiller D. Reliability and validity of testing an evidence-based medicine OSCE station. Family Medicine. 2009; 41:89-91.




 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article

 Article Access Statistics
    Viewed1587    
    Printed44    
    Emailed0    
    PDF Downloaded197    
    Comments [Add]    

Recommend this journal