Print this page Email this page Users Online: 182 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2007  |  Volume : 20  |  Issue : 3  |  Page : 125

Clinical Skills Training in a Skills Lab Compared with Skills Training in Internships: Comparison of Skills Development Curricula


1 Universiteit Antwerpen, Wilrijk, Belgium
2 Universiteit Maastricht, Maastricht, The Netherlands

Date of Submission28-Sep-2007
Date of Web Publication23-Nov-2007

Correspondence Address:
G Peeraer
universiteitsplein 1 S, 2610 Wilrijk
Belgium
Login to access the Email id

Source of Support: None, Conflict of Interest: None


PMID: 18080964

  Abstract 

Context: The necessity of learning skills through "integrated skills training" at an undergraduate level has been supported by several studies. The University of Antwerp implemented undergraduate skills training in its renewed curriculum in 1998, after it was demonstrated that Flemish students did not master their medical skills as well as Dutch students who received "integrated skills training" as part of their undergraduate medical course.
Aim : The aim of this study was to compare the skill outcome levels of two different student populations: students who had been trained in basic clinical skills mainly through clinical internships in year 7 with students who had learned these skills through an integrated longitudinal programme in a special learning environment in years 1-5 prior to their internship experience.
Study sample: Students of the traditional curriculum learned skills through a 75 hour programme in years 4 and 5, through plenary sessions followed by a 12 month period of internships during which skills could be further practiced. We tested this group right after completion of their internships. Students from the renewed curriculum followed a 200 hour intensive small group skills training programme offered in years 1-5. This group was tested before starting their internships.
Results: On global OSCE-scores, renewed curriculum students had significantly higher overall scores (p<0.001) and they scored significantly higher at 6 of 15 stations. There was no significant difference at 8 stations, while traditional curriculum students scored better at station 1.
Discussion: 5 years and 200 hours of integrated undergraduate skills training is more effective as a method of learning basic clinical skills, compared to learning these skills through 75 hours of traditional skill training and reinforcement of these skills in 12 month clinical internships, when measured by means of an OSCE.

Keywords: Undergraduate skills training, curriculum change


How to cite this article:
Peeraer G, Scherpbier A J, Remmen R, De winter B Y, Hendrickx K, van Petegem P, Weyler J, Bossaert L. Clinical Skills Training in a Skills Lab Compared with Skills Training in Internships: Comparison of Skills Development Curricula. Educ Health 2007;20:125

How to cite this URL:
Peeraer G, Scherpbier A J, Remmen R, De winter B Y, Hendrickx K, van Petegem P, Weyler J, Bossaert L. Clinical Skills Training in a Skills Lab Compared with Skills Training in Internships: Comparison of Skills Development Curricula. Educ Health [serial online] 2007 [cited 2020 Aug 10];20:125. Available from: http://www.educationforhealth.net/text.asp?2007/20/3/125/101596

Introduction



Curriculum changes often lead medical schools to implement skills training at the undergraduate level. One of the reasons for this implementation was the fact that limited undergraduate skills training often resulted in junior doctors being required to perform skills during full-time internships for which they have not been prepared (Clack, 1994; Remmen, 2000; Sanson-Fischer, 2005). This may result in junior doctors underperforming, which can be stressful for them and is a potential risk for their patients. However, this underperformance is not always visible as basic medical skills are only part of the entire internship experience (Williams et al., 1997; Bradley & Bligh, 2004).



Undergraduate skills training is not intended to replace clinical experience as the key mode of learning, but as a preparation for learning of clinical skills in real practice (Bradley & Bligh, 1999; Bradley & Bligh, 2005). It is a medical educational reform which facilitates the learning of basic clinical skills in a setting other than clerkships and which is understandable in view of the changes of healthcare delivery and changes in healthcare education (Bradley & Postlethwaite, 2003).



The undergraduate basic medical skills programme should be designed to support the intended learning outcomes and be integrated within the overall curriculum, including within the assessment strategy (Bradley & Postlethwaite, 2003). Simulation-based testing methods have been developed to meet the need for assessment procedures that are both authentic and well-structured (Schuwirth & van der Vleuten, 2004). On the “show how” level of Miller’s pyramid (1990), we find the objective structured clinical examination (OSCE), which combines the reality of live clinical interactions with the standardization of problems and the use of multiple observers (Harden & Gleeson, 1979; van der Vleuten & Swanson, 1990). The OSCE is considered to be one of the most reliable and valid measures of clinical performance ability currently available, with global ratings which are at least as reliable as checklist scores with respect to station-wise alpha (Regehr et al., 1999). In discriminating experts from novices, global ratings are more valid than checklist scores (Hodges, 2003).



As educators, we will have to demonstrate that there actually is a pay-off from the reforms we advocate (Bradley & Postlethwaite, 2003). The educational benefit of skills training in a skills lab is considered unproven; there is no hard evidence that a skills training does pay off (Bradley & Bligh, 2004; Hart, 2004; Morrison, 2004; Payne, 2004; Williams & Lau, 2004). Therefore, we tested students who received intensive skills lab training embedded in a renewed curriculum and compared their test results to those of students who only received 75 hours of skills training without assessment and had to rely on clerkships to master all skills. Both groups had the same intended learning outcomes and used the same study material.



Method



Educational setting



The medical faculty of the University of Antwerp (UA) replaced its traditional lecture-based undergraduate curriculum by a new curriculum in 1998. The old curriculum lasted 7 years and offered 75 hours of obligatory skills training in years 4 and 5 by clinical staff. The programme covered interpersonal skills, physical-diagnostic examination of heart, lungs and abdomen, technical-therapeutic skills, suturing techniques and cardiopulmonary resuscitation in year 4. Medicine prescription, laboratory and technical tests, endotracheal intubation and defibrillation, eye and ear test and monitoring of general anaesthesia were taught in year 5. Dummies and audio-visual aids were used in plenary sessions. Afterwards, there was a theoretical assessment of skills, but no practical skills performance was tested. Bates’ Guide to Physical Examination and History-Taking was used as a manual for students. Before embarking on 12 months of internship in the second half of year 6, students were given a list of basic clinical skills. They were told that they were expected to show mastery of these skills by the end of their final year. There was no formal skills training programme during these internships, and it was the university’s aim to offer all students the opportunity to perform all skills at least once during their clerkship; they had to find opportunities to practice skills and get feedback from peers and senior doctors themselves. According to research data, however, clinical training seemed not to meet these intended goals (Remmen, 1998).



The renewed curriculum, implemented in 1998, is a hybrid of traditional and integrated learning. Based on a thorough analysis of the weaknesses and strengths of the traditional programme, the clinical and communication skill programme was redesigned (Remmen et al., 1998). Students entering the new curriculum, in their first year, receive a list of the skills they must master upon graduation and a schedule indicating when the skills are taught and assessed. The skills list is based on “Blueprint 1994 Training of Doctors in the Netherlands”, which describes the final objectives of undergraduate medical education (Metz et al., 1994) and differs from the list used in the former curriculum only in structure.



The traditional skills programme was extended to 200 hours spanning the first five years of the curriculum. A new team of skill trainers (three full-time equivalents) was appointed to run the programme, collaborating with clinicians of different specialities. Not only does the new programme include all the basic clinical skills, but also introduces important new features such as the integration of skills, small group training, regular and hands-on assessment (introduction of the OSCE) and extensive feedback to students.



Table 1: Comparison between two curricula and students participating in each







A skills training is integrated into other curricular content. For instance, basic resuscitation, defibrillation, suturing techniques, first aid, immobilisation and taping are scheduled in the course on “Trauma and Resuscitation”.



Training sessions are held in the skills laboratory. Depending on the type of skill, group sizes vary from 2-15 students. Attendance is obligatory and recorded in students’ portfolios. Students can practise skills on each other, on models, on manikins and on standardised patients. As a manual, Bates’ Guide to Physical Examination and History -Taking was used.



Assessment methods include self-reflection, assignments (included in portfolios), multiple choice tests and an OSCE in years 1, 3 and 5. In addition, students receive feedback on the early patient contacts in some of the modules.



At the end of each year, students receive feedback including comments on portfolio assignments and on OSCE-scores (including the scores and checklists of each station), as well as an official overall score. Students can make appointments with one of the trainers to receive extra information on their feedback and/or score.



Students



At the time of the study, the first cohort had completed the new five-year skills programme. We compared the skills of the cohort that had completed 7 years of the old curriculum with those of the cohort that had completed 5 years of the new curriculum. That is, we compared the skills of students who had completed 75 hours of skill training (as described before) plus all internships, with those of students who had completed the new 200 hour skill programme but had not yet started their internships (see table 1).



The old curriculum group comprised 34 students (29.6%) out of a total of 115 students in Year 7. These students were informed about the study and given the assurance that individual scores would be kept confidential. Participation was voluntary and students received a fee of €25. The sample was based on gender, age and school grades from year 1-6. Full representation was included for age and gender, but medical school grades were above cohort average. As these students had no prior experience with OSCEs, we invited them to an information session one day before the test. All but one student attended the information session. The general nature and settings of the OSCE were explained. They were given information on the number of stations, on the participation of simulated patients and were given one example of a station (which was not included in this study). They were also allowed to ask questions.



The group from the new curriculum consisted of 72 (regular) students in year 5 out of a total of 85 (85%). The thirteen students who did not participate in the study were either ill or abroad for study-related purposes.



Instrument



To measure skills competencies, we used a 15-stations OSCE (10 minutes per station) because of its validity for assessing basic clinical skills (Smee, 2003). We used an OSCE that represents the basic disciplines of the 7 year (both new and former) curriculum. This OSCE is a barrier examination (Davis, 2003) in the new curriculum, which means that students have to pass it in order to progress to the clerkship year. They have to show that they master the standards on basic clinical skills, as shown in Bates. These standards are the same standards that were used in the former curriculum. There are no other assessment topics in our OSCE apart from showing the mastering of standards on basic clinical skills.



Students’ performance was rated by trained and experienced observers, who were not explicitly informed that test results would be used for this study. The observers used the following three global scores for completeness, systematic approach and proficiency. These global scores were based on a scale from 1-10 (≤5=fail; 10=excellent) and were described in the “instructions for observers” issued before the OSCE.



The three global scores were averaged to obtain one definite station score. The overall score was the mean score across 15 stations. Mean group scores were calculated for the students from the traditional and the renewed curriculum and group scores per station were calculated.



Statistical analysis



Distributions of continuous data were tested for normality by the Kolmogorov Smirnov test. For normal or symmetrical distributions, data were summarised by means and standard deviations. Skewed data are summarised by medians and inter-quartile ranges. Differences between groups were tested for statistical significance by independent t-test when normally distributed, otherwise by the Mann Whitney U-test. We considered a p-value < 0.01 as significant. SPSS 11.0 was used for statistical analysis.



Results



OSCE scores and Global scores



The cohort from the renewed curriculum had a significantly (p<0.01) higher overall score on completeness (new 7.4; old 6.4), systematic approach (new 7.4; old 6.3), and proficiency (new 7.5; old 6.6). There was also a significant difference in relation to final score (new 7.4, old 6.4) (see table 2).



Table 2: Comparison of overall scores between students in traditional versus renewed curricula







Stations scores



There were no significant differences between the two groups in relation to nine stations: clinical examination of the nerve system, clinical examination of abdomen, clinical examination of the vagina, communication skills, injection techniques, clinical examination of the ear, clinical examination of the eye, suture techniques and psychiatry (see table 3). For new curriculum students, final station scores were significantly higher for the following stations: basic life support, clinical examination of the locomotors systems, microscopic urinary investigation, clinical cardiac examination, clinical examination of the lungs and clinical examination of the neonatal.



The only station with results in favour of the old curriculum was the one on injection techniques, but the difference in final score was not statistically significant.



Table 3: Station final scores







Discussion



Despite having completed fewer years of the curriculum, the students who had completed the new integrated skills programme obtained significantly higher OSCE scores for overall basic clinical skills competence than did the traditional curriculum students. From our findings, five years of integrated basic clinical skills training (without full-time internship experience) is more effective than a seven-year curriculum with only limited skills training and internships when tested by means of an OSCE. Both groups of students had the same intended learning outcomes; they were expected to master a defined list of skills at the end of their training, according to standards that are used internationally.



The differences in favour of the new curriculum for basic life support, clinical examination of the locomotors system, microscopic urinary investigation, clinical examination of the lungs, clinical cardiac examination and clinical examination of the neonate could suggest that these skills receive more attention in skills lab training, resulting in higher OSCE scores than when trained during internships. The better results for the traditional curriculum students on injection techniques may be due to the extensive training of this skill during internships.



A plausible explanation for the better results of the new curriculum students is the expansion of the skills training programme from 75 hours in the old curriculum to 200 hours in the renovated programme; the fact that skills training is integrated in the modules; and the special learning environment that is created for the acquisition of basic medical skills. Another difference is regular skills assessment in the new curriculum versus no formal skills assessment in the old one. In addition, new curriculum students are offered remediation when skills competence is unsatisfactory. In the traditional curriculum, internships accounted for the bulk of skills training. Other studies have shown that clinical experience without training increases confidence, but not competence (Marteau, 1990; Bulstrode & Holsgrove, 1996).



A limitation of the study is that the OSCE tested only basic clinical skills. It did not include differential diagnosis, management decisions or other skills that students learn from internship experiences. Further research remains to be done on skills related to the doctor-patient encounter (Smee, 2003). A second limitation is the fact that new curriculum students were familiar with the OSCE because they had to take an OSCE in year 3, whereas the other students were only familiar with the OSCE by means of explanation of the assessment method.



Overall, we are convinced that, compared with a combination of traditional skills training and internships, a curriculum improvement involving integration and intensification of skills training as well as skills assessment yielded higher basic clinical skills competence. As to whether or not this improvement will persist during internships remains to be seen. We will have to monitor the development of students’ skills levels in the clinical setting away from the skills laboratory. This is a topic of ongoing research.



Acknowledgments



This study was undertaken at the University of Antwerp, Belgium, and was funded by the Educational Committee of the Faculty of Medicine. The authors are grateful to the dean and all members of the Educational Committee for their support. The authors also like to thank Mereke Gorsira of the Maastricht University for critically reading and correcting the English manuscript.



References



BRADLEY, P. & BLIGH, J. (1999). One year’s experience with a clinical skills resource centre. Medical Education, 33, 114-120.



BRADLEY, P. & POSTLETHWAITE, K. (2003). Setting up a clinical skills learning facility. Medical Education, 307 (suppl. 1), 6-13.



BRADLEY, P. & BLIGH, J. (2004). Setting up and running clinical skills learning programmes. Medical Education: The clinical teacher, 1(2), 53-58.



BRADLEY, P. & BLIGH, J. (2005). Clinical skills centres: where are we going? Medical Education, 39, 649-650.



BULSTRODE, C. & HOLSGROVE, G. (1996). Education for educating surgeons. British Medical Journal, 312, 326-327.



CLACK, G.B. (1994). Medical graduates evaluate the effectiveness of their education. Medical Education, 28,418-431.



DAVIS, M. (2003). OSCE: The Dundee experience. Medical Teacher, 25(3), 255-261.



HARDEN, R.M. & GLEESON, F.A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 41-54.



HART, J.T. (2004). Reform of undergraduate medical teaching in the United Kingdom: Unfunded reform always ends in reaction. British Medical Journal, 329, 799.



HODGES, B. (2003). Analytic global OSCE ratings are sensitive to level of training. Medical Education, 37, 1012-1016.



MARTEAU, T.M., WYNNE, G., KAYE W., & EVANS, T.R. (1990). Resuscitation: experience without feedback increases confidence but not skill. British Medical Journal 300, 849-850.



METZ, J.C.M., STOELINGA, G.B.A., PELS RIJCKEN-VAN ERP TAALMAN KIP, E.H., VAN DEN BRAND-VALKENBURG, B.M.W. (1994) Objectives of undergraduate medical education. Nijmegen: University Publication Office.



MILLER, G.E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65 (Suppl.), 63-67.



MORRISON, J. (2004) Reform of undergraduate medical teaching in the United Kingdom: Evidence base for problem learning is growing. British Medical Journal, 329, 798-799.



PAYNE, J.D.R. (2004). Reform of undergraduate medical teaching in the United Kingdom: “Problem based learning” v. “traditional learning” is a false debate. British Medical Journal, 329, 799.



REGEHR, G., FREEMANN, R., ROBB, A., MISSIHA, N., & HEISLY, R. (1999). OSCE performance evaluations made by standardized patients: comparing checklist and global rating scores. Academic Medicine, 10 (Suppl.), 135-137.



REMMEN, R., SCHERPBIER, A.J.J.A., DERESE, A., DENEKENS, J., VAN DER VLEUTEN, C., VAN ROYEN, P., & BOSSAERT, L. (1998). Unsatisfactory basic skills performance by students in traditional medical curricula. Medical Teacher, 20(6), 579-582.



REMMEN, R. (1998). A comparative study of the intended curriculum in An evaluation of clinical skills training at the medical school of the University of Antwerp, Antwerp: Remmen, 47-57.



REMMEN, R., DENEKENS, J., SCHERPBIER, A., HERMANN, I., VAN DER VLEUTEN, C., VAN ROYEN, P., & BOSSAERT, L. (2000). An evaluation study of the didactic quality of clerkships. Medical Education, 34, 460-464.



SANSON-FISHER, R.W., ROLFE, I.E., & WILLIAMS, N. (2005). Competency based teaching: the need for a new approach to teaching clinical skills in the undergraduate medical education course. Medical Teacher, 27(1), 29-36.



SCHUWIRTH, L.W.T., & VAN DER VLEUTEN, C. (2003). The use of clinical simulations in assessment. Medical Education 37 (Suppl.1), 65-71.



SMEE, S. (2003). Skill based assessment. British Medical Journal, 326, 703-706.



VAN DER VLEUTEN, C., & SWANSON, D.B. (1990). Assessment of clinical skills with standardised patients. Teaching and Learning in Medicine, 2(2), 58-76.



WILLIAMS, S., DALE, J., GLUCKSMAN, E., & WELLESLEY, A. (1997). Senior house officers’ work related stressors, psychological distress, and confidence in performing clinical tasks in accident and emergency: a questionnaire study. British Medical Journal, 314, 713-718.



WILLIAMS, G., & LAU, A. (2004) Reform of undergraduate medical teaching in the United Kingdom: a triumph of evangelism over common sense. British Medical Journal, 329, 92-94.




 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract

 Article Access Statistics
    Viewed2432    
    Printed50    
    Emailed0    
    PDF Downloaded412    
    Comments [Add]    

Recommend this journal