|Year : 2007 | Volume
| Issue : 2 | Page : 88
J van Dalen
Associate Editor, Education for Health, The Netherlands
|Date of Web Publication||25-Jan-2013|
J van Dalen
PO Box 616, 6200 MD Maastricht
Source of Support: None, Conflict of Interest: None
|How to cite this article:|
van Dalen J. d. Educ Health 2007;20:88
Interest in the quality of communication seems to be continuously present in medical education. In the most recent issues of The Clinical Teacher, Medical Education and Teaching and Learning in Medicine five articles address the subject in one way or another.
Austin et al. (2007) looked, among other things, for a relation between empathy and academic performance (and found none). A small association between emotional intelligence and academic performance was established, which is promising as an area for further research.
Van Nuland et al. (2007) compared two of the leading instruments for the assessment of communication skills, in order to assist the selection between those instruments.
Mazor et al. (2007) share first findings with a new format of assessing communication skills: the video-based test of communication skills. With this instrument, live verbal reactions to patient-vignettes are recorded. The recordings are available for assessment at a later stage. The authors do not elaborate on other applications of this format, but I would assume that this instrument would also be interesting for teaching purposes.
Quilligan (2007) gave helpful suggestions about providing feedback in communication skills training. Although the application of these guidelines for feedback is definitely not limited to communication skills training, it is indicative that the guidelines are described in that specific context. Apparently, communication skills training is an area of competence that is always under debate.
Finally, Silverman (2007) addresses his reflections on the use of the most widely used guidelines for communication skills: the Calgary-Cambridge guides. Silverman is one of the developers of the Calgary-Cambridge guides, together with Kurtz, Benson and Draper. Silverman's reflections witness wisdom and some concern for those involved in training communication skills.
Let me borrow an analogy, addressed in two consecutive Editorials of Advances in Health Sciences Education in 2003. In the first one, Norman (2003) cautioned against overestimating skills training with models in a Skillslab-setting: we train students 'resuscitation' on a plastic model and at a later stage we test them on the same model. We must be careful that students do not overestimate their resuscitation skills, they may only know them on the same plastic model. Transfer of learning is very difficult to assess, but may very well be very limited. In the next issue, McGaghie et al. (2003) gave a well evidenced rebuttal, sharing the evidence available for the value of such training with simulators.
Following up on this we can recognize Silverman's caution about occasional misuse of the Calgary-Cambridge guide for communication skills. We must every now and then look at what we are doing in communication skills training, in order to get our feet back on the ground.
What is common practice in communication skills in many places in the world? As course developers we start from the available literature - and there is an enormous amount of evidence for the value of patient-centered communication skills. We then deduce guidelines from this evidence and train students in following these guidelines. Without noticing we have turned the world upside down. Instead of taking patients' satisfaction, compliance or recovery as outcome-criteria, we focus on the marks that students reach on the instruments. Consequently, students focus on passing the test, rather than conducting a patient-centered consultation. These issues are not always the same. The good instruments are obviously not designed or intended to give an absolute judgement; they should be seen and used as guides or accumulations of examples of what patient centered consultation looks like. However, as so often, the instrument is likely to counteract what we want to achieve.
May I strongly recommend everybody to read the Silverman article, and use the available instruments for communication skills the way they were originally intended to be used?
Jan van Dalen
Associate Editor Education for Health
AUSTIN E.J., EVANS, P., MAGNUS, B. & O'HANLON, K. (2007). A preliminary study of empathy, emotional intelligence and examination performance in MBGhB students. Medical Education, 41 , 684-689.
MAZOR, K.M., HALEY, H-L, SULLIVAN, K. & QUIRK, M.E. (2007). The video-based test of communication skills: description, development and preliminary findings. Teaching and Learning in Medicine, 10(2) , 162-167.
MCGAGHIE, W.C., ISSENBERG, S.B. & PETRUSA, E.R. (2003). Editorial: simulation - savior or saint? A rebuttal. Advances in Health Sciences Education, 8 , 97-103.
NORMAN, G. (2003). Editorial: simulation - savior or saint? Advances in Health Sciences Education, 8 , 1-3.
QUILLIGAN, S. (2007). Communication skills teaching: the challenge of giving feedback. The Clinical Teacher, 4 , 100-105.
SILVERMAN, J. (2007). The Calgary-Cambridge guides: the 'teenage years'. The Clinical Teacher, 4 , 87-93.
VAN NULAND, M., VAN DEN NOORTGATE, W., DEGRYSE, J. & GOEDHUYS, J. (2007). Comparison of two instruments for assessing communication skills in a general practice objective structured clinical examination. Medical Education, 41 , 676-683.