Print this page Email this page Users Online: 1117 | Click here to view old website
Home About us Editorial Board Search Current Issue Archives Submit Article Author Instructions Contact Us Login 


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2011  |  Volume : 24  |  Issue : 1  |  Page : 496

Bringing Explicit Insight into Cognitive Psychology Features during Clinical Reasoning Seminars: A Prospective, Controlled Study


University of Geneva, Faculty of Medicine, Geneva, Switzerland

Date of Submission30-May-2010
Date of Acceptance16-Dec-2010
Date of Web Publication29-Apr-2011

Correspondence Address:
M R Nendaz
service of general internal medicine
Switzerland
Login to access the Email id

Source of Support: None, Conflict of Interest: None


PMID: 21710417

  Abstract 

Context: Facets of reasoning competence influenced by an explicit insight into cognitive psychology features during clinical reasoning seminars have not been specifically explored.
Objective: This prospective, controlled study, conducted at the University of Geneva Faculty of Medicine, Switzerland, assessed the impact on sixth-year medical students' patient work-up of case-based reasoning seminars, bringing them explicit insight into cognitive aspects of their reasoning.
Methods: Volunteer students registered for our three-month Internal Medicine elective were assigned to one of two training conditions: standard (control) or modified (intervention) case-based reasoning seminars. These seminars start with the patient's presenting complaint and the students must ask the tutor for additional clinical information to progress through case resolution. For this intervention, the tutors made each step explicit to students and encouraged self-reflection on their reasoning processes. At the end of their elective, students' performances were assessed through encounters with two standardized patients and chart write-ups.
Findings: Twenty-nine students participated, providing a total of 58 encounters. The overall differences in accuracy of the final diagnosis given to the patient at the end of the encounter (control 63% vs intervention 74%, p=0.53) and of the final diagnosis mentioned in the patient chart (61% vs 70%, p=0.58) were not statistically significant. The students in the intervention group significantly more often listed the correct diagnosis among the differential diagnoses in their charts (75% vs 97%, p=0.02).
Conclusion: This case-based clinical reasoning seminar intervention, designed to bring students insight into cognitive features of their reasoning, improved aspects of diagnostic competence.

Keywords: Bedside teaching, case-based learning, clinical reasoning, internal medicine, medical education, medical reasoning, precepting, problem-solving


How to cite this article:
Nendaz M R, Gut A M, Louis-Simonet M, Perrier A, Vu N V. Bringing Explicit Insight into Cognitive Psychology Features during Clinical Reasoning Seminars: A Prospective, Controlled Study. Educ Health 2011;24:496

How to cite this URL:
Nendaz M R, Gut A M, Louis-Simonet M, Perrier A, Vu N V. Bringing Explicit Insight into Cognitive Psychology Features during Clinical Reasoning Seminars: A Prospective, Controlled Study. Educ Health [serial online] 2011 [cited 2021 Sep 20];24:496. Available from: https://www.educationforhealth.net/text.asp?2011/24/1/496/101459

Context

During the past 25 years, several studies in cognitive psychology have brought a better understanding about the mechanisms involved in clinical reasoning1-3. In particular, they have demonstrated that clinicians’ diagnostic accuracy is associated with characteristic features of a medical encounter, such as detailed inquiry about the chief complaint and frequent summarization of the collected information4,5. The most important predictor of diagnostic accuracy, however, was the early generation and evaluation of relevant diagnostic hypotheses and their use to frame the collection of further relevant information from the patient6-10.

Case-based seminars allowing for iterative hypothesis testing may be adapted as a teaching method to put into practice the understanding brought by studies in cognitive psychology11. This format not only trains students to evaluate different diagnostic hypotheses through an analytical approach but also allows them to take into account their initial, non-analytical, intuitive diagnostic impression, a condition reported to increase teaching efficacy12-14. However, further exploration is needed into which facets of clinical reasoning may be particularly influenced by the explicit use of these cognitive theories during teaching.

The purpose of this prospective, controlled study was to assess the impact on senior medical students of case-based reasoning seminars designed explicitly to bring them insight into cognitive aspects of their reasoning.

Methods

Setting and participants

The last year of the six-year medical curriculum at the University of Geneva Faculty of Medicine, Switzerland, consists of a 10-month clinical elective program. Each promotion encompasses about 100 students. Our internal medicine elective admits successive groups of students for a two- or three-month clerkship rotation. Students registered for three-month rotations were eligible for the study.

From January 2005 to end of April 2006, 39 students registered for four successive three-month elective rotations and were asked at the beginning of their elective to volunteer for the study. The students who declined participation attended the regular teaching provided during the clerkship, either with control groups or with students who were not eligible for the study (e.g. those who registered for a 2-month elective rotation). At this point of their curriculum, all students were at the same stage of training and had attended the same mandatory clerkships. They were randomly distributed across the patient units of our in-patient department.

Intervention

During their elective in the division of internal medicine, students attend regular, weekly case-based clinical reasoning seminars conducted by faculty members15,16. This learning method is derived from Kassirer’s case-based seminars allowing for iterative hypothesis testing11. The seminars started with the patient’s presenting complaint and the students were instructed to ask the tutor for additional information about the patient’s history and physical examination while justifying their requests and mentioning which diagnostic hypothesis would be tested. The tutor provided them with the requested information and the same process continued until a final working diagnostic hypothesis was reached. The case was frequently summarized between natural steps of the seminar, when the group moved from history to physical examination and from physical findings to ancillary tests.

For this study, we developed two types of case-based clinical reasoning seminars and we trained the tutors accordingly. For the first type (control group), the tutors followed the usual steps described above. For the second type (intervention group), the tutors additionally made each step explicit to the students, encouraged them to self-reflect on their processes of reasoning, related each step to the cognitive concepts described in the cognitive psychology literature and trained students to use these features. The tutors actively and explicitly provided guided feedback during the case resolution by reinforcing the following processes: a) setting up a plan for the collection of the information once the presenting complaint is exposed; b) characterizing each complaint (e.g. duration, characteristics, etc.); c) regularly summarizing the information at hand to enhance problem representation; d) generating early diagnostic hypotheses to be evaluated through a directed enquiry and using these hypotheses to frame the collection of further information (Figures 1 and 2). There is little evidence that the acquisition of generic thinking processes without the necessary contextual knowledge can lead to a transfer of competence from problem to problem1,2. Hence, the case-related content knowledge was simultaneously addressed in all seminars.









Figure 1:  Characteristics of data collection and reasoning made explicit by the tutors to the students of the intervention group during the case-based clinical reasoning seminars







Figure 2:  Examples of verbatim interactions between tutor and students during a case-based reasoning seminar, making the process explicit to students and relating it to features of cognitive psychology.



The tutors involved in this study were all experienced physicians in general internal medicine and members of a pool of faculty members who had already experienced case-based reasoning seminars for several years. Their yearly tutoring schedule had been established independently from the present study. The tutors actually scheduled to teach intervention groups received an additional training by one investigator (MN) about the cognitive features of clinical reasoning and the way to relate them to the reasoning process of the students (Figures 1 and 2). They also received a written summary describing each step of the tutorial with examples of questions to ask and had a debriefing session with one investigator after the first tutorial to discuss potential difficulties. Four of them taught the intervention groups and five taught the control groups. Each tutor provided a total of 10 to 12 seminars during each three-month elective rotation.

The four successive three-month student rotations were assigned to the control or the intervention groups according to a predefined plan designed to balance students’ acquired clinical experiences during the elective year (control-intervention-intervention-control).

Data Collection and Analysis

On the last week of their elective, students worked up two cases portrayed by standardized patients. At the end of each encounter, the students provided the patient with the final working diagnosis and proposed a management plan, wrote up a patient chart listing the differential diagnosis and patient management plan (triage, investigation and treatment), and completed a multiple-choice knowledge test about their respective encountered cases. Each encounter was videotaped and reviewed for a thinking-aloud stimulated recall during which the participants indicated the diagnostic hypotheses underlying their data collection. The stimulated recall sessions were audio-taped and transcribed verbatim for analysis.

Two assessment cases out of a set of four cases were randomly selected for each rotation. The presenting situations were abdominal distension, weight loss, headache and chronic diarrhoea. These complaints might have been encountered during the clerkship on the wards but were not directly addressed during the case-based reasoning seminars.

Primary outcomes consisted of the accuracy of the final diagnosis and of the differential diagnosis, respectively, during the encounter - as determined by the explanations to the patient and the stimulated recall - and in the patient chart. Secondary outcomes included the characteristics of the patient data collection that were observed on the videotapes or reported by the participant during the stimulated recall. These characteristics encompassed clarification of patient complaints, summarization of the information at hand, use of key diagnostic hypotheses to frame data collection and early testing of the final diagnosis. We assessed the relevance of the tested diagnostic hypotheses, the collected information and the management decisions, using as a gold standard their frequency of occurrence (0 to 1) in a group of expert physicians who previously worked up the same cases8,17.

We used student’s t-tests with Bonferroni corrections as needed, to compare the continuous variables of the intervention and control groups and Chi-squared tests to assess categorical variables. Analyses of variance were conducted on the relevance scores of diagnoses and management decisions to take into account a potential effect of case difficulty (SPSS Inc, Chicago, version 14.0). We built models with relevance scores of diagnoses as dependent variable and case difficulty, training condition and their interaction as factors. We estimated that 14 students in each group allowed the detection of a difference in relevance scores of 8% (SD 0.08) with a power of .80 when using two-tailed t-tests with an alpha value of .05.

A complete ethical review was not requested by our institution for this type of project and the study was approved by our Curriculum Committee and Internal Medicine Head.

Findings

Thirty-nine students were eligible for the study: 29 participated, 14 in the control group and 15 in the intervention group, thus providing 28 and 30 encounters, respectively. The students who declined participation (N=10) at the time of recruitment reported anticipated time constraints on the assessment day. The value of kappa between two coders analyzing the characteristics of a random sample of 13 encounters was 0.90. Students’ backgrounds were similar in both groups, as well as their clinical knowledge related to the cases (Table 1).



Table 1:  Participants’ characteristics







Table 2 summarizes the primary outcomes regarding students’ diagnostic performance. The difference in accuracy of the final diagnosis given to the patient (63% for the control group vs 74% for the intervention group, p=.53) or reported in the chart (61% for the control group vs 70% for the intervention group, p=.58) was not statistically significant between the groups. The students in the intervention group mentioned significantly more often the correct diagnosis among the differential diagnostic hypotheses listed in the patient charts (75% for the control group vs 97% for the intervention group, p=.02).



Table 2:  Primary outcomes: Diagnostic performance according to the training received









As displayed in Table 3, the characteristics of the information items collected and of the diagnostic hypotheses evaluated during the encounters were similar in both groups, except that the control group more often summarized the available information during the encounters (2.68 vs 1.73, p=.03). The relevance of the management decisions (tests, triage and treatment) recorded in the students’ written charts was not statistically significant.



Table 3:  Secondary outcomes: characteristics of information items collected and diagnostic hypotheses evaluated







Case difficulty, as defined by the correctness of the final diagnosis, varied among students: the easiest were abdominal distension (92% correctness) and chronic diarrhea (77%) cases while the most difficult situations were the headache (59%) and weight loss (33%) cases. We found the same pattern of difficulty among gold-standard experts. For the more difficult cases, univariate analyses of variance showed that students in the intervention group performed better than those in the control group on the following variables: earlier exploration of the correct diagnosis during the encounter (case headache: after 6 questions vs after 42 questions, p=0.05), use of a diagnostic hypothesis to frame information collection from the patient (case headache: for 68% of the questions vs for 50%, p=0.03), and the relevance of the differential diagnosis in the chart (case weight loss: relevance score 0.56 vs 0.83, p=0.04). Multivariate analyses of variance demonstrated that the relevance score of the differential diagnosis was significantly affected by case difficulty and training condition (Table 4). There was an interaction effect between case difficulty and training condition (intervention or control) for the early exploration of the correct diagnosis (p=.01) and for the relevance score of treatment decisions (p=.04).



Table 4:  Relevance score (from 0 to 1) of the differential diagnosis in the chart according to training condition and cases







Discussion

The essence of our intervention was to bring students an explicit insight into their ongoing processes of clinical reasoning during case-based seminars and to encourage reflection at each step of the teaching approach described in Figures 1 and 2. This did not significantly affect their global diagnostic or decisional competencies but helped them increase the relevance of their differential diagnosis written in the post-encounter charts.

The analysis of the variables collected during the stimulated recall showed no differences between students at the time of the encounter. This reinforces the finding that this training did not improve the students’ abilities to test more relevant hypotheses during the encounter by collecting more relevant information. It did, however, help them select the most relevant ones after the encounter, at the time they integrated the information gathered during the SP encounter and wrote their differential diagnosis in the charts. Students did not necessarily select the correct hypothesis as their final diagnosis, which is not surprising at their stage of training and gives credence to existing assumptions that, while the selection of the relevant diagnoses is dependent on the reasoning process, diagnostic accuracy requires further clinical experience, exposure to clinical cases, as well as the integration of some elements of the clinical decision-making process1,18,19.

The analyses by case showed that the students in the intervention group performed significantly better than the control group in the more difficult weight loss and headache cases. As the diagnoses related to these complaints (respectively hyperthyroidism and giant cell arteritis) are more often represented in an ambulatory setting and less frequently encountered during our clerkship on the wards than the other cases used for this study, our intervention may particularly prove useful with more difficult cases.

There is a rational explanation to this observation. When students already have an experience with certain cases, a non-analytical, more intuitive approach may first take place2, thus making a teaching approach using analytical steps less likely to influence the outcomes. On the other hand, such an intervention turned out more influential with harder, ill-defined cases requiring a more elaborate analytical process.

Explicit reflective learning and the metacognition of learning processes is a model issued from cognitive psychology that emerged in medical education during the 1980s. Lately, this notion has emerged as a potentially important tool for expertise acquisition20-22. Its application in clinical teaching aims mainly at making doctors more aware of their underlying reasoning processes, with the hope that this may minimize errors.

In a recent study, Mamede23 showed that reflective practice had a positive effect on residents’ diagnosis of complex, unusual cases. This may also explain why our intervention seemed more efficacious with more difficult cases. However, methods to enhance reflective practice among medical students have still to be further explored and validated in medical education24. The introduction of problem-based learning25 has been a way to put this learning-oriented teaching model into practice at curricular or course levels. The use of portfolios is another attempt to enhance students’ self-reflection and is under current investigation for learning and assessment26,27. Reflective physical examination and reasoning represents a potential way of enhancing clinical competence28,29 but the real impact on learning outcomes is still open for further research. Contextual metacognition has been reported in a study in emergency medicine showing that experienced residents were sensitive to metacognitive approaches to understand errors30. In a study aiming at teaching communication and reasoning skills through an iterative reflective process, students performed better in integrating aspects of communication into their reasoning31. Our study adds an additional piece of evidence suggesting that the introduction of a contextual reflective approach leading to the metacognition of clinical reasoning processes related to specific problems may influence some aspects of clinical competence.

Students in the control group summarized more often the available information during the encounters than the students in the intervention group. Summarizing the data collected from the patient is a frequent step of the regular case-based seminars. This occurs generally between natural steps of the seminar, when the group moves from history to physical examination and from physical findings to ancillary tests. There is also a frequent final summary at the end of the session. Each seminar provides thus two to three natural summary opportunities, which corresponds to the mean number of occurrences found in this study for the control group (2.68). The instructions given to the tutors of the intervention group included 'summarizing' as an explicit way to increase problem representation but was not directly related to a specific step of the seminar. The observed difference in the number of summarization occurrences in each group might, therefore, not convey exactly the same meaning.

Summarizing was probably the reflection of the natural steps in the regular seminar (2-3 natural steps), while it represented a way to apply cognitive principles in the intervention group. Additionally, as the tutor had many tasks to manage at the same time, he might have focused on other aspects than systematically summarizing information at each step of the seminar.

Limitations

This study presents some limitations that may have hampered the impact of the intervention. First, the usual baseline performance of our students at the end-of-clerkship Objective Structured Clinical Exams (OSCEs) turns out relatively high (mean class scores ranging from 79 to 80% from 2005 to 2007). This suggests that the baseline training program already confers a certain degree of clinical proficiency, which potentially let little room for a dramatic change resulting from our intervention.

Second, our students are issued from a problem-based learning (PBL) system involving the clinical years, which may already nurture their ability to analyze content and process in group sessions. An earlier intervention during the curriculum would perhaps have brought greater effect sizes, at the time of transition from the preclinical to the clinical years.

Third, as recently suggested by Mamede23, reflective practice may take its full effect only with more difficult clinical scenarios. Some of our cases were potentially not difficult enough, limiting the effect of our intervention, which is supported by our analysis showing a more important effect of our intervention on more difficult cases. This finding may also be explained by the fact that a non-analytical, more intuitive approach could take place with easier cases, thus preventing an analytical teaching approach to show a major effect on the outcomes.

As we have not looked for data on students who declined participation to the study, one cannot exclude a possible volunteer bias. However, if this bias would possibly have affected the type of students included in the study, it is unlikely that it could influence the differences observed between control and intervention groups. Given the limited size of the student sample in this study, one cannot exclude a lack of power to detect other significant effects of the training. In addition, at the time of this study we had no available data on tutor effectiveness, thus raising the possibility that the difference in student outcomes were due to more effective teachers in the intervention groups. However, as the tutors involved in this study were part of a pool of faculty members who had already experienced case-based reasoning seminars for several years and as their yearly tutoring schedule had been established independently from the present study, there is a limited likelihood that tutor effectiveness alone could explain our results. Moreover, the tutors specifically trained for the intervention group may nevertheless have applied the teaching recommendations with variable intensity, which could have lessened the impact of this approach. This limitation, however, gives even more credence to the effect found in this study.

Conclusion

In this study, increasing students’ insight into the cognitive aspects of their reasoning processes and characteristics seems to improve the relevance of the working diagnostic hypotheses selected after the encounter at the time of case synthesis in the chart. To our knowledge, few studies, if any, have been able to capture the impact of an intervention grounded in cognitive psychology and applied in a real setting on specific components of clinical reasoning. Further research should try to better understand the reasoning process taking place at the time of patient chart writing.

Moreover, our results may have important clinical implication, since the quality of the differential diagnosis may influence the subsequent diagnostic work-up of the patient and the tests chosen to confirm or disprove the diagnostic hypotheses. Whether this ability would impact the diagnostic work-up of the patients, and bring better diagnostic accuracy and better management decisions with more complex cases, is open to further research.

Acknowledgements

The authors acknowledge the participation of students and tutors in this study.

The study was funded by the Swiss National Science Foundation (3200B0-102265), the Elie Safra foundation, Geneva University, the Department of Internal Medicine, University Hospitals, Geneva and the Gabriella Giorgi-Cavaglieri foundation, Geneva, Switzerland.

References

1Eva KW. What every teacher needs to know about clinical reasoning. Medical Education. 2005; 39:98-106.

2Norman G. Research in clinical reasoning: past history and current trends. Medical Education. 2005; 39:418-427.

3Nendaz MR, Charlin B, LeBlanc V, Bordage G. Clinical reasoning: from research findings to applications for teaching. Pédagogie Médicale. 2005; 6:235-254.

4Hasnain M, Bordage G, Connell KJ, Sinacore JM. History-taking behaviors associated with diagnostic competence of clerks: an exploratory study. Academic Medicine. 2001; 76:S14-17.

5Nendaz MR, Gut AM, Perrier A, Louis-Simonet M, Reuille O, Junod AF, Vu NV. Common strategies in clinical data collection displayed by experienced clinician-teachers in internal medicine. Medical Teacher. 2005; 27:415-421.

6Barrows HS, Norman GR, Neufeld VR, Feightner JW. The clinical reasoning of randomly selected physicians in general medical practice. Clinical and investigative medicine. 1982; 5:49-55.

7Elstein AS, Shulman LS, Sprafka SA. Medical Problem Solving: An Analysis of Clinical Reasoning. Cambridge, MA: Harvard University Press 1978.

8Nendaz MR, Gut AM, Perrier A, Louis-Simonet M, Blondon-Choa K, Herrmann FR, et al. Beyond clinical experience: features of data collection and interpretation that contribute to diagnostic accuracy. Journal of General Internal Medicine. 2006; 21:1302-1305.

9Neufeld VR, Norman GR, Barrows HS, Feightner JW. Clinical problem-solving by medical students: a longitudinal and cross-sectional analysis. Medical Education. 1981; 15:315-322.

10Kassirer J, Gorry G. Clinical problem-solving: a behavioural analysis. Annals of Internal Medicine. 1978; 89:245-255.

11Kassirer JP. Teaching clinical medicine by iterative hypothesis testing. Let's preach what we practice. New England Journal of Medicine. 1983; 309:921-923.

12Ark TK, Brooks LR, Eva KW. Giving learners the best of both worlds: do clinical teachers need to guard against teaching pattern recognition to novices? Academic Medicine. 2006; 81:405-409.

13Ark TK, Brooks LR, Eva KW. The benefits of flexibility: the pedagogical value of instructions to adopt multifaceted diagnostic reasoning strategies. Medical Education. 2007; 41:281-287.

14Hatala RM, Brooks LR, Norman GR. Practice makes perfect: the critical role of mixed practice in the acquisition of ECG interpretation skills. Advances in Health Sciences Education. 2003; 8:17-26.

15Chamberland M. Clinical reasoning learning (CRL) sessions. An example of a contextualized teaching activity adapted to clinical stages in medicine. Annales de Médecine Interne (Paris). 1998; 149:479-484.

16Nendaz MR, Bordage G. Promoting diagnostic problem representation. Medical Education. 2002; 36:760-766.

17Nendaz MR, Gut AM, Perrier A, Reuille O, Louis-Simonet M, Junod AF, Vu NV. Degree of concurrency among experts in data collection and diagnostic hypothesis generation during clinical encounters. Medical Education. 2004; 38:25-31.

18Elstein AS. Thinking about diagnostic thinking: a 30-year perspective. Advances in Health Sciences Education. 2009; 14 Suppl 1:7-18.

19Elstein AS, Schwarz A. Clinical problem solving and diagnostic decision making: selective review of the cognitive literature. British Medical Journal. 2002; 324:729-732.

20Epstein RM. Mindful practice. Journal of American Medical Association. 1999; 282:833-839.

21ten Cate O, Snell L, Mann K, Vermunt J. Orienting teaching toward the learning process. Academic Medicine. 2004; 79:219-228.

22Mamede S, Schmidt HG. The structure of reflective practice in medicine. Medical Education. 2004; 38:1302-1308.

23Mamede S, Schmidt HG, Penaforte JC. Effects of reflective practice on the accuracy of medical diagnoses. Medical Education. 2008; 42:468-475.

24Driessen E, van Tartwijk J, Dornan T. The self critical doctor: helping students become more reflective. British Medical Journal. 2008; 336:827-830.

25Barrows HS. Problem-based, self-directed learning. Journal of American Medical Association. 1983; 250:3077-3080.

26Davis MH, Friedman Ben-David M, Harden RM, Howie P, Ker J, McGhee C, Pippard MJ, Snadden D. Portfolio assessment in medical students' final examinations. Medical Teacher. 2001; 23:357-366.

27Grant AJ, Vermunt JD, Kinnersley P, Houston H. Exploring students' perceptions on the use of significant event analysis, as part of a portfolio assessment process in general practice, as a tool for learning how to use reflection in learning. BMC Medical Education. 2007; 7:5.

28Benbassat J, Baumal R, Heyman SN, Brezis M. Viewpoint: suggestions for a shift in teaching clinical skills to medical students: the reflective clinical examination. Academic Medicine. 2005; 80:1121-1126.

29Yudkowsky R, Otaki J, Lowenstein T, Riddle J, Nishigori H, Bordage G. A hypothesis-driven physical examination learning and assessment procedure for medical students: initial validity evidence. Medical Education. 2009; 43:729-740.

30Bond WF, Deitrick LM, Arnold DC, Kostenbader M, Barr GC, Kimmel SR, Worrilow CC. Using simulation to instruct emergency medicine residents in cognitive forcing strategies. Academic Medicine. 2004; 79:438-446.

31Windish DM, Price EG, Clever SL, Magaziner JL, Thomas PA. Teaching medical students the important connection between communication and clinical reasoning. Journal of General Internal Medicine. 2005; 20:1108-1113.






 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract

 Article Access Statistics
    Viewed2027    
    Printed90    
    Emailed0    
    PDF Downloaded223    
    Comments [Add]    

Recommend this journal