ORIGINAL RESEARCH ARTICLE
Year : 2017 | Volume
: 30 | Issue : 3 | Page : 193--197
Measuring situation awareness in medical education objective structured clinical examination guides
Margaret Frere, John Tepper, Markus Fischer, Kieran Kennedy, Thomas Kropmans Medical Informatics and Medical Education School of Medicine, National University of , Galway, Ireland
Correspondence Address:
Ms. Margaret Frere School of Medicine, National University of Ireland, Galway Ireland
Abstract
Background: Medical errors are among the most prevalent and serious adverse events in health care. Lack of situation awareness (SA) is an important factor leading to such errors. SA can be understood using Endsley's three-tier model: level 1 is perception, level 2 is comprehension, and level 3 is projection. While there is extensive literature on the theory of SA, it is difficult to measure and quantify. The purpose of this pilot study was to measure, identify, and characterize SA in some medical objective structured clinical examination (OSCE) guides, including a 1st year National University of Ireland, Galway (NUIG) OSCE. Methods: Two independent observers analyzed two online OSCE guides and a 1st year OSCE examination using a self-developed tool. This tool was an inferential measure of SA. The guides were first qualitatively analyzed using NVivo and then quantitatively analyzed using Excel. Results: The results indicated strong internal validity and moderate inter-rater reliability. There was limited statistically significant variance between the observers. The NUIG OSCE had relatively the fewest relative observations of SA and the Geeky Medics OSCE Guide had relatively the most observations of SA. In all guides, Level 1 SA was observed more frequently than Level 2 or 3 SA. Discussion: SA is an important factor in clinical decision-making and patient safety. The challenging aspect is how to best teach and assess SA in medical education. Simulations, such as informative and/or summative OSCEs, are considered a valuable and safe way to do so. Inter-rater reliability can be improved using tool training sessions.
How to cite this article:
Frere M, Tepper J, Fischer M, Kennedy K, Kropmans T. Measuring situation awareness in medical education objective structured clinical examination guides.Educ Health 2017;30:193-197
|
How to cite this URL:
Frere M, Tepper J, Fischer M, Kennedy K, Kropmans T. Measuring situation awareness in medical education objective structured clinical examination guides. Educ Health [serial online] 2017 [cited 2023 Jun 1 ];30:193-197
Available from: https://educationforhealth.net//text.asp?2017/30/3/193/229509 |
Full Text
Background
Medical errors are among the most prevalent and serious adverse events in health care. According to the World Health Organization, these errors account for approximately 10% of hospitalizations in Europe.[1] Lack of situation awareness (SA) is an important factor leading to poor clinical decision-making and medical errors.[2],[3] Improving SA through training and testing in simulation could potentially reduce errors, and ultimately lead to fewer deaths.[4]
In simple terms, SA is “knowing what is going on” in a situation.[5] It can be more clearly understood as “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status shortly.”[5] Using the definition provided, SA can be understood using Endsley's three-tier model: tier 1 is perception, tier 2 is comprehension, and tier 3 is projection.[5] Perception involves recognizing cues relevant to the environment, comprehension requires integrating cues from Level 1, and projection involves extrapolating information from Level 1 and 2 and analyzing how this information may impact future events.[2],[6] A high degree of SA is crucial for health-care students who will be required to make decisions in complex, unpredictable, and demanding situations.[7]
Nontechnical skills, such as decision-making and SA, are explicitly addressed in the aviation industry.[8] Health-care professional education appears to lag, even though these skills are essential for the provision of care and patient safety.[3],[7] There are many strategies that exist to develop and maintain SA, including proactively seeking and managing information, using checklists, and avoiding attention blindness.[3] These strategies, applied in health-care student education, could potentially significantly improve students' SA and therefore reduce errors.
Students often feel unprepared entering clinical practice. Teaching and assessment of SA using clinical scenarios, such as formative objective structured clinical examinations (OSCEs), is believed to better prepare students for the transition from education to clinic.[7] Despite the benefit OSCEs provide and medical association recommendations for nontechnical skill development in health-care education, there is little literature on this topic.[7] Further, SA is difficult to measure and quantify. SA can be measured during simulation, such as OSCEs, in health-care education.[7],[9] Both direct and indirect methods may be used to measure SA; direct measurements employ in-test probes or self-rating assessments and indirect measurements infer SA from test performance.[6] OSCEs evaluate many aspects of students' clinical competence, including SA, and many medical schools have incorporated OSCEs to better develop SA in their students.[9]
The purpose of this pilot study was first to determine validity and next to determine inter-rater reliability of an SA assessment tool and to determine the degree of SA present in several medical student OSCE guides. A 1st year OSCE examination at the National University of Ireland, Galway (NUIG), was compared to two free, widely utilized OSCE study guides available online: Geeky Medics and OSCE Skills.
Methods
A 1st year OSCE examination was obtained from the NUIG School of Medicine. An internet search was conducted to identify freely available OSCE guides—those which specifically prepared medical students for their OSCE examinations—to compare to the NUIG examination. These two most comprehensive guides found were from OSCE Skills and Geeky Medics.[10],[11] OSCE Skills and Geeky Medics were found mentioned across various online medical school forums, so we inferred that they are widely utilized by medical students internationally.[12] The guides were composed of a number of stations categorized into various medical specialties. Each station described the steps, actions, and considerations necessary for students to adequately perform the clinical scenario.
The guides and examination were uploaded into NVivo 10.2.2. (QRS International, Melbourne, Australia), and each station was qualitatively analyzed for the presence of SA. A self-developed tool was used for this analysis. The tool was developed using Endsely's model of SA, it is constructed using specific tasks/goals assigned to each of the three levels of SA [Figure 1]. The tool is an inferential measure of SA, meaning presence of SA is determined based on performance in the clinical scenario; if a station guide listed a task from the tool, the guide was said to possess that level of SA. Where there were multiple instances of a particular task in one station, it was recorded only once. Further, a single observation of SA in the station was recorded as one observation of SA in the tool, regardless of the length of text coded in NVivo. Multiple observers analyzed the guides and examination using this tool.{Figure 1}
The qualitative results were then quantitatively analyzed using Microsoft Excel. For descriptive statistics, we used frequency and measures of central tendency. For inferential statistics, we used Kruskal–Wallis for variance analysis, Cohen's kappa for inter-rater reliability, and Cronbach's alpha for internal validity.
Results
The OSCE Skills guide and Geeky Medics guide consisted of 33 stations in nine medical specialties and 39 stations in ten medical specialties, respectively. The NUIG examination was significantly less comprehensive, consisting of only five stations, which were not categorized into any specific specialties. The specialties included in the OSCE Skills guide and the Geeky Medics guide were cardiology, endocrinology, gastroenterology, neurology, obstetrics and gynecology, orthopedics, otorhinolaryngology, pediatrics, psychology, pulmonology, urology, and an “other” category.
The NUIG OSCE examination possessed both absolutely and relatively fewer observations of SA [Figure 2] as compared to the two free OSCE guides. On an average, 45% of the stations exhibited Level 1 SA, 18.9% of the stations exhibited Level 2 SA, and 21.7% of the stations exhibited Level 3 SA, with a standard deviation of 5%, 3.8%, and 12.6%, respectively.{Figure 2}
[Figure 2] illustrates the comparison between the mean number of stations that demonstrated each level of SA within each OSCE guide and examination. The Geeky Medics guide exhibited the highest degree of SA in every level. Nearly 56.5% of the stations exhibited Level 1 SA, 37.5% of the stations exhibited Level 2 SA, and 34.3% of the stations exhibited Level 3 SA, with a standard deviation of 5.6%, 4.7%, and 13.4%, respectively. OSCE Skills guide was in the middle level comparing Level 1 and 2 SA, but it had the lowest mean observations of Level 3 SA. Around 54.5% of the stations exhibited Level 1 SA, 24.2% of the stations exhibited Level 2 SA, and 6.2% of the stations exhibited Level 3 SA, with a standard deviation of 12.1%, 11.3%, and 2.5%, respectively.
Geeky Medics and OSCE Skills guides were further analyzed by medical specialty and the mean number of stations demonstrating each level of SA [Figure 3] and [Figure 4]. In the OSCE Skills guide, all specialties demonstrated Level 1 SA, ranging between 43.8% and 75% of stations. Level 2 SA was less frequently observed than Level 1, and Level 3 was the least frequently observed. Four specialties did not observe any SA in Level 3, including neurology, orthopedics, psychiatry, and pulmonology. Similar results were found for the Geeky Medics guide. All specialties demonstrated Level 1 SA, ranging between 42.5% and 62.5% of stations. Again, Level 2 SA was less frequently observed than Level 1, and Level 3 was the least frequently observed, except in cardiology, neurology, and pulmonology—Level 3 was observed more frequently than Level 2.{Figure 3}{Figure 4}
Cronbach's alpha was used to determine the consistency between the raters and their observations; we determined an acceptable Cronbach's alpha coefficient as ≥0.7. The results demonstrated strong internal validity in each level of SA measured in the OSCE Skills and Geeky Medics guides [Table 1]. Kruskal–Wallis test was used to determine whether there was a difference in the medians of each rater and their observations of SA at each level. The null hypothesis stated that there is no statistically significant difference in the mean ranks; we determined P < 0.05 to be statistically significant. Kruskal–Wallis test was used because the assumptions were not met for ANOVA. The results of this test showed no significant difference between each rater, except in Level 3 of the OSCE Skills guide, where a statistically significant difference was observed (P = 0.031) [Table 2]. Further, Cohen's Kappa test was used to determine inter-rater agreement, the results of this test showed moderate inter-rater reliability: k = 0.476 for Geeky Medics Guide and k = 0.505 for OSCE Skills guide.{Table 1}{Table 2}
Discussion
SA—or the ability to identify, comprehend, and make predictions about critical information as it affects both the health-care team and the patient—is clearly an important factor in clinical decision-making and ultimately patient safety. Decreasing medical errors by enhancing SA can improve patient outcomes. The challenging aspect is how best to teach and assess SA in health-care education. Simulations, such as informative or summative OSCEs, are considered a valuable and safe way to teach SA.[9],[13] Using simulation during teaching has shown to increase cognitive function and reduce errors in medicine,[13] as well as improve active learning.[9] Evaluation of SA is more challenging. This study used a self-developed tool to assess SA in several OSCE guides and indeed a statistical difference between the guides and their degree of SA was found.
It is difficult to determine which guides were “best.” We considered the guide that had the most observations of SA at the highest level to be more successful in applying SA to their clinical scenarios. In this respect, the NUIG OSCE appears to be inferior to the freely available OSCE guides on the internet. However, we do not find this to be an accurate representation of the teaching institution and its performance, as higher year examinations could not be accessed at the time of study commencement. Geeky Medics was the superior OSCE guide, it possessed the most observations of SA at every level and therefore seems to prepare students in the best manner.
We believed the best OSCE guide was the one that had the greatest frequency of observations across all SA levels because each level requires a different degree of mental processing. When information is perceived, it is stored in working memory instead of relevant long-term memory stores, or other similarly relevant mechanisms.[5] This is Level 1 SA and it involves responding to the input of relevant data. Comparing this to Level 2 SA requires new information to be taken in as working memory and then combined with the existing knowledge, resulting in the recognition of significant data and the generation of a holistic picture of the situation in one's mind. One step above that is Level 3 SA, which requires taking the composite picture of the new information in one's mind and using higher processing centers to generate an accurate plan in a timely fashion.[5] Some consider the three SA levels hierarchal, meaning that Level 1 SA is required to develop Level 2 SA and Level 2 SA is required to develop Level 3 SA. This is not necessarily the case, Endsley stated that SA can be a linear, bottom-up process but it can also be a top-down goal-driven process, which will be explored later.[5] In our view, it was not surprising to find that the most frequent tier of SA observed was Level 1 and the least frequent level of SA observed was Level 3, because Level 1 SA requires only reactions to working memory inputs and is the least mentally taxing. Notwithstanding, this observation could also be explained by the potential limitations of the tool in assessing higher level SA, or the subjectivity in the raters.
It was intriguing to see certain specialties with a greater degree of SA than others. SA is often studied in anesthesiology because it is a dynamic medical specialty in which substantial and rapid changes occur.[2],[14] The specialties that exhibited the highest degree of SA in this study were neurology, otorhinolaryngology, pediatrics, and pulmonology. This was somewhat predictable as these specialties are equally dynamic—perhaps more so in pediatrics and otorhinolaryngology as both involve surgical care. Operating rooms are complex environments and have numerous people communicating, and different tools being used at the same time, thus requiring a high degree of SA.[14] While all medical specialties require SA, it is perhaps of increased importance in the above-listed specialties, as reflected in the results of this study.
Our assessment tool demonstrated strong internal validity but moderate inter-rater reliability. This suggests that SA can indeed be assessed in OSCE guides, but there may be inherent subjectivity of the tool. It is possible to improve inter-rater reliability and increase objectivity of the tool with training sessions to optimize use of the tool. Another limitation of the tool is that it is an indirect measure of SA. Indirect measures infer SA, whereas direct measures are employed during simulation and are perhaps better indications of SA because they can explore an individual's thought process through in-test probes.[13] This method could more accurately predict SA performance, especially higher levels of SA. However, using a direct measure was beyond the scope of this study as we looked only at the guides and marking schemes for simulations, not the implementation of simulations. Ultimately, there is inherent difficulty in measuring SA, one study attempted to compare measures of SA using reliability and validity testing and it showed limited correlation, similar to our own study.[13] This is just an exploratory study, and further training and validity testing are required to better understand to what extent the tool can accurately measure SA.
While there were limitations to the tool, there were also strengths—especially because the tool was developed using Endsley's model of SA. The breadth of the model is great, allowing application to multiple industries, including health care and the many fields within health care.[15] As previously mentioned, the model is not unidirectional, it can be understood as a forward mechanism in decision-making processes or a backward mechanism in goal-driven processes.[14] Finally, new situations do not have to be the same as previous situations to employ SA. SA is a nontechnical skill developed over time and allows cues to be recognized, regardless of the circumstances.[5] The model represents a dynamic cycle of collecting, interpreting, and predicting information in any condition, which is why it was used to develop the tool for assessing SA in this study.
Conclusion
SA has been extensively studied over the past two decades and has recently become a focus in medicine. Literature that analyzes the teaching and assessment of SA in medical education is still sparse. This study has shown that formative and summative OSCEs may be able to predict performance of Level 1 SA, but less able to predict Levels 2 and 3 SA. Regardless, it is clear that SA is critical for sound clinical judgment and prevention of medical errors. To improve SA in clinical settings, it is necessary to include SA training in education. SA is taught and assessed using simulations, such as the OSCE, but it is inherently difficult to measure and many methods of SA measurement have been proposed, including the one in this study. Consensus on which method of measurement is best to appraise these clinical scenarios, and indeed medical curriculum, has not yet been achieved. More research is required to better understand SA and its measurement in medical training.
Financial support and sponsorship
Nil.
Conflicts of interest
There are no conflicts of interest.
References
1 | World Health Organization. Patient Safety: Data and Statistics. Geneva: WHO; 2014. Available from: http://www.euro.who.int/en/health-topics/Health-systems/patient-safety/data-and-statistics. [Cited on 2016 Aug 09]. |
2 | Schulz CM, Krautheim V, Hackemann A, Kreuzer M, Kochs EF, Wagner KJ. Situation awareness errors in anesthesia and critical care in 200 cases of a critical incident reporting system. BMC Anesthesiol 2016;16:4. |
3 | Edozien LC. Situational awareness and its application in the delivery suite. Obstet Gynecol 2015;125:65-9. |
4 | Green B, Tsiroyannis C, Brennan PA. Human factors – Recognising and minimising errors in our day to day practice. Oral Dis 2016;22:19-22. |
5 | Endsley MR. Toward a theory of situation awareness in dynamic systems. Hum Factors 1995;37:32-64. |
6 | Salmon PM, Stanton NA, Walker GH, Jenkins D, Ladva D, Rafferty L, et al. Measuring situation awareness in complex systems: Comparison of measures study. Int J Ind Ergon 2009;39:490-500. |
7 | Sandars J, Bax N, Mayer D, Wass V, Vickers R. Educating undergraduate medical students about patient safety: Priority areas for curriculum development. Med Teach 2007;29:60-1. |
8 | Endsley M. Situational awareness in aviation systems. In: Garland D, Wise J, Hopkin V, editors. Handbook of Human Aviation Factors. New Jersey, USA: Lawrence Erlbaum Associates; 1999. p. 257-76. |
9 | Akaike M, Fukutomi M, Nagamune M, Fujimoto A, Tsuji A, Ishida K, et al. Simulation-based medical education in clinical skills laboratory. J Med Invest 2012;59:28-35. |
10 | OSCE Skills. OSCE Station Categories; c2016. Available from: http://www.osceskills.com. [Cited on 2016 Sep 02]. |
11 | Geeky Medics. Clinical Skills (OSCE); c2016. Available from: http://www.geekymedics.com/category/osce/. [Cited on 2016 Sep 02]. |
12 | Geeky Medics – Awesome Revision Site with Video Guides (Great for OSCEs). Reddit. Available from: https://www.reddit.com/r/medicalschool/comments/1m1c64/geeky_medics_awesome_revision_site_with_video/. [Last retrieved on 2017 Jun 15]. |
13 | Lowe DJ, Ireland AJ, Ross A, Ker J. Exploring situational awareness in emergency medicine: Developing a shared mental model to enhance training and assessment. Postgrad Med J 2016. pii: postgradmedj-2015-133772. |
14 | Parush A, Kramer C, Foster-Hunt T, Momtahan K, Hunter A, Sohmer B. Communication and team situation awareness in the OR: Implications for augmentative information display. J Biomed Inf 2011;44:477-85. |
15 | Endsley MR. Situation awareness misconceptions and misunderstandings. J Cogn Eng Decis Making 2015;9:4-32. |
|