|ORIGINAL RESEARCH PAPER
|Year : 2009 | Volume
| Issue : 3 | Page : 325
Direct Observation of Resident-Patient Encounters in Continuity Clinic: A Controlled Study of Parent Satisfaction and Resident Perceptions
AJ Starmer1, GD Randolph2, KL Wysocki2, MJ Steiner2
1 Children's Hospital of Boston, Boston, USA
2 The University of North Carolina, North Carolina, USA
|Date of Submission||21-Mar-2009|
|Date of Acceptance||28-Oct-2009|
|Date of Web Publication||15-Dec-2009|
A J Starmer
300 Longwood Avenue, Boston, MA 02115
Source of Support: None, Conflict of Interest: None
Context: Direct observation (DO) by teaching physicians of medical care provided by resident physicians offers a method to evaluate clinical skills beyond traditional measures that focus solely on medical knowledge assessment.
Objectives: We sought to determine if the presence of the teaching physician observer affects parental satisfaction with care and to assess resident perceptions of DO in a general pediatrics residency clinic.
Methods: A cross-sectional parent survey compared visit satisfaction of parents who experienced a DO with controls in a traditional clinic visit. Additionally, a pre-post survey measured resident perceptions of direct observation before and after implementation of DO in the clinic.
Findings: Parents frequently described their overall satisfaction with care as "excellent" after DO and traditional visits (DO 70%, 95% CI, 50-86% and control 80%, CI 66-89%). However, parents in DO visits were less likely to rate their satisfaction with the amount of time spent in the room as excellent (DO 78%, CI 58-91%; Control 95%, CI 85-99%). Most resident physicians were in favor of the DO process (63%) and agreed that DO provides feedback about history-taking (94%), physical examination (94%) and interpersonal skills (91%).
Conclusions: Direct observation by attending physicians does not decrease overall parental satisfaction during clinical encounters. Additionally, residents have a generally favorable opinion of direct observation and believe that it can provide useful feedback.
Keywords: Feedback, ambulatory care, educational measurement, patient satisfaction
|How to cite this article:|
Starmer A J, Randolph G D, Wysocki K L, Steiner M J. Direct Observation of Resident-Patient Encounters in Continuity Clinic: A Controlled Study of Parent Satisfaction and Resident Perceptions. Educ Health 2009;22:325
|How to cite this URL:|
Starmer A J, Randolph G D, Wysocki K L, Steiner M J. Direct Observation of Resident-Patient Encounters in Continuity Clinic: A Controlled Study of Parent Satisfaction and Resident Perceptions. Educ Health [serial online] 2009 [cited 2020 Apr 3];22:325. Available from: http://www.educationforhealth.net/text.asp?2009/22/3/325/101518
Attending- or supervising-physician observation of resident physician clinical skills is a crucial component of feedback to improve performance (Epstein, 2007). However, medical students and resident physicians report low rates of being observed by attending physicians while providing care to patients (Burdick & Schoffstall, 1995; Howley & Wilson, 2004). Furthermore, traditional evaluation methods are often based on the perception of competence without actual observation and, therefore, have limited educational utility and provide little constructive feedback on performance (Howley & Wilson, 2004).
In an attempt to improve rates of observation and offer increased opportunity for feedback for residents in training, many residency programs began to implement formal programs of Direct Observation (DO) where attending-physicians shadow resident physicians and offer suggestions for ways to improve clinical skills. Some preliminary research into this educational intervention suggested that DO can offer faculty new insights into resident clinical strengths and weaknesses, and make them more likely to provide useful feedback (Benenson & Pollack, 2003; Cydulka et al., 1996; Holmboe et al., 2004; Lane & Gottlieb, 2000). Based in part on these presumed benefits of DO, new residency training program requirements released by the United States Accreditation Council for Graduate Medical Education (ACGME, which is responsible for the accreditation of post-MD medical training programs within the United States) now mandate DO of residents as they perform history and physical examinations in various clinical settings (ACGME, 2007). Similar requirements for programs of observation are being implemented in many other countries in an attempt to improve evaluation and feedback (Norcini & Burch, 2007; Stern et al., 2005; Wiles et al., 2007).
Despite this preliminary research on the benefits of DO, a number of uncertainties remain about its use in medical education. For example, there have been no studies to assess the impact of DO on patients or their parents. Likewise, no published studies have examined learners' perceptions of DOs and how they should be performed.
Accordingly, we designed a study to answer the following questions: 1) Is parent satisfaction affected by DO? 2) What impact does DO have on the resident-patient-parent relationship? 3) Do residents believe that DO is a useful educational tool? 4) What preferences do residents have about how DO is performed?
We hypothesized direct observation would not change parent satisfaction with visits or negatively impact other aspects of the clinical encounter for parents. We also hypothesized that resident perception of DOs would improve after experiencing a DO and that their feedback on the DO process could be used to guide its implementation.
Design and study population: We designed a two-part study to assess the impact of a formalized DO program implementation in a pediatrics resident continuity clinic at an academic children’s hospital. The first component was a cross-sectional parent survey to compare perceptions and satisfaction ratings of intervention parents (whose child had been cared for by a resident undergoing a DO) and control parents (whose child was cared for in a traditional visit). During traditional visits, a resident enters the patient room independently and completes the clinical assessment. The resident then leaves the patient room to discuss the assessment and plan of care with the attending physician. Attendings may or may not re-enter the room with the resident to directly evaluate the patient and discuss the diagnosis and plan with the family. The second component of the study was a pre-post matched-pair survey of resident perceptions before and after implementation of the DO program.
The setting for the study was the University of North Carolina (UNC) General Pediatrics Resident Continuity Clinic in the United States which is staffed by nine attending preceptors and 56 pediatrics residents. In 2007, the clinic provided 9,250 clinic visits. The patient population seen in the clinic was 38% African American, 24% Latino and 30% Caucasian, with 69% of the patients insured by government insurance programs. The majority of patients lived within a 50-mile radius that includes urban, suburban and rural areas.
The full study protocol was approved by the Health Science Institutional Review Board at the University of North Carolina.
Intervention: A DO program was initiated in the continuity clinic in the fall of 2006 using a standardized 40-item checklist and feedback form. This form (available by request from authors) was adapted for use by all attendings during resident observation sessions.
The DOs were consistently performed so that at the beginning of a DO visit, the resident physicians would ask the parents for permission to have an observer present in the room. Only parents whose primary language was English participated in DOs or in the control group due to the complexities of using an interpreter during a DO. The attending physician observer was usually introduced as a “doctor” who was “shadowing” or “observing" in clinic, and attendings were encouraged to avoid participating in the resident-patient interaction unless invited to do so by the resident. In almost all cases, attending observers provided feedback to the resident about their observations immediately after the visit in a private area of the clinic.
Survey instrument and data collection: The parent survey examined parent satisfaction with the clinic visit (Appendix A). Our parent satisfaction outcome measure was adapted from the Medical Outcomes Study instrument, a widely used, validated and reliable patient satisfaction survey (Rubin et al., 1993). This simple question asks parents to rate their overall satisfaction with the clinic visit on a 5-point Likert scale. Additionally, parents were asked to rank their satisfaction with the healthcare provided, the amount of time the doctor spent in the room and if they could recall the name of the doctor who saw their child during the visit. Four additional questions, administered only to parents who experienced a DO, assessed the perception of having an observer present in the room.
Consecutive parents who experienced a DO visit and a convenience sample of concurrent control parents were approached to complete the surveys. Study personnel attempted to approach all families presenting for care while personnel were available. An anonymous parent survey was administered in a standardized, anonymous manner by study investigators who were not involved in the care of the patient.
We were unable to find a previously validated survey of resident preferences for an educational intervention similar to DO. Therefore, we designed a survey that assessed the perceived educational value and preferences for implementation of DO (Appendix B). Face validity was assessed by the physician members of the research team (AS, GR, MS). The baseline resident survey was administered prior to the implementation of the DO program in the late summer of 2006. The follow-up resident survey was administered after implementation in the spring of 2007. All pediatrics residents working in the continuity clinic were asked to complete the survey. Resident responses were confidential, but a tracking number was used to compare individual resident opinions before and after experiencing DO.
Statistical analysis: We performed a power calculation to determine the sample size necessary to detect a 0.5 rating difference on a 5-point Likert scale of overall parent satisfaction using a 2:1 ratio of control (traditional visit) to intervention (DO visit) parents. Sample size determination was 54 control subjects and 27 intervention subjects. We attempted to survey all 56 residents, but ultimately obtained 35 (63%) matched pre- and post-resident surveys. This sample of residents provided 80% power to detect a 0.5 difference on a 5-point Likert scale of resident perceptions and preferences regarding DO.
Differences between included and excluded resident physician subjects were compared using a two-tailed t-test (for age), z-test of proportions (for gender) and Fisher’s exact test (for year of training). Pearson’s chi-square was used to compare the proportion of respondents in each year of training. The survey data were analyzed using non-parametric statistical tests. For some responses, Likert scale results were dichotomized into “positive” (rank of 1 or 2) or other (rank of 3, 4 or 5). The proportion of parents rating overall satisfaction as “excellent” (rank of 1) between intervention and control groups was analyzed using z-test of proportions. Wilcoxon rank sum was used to compare central tendency between DO and control groups. Matched resident Likert scale rankings before and after implementation were analyzed using Wilcoxon matched-pairs sign-rank. McNemar’s test was used to compare matched dichotomous agreement scores. All confidence intervals (CI) reported represent 95% CI range. Data analysis was performed using Stata 8.1 and Stata 10.0 (College Station, TX, 2003).
Parent survey: Twenty-eight parents of children with a DO visit were approached to participate in the study, and 27 enrolled (one parent refused for unspecified reasons). Fifty-five control parents were approached to enroll and 54 enrolled (one excluded due to participation as an intervention family on a previous clinic visit).
Parents were highly satisfied with care in both DO and traditional visits. Seventy percent of parents in DO visits described their overall satisfaction with the visit as excellent, compared with 80% of parents in traditional visits. This difference was not statistically significant (respective CIs, DO 50-86% and controls 66%-89%). The median Likert score was also not different between visit types (DO, median 5, range 3-5 and controls, median 5, range 2-5, p=0.40). Additionally, the percentage of parents rating their satisfaction with healthcare provided by the doctor as excellent did not differ between groups (DO 85%, CI 66%-96% and controls 88%, CI 75%-95%) and there was no difference in the ability to recall the name of the doctor who provided care during that visit (DO 59%, CI 39%-78% and controls 63%, CI 52%-81%). However, parents in the traditional visit group were more likely to rate their satisfaction with the amount of time spent in the room as excellent (95%, CI 85%-99% vs. 78% DO group, CI 58%-91), though the difference in median rating was of borderline significance (DO 5, range 3-5; Control 5, range 4-5, p=0.053).
Parents who participated in DO were asked to rate their impressions of the process. Thirty percent of subjects in the DO group agreed (either somewhat or strongly) that having another doctor observe their child’s doctor resulted in a higher level of care for that visit; 63% felt that having DO “every once in a while” is a good way to improve the level of care their child receives in clinic. A minority (11%) of parents felt that DO lengthened the time of the visit and, similarly, 11% of parents did not like DO because having an extra doctor present in the room was uncomfortable.
Resident preferences: Eighty-nine percent of residents in the residency program completed a survey (n=50). Incomplete surveys, surveys from residents who did not complete both a pre- and post-implementation survey and surveys from residents that did not experience a DO during the year were excluded from analysis. This left 35 subjects with matched pair survey results for inclusion (63% participation). There was no difference in age (mean age 28.0 and 27.8, respectively; p=0.86), gender (91% and 73% female; p=0.09), or year of training (p=0.68) between those who participated per protocol and those who were excluded. Additionally, the percentage of respondents did not vary with level of training (PL1: 66%, PL2: 56%, PL3: 63%; p=0.81)
Twenty-six percent of 2nd or 3rd year residents and 50% of 1st year residents reported never having been observed during a complete outpatient clinical visit during residency. Despite limited experience with DO prior to implementation of this program, 51% of residents were in favor of DO in clinic, 34% were neutral to DO and 14% of residents were opposed to DO in clinic. This favorable impression continued after experiencing DO, with 63% in favor after DO implementation (in favor before implementation 51%, CI 34-69%, after 63%, CI 45-79%). Fourteen percent of residents were opposed to DO prior to implementation and 11% of residents were still opposed to DO in clinic after implementation (respective CIs 5-30% and 3-27%).
We surveyed resident preferences for DO format both before and after initiation of DO in clinic (see Table 1). Multiple resident qualitative comments suggested that physical examination skills should be taught and demonstrated during DO, and 94% of residents agreed that DO would be a helpful way to improve physical examination skills. However, 97% of residents did not want attendings to repeat a complete physical examination after the resident, but preferred that attendings perform a brief focused exam: sometimes (50%), always (12%) or never perform an exam at all (29%). Some resident preferences for DO format had notable, but not statistically significant, changes after experiencing a DO. For example, prior to DO 79% of residents wanted immediate feedback provided outside of the room, but this rose to 94% of the residents after experiencing DO. Interestingly, there was also a trend that after experiencing DO, a greater percentage of residents preferred that attendings remain quiet while the clinical encounter was occurring (69% preferred this prior and 84% preferred after, p=0.34).
Table 1: Resident preferences for DO format before and after implementation of DO program in continuity clinic
A summary of resident perceptions about the utility of DO as a way to provide feedback about specific clinical skills is displayed in Table 2. Every resident reported that DO would be a helpful mechanism for feedback in at least two of the following clinical skills: history-taking, interpersonal communication, physical examination or dissemination of information. After experiencing direct observation, a significantly higher percentage of residents agreed that DO would help to improve interpersonal communication skills.
Table 2: Percentage of residents who somewhat or strongly agree that DO improves various clinical skills before and after implementation of DO
There were some negative impressions of DO among residents. For example, after experiencing a DO, 56% of residents were somewhat or very uncomfortable having an attending present in the room during an entire clinic visit and 38% of residents agreed somewhat or strongly that DO would make continuity clinic visits less efficient. Despite this discomfort, 63% of residents felt that assessment after DO represents an accurate evaluation of their clinical skills and only 14% of residents worried that DO would have a negative impact on relationships with their patients.
Our study suggests that parent satisfaction and experience with clinical care while participating in the DO process was not different than during standard clinic visits. We did find a small difference in parent satisfaction with the amount of time physicians spent in the room during DO visits.
There is often a perceived tension between the education and teaching of physician-trainees and the provision of high quality patient care. Recent studies have suggested that parents of children who are seen in training clinics believe their children are receiving high-quality care (Krugman et al., 2007). Yet, prior to our study, it was not known whether the newly-mandated ACGME requirement for attending physician DO of resident-patient encounters might negatively impact parents’ perceptions of high quality care.
We were surprised to find parents were somewhat less satisfied with the amount of time physicians spent in the room during DO visits. In traditional clinic visits, residents often inefficiently enter and leave the room two or three times; first entering for their history and examination, then leaving to discuss with an attending, then returning and leaving again to discuss and implement agreed upon plans such as medication prescriptions. For DO visits, residents were often able to finish the visit while the attending was present in the room. We assumed that families would appreciate improved efficiency. We did not record actual visit times and thus cannot assess whether DO visits were shorter or longer than traditional visits.
Previous research has raised concern about withering resident physical exam skills (Dunnington et al., 1994; Gaskin et al., 2000; Li, 1994; Mangione et al., 1995; Willett et al., 2007). Ninety-four percent of residents were hopeful that DO would help improve physical exam skills, and future research should explore how to use DO to maximize the ability to teach residents components of the physical examination.
The present study has several important limitations. Resident and family perceptions of the DO process will likely vary depending on the setting and how DO is implemented. However, assuming population factors are equal, DO implementation using a similar process should produce similar results in other institutions. Another limitation is that we examined resident perceptions of educational value, but we did not attempt to assess the actual educational utility of DO. Follow-up studies should attempt to demonstrate improved performance by resident physicians after DO and, more convincingly, demonstrate improved patient outcomes after resident physicians have undergone DO.
This study suggests that DO does not adversely affect parent satisfaction with care or the parent-resident physician relationship. Additionally, residents were generally in favor of DO. We remain encouraged that carefully implemented DO programs will face little resistance from residents or parents and can become a central and useful educational intervention in pediatric residency training programs.
The authors would like to thank Drs. Kenneth Roberts, Eliana Perrin and Lewis First for their thoughtful reviews of early versions of this manuscript as well as the residents and staff of the UNC Continuity Clinic for their assistance with completion of this project.
ACGME. (2007). Program requirements for pediatric residency training programs. http://www.acgme.org/acWebsite/RRC_320/320_prIndex.asp.
Benenson, R. S., & Pollack, M. L. (2003). Evaluation of emergency medicine resident death notification skills by direct observation. Academic Emergency Medicine, 10, 219-223.
Burdick, W. P., & Schoffstall, J. (1995). Observation of emergency medicine residents at the bedside: How often does it happen? Academic Emergency Medicine, 2, 909-913.
Cydulka, R. K., Emerman, C. L., & Jouriles, N. J. (1996). Evaluation of resident performance and intensive bedside teaching during direct observation. Academic Emergency Medicine, 3, 345-351.
Dunnington, G. L., Wright, K, & Hoffman, K. (1994). A pilot experience with competency-based clinical skills assessment in a surgical clerkship. American Journal of Surgery, 167, 604-606.
Epstein, R. M. (2007). Assessment in medical education. The New England Journal of Medicine, 356, 387-396.
Gaskin, P. R., Owens, S. E., Talner, N. S., Sanders, S. P., & Li, J. S. (2000). Clinical auscultation skills in pediatric residents. Pediatrics, 105, 1184-1187.
Holmboe, E. S., Hawkins, R. E., & Huot, S. J. (2004). Effects of training in direct observation of medical residents' clinical competence: A randomized trial. Annals of Internal Medicine, 140, 874-881.
Howley, L. D., & Wilson, W. G. (2004). Direct observation of students during clerkship rotations: A multiyear descriptive study. Academic Medicine, 79, 276-280.
Krugman, S. D., Racine, A., Dabrow, S., Sanguino, S., Meyer, W., Seid, M., et al. (2007). Measuring primary care of children in pediatric resident continuity practices: A Continuity Research Network study. Pediatrics, 120, e262-e271.
Lane, J. L. & Gottlieb, R. P. (2000). Structured Clinical Observations: A method to teach clinical skills with limited time and financial resources. Pediatrics, 105, 973-977.
Li, J. T. (1994). Assessment of basic physical examination skills of internal medicine residents. Academic Medicine, 69, 296-299.
Mangione, S., Burdick, W. P., & Peitzman, S. J. (1995). Physical diagnosis skills of physicians in training: A focused assessment. Academic Emergency Medicine, 2, 622-629.
Norcini, J., & Burch, V. (2007). Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher, 29, 855-871.
Rubin, H. R., Gandek, B., Rogers, W. H., Kosinski, M., McHorney, C. A., & Ware, J. E., Jr. (1993). Patients' ratings of outpatient visits in different practice settings. Results from the Medical Outcomes Study. Journal of the American Medical Association, 270, 835-840.
Stern, D. T., Ben David, M. F., De Champlain, A., Hodges, B., Wojtczak, A., & Schwarz, M. R. (2005). Ensuring global standards for medical graduates: a pilot study of international standard-setting. Medical Teacher, 27, 207-213.
Wiles, C. M., Dawson, K., Hughes, T. A., Llewelyn, J. G., Morris, H. R., Pickersgill, T. P. et al. (2007). Clinical skills evaluation of trainees in a neurology department. Clinical Medicine, 7, 365-369.
Willett, L. L., Estrada, C. A., Castiglioni, A., Massie, F. S., Heudebert, G. R., Jennings, M. S., et al. (2007). Does residency training improve performance of physical examination skills? The American Journal of the Medical Sciences, 333, 74-77.
Appendix A: Parent Survey
Appendix B: Direct Observation Resident Survey