Advertisement

Examining the July Effect: A National Survey of Academic Leaders in Medicine

      Abstract

      Background

      Whether the “July Effect” affects perspectives or has prompted changes in US Internal Medicine residency programs is unknown.

      Methods

      We designed a survey-based study to assess views and efforts aimed at preventing harm in July. A convenience sampling strategy (email listserv and direct messages to program leaders via the Electronic Residency Application Service) was used to disseminate the survey.

      Results

      The response rate was 16% (65/418 programs); however, a total of 262 respondents from all 50 states where residency programs are located were included. Most respondents (n = 201; 77%) indicated that errors occur more frequently in July compared with other months. The most common identified errors included incorrect or delayed orders (n = 183, 70% and n = 167, 64%, respectively), errors in discharge medications (n = 144, 55%), and inadequate information exchange at handoffs (n = 143, 55%). Limited trainee experience (n = 208, 79%), lack of understanding hospital workflow, and difficulty using electronic medical record systems (n = 194; 74% and n = 188; 72%, respectively) were reported as the most common factors contributing to these errors. Programs reported instituting several efforts to prevent harm in July: for interns, additional electronic medical record training (n = 178; 68%) and education on handoffs and discharge processes (n = 176; 67% and n = 108; 41%, respectively) were introduced. Similarly, for senior residents, teaching sessions on how to lead a team (n = 158; 60%) and preferential placement of certain residents on harder rotations (n = 103; 39%) were also reported. Most respondents (n = 140; 53%) also solicited specific “July attendings” using a volunteer system or highest teaching ratings.

      Conclusion

      Residency programs in Internal Medicine appear to have instituted various changes to mitigate harm in July. Further evaluation to understand the impact of these interventions on trainee education and patient safety is necessary.

      Keywords

      Clinical Significance
      • Internal Medicine residency program leaders believe in the existence of a July Effect.
      • Many programs have enacted changes to offset the July Effect.
      • Greater understanding of both intended and unintended consequences of these efforts and their impact on resident training and patient safety is necessary.
      The dawn of the academic medical year in July often heralds the arrival of medical errors.
      • Phillips D.P.
      • Barker G.E.
      A July spike in fatal medication errors: a possible effect of new medical residents.
      • Wen T.
      • Attenello F.J.
      • Wu B.
      • et al.
      The july effect: an analysis of never events in the nationwide inpatient sample.
      • Young J.Q.
      • Ranji S.R.
      • Wachter R.M.
      • et al.
      “July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review.
      Although the extent and impact of this proverbial “July Effect” is often debated,
      • McDonald J.S.
      • Clarke M.J.
      • Helm G.A.
      • Kallmes D.F.
      The effect of July admission on inpatient outcomes following spinal surgery.
      • Ravi P.
      • Trinh V.Q.
      • Sun M.
      • et al.
      Is there any evidence of a “July effect” in patients undergoing major cancer surgery?.
      • Bohl D.D.
      • Fu M.C.
      • Golinvaux N.S.
      • et al.
      The “July effect” in primary total hip and knee arthroplasty: analysis of 21,434 cases from the ACS-NSQIP database.
      • Karipineni F.
      • Panchal H.
      • Khanmoradi K.
      • Parsikhia A.
      • Ortiz J.
      The “July effect” does not have clinical relevance in liver transplantation.

      Lieber BA, Appelboom G, Taylor BE, et al. Assessment of the “July Effect”: outcomes after early resident transition in adult neurosurgery. J Neurosurg. 2015 Dec 15:1-9. [e-pub ahead of print]

      transitions during this period make its existence likely. For example, newly minted interns must learn hospital workflow and electronic medical record (EMR) systems while taking care of patients. Similarly, second-year residents must make the leap from supervised intern to supervisor, doubling the number of patients they oversee in a single night. Finally, these changes often occur under the guidance of junior attendings, only further aligning the holes of the Swiss-cheese model of error.
      • Reason J.
      Human error: models and management.
      It is not surprising, then, that interventions such as simulation-based training and “boot camps” for interns,
      • Cohen E.R.
      • Barsuk J.H.
      • Moazed F.
      • et al.
      Making July safer: simulation-based mastery learning during intern boot camp.
      • Wright M.
      • Mankey C.G.
      • Miller B.W.
      Improving upon the ‘July effect’: a collaborative, interdisciplinary orientation for internal medicine interns.
      education to improve team communication
      • Loo L.
      • Puri N.
      • Kim D.I.
      • et al.
      “Page me if you need me”: the hidden curriculum of attending-resident communication.
      and feedback of supervisory styles have all been trialed to improve safety in July.
      • Goldszmidt M.
      • Faden L.
      • Dornan T.
      • et al.
      Attending physician variability: a model of four supervisory styles.
      However, to date, no study has examined whether training programs in Internal Medicine (IM) across the US have taken steps to prevent the July Effect. This knowledge gap is important, as approaches to offset such harm may substantially influence resident training, cultural norms, and patient safety during this period. Furthermore, identifying and defining such approaches may inform a more robust research agenda and improve sharing of successful practices across the country.
      Therefore, we designed and conducted a survey-based study to understand perspectives of IM program leadership regarding the July Effect. We also sought to quantify and describe the types of changes enacted by residency programs to mitigate this potential concern.

      Methods

      We created an electronic survey targeting program leaders in academic IM focusing on 2 domains: perspectives regarding the July Effect and programmatic changes directed toward interns, residents, and attendings to prevent harm in July. Following cognitive pretesting, the survey instrument was refined and programmed into an online tool (SurveyMonkey, Palo Alto, Calif.) for dissemination. A convenience sampling strategy using: 1) a listserv of Program Directors (PDs), Assistant/Associate Program Directors (APDs), and Chief Medical Residents (CMRs) from the Alliance for Academic Internal Medicine; and 2) direct e-mails to program leadership using records in the Electronic Residency Application Service was used to disseminate the survey. All responses were obtained anonymously.
      Because multiple individuals from a training program could potentially respond (eg, PD, APD or CMR), the survey response rate was calculated at the program level (eg, number of unique program director responses/number of IM training programs in the US in the Electronic Residency Application Service [n = 418]). Descriptive statistics (eg, mean, percentage) were used to tabulate all data. Stata MP/SE (StataCorp LP, College Station, Texas) was used for all analyses.
      The study received a “Not Regulated” status by the University of Michigan Institutional Review Board (IRBMED HUM00109041).

      Results

      A total of 262 respondents representing 65 of the 418 IM residency programs across the US completed the survey (program response rate = 16%). Along with PDs (n = 65), APDs (n = 78; 30%), CMRs (n = 70; 27%), IM/Pediatrics program leaders (n = 13; 5%) and others (n = 36; 14%) responded and were included. Most respondents were affiliated with large academic medical centers (n = 155; 70%); others were affiliated with community-based hospitals (n = 84; 38%) or Veterans Affairs Medical Centers (n = 48; 22%). Almost a third of respondents (n = 78; 30%) indicated that their training programs had ≥ 111 residents. While only 65 unique residency programs were included, respondents from all 50 states where IM residency programs are located in the US were included.

      Perspectives Regarding the July Effect

      Compared with other months, the majority of respondents (n = 201; 77%) agreed with the statement that errors occur slightly or much more frequently in July. Most respondents (n = 140; 53%) reported using informal reviews to evaluate errors in July; however, formal multidisciplinary reviews were also commonly reported (n = 95; 36%). According to respondents, the most common errors in July were incorrect or delayed orders (n = 183, 70% and n = 167, 64%, respectively), errors in discharge medications (n = 144, 55%), and inadequate information exchanged at handoffs (n = 143, 55%). Limited trainee experience (n = 208, 79%) and lack of familiarity with hospital workflow and EMR systems (n = 194, 74% and n = 188, 72%, respectively) were cited as the 3 most common factors contributing to errors in July (Table 1).
      Table 1General Characteristics and Perspectives of Respondents (N = 262)
      Question/DomainN = 262 (100%)
      General characteristics
       Respondent role
      Internal Medicine Program Director65 (24.8%)
      Internal Medicine Associate/Assistant Program Director78 (29.8%)
      Internal Medicine Chief Medical Resident70 (26.7%)
      Internal Medicine and Pediatrics Program Leader13 (4.9%)
      Other/unknown36 (13.7%)
       Program size (number of residents)
      ≤155 (1.9%)
      16-3524 (9.2%)
      36-8061 (23.3%)
      81-11043 (16.4%)
      ≥11178 (29.8%)
      Other51 (19.4%)
       Hospital type
      Academic/University-based medical center155 (59.2%)
      Community-based hospital84 (32.1%)
      Veterans Affairs medical center48 (18.3%)
      Other8 (3.1%)
      Perspectives regarding July effect
       Perceived frequency of errors during July
      More than other months201 (76.7%)
      The same as other months49 (18.7%)
      Less than other months12 (4.6%)
       Approach to tracking and evaluating errors
      Formal reviews or evaluation94 (35.9%)
      Informal reviews or evaluation140 (53.4%)
      Have not investigated errors in July69 (26.3%)
       Most frequent errors encountered during July
      Incorrect orders183 (69.8%)
      Delayed orders167 (63.7%)
      Errors in medications prescribed at discharge144 (55.0%)
      Lack of adequate information exchange at handoffs143 (54.6%)
      Discharge errors regarding follow-up114 (43.5%)
      Other81 (30.9%)
       Most important factors contributing to errors in July
      Limited trainee experience208 (79.4%)
      Lack of familiarity with hospital workflow194 (74.0%)
      Lack of familiarity with electronic medical record systems188 (71.8%)
      Inadequate communication between interns, residents, and attendings82 (31.3%)
      Inadequate training of interns on discharge processes81 (30.9%)
      Inadequate communication between medical teams and nurses68 (26.0%)
      Inadequate supervisory skills of senior residents66 (25.2%)
      Turnover of skilled residents and attendings64 (24.4%)
      Other103 (39.3%)

      Programmatic Changes Aimed at Ameliorating Harm in July

      To avoid intern errors in July, programs reported introduction of a number of measures including in-depth debriefs of tasks after rounds (n = 131; 50%), increased CMR presence (n = 122; 47%), and senior resident presence during handoffs (n = 118, 45%). Correspondingly, respondents stated that dedicated education focused on EMRs, handoffs of care, and discharge processes (n = 178, 68%; n = 176, 67%; and n = 108, 41%, respectively) were specifically added to training in order to reduce errors in July. Respondents also reported interventions aimed at senior residents, including dedicated teaching sessions on how to lead a team during orientation (n = 158, 60%) and mandatory follow-up with attendings after rounds (n = 108, 41%) to reduce errors in July. Interventions directed at attendings were less common, not routinely enforced, and often vague, such as “requests for more frequent check-ins with the team” (n = 109, 42%). Almost a third of respondents stated that no specific efforts were directed toward attendings during the month of July (n = 93, 35%) (Table 2).
      Table 2Programmatic Interventions to Prevent Errors in July (N = 262)
      Interventions directed toward interns
       More in-depth discussion of intern tasks on or after rounds131 (50.0%)
       Increased Chief Resident presence122 (46.6%)
       Senior (PGY2 or greater) presence at sign out/handoffs118 (45.0%)
       Increased attending follow-up after rounds109 (41.6%)
       Fewer patients per clinic day101 (38.5%)
       Supervision for procedures91 (34.7%)
       Float/extra senior (PGY2 or greater) available or assigned to team75 (28.6%)
       Specific instructions to attendings regarding oversight of interns70 (26.7%)
       Lower census caps on inpatient services51 (19.5%)
       Other85 (32.4%)
      Changes to intern orientation to prevent errors in July
       Electronic medical record training178 (67.9%)
       Training on patient sign-out/handoffs176 (67.2%)
       Discharge process training108 (41.2%)
       Training on common medical procedures107 (40.8%)
       Interdisciplinary training84 (32.1%)
       Training on what and when to discuss thing with supervisors77 (29.4%)
       Education about when to ask questions64 (24.4%)
       Training on how to conduct code status discussion60 (22.9%)
      Interventions directed toward senior residents
       Teaching sessions on how to lead a team158 (60.3%)
       Greater Chief Resident presence119 (45.4%)
       Directions to check all intern orders116 (44.3%)
       Direction to follow-up with attendings after rounds108 (41.2%)
       Tips on identifying struggling interns105 (40.1%)
       Training on running codes101 (38.5%)
       Float/extra senior (PGY2 or greater) available or assigned to team73 (27.9%)
       Other116 (44.3%)
      Changes to resident scheduling made to limit errors in July
       Avoid placing certain residents in a senior role on inpatient wards119 (45.4%)
       Preferentially place PGY3 or PGY4 seniors on “harder” services103 (39.3%)
       Preferentially place interns from your medical school on harder services93 (35.5%)
       None47 (17.9%)
      Interventions directed toward attendings
       More frequent “check-ins” with teams during the day109 (41.6%)
       Tips on how to identify struggling interns or seniors59 (22.5%)
       Ask that attendings review orders42 (16.0%)
       Encourage at least one formal check-in with teams overnight26 (9.9%)
       Other30 (11.4%)
       None93 (35.5%)
      Reported process of selecting attendings for July
       Highest teaching evalutions43 (16.4%)
       Volunteer basis (self-selection)43 (16.4%)
       More years of experience40 (15.3%)
       Other18 (6.9%)
       None122 (46.6%)
      PGY = postgraduate year.
      Explicit scheduling approaches that included not placing certain senior residents on inpatient services in July (n = 119, 45%), preferentially placing postgraduate year 3 or 4 seniors on harder rotations (n = 103, 39%), and placing interns from host medical schools on harder services (n = 96, 35%) were often reported by respondents. However, a few respondents (n = 47, 18%) stated that they did not implement any type of scheduling consideration for trainees in July. In contrast to attending-specific interventions, most respondents (n = 144, 55%) stated that they selected specific attendings for service in July. With respect to how such attendings were assigned, respondents stated that they often assigned attendings with the highest teaching ratings or those that specifically volunteered for service during this period (n = 43, 16% for both).

      Discussion

      This first-of-a-kind national survey suggests that the majority of leaders in US IM residency programs believe in the existence of a July Effect. Additionally, many programs reported making specific changes to orientation, training, and workflow processes during this period. Because the most cited errors in July were operational in nature (eg, placing orders and safe use of EMRs), strategies such as structured intern orientation and focused EMR-training sessions were often described as tactics to reduce these risks. Additionally, preferential selection of interns, residents, and attendings to ensure safety was also reported. These findings highlight that leadership in IM residency programs not only perceive the July Effect as a veritable threat, but have also instituted several changes to counteract this hazard.
      Several studies suggest that the July Effect is mediated by an influx of inexperienced medical trainees. For example, a study examining fatal medication errors reported a substantial 10% increase in July related to the arrival of new interns.
      • Phillips D.P.
      • Barker G.E.
      A July spike in fatal medication errors: a possible effect of new medical residents.
      In an analysis of national data, a 6% greater risk of hospital-acquired complications in July (potentially related to inexperienced trainees) compared with other months was reported in another study.
      • Wen T.
      • Attenello F.J.
      • Wu B.
      • et al.
      The july effect: an analysis of never events in the nationwide inpatient sample.
      Similarly, a systematic review of the July Effect reported that high-quality studies more often revealed increased patient mortality during this period compared with studies at greater risk of bias.
      • Young J.Q.
      • Ranji S.R.
      • Wachter R.M.
      • et al.
      “July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review.
      Despite this background and reasonable theories to explicate these outcomes, the precise mechanisms by which patients may be harmed in July are not known. Our findings that provide a “birds-eye” view from training program leadership are highly valuable, as they suggest that lack of familiarity with workflow, team handoffs, and EMR proficiency are considered root causes of this phenomenon. Consequently, residency programs have innovated strategies such as greater supervision of interns, tailored EMR training, and schedules that deploy internally trained or higher-performing residents and interns to offset this risk. While creative, the impact of these interventions on patient safety, residency training, or the July Effect is not known. In an era of duty hour restrictions, EMR rollouts, and concerns regarding fragmented continuity of care, these knowledge gaps are highly relevant and merit further study.
      It is pertinent to note that some studies have argued against the existence of a July Effect.
      • McDonald J.S.
      • Clarke M.J.
      • Helm G.A.
      • Kallmes D.F.
      The effect of July admission on inpatient outcomes following spinal surgery.
      • Ravi P.
      • Trinh V.Q.
      • Sun M.
      • et al.
      Is there any evidence of a “July effect” in patients undergoing major cancer surgery?.
      • Bohl D.D.
      • Fu M.C.
      • Golinvaux N.S.
      • et al.
      The “July effect” in primary total hip and knee arthroplasty: analysis of 21,434 cases from the ACS-NSQIP database.
      • Karipineni F.
      • Panchal H.
      • Khanmoradi K.
      • Parsikhia A.
      • Ortiz J.
      The “July effect” does not have clinical relevance in liver transplantation.

      Lieber BA, Appelboom G, Taylor BE, et al. Assessment of the “July Effect”: outcomes after early resident transition in adult neurosurgery. J Neurosurg. 2015 Dec 15:1-9. [e-pub ahead of print]

      These contrarian data often emanate from studies of surgical trainees, where no appreciable increase in rates of mortality or surgical-specific adverse events (eg, wound infections or unplanned returns to the operating room) in July, compared with other months, has been observed. Although interesting, these findings may not translate to IM training programs or experiences in medicine for several reasons. First, unlike surgical trainees in the operating room, harm to medical patients is more likely to occur outside of direct attending supervision. Second, medical patients are prone to experience a penumbra of errors, ranging from delayed orders, incorrect drug doses or formulations, or errors in discharge medications and instructions. The impact of such errors is likely to not only be more heterogeneous, but also more difficult to measure using available patient safety outcomes. Finally, the proximity of an error to measurable harm is more direct in surgical settings as compared with medical settings. Thus, attribution of harm in surgical settings is not only easier, but potentially also more preventable. Collectively, these differences suggest that dedicated studies of IM trainees with appropriate metrics are needed to better understand the July Effect.
      Notably, we found that many programs have developed strategies to prevent patient harm in July. For example, many respondents stated that they have improved orientation of interns or enhanced training of senior residents to better lead teams in July. Additionally, scheduling efforts to employ internally trained or higher-performing residents and interns were also reported. Although creative, the impact of such approaches on patient safety, team culture, or efficiency in July remains unknown. Further, we were surprised by the comparatively lesser effort directed toward orienting or training attendings. While some respondents reported recruiting attendings with better teaching evaluations or those with more years of experience to serve in July, many did not specifically select attendings in this fashion. As attending supervision is critical to the safety of new trainees, further study and evaluation of attending-directed interventions may prove valuable.
      Our study has several limitations. First, when measured by number of participating programs, the overall response rate was low, creating the possibility for respondent or selection bias. Relatedly, as responses were collected anonymously, the degree to which respondents represented the same program is not known. However, we note that respondents from all states in the US where IM training programs are located were recruited, possibly limiting these biases. Second, as most respondents were from academic programs with ≥ 80 residents, the generalizability of our findings to smaller academic programs or community-based programs is not known. Lastly, although we report the use of various methods to mitigate harm in July, the efficacy, risks, and benefits of these approaches cannot be determined by this study.
      Despite these limitations, our study has several strengths. To our knowledge, this is the first survey to assess the views of residency programs and structural changes related to offset the July Effect in our specialty. Consequently, our study sheds new light on which specific errors are of most concern to program leadership and what practices have been implemented to offset these threats. Second, by identifying and categorizing approaches directed toward interns, residents, and attendings, our study helps inform the development of interventions to prevent errors in July. Third, by including tiers of leadership including APDs and CMRs, our findings are likely to be highly representative and externally valid across training programs in the US. These aspects lend credibility to the data and suggest that further enquiry directed toward IM training programs in this regard may be helpful.

      Conclusion

      Most leaders in US IM residency programs believe in the existence of a July Effect. While several programs have enacted changes to offset this risk, the influence of such changes remains unknown. A better understanding of both intended and unintended consequences of these efforts and their impact on resident training and patient safety thus appears necessary.

      References

        • Phillips D.P.
        • Barker G.E.
        A July spike in fatal medication errors: a possible effect of new medical residents.
        J Gen Intern Med. 2010; 25: 774-779
        • Wen T.
        • Attenello F.J.
        • Wu B.
        • et al.
        The july effect: an analysis of never events in the nationwide inpatient sample.
        J Hosp Med. 2015; 10: 432-438
        • Young J.Q.
        • Ranji S.R.
        • Wachter R.M.
        • et al.
        “July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review.
        Ann Intern Med. 2011; 155: 309-315
        • McDonald J.S.
        • Clarke M.J.
        • Helm G.A.
        • Kallmes D.F.
        The effect of July admission on inpatient outcomes following spinal surgery.
        J Neurosurg Spine. 2013; 18: 280-288
        • Ravi P.
        • Trinh V.Q.
        • Sun M.
        • et al.
        Is there any evidence of a “July effect” in patients undergoing major cancer surgery?.
        Can J Surg. 2014; 57: 82-88
        • Bohl D.D.
        • Fu M.C.
        • Golinvaux N.S.
        • et al.
        The “July effect” in primary total hip and knee arthroplasty: analysis of 21,434 cases from the ACS-NSQIP database.
        J Arthroplasty. 2014; 29: 1332-1338
        • Karipineni F.
        • Panchal H.
        • Khanmoradi K.
        • Parsikhia A.
        • Ortiz J.
        The “July effect” does not have clinical relevance in liver transplantation.
        J Surg Educ. 2013; 70: 669-679
      1. Lieber BA, Appelboom G, Taylor BE, et al. Assessment of the “July Effect”: outcomes after early resident transition in adult neurosurgery. J Neurosurg. 2015 Dec 15:1-9. [e-pub ahead of print]

        • Reason J.
        Human error: models and management.
        BMJ. 2000; 320: 768-770
        • Cohen E.R.
        • Barsuk J.H.
        • Moazed F.
        • et al.
        Making July safer: simulation-based mastery learning during intern boot camp.
        Acad Med. 2013; 88: 233-239
        • Wright M.
        • Mankey C.G.
        • Miller B.W.
        Improving upon the ‘July effect’: a collaborative, interdisciplinary orientation for internal medicine interns.
        Med Educ Online. 2013; 18: 23249
        • Loo L.
        • Puri N.
        • Kim D.I.
        • et al.
        “Page me if you need me”: the hidden curriculum of attending-resident communication.
        J Grad Med Educ. 2012; 4: 340-345
        • Goldszmidt M.
        • Faden L.
        • Dornan T.
        • et al.
        Attending physician variability: a model of four supervisory styles.
        Acad Med. 2015; 90: 1541-1546