More than 22,000 residents train in the 385 Accreditation Council for Graduate Medical Education (ACGME)-accredited core programs in internal medicine each year.
1
The Residency Review Committee for Internal Medicine (RRC-IM) relies on a site visit, the results of a resident questionnaire, and the American Board of Internal Medicine (ABIM) pass rate to judge the effectiveness of these training programs.2
In 2004, the Long-Range Planning Committee of RRC-IM proposed a plan to reform the accreditation process that would emphasize local educational innovation and utilize outcomes-based assessment to determine the effectiveness of the training program.3
ACGME has encouraged innovations that link quality training to quality patient care, simplify training requirements, and define outcomes that will measure the success of the training program.4
Recently, the internal medicine community has called for a redesign of training to reflect the needs of patients and society.5
, 6
, 7
In response to these challenges, RRC-IM developed the Educational Innovations Project (EIP) to begin the process of accreditation reform. This project provides an opportunity for internal medicine educational leaders to reconsider the regulations that currently govern training, experiment with new approaches to resident education and evaluation, and lead the way in determining the critical educational requirements that will shape the internist of the future. In this article, we describe EIP in the context of previous ACGME requirements and showcase the innovations of those programs that are participating in EIP as an alternate accreditation pathway.Perspectives Viewpoints
- •The Accreditation Council for Graduate Medical Education and the Residency Review Committee for Internal Medicine established the Educational Innovations Project to test alternative methods to approaches to meet program requirements to improve education and patient care.
- •Programs have developed a variety of innovations including geographic wards, hospitalist-resident partnerships, longer ambulatory blocks, and handoff simulations.
- •Educational Innovations Project programs can focus more on the process of quality improvement rather than solely the outcomes.
Graduate Medical Education and the Evolution of the ACGME RRC-IM
The genesis of graduate medical education can be found in a grassroots movement of physicians who desired practical experience to prepare them to care for patients and a more prolonged experience to develop expertise in a given field of medicine. In the second half of the 19th century, the term “intern” was used to designate the paid house officer whose function was to help the medical staff of hospitals. By 1889, Johns Hopkins Hospital coined the term “resident” to represent individuals who had completed an internship, but who wished to stay on to gain greater expertise in a specific area, clearly differentiating an educational experience as distinct from hands-on, paid work.
8
In 1914, the American Medical Association Council on Medical Education published its first list of hospitals appropriate for internship and adopted standards for adequate education and facilities. Surprisingly familiar are the requirements of those early years: a review of basic underlying pathophysiology, hospitals and clinics with patients and important operative techniques of the day, expert physicians who could guide learners, and opportunities for research in science. It was not until 1972, however, that the American Medical Association, the American Board of Medical Specialties, the Association of American Medical Colleges, the subspecialty societies, and the major teaching hospitals formed the Coordinating Council of Medical Education, the forerunner of the current ACGME, which came into being in 1982.
9
Today, more than 100,000 residents train in 2500 institutions that receive approximately $8 billion of federal funding.8
, 10
, 11
The requirements for ACGME accreditation have understandably proliferated as educators have sought to protect their learners and the educational process and to assure the public that physicians who complete these training programs are competent to practice in their chosen specialty.Until 1965, internal medicine functioned under the guidelines for “Essentials of Approved Residencies.”
12
The special requirements for internal medicine broadly included need for experience in disciplines in addition to medicine, such as psychiatry, neurology, dermatology, and pediatrics; focus on education, with staff members providing regular instruction in diagnostic and therapeutic methods; assumption of graduated responsibility for patient care under supervision; need to review pathology along with other fields of medical and basic science literature; and participation in clinical rounds. Most critical of all, adequate numbers of patients had to be admitted to the medicine “department” to ensure an adequate environment for education. These requirements comprised one half of a page and formed the basis for the first “Training Requirements in Internal Medicine” in 1974.13
Table 1 summarizes the internal medicine training requirements as they have evolved over the past 30 years. Increased specificity and detail are reflected in the increased length of these requirements (number of pages). However, also evident is the enhanced focus on education and processes, as physicians-in-training have cared for sicker patients in health care systems with greater complexity. In contrast, Table 2 delineates the core requirements maintained in EIP, which was meant to provide relief from extensive process-based regulation while also extending the time between RRC accreditation reviews to 10 years.
Summarized from the Corresponding Bulletins of the ACGME on Requirements for Internal Medicine training, from years 1978-2004.
14
Intended to stimulate outcome measures of program effectiveness and to foster a positive and flexible learning environment, EIP challenges educational leaders to design internal medicine training programs that are better aligned with the needs of patients and physicians in the current era.Table 1Development of Internal Medicine Training Requirements
Year | Length #Pages | Time and Experience | Faculty and Environment | Education | Evaluation |
---|---|---|---|---|---|
1978 | 1 |
|
|
| ABIM exam elective |
1980 | 3.5 |
|
|
|
|
1986 | 4.5 |
|
|
| |
1990 | 5.8 |
|
|
| |
1995 | 7.5 |
|
|
| |
1998 | 9 |
|
|
|
|
2000 | Transplant experience limited to 1 month | PD with greater authority and oversight |
| 80% must take, 60% must pass the ABIM | |
2004 | 11 | Continuity all 3 years, at least 108 ambulatory clinics. |
|
|
|
ABIM=American Board of Internal Medicine; PD=program director; PGY=postgraduate year; ICU=intensive care unit; KCF=key clinical faculty; PIF=program information form; CEX=observed clinical examination.
Table 2Educational Innovations Project – Synopsis of Program Requirements
Length of time and required experiences |
|
Environment and faculty |
|
Education |
|
Evaluation |
|
Assessment of training program |
|
The Internal Medicine Educational Innovations Programs
In fall 2005, 123 programs were notified of eligibility for EIP, based upon the stability of program leadership, a minimum of 8 years accreditation during the last 2 ACGME review cycles, and an ABIM program pass rate of more than 80%.
A request for application was distributed that required critical aspects: support of the institutional leadership, with a guarantee that resources would be available for the proposed projects; the plan for an electronic medical record in the near future; and the completion of a 20-page proposal describing program innovations that would link residency training to quality patient care and use new tools to assess patient outcomes or the effectiveness of resident education. Seventy-three programs submitted letters of intent and 40 were encouraged to complete the application process. Applications were reviewed by the EIP subcommittee of the RRC-IM and selected based on compliance with the requirements and goals of EIP. Seventeen programs entered the EIP accreditation pathway in July 2006. Table 3 shows the initial programs accepted into EIP and working under EIP requirements for more than 1 year, including 8 community programs, 8 university programs, and one municipal program with a broad range of program size (range 30-168 trainees, median size of 82 trainees).
Table 3Phase One Education Innovations Project Participants, 2006-2016
Program Name | Location | Type | Residents |
---|---|---|---|
Abington Hospital | Philadelphia, PA | Community | 44 |
Aurora Health | Milwaukee, WI | Community | 39 |
Banner Good Samaritan | Phoenix, AZ | Community | 95 |
Baystate Medical Center | Worchester, MA | Community | 88 |
Henry Ford Hospital | Detroit, MI | Community | 125 |
Long Island Jewish | Long Island, NY | Community | 82 |
Scripps Mercy Hospital | San Diego, CA | Community | 30 |
Summa Health System/Northeast Ohio University College of Medicine | Akron, OH | Community | 48 |
Hennepin County Medical Center | Minneapolis, MN | Municipal | 67 |
Beth Israel Deaconess | Boston, MA | University | 158 |
Duke University | Durham, NC | University | 163 |
Mayo Clinic | Rochester, MN | University | 168 |
Ohio State University | Columbus, OH | University | 108 |
Southern Illinois University | Springfield, IL | University | 53 |
University of Cincinnati | Cincinnati, OH | University | 108 |
University of Wisconsin | Madison, WI | University | 81 |
Westchester New York Medical Center | Westchester, NY | University | 66 |
Participation in EIP also required the submission of an annual report to RRC-IM. The purpose of these reports is to ensure ongoing compliance with the goals and requirements of EIP, confirm continued institutional support, and review ongoing educational and clinical outcomes. Feedback to the program is provided and RRC captures narratives about success, barriers, and solutions.
Educational Innovations Projects
Each EIP proposal included 3 to 5 projects. Upon review of all accepted program proposals, these projects fell into 1 of 5 broad themes summarized in Table 4. “Learning environment innovations” resulted in a substantive reorganization of the training program for all residents. “Safety and performance improvement projects” were listed in separate categories because these interventions were considered a particular focus of EIP. “Other curricular changes” included the development of a new rotation or new curriculum that enhanced ongoing experiences in the training program. Finally, EIP considered the development of “new evaluation tools” or the testing of existing tools as a necessary contribution for the EIP programs to advance the future assessment capabilities of all training programs. While it is not possible to detail all the projects, each program is featured with an example of an innovation in 1 of these 5 areas.
Table 4Themes of EIP Proposed Projects
Learning Environment Innovations |
Geographic wards with interdisciplinary teams and patients admitted daily; Focused inpatient and outpatient experiences with care delivered by training partners; Immersion inpatient and outpatient experiences by year of training; Tailored third year of training; Learner-manager sequence as graded responsibility. |
Patient Safety Enhancements |
Resident patient safety and disclosure council; Patient safety conference; Patient safety rotation; Faculty patient safety training; Use of AHRQ patient safety survey and check list; Patient safety curricula; Transitions of care (including handoffs) curricula. |
Performance Improvement |
“Pay for Performance” reward system; Performance improvement measures in ambulatory care; Chronic disease registries; ABIM Practice Improvement Modules implementation; Chronic care collaboratives; Chart audits. |
Other Curricular Changes |
Administrative rotations; Focused learning laboratory; Hospital apprenticeships; Clinical skills curricula; Communications curricula; Ultrasound examinations; Procedural simulation curricula; Professionalism curricula; Interpretive skills curricula; Teamwork curriculum. |
New Evaluation Tools |
Praise and concern reporting for professionalism; Template “grading” for discharge summaries; Standardized norms of performance across all trainee levels; Performance improvement conference and residency group projects; Professionalism OSCE; ICD tracking tool for patient experiences; Individualized learning plans; Handoff templates; Disclose form; Web language translator; Inpatient rounding tool; Health care access survey; Health literacy tool; Competency-based post-graduate assessment. |
EIP=Educational Innovations Project; AHRQ=Agency for Healthcare Research and Quality; ABIM=American Board of Internal Medicine; OSCE=Objective Structured Clinical Examination; ICD=International Classification of Diseases.
Several proposed structural changes allowed residents to enter a learning environment with enhanced team interactions and learner-teacher relationships. For example, geographic wards at Harvard Medical School Beth Israel Deaconess Medical Center and Long Island Jewish Medical Center aligned nursing units and firm-based hospitalists, so that residents integrate into a single team with continuity between residents and faculty and between residents and other health care personnel. Additionally, residents assumed ownership for the quality improvement and utilization measures on those same wards. Baystate Medical Center paired a second-year resident with a hospitalist teacher to allow residents to mature in their patient management skills before postgraduate year 3 (PGY-3), when they then manage and teach their own teams. This program aimed to aggressively address the “milestones” that residents must reach before they progress, so that competence is no longer presumed to be a product of time in an experience. Aurora Sinai Medical Center assessed nurse and social worker satisfaction with intern responsiveness and competency before and after a new multidisciplinary apprenticeship experience. Taken together, these innovations maximize social network learning opportunities and strong learner-teacher and patient-learner-teacher relationships.
15
Ambulatory care structural innovations aimed at residents reducing their distraction during continuity experiences have been particularly robust across the initial 17 EIP programs. For example, Hennepin County Medical Center initiated a partnership experience in which residents are scheduled into their continuity clinic twice weekly during outpatient and consultative rotations (about one half of the curricular schedule) and not at all during inpatient ward rotations. Faculty members experience a challenging, but parallel schedule. During the 2-month inpatient cycles, patients from the resident's continuity panel are seen by a designated practice partner, another resident with the complementary schedule. Resident-patient continuity reached 70% with this model. Duke University School of Medicine has a similar schedule, which they term “practice partnerships.” University of Cincinnati College of Medicine introduced its year-long “long block” ambulatory group practice and elective experience, during which, between months 17 and 28 of the 36-month training sequence, PGY-2 residents transition from primarily inpatient services to an expanded outpatient experience. They see patients in continuity clinic 3 half-days per week and remain responsible for responding to their patients at all times. Residents perceive that they have developed a healing relationship with nearly 50% of their patients, compared with 15% before the long block implementation. Resident satisfaction has increased and the hospital has experienced a decrease in emergency room utilization by the continuity patients of the residency.
Several new experiences or curricula for residents attest to new content and skills required to become an internist competent for practice in today's world. Many of these changes are patient safety-oriented, because EIP aimed to address safety and implement curricula directed at the transitions of care. At Southern Illinois University School of Medicine, difficult handoff situations are recreated in a simulation laboratory. Inpatient handoffs are then directly observed on medicine ward rotations, with attention to quality, safety, and error reduction, using a standardized tool: situation, background, assessment, and recommendation process. At Summa Health System/Northeast Ohio University College of Medicine, computerized intensive care unit handoffs decreased sign-out cycle time by 25%. Ohio State University College of Medicine implemented specific instruction in computerized sign-out, and faculty members observe all sign-outs until the resident is judged competent. Night teams have formal nocturnal rounds on patients designated by the day teams. Scripps Mercy Hospital measured each standardized format discharge summary's effectiveness in conveying the patient's hospital course to subsequent care providers on a 9-point scale. Abington Memorial Hospital developed an “educational patient laboratory service,” where trainees are systematically and intensively observed by specifically trained faculty in all aspects of patient care, with the majority of interactions providing opportunities for feedback and evaluation. At New York Medical College, residents participated in an objective structural clinical examination case management skill assessment with simulation of scenarios that focused on the challenges of posthospital care in patients with language barriers, noncompliance, and complex home care.
The link between education and improving the quality of patient care has already become evident as new approaches focus on this critical aspect of education. University of Wisconsin School of Medicine and Public Health is developing a formal, resident-specific patient outcome database of 18 care measures, tracked sequentially over all 3 years of training. Mayo Clinic College of Medicine has established the “Patient Portal” to track resident-specific patient preventive screening rates along with an accompanying web-based tool that brings to one screen all costs and charges related to a patient's care over any time period, even permitting the resident to “delete” a charge to remodel the cost estimate. Banner Good Samaritan Regional Medical Center implemented a health literacy tool and a “Who's My Doctor?” program so patients can more easily identify the physician who is always responsible for their care. Henry Ford Hospital developed a competency-based survey of residents' postgraduate employers and fellowship directors to compare these assessments with the summative evaluation of the program director. This comparison should reveal opportunities to modify assessment and training in their program.
Discussion
EIP was designed to facilitate the transition from process-based accreditation requirements to outcomes-based requirements by encouraging high-performing programs to experiment with new facets of education that will better prepare physicians for the health care environment. While some programs initiated significant structural changes, the majority of innovations focused on new curricula, the development of tools for future program use, and the requisite faculty development needed to implement the proposed projects. The EIP programs overall did address the critical agenda of patient safety and the physicians' commitment to systematically review the quality of patient care.
Process-based program requirements imposed by RRC-IM have long been considered significant barriers to innovation. However, most EIP innovations did not require the regulatory relief afforded by the EIP program requirements. Interestingly, the majority of programs maintained the general structure and requirements mandated by RRC-IM. It remains unclear whether this adherence to the previous requirements resulted from a fear of resident or faculty dissatisfaction, a concern about future ACGME requirements, a belief that the current requirements are sensible, or simply a failure to feel that structural innovation would be feasible or embraceable.
Traditional program evaluation has centered on the measurement of process, with the single outcome measurement limited to ABIM certification rates. In contrast, EIP evaluation focuses on the effectiveness of local continuous improvement processes that will eventually inform the educational community: what was tried, how the program evaluates the outcome, and what was learned that would improve patient care or resident education. In this setting, it is not just the attained outcomes that are important; the obstacles and their solutions, failures and their root causes, and anticipated barriers are also of value to both regulators and educators.
As EIP programs began to implement their proposed innovations, challenges have clearly arisen. For example, the quality and safety agenda can be unfamiliar to learners and faculty ensconced in traditional teaching hospitals. Reducing variation in rounding habits and record-keeping may be perceived by both faculty and residents as an intrusion on professionalism and academic freedom. Increased ambulatory training may be viewed as a threat to some subspecialties, as explicit competency assessment can result in an uneven demand for specialty education when residents seek to correct their deficiencies. Enhanced curricula and educational methods demand well-trained and engaged faculty members, straining faculty resources, clinical productivity, and exposing the most involved faculty to burnout and attrition. Separation of service from educational activities, traditionally difficult, can be further threatened when rapid cycles of improvement and innovation increase the tension between these competing interests.
EIP programs are required to present their progress annually, in conjunction with the Association of Program Directors in Internal Medicine Spring Conference. In the first year of EIP accreditation, RRC-IM members and staff organized these sessions and programs discussed their intended innovations. Since the initiation of EIP, it is evident that these educational leaders have leveraged their enthusiasm to build a national coalition to improve resident education while improving health care. The educational community will look to the EIP programs to lead the way in exploring opportunities to design or modify experiences as each program deems appropriate, rather than depending upon RRC-IM's judgment that all programs should follow an identical recipe for training the internist of the future.
Acknowledgements
We would like to thank Allan H. Goroll, MD, Henry J. Schultz, MD, and Paul Rockey, MD for their helpful comments; Karen L. Lambert and William E. Rodak, MD, for locating the historical documents of ACGME; current EIP subcommittee members of RRC-IM (in addition to the authors): Rosemarie L. Fisher, MD, Glenn M. Mills, MD, Eileen E. Reynolds, MD, Richard H. Simon, MD, and previous EIP subcommittee members, Carl Sirio, MD, and Richard F. LeBlond, MD; Debra L. Dooley, who provides ongoing support to the EIP project, and Thomas J. Nasca, MD, who initiated this project during his tenure as Chair of RRC-IM.
Addendum: As of July 2007, 4 additional programs have been admitted to the EIP: St. Barnabas Medical Center, Indiana University School of Medicine, University of California, San Francisco, School of Medicine, and University of Pittsburgh School of Medicine.
References
- Number of accredited programs by academic year (7/1/2007-6/30/2008).(Accessed January 12, 2008)
- Internal medicine program requirements.(Accessed December 2, 2007)
- Residency Review Committee for Internal Medicine.Ann Intern Med. 2004; 140: 902-909
- (Accessed February 29, 2008)
- Redesigning training for Internal Medicine.Ann Intern Med. 2006; 145: 865-866
- Redesigning residency education in Internal Medicine: a position paper from the Association of Program Directors in Internal Medicine.Ann Intern Med. 2006; 144: 920-926
- Redesigning residency training in Internal Medicine: the consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force.Acad Med. 2007; 82: 1211-1219
- Council on Medical Education.JAMA. 1957; 165: 60-65
History and Organization of the ACGME, Orientation Program. Revised 1993.
- AAMC GME and IME Payments.(Accessed January 12, 2007)
- Graduate Medical Education Data Resource Book, Academic Year 2005-2006.ACGME, Chicago, IL2007
- Directory of Approved Internships and Residencies. 1965; (on file, ACGME, Chicago, IL)
- Guides for Residency Program in Internal Medicine. 1974;
- (Accessed January 9, 2007)
- Overview of current learning theories for medical educators.Am J Med. 2006; 19: 903-907
Article info
Identification
Copyright
© 2009 The Association of Professors of Medicine. Published by Elsevier Inc. All rights reserved.