APM perspective| Volume 122, ISSUE 4, P398-404, April 2009

Internal Medicine's Educational Innovations Project: Improving Health Care and Learning

      More than 22,000 residents train in the 385 Accreditation Council for Graduate Medical Education (ACGME)-accredited core programs in internal medicine each year.
      Accreditation Council for Graduate Medical Education (ACGME)
      Number of accredited programs by academic year (7/1/2007-6/30/2008).
      The Residency Review Committee for Internal Medicine (RRC-IM) relies on a site visit, the results of a resident questionnaire, and the American Board of Internal Medicine (ABIM) pass rate to judge the effectiveness of these training programs.
      Accreditation Council for Graduate Medical Education (ACGME)
      Internal medicine program requirements.
      In 2004, the Long-Range Planning Committee of RRC-IM proposed a plan to reform the accreditation process that would emphasize local educational innovation and utilize outcomes-based assessment to determine the effectiveness of the training program.
      • Goroll A.H.
      • Sirio C.
      • Duffy F.D.
      • et al.
      Residency Review Committee for Internal Medicine A new model for accreditation of residency programs in Internal Medicine.
      ACGME has encouraged innovations that link quality training to quality patient care, simplify training requirements, and define outcomes that will measure the success of the training program.
      Committee on Innovation in the Learning Environment (CILE)
      Recently, the internal medicine community has called for a redesign of training to reflect the needs of patients and society.
      • Fitzgibbons J.P.
      • Meyers F.J.
      Redesigning training for Internal Medicine.
      • Fitzgibbons J.P.
      • Bordley D.R.
      • Berkowitz L.R.
      • Miller B.W.
      • Henderson M.C.
      Redesigning residency education in Internal Medicine: a position paper from the Association of Program Directors in Internal Medicine.
      • Meyers F.J.
      • Weinberger S.E.
      • Fitzgibbons J.P.
      • Glassroth J.
      • Duffy F.D.
      • Clayton C.P.
      Redesigning residency training in Internal Medicine: the consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force.
      In response to these challenges, RRC-IM developed the Educational Innovations Project (EIP) to begin the process of accreditation reform. This project provides an opportunity for internal medicine educational leaders to reconsider the regulations that currently govern training, experiment with new approaches to resident education and evaluation, and lead the way in determining the critical educational requirements that will shape the internist of the future. In this article, we describe EIP in the context of previous ACGME requirements and showcase the innovations of those programs that are participating in EIP as an alternate accreditation pathway.
      • The Accreditation Council for Graduate Medical Education and the Residency Review Committee for Internal Medicine established the Educational Innovations Project to test alternative methods to approaches to meet program requirements to improve education and patient care.
      • Programs have developed a variety of innovations including geographic wards, hospitalist-resident partnerships, longer ambulatory blocks, and handoff simulations.
      • Educational Innovations Project programs can focus more on the process of quality improvement rather than solely the outcomes.

      Graduate Medical Education and the Evolution of the ACGME RRC-IM

      The genesis of graduate medical education can be found in a grassroots movement of physicians who desired practical experience to prepare them to care for patients and a more prolonged experience to develop expertise in a given field of medicine. In the second half of the 19th century, the term “intern” was used to designate the paid house officer whose function was to help the medical staff of hospitals. By 1889, Johns Hopkins Hospital coined the term “resident” to represent individuals who had completed an internship, but who wished to stay on to gain greater expertise in a specific area, clearly differentiating an educational experience as distinct from hands-on, paid work.
      • Turner E.L.
      Council on Medical Education Background and development of residency review and conference committees.
      In 1914, the American Medical Association Council on Medical Education published its first list of hospitals appropriate for internship and adopted standards for adequate education and facilities. Surprisingly familiar are the requirements of those early years: a review of basic underlying pathophysiology, hospitals and clinics with patients and important operative techniques of the day, expert physicians who could guide learners, and opportunities for research in science. It was not until 1972, however, that the American Medical Association, the American Board of Medical Specialties, the Association of American Medical Colleges, the subspecialty societies, and the major teaching hospitals formed the Coordinating Council of Medical Education, the forerunner of the current ACGME, which came into being in 1982.

      History and Organization of the ACGME, Orientation Program. Revised 1993.

      Today, more than 100,000 residents train in 2500 institutions that receive approximately $8 billion of federal funding.
      • Turner E.L.
      Council on Medical Education Background and development of residency review and conference committees.
      Association of American Medical Colleges (AAMC)
      AAMC GME and IME Payments.
      Accreditation Council for Graduate Medical Education (ACGME)
      Graduate Medical Education Data Resource Book, Academic Year 2005-2006.
      The requirements for ACGME accreditation have understandably proliferated as educators have sought to protect their learners and the educational process and to assure the public that physicians who complete these training programs are competent to practice in their chosen specialty.
      Until 1965, internal medicine functioned under the guidelines for “Essentials of Approved Residencies.”
      Accreditation Council for Graduate Medical Education (ACGME)
      The special requirements for internal medicine broadly included need for experience in disciplines in addition to medicine, such as psychiatry, neurology, dermatology, and pediatrics; focus on education, with staff members providing regular instruction in diagnostic and therapeutic methods; assumption of graduated responsibility for patient care under supervision; need to review pathology along with other fields of medical and basic science literature; and participation in clinical rounds. Most critical of all, adequate numbers of patients had to be admitted to the medicine “department” to ensure an adequate environment for education. These requirements comprised one half of a page and formed the basis for the first “Training Requirements in Internal Medicine” in 1974.
      Residency Review Committee in Internal Medicine, approved by the American Board of Internal Medicine, The American College of Physician, and The Council on Medical Education of the American Medical Association
      Table 1 summarizes the internal medicine training requirements as they have evolved over the past 30 years. Increased specificity and detail are reflected in the increased length of these requirements (number of pages). However, also evident is the enhanced focus on education and processes, as physicians-in-training have cared for sicker patients in health care systems with greater complexity. In contrast, Table 2 delineates the core requirements maintained in EIP, which was meant to provide relief from extensive process-based regulation while also extending the time between RRC accreditation reviews to 10 years. Intended to stimulate outcome measures of program effectiveness and to foster a positive and flexible learning environment, EIP challenges educational leaders to design internal medicine training programs that are better aligned with the needs of patients and physicians in the current era.
      Table 1Development of Internal Medicine Training Requirements
      Summarized from the Corresponding Bulletins of the ACGME on Requirements for Internal Medicine training, from years 1978-2004.
      YearLength #PagesTime and ExperienceFaculty and EnvironmentEducationEvaluation
      • 24 months patient responsibility
      • Broad experience in prevention, diagnosis, treatment, rehabilitation
      • Experience in non-medical specialties
      • Chief of Medicine
      • Subspecialty faculty participate in national societies
      • Specialty labs
      • Strong consult services
      • Enough residents for peer interaction
      • Adequate patient volume
      • Not primarily service
      • Bedside teaching
      • Departmental conferences
      • Graduated responsibility
      ABIM exam elective
      • 3 years of training including emergency and intensive care training
      • Weekly half-day continuity clinic
      • Broad experience in all ages and both sexes
      • Experience in neurology, orthopedics, gynecology, psychology, dermatology, ophthalmology
      • Key faculty should do research
      • Division heads desirable
      • Committed teachers
      • Program director (PD) with authority
      • Medical records available
      • 24-h library
      • Mainly educational
      • Teaching rounds 3×/week
      • Research conferences
      • PGY-1 limited to10-15 patients
      • Learn to function as team
      • Residents should be made aware of their strengths and weaknesses
      • There should be records of a personal interview and evaluation
      • Rotations at least 4 weeks
      • Consultation and geriatrics required
      • Service for patients of trainees only
      • Trainee should not go off duty until the care of their patient is assured
      • Evaluation at end of each rotation
      • Faculty should be evaluated
      • 25% ambulatory care
      • Continuity preferred for 3 years, but block is acceptable
      • ICU no more than 6 months
      • PD with 5 years experience
      • 4 key clinical faculty (KCF) at 20 h/week
      • All faculty and division heads ABIM-certified
      • 24-h ancillary support
      • Ambulatory care clinics supported
      • Adequate call rooms and food
      • Computerized search capability
      • 75% should take, 50% should pass the ABIM
      • PD should regularly evaluate the clinical competence of residents.
      • 24 months of meaningful patient responsibility
      • Must have emergency medicine, ambulatory, dermatology, and neurology
      • Emergency medicine not more than 3 months
      • Must have a written curriculum that is revised by residents and faculty
      • Limit number of learners on teaching rounds
      • Procedures delineated
      • Must participate in scholarly project
      • CEX recommended
      • Residents should review program annually
      • 36 months with:
      • - one third ambulatory
      • - one third inpatient
      • Continuity all 3 years, day per week.
      • PD role expanded to oversight of subspecialties
      • PIF must be accurate
      • Cardiac catheterization facility must be present at primary teaching site
      • Teaching rounds delineated as separate from management rounds
      • Expanded educational topics required
      • Must repeat core conferences
      • Journal club required
      • PD must provide formative and summative evaluation of trainee
      • Trainee must be evaluated verbally and in writing at end of each rotation
      • Chart audit of trainees
      2000Transplant experience limited to 1 monthPD with greater authority and oversight
      • Detailed sign-outs required;
      • Resident work-load numbers defined
      80% must take, 60% must pass the ABIM
      200411Continuity all 3 years, at least 108 ambulatory clinics.
      • PD must have 3 years administrative experience
      • Associate PD and KCF formula
      • Site coordinator required
      • Resident service limits;
      • Minimum numbers of inpatients for PGY1s
      • Sex-specific competencies
      • 80% must take, 70% must pass the ABIM
      • Confidential evaluation
      • 360 evaluations
      • Competency-based Performance improve-ment measures.
      • CEX during first year.
      ABIM=American Board of Internal Medicine; PD=program director; PGY=postgraduate year; ICU=intensive care unit; KCF=key clinical faculty; PIF=program information form; CEX=observed clinical examination.
      low asterisk Summarized from the Corresponding Bulletins of the ACGME on Requirements for Internal Medicine training, from years 1978-2004.
      Table 2Educational Innovations Project – Synopsis of Program Requirements
      Length of time and required experiences
      • 36 months of required training during which residents must form long-term healing relationships with patients.
      • Residents must complete emergency medicine, inpatient, critical care unit, and consultative medicine experiences.
      • Adequate exposure to all medicine subspecialties.
      Environment and faculty
      • A qualified program director with sufficient numbers of associate program directors (20 h/week) and board-certified faculty, educational site coordinators must be available for the training program.
      • The institution should guarantee adequate resources, and there should be an electronic health record or plans to implement one.
      • Trainees must be able to access onsite library or reference material.
      • Advanced residents should supervise first-year residents at some time during training.
      • Patient care must be integrated with performance improvement and medical education.
      • The program must ensure rigorous sign-in and sign-out procedures.
      • There must be a well-organized curriculum, with goals and objectives organized by training level, to achieve competence in the specialty of internal medicine.
      • Structured educational exercises must occur.
      • Residents must participate in scholarly activity.
      • The program must demonstrate that they have effective mechanisms to assess resident performance.
      • Trainees must undergo a formative evaluation that is developed through close observation of the trainee.
      • Each trainee must be evaluated using 2 evaluation tools for each ACGME-required competency, including a global evaluation.
      • 360-degree (multi-rater) evaluations must be utilized in the program.
      • The trainee must receive monthly feedback and semi-annual summative evaluation.
      Assessment of training program
      • Residents must evaluate the faculty annually.
      • Resident performance and patient outcomes must be used to judge the effectiveness of the training program.
      • The program must be evaluated annually.

      The Internal Medicine Educational Innovations Programs

      In fall 2005, 123 programs were notified of eligibility for EIP, based upon the stability of program leadership, a minimum of 8 years accreditation during the last 2 ACGME review cycles, and an ABIM program pass rate of more than 80%.
      A request for application was distributed that required critical aspects: support of the institutional leadership, with a guarantee that resources would be available for the proposed projects; the plan for an electronic medical record in the near future; and the completion of a 20-page proposal describing program innovations that would link residency training to quality patient care and use new tools to assess patient outcomes or the effectiveness of resident education. Seventy-three programs submitted letters of intent and 40 were encouraged to complete the application process. Applications were reviewed by the EIP subcommittee of the RRC-IM and selected based on compliance with the requirements and goals of EIP. Seventeen programs entered the EIP accreditation pathway in July 2006. Table 3 shows the initial programs accepted into EIP and working under EIP requirements for more than 1 year, including 8 community programs, 8 university programs, and one municipal program with a broad range of program size (range 30-168 trainees, median size of 82 trainees).
      Table 3Phase One Education Innovations Project Participants, 2006-2016
      Program NameLocationTypeResidents
      Abington HospitalPhiladelphia, PACommunity44
      Aurora HealthMilwaukee, WICommunity39
      Banner Good SamaritanPhoenix, AZCommunity95
      Baystate Medical CenterWorchester, MACommunity88
      Henry Ford HospitalDetroit, MICommunity125
      Long Island JewishLong Island, NYCommunity82
      Scripps Mercy HospitalSan Diego, CACommunity30
      Summa Health System/Northeast Ohio University College of MedicineAkron, OHCommunity48
      Hennepin County Medical CenterMinneapolis, MNMunicipal67
      Beth Israel DeaconessBoston, MAUniversity158
      Duke UniversityDurham, NCUniversity163
      Mayo ClinicRochester, MNUniversity168
      Ohio State UniversityColumbus, OHUniversity108
      Southern Illinois UniversitySpringfield, ILUniversity53
      University of CincinnatiCincinnati, OHUniversity108
      University of WisconsinMadison, WIUniversity81
      Westchester New York Medical CenterWestchester, NYUniversity66
      Participation in EIP also required the submission of an annual report to RRC-IM. The purpose of these reports is to ensure ongoing compliance with the goals and requirements of EIP, confirm continued institutional support, and review ongoing educational and clinical outcomes. Feedback to the program is provided and RRC captures narratives about success, barriers, and solutions.

      Educational Innovations Projects

      Each EIP proposal included 3 to 5 projects. Upon review of all accepted program proposals, these projects fell into 1 of 5 broad themes summarized in Table 4. “Learning environment innovations” resulted in a substantive reorganization of the training program for all residents. “Safety and performance improvement projects” were listed in separate categories because these interventions were considered a particular focus of EIP. “Other curricular changes” included the development of a new rotation or new curriculum that enhanced ongoing experiences in the training program. Finally, EIP considered the development of “new evaluation tools” or the testing of existing tools as a necessary contribution for the EIP programs to advance the future assessment capabilities of all training programs. While it is not possible to detail all the projects, each program is featured with an example of an innovation in 1 of these 5 areas.
      Table 4Themes of EIP Proposed Projects
      Learning Environment Innovations
       Geographic wards with interdisciplinary teams and patients admitted daily; Focused inpatient and outpatient experiences with care delivered by training partners; Immersion inpatient and outpatient experiences by year of training; Tailored third year of training; Learner-manager sequence as graded responsibility.
      Patient Safety Enhancements
       Resident patient safety and disclosure council; Patient safety conference; Patient safety rotation; Faculty patient safety training; Use of AHRQ patient safety survey and check list; Patient safety curricula; Transitions of care (including handoffs) curricula.
      Performance Improvement
       “Pay for Performance” reward system; Performance improvement measures in ambulatory care; Chronic disease registries; ABIM Practice Improvement Modules implementation; Chronic care collaboratives; Chart audits.
      Other Curricular Changes
       Administrative rotations; Focused learning laboratory; Hospital apprenticeships; Clinical skills curricula; Communications curricula; Ultrasound examinations; Procedural simulation curricula; Professionalism curricula; Interpretive skills curricula; Teamwork curriculum.
      New Evaluation Tools
       Praise and concern reporting for professionalism; Template “grading” for discharge summaries; Standardized norms of performance across all trainee levels; Performance improvement conference and residency group projects; Professionalism OSCE; ICD tracking tool for patient experiences; Individualized learning plans; Handoff templates; Disclose form; Web language translator; Inpatient rounding tool; Health care access survey; Health literacy tool; Competency-based post-graduate assessment.
      EIP=Educational Innovations Project; AHRQ=Agency for Healthcare Research and Quality; ABIM=American Board of Internal Medicine; OSCE=Objective Structured Clinical Examination; ICD=International Classification of Diseases.
      Several proposed structural changes allowed residents to enter a learning environment with enhanced team interactions and learner-teacher relationships. For example, geographic wards at Harvard Medical School Beth Israel Deaconess Medical Center and Long Island Jewish Medical Center aligned nursing units and firm-based hospitalists, so that residents integrate into a single team with continuity between residents and faculty and between residents and other health care personnel. Additionally, residents assumed ownership for the quality improvement and utilization measures on those same wards. Baystate Medical Center paired a second-year resident with a hospitalist teacher to allow residents to mature in their patient management skills before postgraduate year 3 (PGY-3), when they then manage and teach their own teams. This program aimed to aggressively address the “milestones” that residents must reach before they progress, so that competence is no longer presumed to be a product of time in an experience. Aurora Sinai Medical Center assessed nurse and social worker satisfaction with intern responsiveness and competency before and after a new multidisciplinary apprenticeship experience. Taken together, these innovations maximize social network learning opportunities and strong learner-teacher and patient-learner-teacher relationships.
      • Torre D.M.
      • Daley B.J.
      • Sebastian J.L.
      • Elnicki D.M.
      Overview of current learning theories for medical educators.
      Ambulatory care structural innovations aimed at residents reducing their distraction during continuity experiences have been particularly robust across the initial 17 EIP programs. For example, Hennepin County Medical Center initiated a partnership experience in which residents are scheduled into their continuity clinic twice weekly during outpatient and consultative rotations (about one half of the curricular schedule) and not at all during inpatient ward rotations. Faculty members experience a challenging, but parallel schedule. During the 2-month inpatient cycles, patients from the resident's continuity panel are seen by a designated practice partner, another resident with the complementary schedule. Resident-patient continuity reached 70% with this model. Duke University School of Medicine has a similar schedule, which they term “practice partnerships.” University of Cincinnati College of Medicine introduced its year-long “long block” ambulatory group practice and elective experience, during which, between months 17 and 28 of the 36-month training sequence, PGY-2 residents transition from primarily inpatient services to an expanded outpatient experience. They see patients in continuity clinic 3 half-days per week and remain responsible for responding to their patients at all times. Residents perceive that they have developed a healing relationship with nearly 50% of their patients, compared with 15% before the long block implementation. Resident satisfaction has increased and the hospital has experienced a decrease in emergency room utilization by the continuity patients of the residency.
      Several new experiences or curricula for residents attest to new content and skills required to become an internist competent for practice in today's world. Many of these changes are patient safety-oriented, because EIP aimed to address safety and implement curricula directed at the transitions of care. At Southern Illinois University School of Medicine, difficult handoff situations are recreated in a simulation laboratory. Inpatient handoffs are then directly observed on medicine ward rotations, with attention to quality, safety, and error reduction, using a standardized tool: situation, background, assessment, and recommendation process. At Summa Health System/Northeast Ohio University College of Medicine, computerized intensive care unit handoffs decreased sign-out cycle time by 25%. Ohio State University College of Medicine implemented specific instruction in computerized sign-out, and faculty members observe all sign-outs until the resident is judged competent. Night teams have formal nocturnal rounds on patients designated by the day teams. Scripps Mercy Hospital measured each standardized format discharge summary's effectiveness in conveying the patient's hospital course to subsequent care providers on a 9-point scale. Abington Memorial Hospital developed an “educational patient laboratory service,” where trainees are systematically and intensively observed by specifically trained faculty in all aspects of patient care, with the majority of interactions providing opportunities for feedback and evaluation. At New York Medical College, residents participated in an objective structural clinical examination case management skill assessment with simulation of scenarios that focused on the challenges of posthospital care in patients with language barriers, noncompliance, and complex home care.
      The link between education and improving the quality of patient care has already become evident as new approaches focus on this critical aspect of education. University of Wisconsin School of Medicine and Public Health is developing a formal, resident-specific patient outcome database of 18 care measures, tracked sequentially over all 3 years of training. Mayo Clinic College of Medicine has established the “Patient Portal” to track resident-specific patient preventive screening rates along with an accompanying web-based tool that brings to one screen all costs and charges related to a patient's care over any time period, even permitting the resident to “delete” a charge to remodel the cost estimate. Banner Good Samaritan Regional Medical Center implemented a health literacy tool and a “Who's My Doctor?” program so patients can more easily identify the physician who is always responsible for their care. Henry Ford Hospital developed a competency-based survey of residents' postgraduate employers and fellowship directors to compare these assessments with the summative evaluation of the program director. This comparison should reveal opportunities to modify assessment and training in their program.


      EIP was designed to facilitate the transition from process-based accreditation requirements to outcomes-based requirements by encouraging high-performing programs to experiment with new facets of education that will better prepare physicians for the health care environment. While some programs initiated significant structural changes, the majority of innovations focused on new curricula, the development of tools for future program use, and the requisite faculty development needed to implement the proposed projects. The EIP programs overall did address the critical agenda of patient safety and the physicians' commitment to systematically review the quality of patient care.
      Process-based program requirements imposed by RRC-IM have long been considered significant barriers to innovation. However, most EIP innovations did not require the regulatory relief afforded by the EIP program requirements. Interestingly, the majority of programs maintained the general structure and requirements mandated by RRC-IM. It remains unclear whether this adherence to the previous requirements resulted from a fear of resident or faculty dissatisfaction, a concern about future ACGME requirements, a belief that the current requirements are sensible, or simply a failure to feel that structural innovation would be feasible or embraceable.
      Traditional program evaluation has centered on the measurement of process, with the single outcome measurement limited to ABIM certification rates. In contrast, EIP evaluation focuses on the effectiveness of local continuous improvement processes that will eventually inform the educational community: what was tried, how the program evaluates the outcome, and what was learned that would improve patient care or resident education. In this setting, it is not just the attained outcomes that are important; the obstacles and their solutions, failures and their root causes, and anticipated barriers are also of value to both regulators and educators.
      As EIP programs began to implement their proposed innovations, challenges have clearly arisen. For example, the quality and safety agenda can be unfamiliar to learners and faculty ensconced in traditional teaching hospitals. Reducing variation in rounding habits and record-keeping may be perceived by both faculty and residents as an intrusion on professionalism and academic freedom. Increased ambulatory training may be viewed as a threat to some subspecialties, as explicit competency assessment can result in an uneven demand for specialty education when residents seek to correct their deficiencies. Enhanced curricula and educational methods demand well-trained and engaged faculty members, straining faculty resources, clinical productivity, and exposing the most involved faculty to burnout and attrition. Separation of service from educational activities, traditionally difficult, can be further threatened when rapid cycles of improvement and innovation increase the tension between these competing interests.
      EIP programs are required to present their progress annually, in conjunction with the Association of Program Directors in Internal Medicine Spring Conference. In the first year of EIP accreditation, RRC-IM members and staff organized these sessions and programs discussed their intended innovations. Since the initiation of EIP, it is evident that these educational leaders have leveraged their enthusiasm to build a national coalition to improve resident education while improving health care. The educational community will look to the EIP programs to lead the way in exploring opportunities to design or modify experiences as each program deems appropriate, rather than depending upon RRC-IM's judgment that all programs should follow an identical recipe for training the internist of the future.


      We would like to thank Allan H. Goroll, MD, Henry J. Schultz, MD, and Paul Rockey, MD for their helpful comments; Karen L. Lambert and William E. Rodak, MD, for locating the historical documents of ACGME; current EIP subcommittee members of RRC-IM (in addition to the authors): Rosemarie L. Fisher, MD, Glenn M. Mills, MD, Eileen E. Reynolds, MD, Richard H. Simon, MD, and previous EIP subcommittee members, Carl Sirio, MD, and Richard F. LeBlond, MD; Debra L. Dooley, who provides ongoing support to the EIP project, and Thomas J. Nasca, MD, who initiated this project during his tenure as Chair of RRC-IM.
      Addendum: As of July 2007, 4 additional programs have been admitted to the EIP: St. Barnabas Medical Center, Indiana University School of Medicine, University of California, San Francisco, School of Medicine, and University of Pittsburgh School of Medicine.


        • Accreditation Council for Graduate Medical Education (ACGME)
        Number of accredited programs by academic year (7/1/2007-6/30/2008).
        (Accessed January 12, 2008)
        • Accreditation Council for Graduate Medical Education (ACGME)
        Internal medicine program requirements.
        (Accessed December 2, 2007)
        • Goroll A.H.
        • Sirio C.
        • Duffy F.D.
        • et al.
        Residency Review Committee for Internal Medicine.
        Ann Intern Med. 2004; 140: 902-909
        • Committee on Innovation in the Learning Environment (CILE)
        (Accessed February 29, 2008)
        • Fitzgibbons J.P.
        • Meyers F.J.
        Redesigning training for Internal Medicine.
        Ann Intern Med. 2006; 145: 865-866
        • Fitzgibbons J.P.
        • Bordley D.R.
        • Berkowitz L.R.
        • Miller B.W.
        • Henderson M.C.
        Redesigning residency education in Internal Medicine: a position paper from the Association of Program Directors in Internal Medicine.
        Ann Intern Med. 2006; 144: 920-926
        • Meyers F.J.
        • Weinberger S.E.
        • Fitzgibbons J.P.
        • Glassroth J.
        • Duffy F.D.
        • Clayton C.P.
        Redesigning residency training in Internal Medicine: the consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force.
        Acad Med. 2007; 82: 1211-1219
        • Turner E.L.
        Council on Medical Education.
        JAMA. 1957; 165: 60-65
      1. History and Organization of the ACGME, Orientation Program. Revised 1993.

        • Association of American Medical Colleges (AAMC)
        AAMC GME and IME Payments.
        (Accessed January 12, 2007)
        • Accreditation Council for Graduate Medical Education (ACGME)
        Graduate Medical Education Data Resource Book, Academic Year 2005-2006.
        ACGME, Chicago, IL2007
        • Accreditation Council for Graduate Medical Education (ACGME)
        Directory of Approved Internships and Residencies. 1965; (on file, ACGME, Chicago, IL)
        • Residency Review Committee in Internal Medicine, approved by the American Board of Internal Medicine, The American College of Physician, and The Council on Medical Education of the American Medical Association
        Guides for Residency Program in Internal Medicine. 1974;
        • Educational Innovation Project (EIP)
        (Accessed January 9, 2007)
        • Torre D.M.
        • Daley B.J.
        • Sebastian J.L.
        • Elnicki D.M.
        Overview of current learning theories for medical educators.
        Am J Med. 2006; 19: 903-907