If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
Requests for reprints should be addressed to Lisa L. Willett, MD, MACM, Department of Medicine, University of Alabama at Birmingham, Tinsley Harrison Residency Program, BDB 341, 1720 2nd Avenue S, Birmingham, AL 35294-0012.
American Board of Internal Medicine 3-year rolling pass rates vary across geographic region, program size, and program director tenure.
Programs offer a variety of methods to prepare residents for the examination, and report making changes in response to failures.
Program directors attribute resident failures to low performance on prior standardized tests, and are giving greater consideration to US Medical Licensing Examination scores when ranking student applicants.
The American Board of Internal Medicine (ABIM) establishes uniform standards for physicians in practice.
The ABIM certifies that internists demonstrate knowledge, skills, and attitudes essential for patient care. For initial certification, a physician must complete the requisite medical education and clinical training, and pass a high-stakes written examination.
The ABIM certification examination is also high stakes for residency programs. For accreditation in Internal Medicine (IM), a program must have at least 80% of its graduates pass the examination on the first attempt in the most recently defined 3-year period, called the “3-year rolling pass rate.”
Over the past 6 years, the number of residency programs achieving an 80% 3-year program pass rate declined from 96% in 2008 to 75% in 2013, with 22% of programs falling below the accreditation requirement for the most recently reported interval (2012-2014) (Figure 1),
Given the importance of the certification examination for residents, program accreditation, and patients, the decline in program pass rates is concerning and warrants study. Understanding factors that place trainees and programs at risk of failure may provide solutions to better prepare residents for success. Residency program directors (PDs) are uniquely aware of certification examination outcomes, at both the individual and program level. Thus, we conducted a national survey of IM PDs to identify program characteristics associated with program pass rates, explore reasons, and program response to the decline.
The Association of Program Directors in Internal Medicine Survey Committee develops yearly questionnaires to track characteristics and issues facing IM residency programs. We used the 2013 survey to ask questions related to the declining ABIM program pass rates. E-mail notification with a program-specific hyperlink to a Web-based questionnaire was sent August-November 2013 to all 365 Association of Program Directors in Internal Medicine member programs, representing 93.4% of Accreditation Council for Graduate Medical Education (ACGME)-accredited IM residency programs.
Prior to blinding program identity for analysis, survey responses were appended with data from publicly available sources, accessed December 2013. Programs were assigned to geographic regions by US Census Bureau definition.
included accreditation cycle length, government affiliations, number of approved training positions, and PD appointment date. We included programs with >10 residents meeting all reported criteria. This study was exempt by the Mayo Clinic Institutional Review Board.
To evaluate sample representativeness, we compared programs responding to nonresponders across the publicly available variables (program type, region, size, PD tenure, and ABIM program pass rates) using Fisher's exact tests or 2 sample t tests, as appropriate. We assessed differences in 3-year program pass rates (years 2010-2012)
across program demographics (program type, region, size, and PD tenure), resident demographics (percentage of US medical graduates, women, and underrepresented minorities), and PD responses to survey items using multivariate analysis of variance models. PDs' report of correlates with residents who failed were described.
Differences in PD agreement and program offerings included: 1) perceived reasons for program pass rate decline, 2) perceptions of why individual residents in their program did not pass, 3) methods to prepare their residents for the examination, and 4) response to the decline.
We collapsed the original 5-point agreement scale (1 = “strongly disagree,” 2 = “disagree,” 3 = “neutral,” 4 = “agree,” 5 = “strongly agree”) to a binary response (1 = “strongly agree” or “agree,” 0 = “neutral,” “disagree,” or “strongly disagree”). Associations were assessed with Fisher's exact tests for 2-way tables, 2 sample t tests, or logistic regression, as appropriate. Continuous-valued numeric characteristics were trichotomized using sample tertiles to create similarly sized groups within demographics.
The significance level for all analyses was set at 0.01 for multiple comparisons, and all tests were 2-sided. Analyses used SAS 9.3 (SAS Institute, Cary, NC).
Three-Year Program Pass Rates by Program Characteristics and Resident Demographics
Surveys were returned by 265 PDs (73%). Responding programs did not differ from nonresponding (Table 1). Of ACGME-accredited programs, 370/391 (94.6%) had 3-year program pass rates (years 2010-2012) available from the ABIM, with a mean (95% confidence interval [CI]) pass rate of 85.0% (84.0-86.0); mean (95% CI) program pass rates from survey respondents was 85.7% (84.6-86.9).
Table 1Association of Program Directors in Internal Medicine 2013 Survey Responders vs Nonresponders for Publicly Available Characteristics of 365 Internal Medicine Residency Programs
ABIM = American Board of Internal Medicine; ACGME = Accreditation Council for Graduate Medical Education; ANOVA = analysis of variance; CI = confidence interval; Q1 = first quartile; Q3 = third quartile.
Average 3-year program pass rates varied significantly across regions, program sizes, and PD tenures (all P ≤ .004) (Table 2). Military-based programs and programs in Puerto Rico had the lowest pass rates. Programs with >80% of positions filled with US medical graduates had the highest pass rates. Mean program pass rates did not differ according to programs' percentages of women or underrepresented minority graduates (Table 3).
Table 2Publicly Available Program Characteristics vs ABIM Program Pass Rate ('10-'12) Multivariate ANOVA Summaries (N = 370)
2010-2012 ABIM Program Pass Rate
Mean (95% CI)
Community-based, university affiliated hospital
Large (≥76 approved positions)
Medium (43-75 approved positions)
Small (≤42 approved positions)
Program director tenure
Long (>7 years)
Medium (2-7 years)
Short (<2 years)
ABIM = American Board of Internal Medicine; ANOVA = analysis of variance; CI = confidence interval.
Table 3Resident Cohort Demographic Tertiles Calculated from Association of Program Directors in Internal Medicine Survey Responses vs ABIM Program Pass Rate ('10-'12) Multivariate ANOVA Summaries (N = 265)
Resident Cohort Demographic
2010-2012 ABIM Program Pass Rate
Mean (95% CI)
ABIM = American Board of Internal Medicine; ANOVA = analysis of variance; CI = confidence interval; URM = underrepresented minority; USMG = United States medical graduate.
Program Director's Perceived Reasons for Decline in 3-Year Program Pass Rates
The majority of PDs agreed that the national decline in program pass rates was attributable to residents spending less time independently reading to improve their medical knowledge and reflecting about patients, and having less clinical experience due to duty hour limitations (Figure 2).
Perceived Reasons Why Individual Residents Did Not Pass the Examination
PDs reported a number of factors correlated with their residents' examination failures (Table 4). Most frequently cited were standardized test scores, especially low In-Training Examination (ITE) scores.
Most programs provided an internal board review program (79.3%), with a mean of 57.9 hours/year. There was no association with number of hours of board review offered and program pass rate. Programs used several teaching methods, mostly the Medical Knowledge Self-Assessment Program
(Figure 3). We found no association between types of methods and program pass rate (all P > .07). Some provided funding for the Medical Knowledge Self-Assessment Program (61.1%), study materials (49.8%), commercial review courses (26.0%), ABIM-specific study materials (20.8%), and travel expenses for courses (14.3%).
The majority administered the ITE to all postgraduate year (PGY) levels (PGY2 93.6%, PGY3 91.7%, PGY1 86.4%). Eighty-five percent used an ITE threshold score to identify residents at risk for failing the ABIM. The majority (81.8%) used a national percentile rank compared with peers, with a median (Q1-Q3) percentile of 30 (27.5-35). Programs with lower average pass rates (82.5% vs 87.4%) adopted a higher threshold, mean (95% CI) difference of 4.9% (2.1-7.8) (P = .001). We found no association between ITE thresholds used by PDs and 3-year program pass rates (P = .29).
Changes in Response to Failures
Sixty-four percent of PDs made changes to their board preparation methods in response to failures. Those who made changes had lower pass rates than those who had not (83.2% vs 92.7%), a mean (95% CI) difference of 9.6% (7.8-11.3) (P < .0001), and were more likely to fall below the ACGME requirement for ≥80% program pass rate (28.4% vs 1.4%; odds ratio 27.8; 95% CI, 3.8-205.4; P = .001). Sixty-two percent reported giving stronger consideration to US Medical Licensing Examination (USMLE)
scores for ranking residency applicants than in years past. PDs who reported giving increased consideration of USMLE scores were more likely to have lower program pass rates than those who did not (83.7% vs 90.6%), a mean (95% CI) difference of 6.9% (4.9-8.9) (P < .0001), and were more likely to fall below the ACGME requirement for ≥80% pass rate (27.8% vs 5.3%; odds ratio 6.9; 95% CI, 2.4-20.1; P = .0004).
Stable pass rates of individual first-time test takers in the face of declining 3-year program pass rates presents a paradox that is worth exploration. The stable pass rates of individuals suggest that the ABIM certification examination is not more difficult, nor are there intrinsic differences in the aggregate pool of medical trainees pursuing IM residencies. The reason for declining program pass rates is likely multifactorial, related to differences in the distribution of residents among certain programs who failed the ABIM examination. There may be a shift at the program level, such that residents who failed the examination were previously concentrated in a few programs and are now dispersed broadly across many programs. Conversely, residents most at risk of failing may now be concentrated within certain programs, with characteristics we identified. For example, it is possible that programs who place less emphasis on USMLE scores when ranking applicants, due to other highly desirable qualities, have lower pass rates. The shift may have disproportionately impacted programs with at-risk residents. We found examination pass rates to be lower for programs in Puerto Rico, smaller programs, those with shortest PD tenure and with more International Medical Graduates. Our findings are consistent with other studies of smaller programs and regional variance.
Reasons for these associations are not known, and may reflect differences in educational support, baseline knowledge of trainees, variations in clinical exposures, availability of resources for board preparation, or unstable programs. Further study to identify why certain programs are at risk is essential to ensure that programs fulfill accreditation requirements and that all trainees get the support they need to pass the examination.
In our survey, PDs perceive the national decline in program pass rates to be due to residents spending less time reading and reflecting about patients, and having less clinical experience due to duty hours. Although prior studies show residents spend less time with patients and in educational conferences due to duty hour limitations,
While there isn't a direct hypothesis for how these changes affected program pass rates, it is plausible that some programs diverted program resources away from board preparation to meet 2009 mandates.
Interestingly, the perceived decreased time in the clinical environment reported by PDs occurred concurrently with an increased emphasis on competencies other than medical knowledge, including system-based practice and practice-based learning and improvement.
Changes in the work environment that align with these competencies, including use of electronic health records, complex coordination of care transitions, increased documentation requirements, and other system-level complexities may be distracting from residents' attainment of medical knowledge. PDs attributed most resident failures with low performance on prior tests of medical knowledge; few attributed failures to low performance on clinical evaluations in other competency domains. How to best prepare residents for modern practice, while ensuring the fundamental medical knowledge of a competent physician, ready for unsupervised practice, remains an important area of study.
Because of declining program pass rates, the majority of PDs reported giving higher priority to USMLE scores to rank medical students on their match list than in years past, especially those with lower pass rates. This is important information for all medical students, faculty advisors, and admissions committees. Given the increasingly competitive match, due to slow growth of graduate medical education positions compared with increased numbers of medical school applicants,
students with lower USMLE scores may not only be in jeopardy of not matching at the IM residency program of their choice; they may not match at all.
Although our study is based on a large, nationally representative survey of PDs, it does have limitations. We assessed PDs' perceptions, not proven reasons, for the national decline in program pass rates and individual resident failures. For example, PDs perceive that ITE scores correlated with individual resident failures, but we did not collect actual score data with our survey. In addition, given the limited length of the survey, we may not have captured all perceptions about pass rates.
In this survey of IM residency PDs, certain program characteristics were associated with lower program pass rates on the ABIM certification examination. Programs offer a variety of methods to prepare their residents for the examination, and reported making changes in response to failures. The majority reported giving stronger consideration to USMLE scores than in years past when ranking students on their match list. Further study is needed to understand the paradox of declining program pass rates with stable individual pass rates.
We are grateful to the Mayo Clinic Survey Research Center for assistance with the survey design and data collection.
Funding: This work was supported by a grant from the Alliance for Academic Internal Medicine, on behalf of the Association of Program Directors in Internal Medicine Survey and Scholarship Committee. This work was also supported in part by the Mayo Clinic Internal Medicine Residency Office of Educational Innovations as part of the Accreditation Council for Graduate Medical Education Educational Innovations Project.
Conflict of Interest: FSM is a full-time employee of the American Board of Internal Medicine as the Senior Vice President for Academic and Medical Affairs. VMA is a member of the Board of Directors for the American Board of Internal Medicine (ABIM) and receives grant funding from the ABIM Foundation. Her faculty appointment and employment is at the University of Chicago. There are otherwise no conflicts of interest to report for all authors.
Authorship: All authors had access to the data and a role in writing the manuscript.