Advertisement

Usefulness of the ACGME Resident Survey: A View from Internal Medicine Program Directors

Published:December 27, 2013DOI:https://doi.org/10.1016/j.amjmed.2013.12.010
      Perspectives Viewpoints
      • Recent changes in the Accreditation Council for Graduate Medical Education (ACGME) resident survey generated significant concerns from internal medicine program directors about the validity of the survey for accreditation purposes.
      • Clarification of potentially ambiguous survey terms may reduce misinterpretation by residents and reassure program directors that the survey accurately represents their programs.
      • Program directors would welcome investigation of the survey for validity and consistency, as the ACGME has carefully done in previous survey versions.
      On July 1, 2013, the Accreditation Council for Graduate Medical Education (ACGME) began implementation of the Next Accreditation System (NAS).

      Accreditation Council for Graduate Medical Education (ACGME). ACGME Next Accreditation System. Available at: http://www.acgme-nas.org. Accessed October 1, 2013.

      Annual data reporting for residency programs in the NAS includes 9 reporting components. In addition to 3 new components (Milestones reporting, Clinical Learning Environment Review, and the annual faculty survey), the ACGME recently made substantial changes to the annual resident survey.

      Accreditation Council for Graduate Medical Education (ACGME). ACGME Data Collection System: Resident and fellow survey. Available at: http://www.acgme.org/acgmeweb/DataCollectionSystems/ResidentFellowSurvey.aspx. Accessed October 1, 2013.

      Programs' residents will no longer receive the same survey; rather, surveys will vary depending upon residents' previous survey responses and level of education. Program directors will no longer have the ability to preview the survey in order to assist residents in their understanding of the questions asked, nor will the aggregate reports program directors receive provide the specific questions residents were asked. In response to these changes, program directors have expressed concerns about the changes to the survey instrument and the perceived weight placed on the survey for accreditation decisions.
      Previous to the NAS, program directors in multiple specialties (General Surgery, OB/GYN, Psychiatry) questioned the validity and accuracy of the ACGME resident survey.
      • Balon R.
      The unspoken tyranny of regulatory agencies: a commentary on the ACGME Resident Survey.
      • Fahy B.N.
      • Todd S.R.
      • Paukert J.L.
      • Johnson M.L.
      • Bass B.L.
      How accurate is the Accreditation Council for Graduate Medical Education (ACGME) Resident survey? Comparison between ACGME and in-house GME survey.
      • Todd S.R.
      • Fahy B.N.
      • Paukert J.L.
      • Mersinger D.
      • Johnson M.L.
      • Bass B.L.
      How accurate are self-reported resident duty hours?.
      • Sticca R.P.
      • Macgregor J.M.
      • Szlabick R.E.
      Is the Accreditation Council for Graduate Medical Education (ACGME) Resident/Fellow survey, a valid tool to assess general surgery residency programs compliance with work hours regulations?.
      Specific areas of perceived misinterpretation on the survey include the definitions of “service” and “education,”
      • Balon R.
      The unspoken tyranny of regulatory agencies: a commentary on the ACGME Resident Survey.
      • Reines H.D.
      • Robinson L.
      • Nitzchke S.
      • Rizzo A.
      Defining service and education: the first step to developing the correct balance.
      • Sanfey H.
      • Cofer J.
      • Hiatt J.R.
      • et al.
      Service or education: in the eye of the beholder.
      • Smith D.E.
      • Johnson B.
      • Jones Y.
      Service versus education, what are we talking about?.
      “sufficient supervision” by attendings,
      • Balon R.
      The unspoken tyranny of regulatory agencies: a commentary on the ACGME Resident Survey.
      perceived compliance with research requirements,
      • Oakley S.H.
      • Crisp C.C.
      • Estanol M.V.
      • Fellner A.N.
      • Kleeman S.D.
      • Pauls R.N.
      Attitudes and compliance with research requirements in OB/GYN residencies: a national survey.
      and accuracy in reporting of duty hours.
      • Todd S.R.
      • Fahy B.N.
      • Paukert J.L.
      • Mersinger D.
      • Johnson M.L.
      • Bass B.L.
      How accurate are self-reported resident duty hours?.
      • Chadaga S.R.
      • Keniston A.
      • Casey D.
      • Albert R.K.
      Correlation between self-reported resident duty hours and time-stamped parking data.
      Program directors and specialty organizations encouraged ACGME to scientifically evaluate the resident survey and to make changes to improve the validity of the instrument.
      In response to these concerns, the ACGME and others had analyzed the ACGME survey (before the 2013 changes) and found the survey to have good overall internal consistency
      • Holt K.D.
      • Miller R.S.
      The ACGME Resident Survey Aggregate Reports: an analysis and assessment of overall program compliance.
      and be a reliable and valid tool to evaluate duty hours.
      • Holt K.D.
      • Miller R.S.
      • Philibert I.
      • Heard J.K.
      • Nasca T.J.
      Residents' perspectives on the learning environment: data from the Accreditation Council for Graduate Medical Education resident survey.
      This study set out to evaluate Internal Medicine program directors' perceptions of the new ACGME survey and to help identify specific areas of perceived or real inconsistency or areas allowing for misinterpretation.

      Methods

      The Association of Program Directors in Internal Medicine (APDIM) Survey Committee conducts an annual survey of member programs of APDIM to explore the characteristics of programs and opinions of program directors about issues important to internal medicine and residency training. Three hundred seventy programs/program directors were sent a link to the survey in August 2012, representing 95.6% of all 387 accredited US training programs. The 2012 survey included 13 questions asking program directors' views of the change in the ACGME resident survey. Program directors were asked their opinions of the accuracy of the new survey, their level of agreement with the changes in the survey, how they helped prepare residents to complete the survey, and open-ended questions assessing program directors' use of the survey to leverage resources, as well as program directors' general assessment of the validity and use of the survey. The APDIM survey instruments and summary files are available on the APDIM Web site at http://www.im.org/toolbox/surveys/APDIMSurveyData.

      Results

      Two hundred seventy-two of 370 (73.5%) program directors responded to the survey. The characteristics of the programs that responded compared with nonresponders are outlined in Table 1. University-based, larger-sized programs were more likely to respond. There were no regional differences in response rates, nor were American Board of Internal Medicine pass rates or program director tenure different for responder programs versus nonresponders.
      Table 1Program Characteristics
      CharacteristicResponders (n = 272)Nonresponders (n = 115)P-Value
      Program description, n (%).02
      Fisher's Exact Test.
       Community-based, university affiliated hospital144 (52.9)57 (49.6)
       University-based100 (36.8)33 (28.7)
       Community-based23 (8.5)21 (18.3)
       Military5 (1.8)4 (3.5)
      Region, n (%).20
      Fisher's Exact Test.
       Northeast89 (32.7)45 (39.1)
       South74 (27.2)28 (24.4)
       Midwest65 (23.9)23 (20.0)
       West41 (15.1)14 (12.2)
       Unincorporated territory3 (1.1)5 (4.4)
      ABIM pass rate, Mean (SD) percent86.2 (9.5)86.8 (9.7).62
      Welch's t test.
      Program director tenure, Mean (SD) years6.6 (6.5)6.2 (6.9).59
      Welch's t test.
      Program size, Mean (SD) approved positions68.2 (39.2)56.6 (35.8).005
      Welch's t test.
      ABIM = American Board of Internal Medicine.
      Fisher's Exact Test.
      Welch's t test.
      One hundred eighty-one program directors (67%) did not agree with the new practice of program-specific surveys, and 202 (74%) were concerned about the transparency of how the survey will be used to accredit programs. Two hundred fifty-nine (95%) agreed they should receive aggregate responses to the specific questions asked of their residents. Only a minority of program directors (111; 41%) felt that it somewhat or very accurately represented the status of their program. In terms of ease of ensuring that residents complete the survey, 127 (47%) found it “difficult” to ensure completion, while 112 (41%) found it “easy.”
      We asked program directors for open-ended responses about the potential positive and negative consequences of the survey change. The responses are collated and detailed in Tables 2 and 3. In general, program directors perceived the change as negative, with the most common concern being misinterpretation of questions by residents. The questions program directors believe residents most commonly misunderstand are detailed in Table 4, with the definition of “service versus education” and duty hours leading the list.
      Table 2Program Directors' Negative Comments About the Change in the ACGME Survey
      CommentNumber of Times Cited
      Survey is vague/ambiguous/misinterpreted by residents63
      PDs can't improve program or clarify questions without knowing questions in advance33
      Survey questions are misleading/misrepresent the program13
      Survey carries too much weight/high stakes8
      Survey questions not well written6
      ACGME does not trust PDs/creates an adversarial environment6
      The new survey implementation lacks transparency/is inconsistent with professional values5
      Misinterpretation based upon ESL or cultural differences with IMGs4
      Unintended consequences of ACGME survey4
      Survey tool is not meaningful/accurate3
      Responses are resident perceptions not reality1
      Too long1
      Abbreviations: ACGME = Accreditation Council on Graduate Medical Education; ESL = English as a second language; IMG = international medical graduate; PD = Program Director.
      Table 3Program Directors' Positive Comments About the Change in the ACGME Survey
      CommentNumber of Times Cited
      Prevents “coaching” by PDs7
      Survey clarity better2
      Can lead to beneficial changes to a program1
      Abbreviations: ACGME = Accreditation Council on Graduate Medical Education, PD = Program Director.
      Table 4Items on the Survey Most Commonly Misunderstood by Residents, According to Program Directors
      ItemNumber of Times Cited
      Service vs education4
      Duty hours3
      Definition of “sometimes” in survey questions3
      Too many learners2
      “Intimidation” misinterpreted1
      Number of attendings of record1
      Nonmedicine patients on service1
      “Scholarly activity”1
      One hundred sixty-three program directors (60%) agreed that the survey was useful in helping them leverage their institution for more resources (Table 5). The most common resources program directors leveraged were hiring hospitalists, physician extenders, and administrative support. Program directors were neutral about the effect of the resident survey on their most recent site visit, with 70 (26%) stating there was no effect, 54 (20%) stating the survey affected the site visit negatively, and 69 (25%) stating it affected the site visit positively.
      Table 5Institutional Resources Requested Based on Survey Responses
      Total number who used the ACGME survey to leverage resources = 163.
      ItemFrequency CountPercent of Total Frequency
      Hiring more hospitalists6338.7
      Hiring physician extenders (NPs, PAs)5533.7
      Increased administrative staff/support4427.0
      Facility improvements (call rooms, lounges, etc.)3320.2
      Increased protected time for PD/APDs3018.4
      Increasing residency positions2616.0
      Increased salary/salary support for PD/APDs159.2
      Additional resources to decrease service demands95.5
      Increase in subspecialty support63.7
      Improved information technology infrastructure/resources63.7
      Abbreviations: ACGME = Accreditation Council on Graduate Medical Education; APD = Associate Program Director; NP = nurse practitioner; PA = physician assistant; PD = Program Director.
      Total number who used the ACGME survey to leverage resources = 163.

      Discussion

      Our study demonstrated a significant level of concern by program directors about the new format of the ACGME resident survey. Most program directors disagreed with the changes in the resident survey, did not believe there was sufficient transparency in its use, and believe they should receive aggregate responses from their residents for each specific question, particularly if the goal is to use survey results to stimulate ongoing program improvement.
      The most common negative responses by program directors about the survey were concerns about residents' misinterpretation of survey questions and the inability of program directors to be able to clarify the intent of questions and specific definitions with their residents in advance. Program directors understood the need to not coach residents into providing specific answers, but felt many of the questions and terms used on the survey were vague and open to misinterpretation. An interesting finding was the concern that the survey had potential to be misinterpreted by residents who were English-as-a-second-language speakers, suggesting the need for a careful review of the cultural interpretation of the survey questions.
      The concerns program directors shared about specific survey questions focused on “service versus education,” accuracy of residents' responses to duty-hours questions, and interpretation of how many learners constitutes “too many” (Table 4). These are consistent with program directors' concerns in the previous version of the ACGME survey and suggest a potential need for investigation of the new ACGME survey for validity and consistency. Some suggestions program directors provided to improve the survey process included providing more details to the ACGME's “frequently asked questions” to further explain how the ACGME derives program-specific surveys or determines a threshold of “noncompliance” on the new survey. Program directors also suggested consideration of defining terms that may be considered ambiguous, such as “sometimes” or “opportunities for scholarship.”
      Given that the results of the resident survey are used for accreditation, a “high stakes” decision, it is not surprising that program directors express high levels of concern about the validity of the survey. Test-based accountability—defined as the use of tests to hold individuals or institutions responsible for performance through the application of rewards or sanctions—has become a debated issue. In the past decade the US federal education system has adopted test-based accountability systems, which has generated concerns about superficial solutions and focusing efforts of excellence on the wrong targets.

      Supovitz J. Is high stakes testing working? Penn Graduate School of Education. Available at: http://www.gse.upenn.edu/review/feature/supovitz. Accessed October 1, 2013.

      The corollary in Internal Medicine may be that program directors feel added pressure when making unpopular changes within their program, even if for the right educational reasons. Unpopular rotations may make for better physicians in the long run if educationally justified, but may be misinterpreted by trainees as providing “service versus education.”
      Results of the survey data can have positive implications, as a majority of program directors used resident survey data to secure program resources. In addition, some program directors feel the new format prevents coaching of residents, suggesting the belief that ACGME survey results in the new format will have improved validity.
      Despite the concerns about the survey, program directors do not seem to find it overly burdensome to have residents complete the survey, which has become notably shorter than previous versions.
      In conclusion, program directors' initial reaction to changes made to the ACGME resident survey is negative, with specific concerns surrounding the proper interpretation of questions, the validity of the survey instrument and the ramifications of the results for programs. In the past, studies in surgical education have suggested that discrepancies exist between the ACGME survey and actual duty hours.
      • Fahy B.N.
      • Todd S.R.
      • Paukert J.L.
      • Johnson M.L.
      • Bass B.L.
      How accurate is the Accreditation Council for Graduate Medical Education (ACGME) Resident survey? Comparison between ACGME and in-house GME survey.
      • Sticca R.P.
      • Macgregor J.M.
      • Szlabick R.E.
      Is the Accreditation Council for Graduate Medical Education (ACGME) Resident/Fellow survey, a valid tool to assess general surgery residency programs compliance with work hours regulations?.
      Nonetheless, the ACGME has analyzed previous versions of the resident survey carefully and found internal reliability and validity as well as good correlation between duty hours noncompliance on the ACGME survey and other areas of program noncompliance. A careful analysis of the consistency and validity of the new ACGME survey is warranted given program directors' concerns and the perceived “high stakes” nature of how the results will be used in the NAS.

      Acknowledgment

      We are grateful for the support of the Association of program directors in Internal Medicine , members of the Survey Committee, and to the residency program directors that completed this survey. This study was supported in part by the Mayo Clinic Internal Medicine Residency Office of Educational Innovations as part of the ACGME Educational Innovations Project. The Mayo Clinic Survey Research Center provided assistance with the survey design and data collection.

      References

      1. Accreditation Council for Graduate Medical Education (ACGME). ACGME Next Accreditation System. Available at: http://www.acgme-nas.org. Accessed October 1, 2013.

      2. Accreditation Council for Graduate Medical Education (ACGME). ACGME Data Collection System: Resident and fellow survey. Available at: http://www.acgme.org/acgmeweb/DataCollectionSystems/ResidentFellowSurvey.aspx. Accessed October 1, 2013.

        • Balon R.
        The unspoken tyranny of regulatory agencies: a commentary on the ACGME Resident Survey.
        Acad Psychiatry. 2012; 36: 351-352
        • Fahy B.N.
        • Todd S.R.
        • Paukert J.L.
        • Johnson M.L.
        • Bass B.L.
        How accurate is the Accreditation Council for Graduate Medical Education (ACGME) Resident survey? Comparison between ACGME and in-house GME survey.
        J Surg Educ. 2010; 67: 387-392
        • Todd S.R.
        • Fahy B.N.
        • Paukert J.L.
        • Mersinger D.
        • Johnson M.L.
        • Bass B.L.
        How accurate are self-reported resident duty hours?.
        J Surg Educ. 2010; 67: 103-107
        • Sticca R.P.
        • Macgregor J.M.
        • Szlabick R.E.
        Is the Accreditation Council for Graduate Medical Education (ACGME) Resident/Fellow survey, a valid tool to assess general surgery residency programs compliance with work hours regulations?.
        J Surg Educ. 2010; 67: 406-411
        • Reines H.D.
        • Robinson L.
        • Nitzchke S.
        • Rizzo A.
        Defining service and education: the first step to developing the correct balance.
        Surgery. 2007; 142: 303-310
        • Sanfey H.
        • Cofer J.
        • Hiatt J.R.
        • et al.
        Service or education: in the eye of the beholder.
        Arch Surg. 2011; 146: 1389-1395
        • Smith D.E.
        • Johnson B.
        • Jones Y.
        Service versus education, what are we talking about?.
        J Surg Educ. 2012; 69: 432-440
        • Oakley S.H.
        • Crisp C.C.
        • Estanol M.V.
        • Fellner A.N.
        • Kleeman S.D.
        • Pauls R.N.
        Attitudes and compliance with research requirements in OB/GYN residencies: a national survey.
        Gynecol Obstet Invest. 2013; 75: 275-280
        • Chadaga S.R.
        • Keniston A.
        • Casey D.
        • Albert R.K.
        Correlation between self-reported resident duty hours and time-stamped parking data.
        J Grad Med Educ. 2012; 4: 254-256
        • Holt K.D.
        • Miller R.S.
        The ACGME Resident Survey Aggregate Reports: an analysis and assessment of overall program compliance.
        J Grad Med Educ. 2009; 1: 327-333
        • Holt K.D.
        • Miller R.S.
        • Philibert I.
        • Heard J.K.
        • Nasca T.J.
        Residents' perspectives on the learning environment: data from the Accreditation Council for Graduate Medical Education resident survey.
        Acad Med. 2010; 85: 512-518
      3. Supovitz J. Is high stakes testing working? Penn Graduate School of Education. Available at: http://www.gse.upenn.edu/review/feature/supovitz. Accessed October 1, 2013.