Print Page | Contact Us | Sign In | Register
The Next Accreditation System

Read the article and then use the corresponding link to take the CME test. Members must be logged in to take the test for free.

  • Upon successful completion, a printable certificate of credit will be delivered automatically to your member profile.
  • To access your certificate, click here.
  • View and print your certificate by clicking on the certificate icon   next to the name of each of your CME credits.
Northeast Florida Medicine, Vol. 70, No. 1, January 2019

Out with the Old and In with the New: The Next Accreditation System

Jennifer Hamilton, BA, C-TAGME,1
Danielle Palmer, BAA,1
Denise West, MA,1
and Linda R. Edwards, MD2
1Office of Educational Affairs, University of Florida College of Medicine-Jacksonville, Jacksonville, FL
2Department of Medicine, University of Florida College of Medicine-Jacksonville

Address Correspondence to:

Leslie Caulder, BAS, C-TAGME
Office of Educational Affairs
University of Florida College of Medicine-Jacksonville
653-1 West 8th Street, Box L15, Jacksonville, FL 32209
Phone: (904) 244-6950
Fax: (904) 244-4771
Email: Leslie.Caulder@jax.ufl.edu

Date of Release: January 1, 2019
Date Credit Expires: January 1, 2021
Estimated Completion Time: 1 hour
Background:

The Duval County Medical Society (DCMS) is proud to provide its members with free continuing medical education (CME) opportunities in subject areas mandated and suggested by the State of Florida Board of Medicine to obtain and retain medical licensure. The DCMS would like to thank the St. Vincent’s Healthcare Committee on CME for reviewing and accrediting this activity in compliance with the Accreditation Council on Continuing Medical Education (ACCME). This issue of Northeast Florida Medicine includes an article, “Out with the Old and In with the New: The Next Accreditation System” authored by Leslie Caulder, BAS, C-TAGME, Jennifer Hamilton, BA, C-TAGME, Danielle Palmer, BAA, Denise West, MA, and Linda R. Edwards, MD, which has been approved for 1 AMA PRA Category 1 credit.TM For a full description of CME requirements for Florida physicians, please visit www.dcmsonline.org.

Faculty/Credentials:

Leslie Caulder, BAS, C-TAGME, Assistant Director, Education and Training Programs, Jennifer Hamilton, BA, C-TAGME, Residency & Fellowship Coordinator II, Danielle Palmer, BAA, GME Accreditation Administrator, and Denise West, MA, GME Accreditation Administrator, are with the Office of Educational Affairs, University of Florida College of Medicine-Jacksonville. Linda R. Edwards, MD, is the Senior Associate Dean and DIO at University of Florida College of Medicine-Jacksonville.

Needs Assessment:

Physicians all across the U.S. complete ACGME accredited training. Over the years, the ACGME has changed accreditation requirements tremendously. It is important for all physicians to develop an understanding of the training that current resident physicians go through, to help to close the generational gap between physicians currently in practice and those that will be entering practice soon.

Objectives:

1. Identify the differences between the ACGME’s current Next Accreditation System (NAS) and the pre-NAS process.
2. Understand the purpose of the Annual Program Evaluation and Self-Study
3. Be able to recognize the ACGME’s Milestones

CME Credit Eligibility:

A minimum passing grade of 70% must be achieved. Only one re-take opportunity will be granted. If you take your test online, a certificate of credit/completion will be automatically downloaded to your DCMS member profile. If you submit your test by mail, a certificate of credit/completion will be emailed within 4 weeks of submission. If you have any questions, please contact the DCMS at 904-355-6561 or dcms@dcmsonline.org. 

Faculty Disclosure:

Leslie Caulder, BAS, C-TAGME, Jennifer Hamilton, BA, C-TAGME, Danielle Palmer, BAA, Denise West, MA, and Linda R. Edwards, MD report no significant relations to disclose, financial or otherwise, with an commercial supporter or product manufacturer associated with this activity.

Disclosure of Conflicts of Interest:

St. Vincent’s Healthcare (SVHC) requires speakers, faculty, CME Committee and other individuals who are in a position to control the content of this educational activity to disclose any real or apparent conflict of interest they may have as related to the content of this activity. All identified conflicts of interest are thoroughly evaluated by SVHC for fair balance, scientific objectivity of studies mentioned in the presentation and educational materials used as basis for content, and appropriateness of patient care recommendations.

Joint Sponsorship Accreditation Statement:

This activity has been planned and implemented in accordance with the Essential Areas and policies of the Accreditation Council for Continuing Medical Education through the joint sponsorship of St. Vincent’s Healthcare and the Duval County Medical Society. St. Vincent’s Healthcare designates this educational activity for a maximum of 1 AMA PRA Category 1 credit.TM Physicians should only claim credit commensurate with the extent of their participation in the activity.

Understanding and adhering to the Accreditation Council for Graduate Medical Education (ACGME) program requirements is essential for programs seeking initial accreditation or for those wishing to maintain accreditation. ACGME’s implementation of the Next Accreditation System (NAS) dramatically changed the way the organization accredits programs. This system has moved from a cyclical accreditation review process to a continuous accreditation model. The University of Florida College of Medicine-Jacksonville has implemented several processes, allowing institutional oversight of program accreditation to mirror the NAS process.

The Accreditation Council for Graduate Medical Education (ACGME), founded in 1981, brought much needed organizational structure and educational standards to graduate medical education (GME) programs in the United States.1 In 2017-2018, the ACGME had at least 8212 sponsoring institutions and 11,140 residency and fellowship programs, with 136,828 housestaff receiving specialty specific training in those programs.3 Through the use of specialty Review Committee(s) (RC), the Council started on a journey to standardize educational practices, foster public confidence, and document physician competence.1 Physicians who completed an ACGME-accredited residency or fellowship program would have achieved specialty specific competence. Over the years, the ACGME has launched several influential initiatives to foster and encourage physicians to broaden their learning beyond the textbook. According to Nasca et al in the article, The next GME accreditation system: rationale and benefits, “The aims of the Next Accreditation System (NAS) are threefold: to enhance the ability of the peer-review system to prepare physicians for practice in the 21st century, to accelerate the ACGME’s movement towards accreditation on the basis of educational outcomes and to reduce the burden associated with the current structure and process-based approach.”1 The purpose of this article is to review the changes in the accreditation process and share how the University of Florida College of Medicine-Jacksonville (UFCOM-J) and its accredited residencies and fellowships have moved from the standard accreditation model to the Next Accreditation System (NAS), which launched in July 2013.1

The Accreditation Process

In the NAS, initial accreditation has remained remarkably similar to the pre-NAS process. First, the application is submitted to the ACGME. The review committee either does a paper review or schedules a pre-accreditation site visit. Following the paper review or Site Visit (SV), the RC makes the accreditation status decision and approved programs receive initial accreditation.4

However, the process in the NAS has changed the continued accreditation flow. At the conclusion of the initial accreditation period, the ACGME awards one of four accreditation statuses: Continued Accreditation, Accreditation with Warning, Probationary Accreditation, or Accreditation Withdrawn.4 Prior to the implementation of the NAS, the ACGME determined a program’s accreditation status based upon data provided to the RC through submission of the Program Information Form (PIF) and the associated SV. Programs began preparing nine months to a year before the anticipated SV date. The program submitted the PIF to the site visitor approximately ten to fifteen days before the scheduled visit. After the visit, the SV team submitted their report to the RC for consideration at their next meeting. During the Review Committee meeting, members evaluated the materials and determined the accreditation status based on the information from the PIF and the Site Visit Report. Programs received notification of the committee’s accreditation decision eight to ten months after the SV PIF submission. Continued accreditation statuses were set for periods between two and five years. Programs on probation could expect to have another SV within 15 months. Review committees made accreditation decisions based on a snapshot in time, or a biopsy, of the program’s adherence to the RC standards (Figure 1).1 The NAS has dramatically changed this process by replacing cyclical reviews with annual reviews.

 

Figure 1: Pre-NAS, ACGME’s cyclical accreditation pattern evaluated program compliance with RC requirements based on a snapshot or biopsy of the program on a given date.

During the pre-NAS cyclical accreditation process, institutions were required to perform a programmatic Internal Review (IR) at the mid-point of the accreditation cycle. The institutional requirements did not specify how this internal review was to be conducted, only that it would occur. At UFCOM-J, the process included a panel review and interview session.  The panel consisted of the Designated Institutional Official (DIO), who was also the Senior Associate Dean for Educational Affairs, the Associate Dean for Educational Affairs, two or more program directors and/or associate program directors, a hospital administrator, and a resident/fellow from another program. The program director was required to complete an internal review form addressing citations, attrition, board certification, in-service examinations, and program requirements. In addition, the program’s residents/fellows and faculty completed an anonymous survey assessing the program. The panel reviewed the completed PIF and survey results prior to a scheduled IR meeting. The meeting included separate group interviews with the faculty and trainees to discuss any issues identified on the survey and the program as a whole, as well as an interview with the program director and coordinator. An internal review report was provided to the Graduate Medical Education Committee for review and approval. The internal review report identified issues and assisted the program director with the development of solutions prior to the next ACGME Site Visit. Post-NAS, the ACGME no longer requires internal reviews. The UFCOM-J continues to conduct IRs of all programs in the initial accreditation phase to ensure that new programs remain in substantial compliance with the ACGME common and program specific requirements.

 

Annual Program Evaluation (APE)

As a part of the new Common Program Requirements (CPRs) implemented through NAS, all programs are required to complete an Annual Program Evaluation (APE). The APE is a self-assessment that mirrors information requested annually by the ACGME through the Accreditation Data System (ADS). Even though the institution is no longer required to complete internal reviews, programs in continued accreditation status complete the annual review process using the institution’s Annual Program Evaluation Review (APER) form.

Again, the institutional requirements do not specify how to accomplish the annual review. To accomplish this requirement, the UFCOM-J established a Committee for Annual Program Evaluation and Review (CAPER), a subcommittee of the Graduate Medical Education Committee (GMEC), charged with reviewing each program’s self-assessment/APE for compliance with ACGME program standards. Table 1 lists the key focus areas for CAPER participants. The program director and coordinator complete a UFCOM-J standardized Annual Program Evaluation and Review (APER) form and provide supporting data, which is then reviewed/evaluated by faculty and residents during their Program Evaluation Committee (PEC) meeting. During the PEC meeting, the program develops action items and timelines for improvements in the following areas, as applicable: 1) self-identified areas of weakness; 2) citations or areas of concern in the ACGME’s most recent Accreditation Letter; and, 3) items marked as non-compliant or needs improvement during the previous year’s APE review.

 

Table 1: CAPER Key Focus Areas:

A primary and secondary reviewer analyze the submission, comparing the program’s data with the RC’s requirements. The reviewers share their analysis at the CAPER meeting, where the committee mutually decides the recommended final statuses: continued annual review, follow-up review/progress report, or special review. The committee chair provides the GMEC with an executive summary of each program and recommended status for the GMEC’s final approval. The CAPER’s detailed review provides timely identification of areas of concern or potential non-compliance.

The CAPER provides the DIO with data across all programs, allowing the institution to examine trends and to identify areas of improvement needed on the program and/or institutional level. The trends sheet (“dashboard”) provides a visualization tool of each focus area to create action items for possible implementation during the next academic year. The Office of Educational Affairs (OEA) identifies trends through data analysis of the focus areas. The trends sheet in Table 2 lists the programs across the top and key areas from the APER form on the left side.

Table 2: The UFCOM-J Committee for Annual Program Evaluation and Review tracks program and institutional trends based on key focus areas on the institutional dashboard. Here is a sample representation of several focus areas and the committee’s outcome assessment.

 

Self-Study and Programmatic Site Visit

To bolster this new era of program introspection, the ACGME also implemented the Self-Study (SS). The programs self-assessment through the APE provides the foundation for the Self-Study. According to the ACGME, “The Self-Study is an objective, comprehensive evaluation of the residency or fellowship program, with the aim of improving it. Underlying the Self-Study is a longitudinal evaluation of the program and its learning environment, facilitated through sequential annual program evaluations that focus on the required components, with an emphasis on program strengths and ‘self-identified’ areas for improvement.”

The Self-Study is an eight-step process: 1) Assemble the Self-Study group, 2) Engage program leaders and constituents in a discussion of program aims, 3) Aggregate and analyze data from the program’s APE and the Self-Study to create a longitudinal assessment of program strengths and areas for improvement, 4) Examine the program’s environment for opportunities and threats, 5) Obtain stakeholder input on strengths, areas for improvement, opportunities, and threats to prioritize actions, 6) Interpret the data and aggregate the self-study findings, 7) Discuss and validate the findings with stakeholders, 8) Develop a Self-Study document for use in further program improvement as the documentation for the program’s 10-year Site Visit.5

The ACGME has not specifically defined how the self-study should be conducted, but has provided guidance through its website, as well as through workshops at the organization’s national educational conference. To support programs through the process, the UFCOM-J created a Self-Study subcommittee to provide guidance and oversight. When a program receives its SS notification letter, the program’s self-study group conducts the self-study, including the development of program aims, and completes the document. Prior to submission to the ACGME through ADS, the Associate Dean for Educational Affairs and the DIO review it for content and structure. Approximately 18 months following the submission of the Self-Study document, the ACGME will conduct a Site Visit. During the 18-month span, programs track strengths and improvement outcomes, based on information provided on the initial form.

In preparation for the 10-year Site Visit, programs use the SS Summary of Achievements document to describe strengths and improvements noted in the original submission. For programs that have identified major changes from the original Self-Study Summary document, they will also complete the Self-Study Update document. The programs submit the draft of the document(s) to the self-study subcommittee every six months until the SV. The subcommittee reviews and provides feedback to the programs. At least one month prior to the due date to the ACGME, the DIO and Associate Dean for Educational Affairs review the final draft. At least 12-days prior to the expected Site Visit date, the program director or coordinator, uploads the documents through ADS. Per ACGME’s website, the 10-Year Site Visit will not differ much from previous full Site Visits and will mirror what is being done by other educational accrediting bodies.1 The major differences will be the absence of the Program Information Form (PIF) and a shorter window between notification and the actual visit.

 

Milestones

Milestones have become an important outcomes component of the accreditation system for GME. The NAS more fully embraced the outcomes-based principles that started with the release of the General Competencies in 1999 and the launch of the Outcomes Project in 2001. The ACGME and programs struggled to evaluate achievement of the competencies and create meaningful outcomes-based assessments. Recognizing these challenges, the NAS introduced Milestones, designed to continuously improve educational outcomes, leading to enhanced clinical outcomes, at the level of the individual learner and the program. These also helped ensure the ACGME’s accountability to the public, and to support the educational process.6

Prior to the introduction of Milestones, residency programs, including the UFCOM-J’s programs, evaluated residents’ success in learning the skills of their specialty, based upon what the program director thought was important, while adhering to American Board of Medical Specialties and general ACGME requirements. Most programs utilized subjective formative and summative evaluations on trainees at random intervals, with the exception of the required rotation evaluations. Several programs began using Milestones in 2013, with all programs reporting Milestones data to ACGME in 2015.

Milestones provide a description of the performance levels that residents must demonstrate for skills, knowledge, and behaviors in the six competency domains. They also give a framework of observable behaviors and are one indicator of a program’s educational effectiveness. Milestones answer these questions: what do they know (Medical Knowledge), what can they do (Patient Care), and how do they conduct themselves (Interpersonal and Communication Skills, Practice-based Learning and Improvement, Professionalism, and Systems-based Practice)?6

Residency programs use Milestones to guide curriculum development to have explicit expectations of residents, to support better assessment, and to obtain enhanced opportunities for early identification of under-performers. Certification boards can use Milestones to ascertain whether individuals have demonstrated qualifications needed to sit for board exams. Milestones can provide increased transparency of performance requirements, encourage resident self-assessment and self-directed learning, and provide better feedback to residents (Figure 2).6

Figure 2: Residency programs spider graph showing the programs compliance with ACGME milestones between July 2016 and December 2016.

The UFCOM-J uses a web-based residency management system, allowing programs to complete confidential and anonymous evaluations electronically. Programs have the ability to map individual Milestones to evaluation questions, providing convenient tracking and making data analysis possible. Mapping Milestones to evaluations provides an objective scoring system. It encourages evaluators to complete a more meaningful review, rather than grading on a subjective scale. As evidenced by the spider graph in Figure 3 from a UFCOM-J program, Milestones allow for a peer-based comparison of residents, over time. Milestones also afford program directors the opportunity to compare trainee levels to national programmatic Milestone averages generated by the ACGME.

Figure 3: Residency programs spider graph showing the programs compliance with ACGME milestones between July 2015 and December 2017.

 

Clinical Learning Environment Review (CLER)

Another impactful component of the NAS was the implementation of the Clinical Learning Environment Review (CLER) with the aim to expand the evaluation of the GME learning community. Per the ACGME’s CLER Committee, “The CLER program’s ultimate goal is to move from a major targeted focus on duty hours to that of broader focus on the GME learning environment and how it can deliver both high-quality physicians and higher quality, safer, patient care.”7, 4 The CLER assesses six focus areas: 1) patient safety, 2) quality improvement, 3) transitions of care, 4) supervision, 5) physician well-being, and 6) professionalism.7, 8 These focus areas go back to and are an extension of the ACGME’s 2003 patient safety initiatives. The ACGME is seeking information on how clinical environments address patient safety and quality as these areas affect all areas of medical education.

The ACGME schedules an institutional CLER visit every 18- to 24-months.9 The CLER Site Visit team meets with leadership from the sponsoring institution and the major participating site being surveyed, patient safety and quality officers, residents and fellows, faculty members, and program directors.9 The team collects data from participant interviews and surveys using an audience response system. Site visitors also go into the clinical center(s) with residents to speak with other members of the healthcare team such as nurses, respiratory therapists, surgical techs, and to witness hand-offs, procedures, and interdisciplinary teamwork, etc.9 Depending on the size of the organization, the visit can take between one and four days. Three CLER visits have occurred at UFCOM-J since the inception of the program at the institution’s major participating site, UF Health-Jacksonville.

 

Out with the Old and In with the New

Pre-NAS, programs with a five-year accreditation cycle may not have been compelled to address citations immediately, as they had “time to address” issues before the next site visit. Conversely, programs with a short accreditation cycle may not have had adequate time to implement meaningful change. Various external factors could affect a program’s compliance, such as long durations between accreditation visits, attrition, data gathering errors or omissions, and outdated information. This method placed an emphasis on the accreditation process rather than program quality, outcomes, or continuous program improvement and innovation.

The Next Accreditation System’s model emphasizes outcomes and improvements, innovation, quality, and patient safety initiatives through the continuous review process. Review Committees receive program information during the annual Accreditation Data System (ADS) update. The data reviewed includes: program characteristics, clinical learning environment, program and institutional leadership, environment of scholarship, faculty and resident survey responses, resident clinical experience data, Milestones, and board pass rate information. Review committees assess overall performance, looking for trends and instances of potential non-compliance with accreditation standards to identify underperforming or non-compliant programs. Programs in substantial non-compliance may receive accreditation with warning or probation. Review committees can make accreditation decisions proactively, moving programs out of “Probationary Accreditation” or “Continued Accreditation with Warning” status when programs address issues, as reflected in ADS. Major program changes, such as a new program director, adding a new participating site or changing a site’s ownership, or an institutional governing body change, etc. can be described and/or documented in ADS immediately. This continuous accreditation cycle provides the RC more freedom to review program changes as they occur (Figure 4). The committees now provide program director appointment approval or denial within weeks of the ADS request. Programs are actively engaged in self-reflection through the APER process. 

 

Figure 4: Post-NAS, ACGME’s continuous accreditation pattern evaluates program compliance in real-time, providing each program with their annual accreditation status decision.  Citations can be removed, retained, or new citations added based on the RC’s review of the program.

The Next Accreditation System brought various changes to the accreditation process that moved towards continuous quality improvement rather than the “biopsy” method of the past. To embrace these changes, the University of Florida College of Medicine-Jacksonville has developed innovative administrative processes, which strive to enhance continuous improvement not only in individual programs, but also as an institution as a whole. It is obvious that while the ACGME expects programs to demonstrate continuous quality improvement, the organization sees value in self-reflection and does not exclude itself from improvement initiatives. There are many changes to come and graduate medical education will certainly continue to evolve. 

1. Nasca TJ, Philibert I, Brigham T, et al. The next GME accreditation system: rationale and benefits. N Engl J Med. 2012 Mar 15;366(11):1051-6.

2. Accreditation Council for Graduate Medical Education. Data resource book, academic year 2016-2017. Chicago: ACGME; 2017. 107 p.

3. Accreditation Council for Graduate Medical Education (ACGME) – Public Advanced Sponsor Search [Internet]. 2018 [cited 2018 Feb 12]. Available from: https://apps.acgme.org/ads/Public/Sponsors/Search

4. Accreditation Council for Graduate Medical Education. Accreditation policies and procedures. Chicago: ACGME; 2018 Feb 3. 161 p.

5. Self-study [Internet]. Chicago (IL): Accreditation Council for Graduate Medical Education; 2018 [cited 2018 Feb 4]. Available from: https://www.acgme.org/What-We-Do/Accreditation/Self-Study.

6. Holmboe S, Edgar L, Hamstra S. ACGME the milestones guidebook. Chicago: ACGME; 2016. 41 p.  

7. CLER Evaluation Committee. Clinical learning environment review: executive summary 2.0. ACGME; 2012 June 10. 1 p.

8. CLER Evaluation Committee. CLER pathways to excellence: expectations for an optional clinical learning environment to achieve safe and high quality patient care- executive summary. ACGME; 2014. 6 p.  

9. CLER Evaluation Committee. CLER pathways to excellence: expectations for an optimal clinical learning environment to achieve safe and high quality patient care, version 1.1. ACGME; 2017. 40 p.

10. Surdyk P. The history of sponsoring institutions, 1982-2017. J Grad Med Educ. 2017 Dec;9(6 Suppl):7-10.