Download presentation
Presentation is loading. Please wait.
1
Understanding the Accreditation Reform
2
Presentation Overview
Accreditation Reform: Overview New Accreditation Standards and Processes Progressive, Eight Year Accreditation Cycle AMS Platform: A New Digital System Emphasis on Self-Study and CQI Onsight Visits Moving Forward Next Steps: What to Expect We know that there is a lot to digest when it comes to understanding the Accreditation Reform. We have broken this presentation into 7 key sections. If you have additional questions, we encourage you to reach out to us at:
3
Accreditation Reform: Overview
Understanding Accreditation, CanRAC, and the Reform
4
What is accreditation? Quality improvement process
Safeguards adherence to national standards Ensures all graduates are ready for practice Within residency education, accreditation is a quality improvement process. Safeguards adherence to national standards across all programs and universities Ensures that the medical doctors graduating from a Canadian university are ready for independent practice. Understanding the standards The Royal College, CFPC, and CMQ have developed national standards for evaluation and accreditation. Evaluations of both the schools and the programs are based on compliance with meeting these standards. There are three different standards of Accreditation: A Standards (apply to the university, specifically the PGME office) B Standards (apply to each residency program) Specialty-specific standards
5
What is the CanRAC? CanRAC
Collaborative between the Royal College, CFPC, and CMQ. Developed a proposed plan to introduce a new system of residency accreditation, which aligns with: 21st century best practices, and Supports the shift towards CBME. Oversees/Participates on multiple committees that contribute to a progressive accreditation reform. Integration Committee Responsible for reviewing and providing feedback on the draft standards developed by the Standards Development Working Groups, focused on identify and addressing cross-cutting issues and facilitating dialogue as needed. Responsible for approving the accreditation standards for presentation to the governing bodies or accreditation at the Royal College, CFPC, and CMQ. Membership includes representation of postgraduate deans, program directors, faculty, residents, FMRAC, CFPC, CMQ, and the Royal College. Accreditation Process Advisory Committee (APAC) Committee representing PGME stakeholders PG Deans, PG Managers, PDs, PAs, Residents Family Med and Specialty Med Provide guidance, suggestions/feedback to CanRAC for process reform proposals. Standards Working Groups Six standards development working groups have worked to develop the draft new residency standards over the past year an a half. Each working group is responsible for one of the six FMEC-PG accreditation domains. Membership includes representation of postgraduate deans, program directors, faculty, residents, administrative personnel, FMRAC, CFPC, CMQ, and the Royal College. Conjoint Taskforce on Resident Input into the Accreditation Process Collaborative initiative Royal College, CFPC, CMQ, RDOC, FMRQ Ensure that resident input into the residency education accreditation process and its related processes are effective, confidential, valid, consistent, transparent, and of high quality.
6
Background: Initiation of change process
No system-wide review in many years Conjoint Accreditation Taskforce Interviews with PG Deans Challenges identified & Recommendations made for a reformed system 30% of PG Deans called for transformative change Current residency accreditation system had not undergone review in many years. To address this gap, the Royal College struck an accreditation reform committee in 2011/2012. Feedback received from the PG Deans, as well as other feedback received over the years, suggested that despite the current system’s positive reputation, there is concern that the current process of residency education accreditation is onerous with problematic reliability and consistency, including: inconsistent interpretation of the standards, and the need for better training and support. Given the concerns with the current accreditation system and the numerous changes underway in medical education, it was determined that there was a need to conduct a complete review of the system.
7
How did we get here? High-level timeline
: Royal College struck an accreditation reform committee 2012: Focus groups with PG Deans and other stakeholders confirm the need for change 2013: Three colleges partner to evaluate the current accreditation system, start brainstorming. 2014: Three colleges begin developing/consulting on a plan to introduce a new system : Begin to release elements of the proposed plan Fall 2015-Spring 2017: Prototype 1 begins (Laval, Saskatchewan, Sherbrooke, Memorial and Ottawa) February 2016: All 3 colleges receive endorsement (in principal) from the accreditation councils : Royal College struck an accreditation reform committee. 2012, focus groups were done with PG deans and other stakeholders regarding the need for change. Identified strengths, challenges and frustrations of the current system. First of its kind in 20 years the three colleges officially partnered to align the accreditation system with 21st century best practices, including CBME approach Royal College (CBD), CFPC (Triple C) – important to note that although the changes align with CBME, they would have happened regardless of CBME 2014, development and consultations begin on a plan to introduce the new system. Late 2014 and into 2015, CanRAC begins releasing elements of the proposed plan. This is an ongoing and evolving process, which is still going on today. 2015 – the first round of prototype testing began in 2015, and continues. Fall 2015-Spring 2017: Prototype 1 begins (Calgary, McMaster, Saskatchewan, Sherbrooke, Memorial and Ottawa) – also known as the Development Prototype.
8
Why change the system? Proposed reform elements were built on stakeholder feedback, and aim to: Create new general standards that provide clarity, reflect updated CanMEDS Framework, and supports transition to CBME practices. Reduce burden of work, promotes continuous evaluation and quality improvement, and integrates innovative practices. Digitize the accreditation system. 9/19/2018
9
What’s changing? What’s not?
Old System Proposed System Systematic rigorous process Peer review × Too much paperwork Digital platform × High stakes, “snap-shot in time” site visit Continuous cycle of accreditation × Lack of focus on QI outcomes Increased focus on outcomes, competency-based programs × Inappropriate categories of accreditation Revised categories × Resident input not optimized Robust system for resident involvement × Idiosyncratic nature of decision-making Clearer standards, standardized and reproducible decisions × High stakes, punitive Emphasis on continuous improvement What is staying the same? The proposed plan aims to preserve the strengths of the current system, including national standards, the onsite evaluation program, and peer reviews. Within accreditation, the major changes are to standards (what is being evaluating) and the processes (how it is being evaluated). How will standards change? The new general standards will: Provide greater clarity, without being overly prescriptive. Reflect the new content of the CanMEDS 2015 Framework. Support the transition to competency-based medical education (CBME. Place greater emphasis on the learning environment. How will the accreditation process change? The proposed changes aim to: Reduce the burden of work on schools. Promote continuous evaluation and quality improvement. Integrate innovative practices into the accreditation system.
10
New Accreditation Standards and Processes
Understanding what the changes are, and how the overall system will benefit
11
Understanding the proposed reform components
Proposed Conjoint Accreditation System Reform Components New Standards for programs and institutions. A new evaluation framework of standards for residency programs, including Exemplary ratings and best practices. A new progressive accreditation cycle of regular accreditation visits, supported by continuous data monitoring. Introduction of a digital Accreditation Management System. Increased emphasis on self-study and continuous quality improvement. Together with input from various postgraduate stakeholders, CanRAC has developed a framework for a new conjoint accreditation system for Canadian residency education. This system comprises ten key components, which are: 1. A new framework of standards for residency programs, with an emphasis on high-yield markers and program outcomes. 2. A new institutional review process, standard system, and status category. 3. A renewed emphasis on the quality and safety of learning environments. 4. Introduction of a digital Accreditation Management System that makes the accreditation process more efficient. 5. A new eight year cycle of regular accreditation visits, supported by continuous data monitoring. 9/19/2018
12
Understanding the proposed reform components
Enhanced onsite review processes, such as tracer methods. A new institutional review process, standard system, and status category. A renewed emphasis on the quality and safety of learning environments. New decision categories, with thresholds to improve consistency of decision-making. A systematic approach to evaluation, research, and continuous improvement of the system. 6. Increased emphasis on self-evaluation and continuous quality improvement. 7. Enhanced onsite review processes, such as tracer methods. 8. New decision categories, with thresholds to improve consistency of decision-making. 9. A new category of "exemplary" ratings to identify programs who have developed outstanding innovations. 10. A systematic approach to evaluation, research, and continuous improvement of the system. 9/19/2018
13
Benefits of the new proposed standards
The new proposed general standards will: Provide greater clarity, without being overly prescriptive. Reflect the new content of the CanMEDS 2015 Framework. Support the transition to competency-based medical education. Place greater emphasis on the learning environment. Together with input from various postgraduate stakeholders, CanRAC has developed a framework for a new conjoint accreditation system for Canadian residency education. This system comprises ten key components, which are: 9/19/2018
14
Example Markers & Evidence
Standards Domain Definition Institution Level Program Level Example Markers & Evidence Institutional Governance Standards that relate to the overall oversight of medical education at the institutional level and governance of the educational mission. √ e.g. support for education (promotion policies) Program Organization Structural and functional standards related to the administration of the education program. e.g. PD protected time (interviews w/ PD, others) Education Program Standards related to the design of the education program, its goals/objectives, the specific content required in the academic curriculum., and the assessment of learners and their readiness for practice (assessment and achievement of competencies). e.g. comprehensive plan for teaching and assessment (curriculum map or blueprint) Resources Standards include those dedicated to sufficiency of ALL resources (both education program specific and broader resource issues). e.g. patient/ procedural volumes (eLog, ePortfolio) Learners, Teachers & Administrative Personnel Standards relevant to the people most directly involved in the delivery of residency education, namely teachers, learners, and administrative personnel. e.g. learning environment that protects patient, resident and faculty safety (learner survey) Continuous Improvement Standards relate to ensuring the program/ institution has effective continuous improvement mechanisms and processes e.g. institution & program involvement in CQI New institution level standards Clear, transparent expectations for postgraduate deans in oversight of residency programs. Increased focus on the outcomes of education, continuous improvement, and the learning environment. Emphasis on high-yield markers of a quality postgraduate offices and learning sites. Identify institutional level and/or site specific strengths or areas for improvement that affect the delivery of quality residency and/or site specific New program level standards Clear, transparent expectations Increased focus on the outcomes of education, continuous improvement , and the learning environment Emphasis on high-yield markers of quality residency programs. Flexible to accommodate traditional and competency based programs.
15
Why are the processes changing?
The proposed changes aim to: Reduce the burden of work on schools. Promote continuous evaluation and quality improvement. Integrate innovative practices into the accreditation system. The proposed changes aim to: Reduce the burden of work on schools. Promote continuous evaluation and quality improvement. Integrate innovative practices into the accreditation system. 9/19/2018
16
Progressive Accreditation Cycle
17
Why change from a six to eight year cycle?
Current challenges Overemphasis on high stakes, “snapshot in time” onsite Episodic nature of accreditation High burden preparation for accreditation visits Future improvements Enhance emphasis on accreditation as CQI More continuous accreditation cycle supported by ongoing data monitoring Progressive, continuous accreditation cycle to be supported by ongoing data monitoring Reduction of high stakes, punitive nature of accreditation
18
What will an 8 year cycle look like?
8 years between regular accreditation visits With predictable 2 year follow-ups Introduction of data collection from variety of sources to enhance evaluation of clinical learning environment Aggregate data (e.g., survey, etc) intended to contextualize program quality and safety Multiple sources of aggregate including graduates, residents, teachers, others Data monitoring processes could be used as an outcomes-focused performance barometer for programs and institutions throughout the accreditation cycle. Furthermore, ongoing data monitoring will inform particular lines of questioning or evaluation during onsite visits. 8 years between visits Possibilities could include: Mini Survey Teams deployed to manage ERs AC to randomly sample APORs as quality assurance
19
INTERNAL (mid-cycle) AND APOR or ER
Continuous Data Monitoring in the New Accreditation Cycle 2020 (Base Year) 2020 Nov 2021 Nov 2022 (Year 2) 2022 Nov 2023 Nov 2024 (Year 4) 2024 Nov 2025 Nov 2026 (Year 6) 2026 Nov 2027 Nov 2028 (Year 8) AI with follow-up by RS INTERNAL (mid-cycle) AND APOR or ER ONSITE SURVEY APOR or ER APOR or ER Expectation that AI conducts the program IRs Institutional Review APOR/ER of Programs (as required) APOR/ER of Programs (as required) ONSITE SURVEY APOR/ER of Programs (as required) Program Reviews Institutional Review Program Reviews LEGEND: AI = Accredited Institution Rectangles = Accreditation Activity Ovals = Accreditation Status Purple = Institution Level Blue = Program Level RS = Regular Survey APOR = Action Plan Outcomes Report ER = External Review IR = Internal Review Data monitoring potential Gradual implementation as new system is established Aggregate data available over time Survey data: residents, faculty, graduates Learner data Faculty data Graduate data Clinical outcomes? CQI & data monitoring ONGOING DATA MONITORING: SURVEYS (graduates, residents, faculty) and MILESTONES → deviation from mean = trigger for accreditation activity (e.g., progress report, external review) ROBUST PROCESS TO IDENTIFY SIGNIFICANT PROGRAM CHANGES → trigger for accreditation activity (e.g., progress report, application)
20
AMS Platform: A New Digital System
Understanding the new Accreditation Management System (AMS)
21
Recommendations for change
Current challenges PSQ: administrative burden High intensity, time-condensed work Used episodically Not reliable or easy to use for CQI Future improvements Digital AMS Single repository Integration with other systems Intuitive, user-friendly Guides CQI activities Typically completed only prior to onsite and internal reviews. Concerns have been raised re: administrative burden of current method (AC, PGME Offices/Programs, FMEC-PG, 2012 Conjoint Task Force) Does not provide for a reliable, easy to use, program/institution CQI tool. Digital tool to support conjoint accreditation activities for both internal and external stakeholders. Single repository for accreditation information and evidence Ability to integrate with other systems for optimal information sharing and flow Intuitive and friendly interface for organizing, tracking, automating, and conducting accreditation activities For programs and institutions For surveyors and committee members For college administrators Guides continuous improvement activities.
22
Proposed AMS functionality
An Accreditation Management System (AMS) equipped with: Program portfolio Self-Study Tool Action Plan
23
Program Portfolio Houses all accreditation information (e.g. documentation, policies, etc.) Directly linked to new standards Customized notifications Prepared at all times for internal and mandated CQI activities Guidance tool available Note: The program profile will not serve as the sole source of information used by the Colleges to evaluate the standards. Evaluation of the requirements will include consideration of information collected via data collection instruments (internal/external) and the onsite component of the accreditation review; however, please note that the new standards will ensure that this is transparent. A single place within the AMS for each Program to house all information related to residency education residency education accreditation e.g. documentation, policies, etc. Required information would directly link to the new accreditation standards. Being notified via “ prompts” should a change to your profile require attention to ensure alignment with accreditation standards. Updated on a continuous basis; prepared at all times for internal and mandated CQI activities. An easily accessible guidance tool to support users, in addition to support from the Three Colleges.
24
Self-study and CQI Current challenges
Limited self-evaluation within PSQ Accreditation as high stakes, sometimes punitive Future improvements Self-Study tool Mimics onsite process Could be used for IR Easily identify strengths, AFIs at any time Accreditation aligned with principles of ongoing CQI Limited self-evaluation tool within PSQ (Self-Identification of Strengths and Weaknesses).
25
Self-Study Tool Easy to use Directly linked to new standards
Easy access to required sources: “Evergreen” program profile (documents, policies, etc.) Information/Data from external sources (e.g. faculty survey, aggregate data from ePortfolio or equivalent, such as “Time Stamps”). Facilitates alignment to new Continuous Improvement standards Three Colleges would not see the results of the actual self-evaluation, upon which the action plan is based. Did you know? Challenges of Self-Study Large commitment of resources Reliability and validity of self-study and self-report data Benefits of Self-Study Increased focus on quality improvement Preparation for accreditation/external peer review Improvements in team functioning Use of self-study is well-established in accreditation (particularly healthcare accreditation) Not currently widespread throughout the Canadian medical education continuum (with the exception of CACMS)
26
Onsite Visits Moving Forward
Program and Institutional Reviews
27
Recommendations for change
Current challenges Time-condensed, high volume prep Lots of paper work Does not support continuity of workflow for accreditation activities Future improvements Continuous access and updating of info in AMS Personalized prompts Submit information online via “Publish” functionality Reduction in time-condensed, high volume prep Accreditation-related information will already be in AMS Ensure all data in AMS is accurate and up to date Submit information through AMS: “Publish” function Snapshot of Program/Institution Portfolios captured, sent to College (incl. surveyors) for review
28
Onsite Visits: What stays the same?
Peer review Experiencing an onsite visit Scheduled meetings with key stakeholders Documents/systems made available to surveyors Identification of program strengths and areas for improvement Surveyors will continue to: Conduct onsite visits Meet with people and talk, gathering qualitative data about the program and the learning environment Drill down ad hoc to investigate areas needing further focus (similar to tracers!) Use their experience in medical education to evaluate the general and discipline-specific standards
29
Onsite Visits: What will be new?
All information online Flexibility in the review schedule Interview guides to facilitate surveyors’ work Conduct tracers mapped to standards Identify and facilitate sharing of innovative and best practices between programs AMS Surveyors will now be able to: Access all information online, and input their data digitally Use the AMS to help adjust the review schedule (incorporating tracer* areas) Use interview guides to help keep track of what to ask, and which standards to evaluate Follow tracers mapped to standards, for both the program and the institution (and the linkages between) Put more emphasis on the learning environment, identifying innovation and best practices to be shared later *WHAT IS TRACER METHODOLOGY? Originally a clinical concept, TRACERS are a new methodology to the medical education field and to accreditation of residency education programs. The Three Colleges intend to use this concept by “tracing” specific files, processes, or themes to evaluate postgraduate medical education. Tracers are focused on evaluating “what was actually done” instead of “what you say you do”. While surveyors may already ask these types of questions, the introduction of TRACERS aims to guide questioning to follow a path where actual actions and events have occurred, spanning different departments, topics, levels, etc. It is meant to be flexible and investigative. It is evaluation from a different point of view– from a TRACER’S point of view. Each TRACER can contain questions that address different standards from different domains.
30
What to expect in the years ahead
Next Steps What to expect in the years ahead
31
What’s Next Through 2016? National Consultations
Key principles and implementation (Spring/Summer 2016) Standards and detailed process elements – wider audience (fall 2016) Preparation for final approvals via all three college’s AC (2017) and phased implementation (2017 and beyond) Key principles and implementation (Spring/Summer 2016): Continue consultation via standards development working groups, advisory committees, and the three colleges’ accreditation committees. Individualized impact letters to PG deans (Complete) Presentation to Faculty of Medicine/School of Medicine Deans. Standards and detailed process elements – wider audience (fall 2016) Detailed plan under development. Preparation for final approvals (2017) and implementation (2017 and beyond)
33
We’re here to help! We know that transitioning to a new accreditation system is complex. If you have questions/feedback about the reform, we encourage you to reach out to us at:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.