Coalition for Physician Enhancement Maintaining Physician Competence: A Global Perspective Ottawa, Ontario Oct 4, 2012 The New PEER Model: A (R)evolution.

Slides:



Advertisements
Similar presentations
PQF Induction: Small group delivery or 1-1 session.
Advertisements

Assessment and eligibility
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
February 9, 2012 Session 1: Observing Lessons NYSED Principal Evaluation Training Program.
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
1 Orientation to Teacher Evaluation /15/2015.
Clinical Audit as Evidence for Revalidation Dr David Scott, GMC Associate, Consultant Paediatrician and Clinical Lead for Children’s Services, East Sussex.
360 Degree Feedback & Performance Appraisal. What is 360 Degree Feedback ?? 360-degree feedback is defined as “The systematic collection and feedback.
Quality Assurance. Identified Benefits that the Core Skills Programme is expected to Deliver 1.Increased efficiency in the delivery of Core Skills Training.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Revised AQTF Standards for Registered Training Organisations Strengthening our commitment to quality - COAG February August 2006.
Linking the learning to the National Standards for Safer Better Healthcare Joan Heffernan Inspector Manager Regulation – Healthcare Health Information.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Session 2: Developing a Comprehensive M&E Work Plan.
Info-Tech Research Group1 Info-Tech Research Group, Inc. is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
SUPPORTING PEOPLE PROVIDER FORUMS An overview of Supporting People’s new approach to Performance Monitoring and Quality Assurance.
Revised Quality Assurance Arrangements for Registered Training Organisations Strengthening our commitment to quality - COAG February 2006 September 2006.
Measures for a national outcomes based audit model Stakeholder consultations.
Click to edit Master subtitle style Competence by Design (CBD) Foundations of Assessment.
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
Development of the Assessor Governance Framework at the College of Physicians and Surgeons of Ontario (Canada)
Academic Program Review Workshop 2017
Logic Models How to Integrate Data Collection into your Everyday Work.
ePortfolio for continuing professional development
Quality Assurance processes
Benchmarking Excellence in Restorative Conferencing
PRESENTATION OF FINDINGS GRANTEES NEED ASSESSMENT
A community of learners improving our world
Building Our Medical Neighborhood
Thursday 2nd of February 2017 College Development Network
Arancha Oviedo EQAVET Secretariat
MODULE 11 – SCENARIO PLANNING
Orientation for New Site Visitors
Building Our Medical Neighborhood
SICI, Malta, October 2017 Emer Egan, deputy chief inspector
Developing Thinking Thinking Skills for 21st century learners
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
What is performance management?
Controlling Measuring Quality of Patient Care
Primary health care performance measurement initiatives across three Canadian provinces Martin-Misener, R., Johnston, S., Burge, F., Blackman, S., Scott,
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
Multi-Sectoral Nutrition Action Planning Training Module
Governance and leadership roles for equality and diversity in Colleges
Planning a Learning Unit
European TRAINING FOUNDATION
Framework for Strategic Plans and Annual Performance Plans
Introduction to CPD Quality Assurance
Gary Carlin, CFN 603 September, 2012
School’s Cool Makes a Difference!
Building Our Medical Neighborhood
SAI PMF Progress Update Annual INTOSAI CBC Meeting
Building Knowledge about ESD Indicators
Towards a RANCH Mentoring Program (David Perry & Sue Birch)
Chicago Public Schools
Core Competencies of a World Class Customer Advisory Board
Chapter 1 - Poll Questions
Key Stakeholders are aware of the Coalitions activities
Action Plan 1: 2017 – 2020 For Information Only.
Standard for Teachers’ Professional Development July 2016
Preston & Wingham Primary Schools Federation
Assessing Pupil Progress in Science (APP) Department CPD session Spring 2009 Slide 0.1.
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Implementing a Quality Management System Approach
Developing SMART Professional Development Plans
TLQAA STANDARDS & TOOLS
Presentation transcript:

Coalition for Physician Enhancement Maintaining Physician Competence: A Global Perspective Ottawa, Ontario Oct 4, 2012 The New PEER Model: A (R)evolution in Peer Assessment at the CPSO Presented by CPSO Research and Evaluation Team: Rhoda Reardon, Nanci Harris, Wendy Yen, Craig Nathanson

Presentation Outline: ‘Back story’ Change Drivers Vision Delivering change – a work in process

Who We Are CPSO Registers and Regulates Ontario’s 30,000 Physicians

History of peer assessment at CPSO Around since 1980 First of its kind in Canada ~ 385 trained peer assessors, ~ 1700 peer assessments annually Random and targeted (e.g., 70+ years old)

Current CPSO Peer Assessment Process Random/Targeted Selection of Physicians Assessment Visit Review Charts Interview Physician Complete assessment module and Report Match assessor to physician Quality Assurance Committee reviews report; decides outcome In the CPSO peer assessment program,2-3% of the province’s 28,000 practising physicians are randomly selected for assessment on an annual basis. Physician selection is random, independent of the complaints and investigative process where quality assessment is used as part of the investigative process. Peer assessments are performed by a single trained practising physicians who achieved excellent ratings in previous peer assessments, and who have similar practice structures and patient populations. Inter-assessor reliability for peer-review assessment of the acceptability of quality of care is excellent (Kappa = .89).20 DESCRIBE PEER PROCESS BRIEFLY Who gets these? -Random assessments (~700/year); Age 70s (~150/year); Incident driven (~40/year); Most specialties Who does the assessing? - Trained practising physicians; Practice matched; Previously assessed For the purposes of this study we dichotomized the Assessment outcomes into : (i) acceptable with no further action, (ii) unacceptable requiring a) reassessment, or b) interview with the QAC to determine an immediate remediation plan. *************************************************************** Satisfactory Performance: No Further Action Reassessment Care Concerns found: Interview with Quality Assurance Committee 5

Drivers of Change No common understanding of purpose of peer assessment program Limited tracking of outcomes = difficult to measure impact Eroding of structured delivery Lack of alignment with modern frameworks, i.e., CanMEDS Isolated, under-used assessors Generic model not useful for specialties

Develop a common understanding of purpose of peer assessment program Set the bar for peer assessment in Ontario and lead by example Systematically track outcomes to determine impact (individual and program) Link assessment to life-long learning and continuous improvement Align peer assessment program – CanMEDS; quality “agenda” Assessor training and resources to deliver the vision

Peer Assessment purpose: Promote continuous quality improvement by providing physicians with feedback to validate appropriate care and show opportunities for practice improvement

P E R The PEER model

P E R Plan & Prepare Evaluate Engage Reflect & Respond Phase 1 pre-visit Evaluate E Phase 2 practice visit Engage ** Formerly known as the Courtship Model Reflect & Respond R Phase 3 Post-visit

Purpose of phase 1 – Plan & Prepare Introduce physician and assessor Familiarize physician with program goals and process Collect basic practice information Familiarize the assessor with a physician’s practice Create a plan for assessment visit Collect discipline relevant practice information Initiate process for multi-source feedback (pilot in 2013) Initiate physician self-assessment using quality indicator tools

Purpose of Phase 2 Evaluate & Engage Chart review Review of other relevant sources of information (e.g., simulation) Tailored, discipline specific assessment tools and chart selection Facilitated feedback – review feedback and information and consider improvement opportunities (assessor feedback, MSF, self audit) Recording improvement opportunities, developing QI or CPD plan (aligned with CFPC and RCPSC) Assessor training e.g. effective feedback, CPD coaching

Purpose of phase 3 – Reflect & Respond Assessor report to College Physician evaluates PEER experience Physician completes his/her plan for QI or CPD and provides to College (Alignment – CFPC & RCPSC) Follow-up with physician - outcome of QI/CPD plans Feedback/outcomes of CPD - PEER program impact

Delivering change – a work in process

Assessor Networking Groups ~ 385 peer assessors 38 peer assessor network groups Assessor Network Coordinator: communication conduit (2-way) network meetings; assessor recruitment & training; QI resources Promote a ‘community of peer assessors’, committed to program improvement

Quality Indicator Template Element Question What does this mean? Definition What is quality? Statement to define quality (i.e., the indicator itself) Evidence Says who? Source of evidence/authority that supports this quality indicator CanMEDS How is this item aligned with CanMEDS? Indicator is clearly linked to one or more of the CanMEDS roles and competencies Criteria How will you know quality when you see it? Common evaluation criteria to allow assessors to consistently determine if this indicator has been met Resources If quality is absent, how can it be improved? Resources available to assist physicians who have not met this indicator Agreement Who needs to agree on this indicator? Broad consensus among relevant stakeholders that this indicator is accurate, relevant and feasible to assess

Building with our assessors 7 networking groups identified to be ‘trail-blazers’ for this task Volunteers identified to form working groups Initial in-person meetings completed Facilitation & measurement expertise from CPSO staff Use of on-line collaboration and survey tools to facilitate process

Development Process Overview Step 1 Assessor working group drafts: tailored pre-assessment questionnaire medical expert assessment module chart selection process Step 2 All assessors in networking group reach consensus on a), b) & c) Selected stakeholder consultation; cross discipline consultation Step 3 Pilot - offline Step 4 Step 5 Live pilot; implement

Building Assessment Tools with assessors Strengths Building on 30 years experience with peer assessment Cadre of skilled assessors with focused practice expertise Assessors arranged into networking groups with defined leads Staff support to coordinate assessor networks Challenges Maintain existing operations while ‘revising’ Pace: risk of frustrating assessors anxious to use new instruments Focus beyond identifying ‘quality’ to include evidence, assessment criteria and practice improvement resources Effective, yet feasible piloting processes

Hospitalist assessment module

Interaction Goals Discussion Themes Engaging learning process for all participants Opportunity to contribute where your interest lies Useful input for our program Discussion Themes Measuring outcomes in practice Setting bar for peer assessment Assessment as a trigger for QI or CPD planning. Building assessor skills to deliver the PEER vision Wild card

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/MTQxMTk5NDYyMA If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone.

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/NDIyOTY1NjEx If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone.

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/MTI1MTgwODQ1Nw If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone.

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/LTYwOTEwMjc3Mg If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone.