Research and Evaluation Overview

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Patient Engagement in Design, Delivery, & Discovery 8 th Annual Utah Health Services Research Conference February 25, 2013 Lucy A. Savitz, Ph.D., MBA Director,
Does It Work? Evaluating Your Program
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
How to Develop the Right Research Questions for Program Evaluation
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
A Comparison of 42 Local, National, and International HIA Guidelines Andrew L. Dannenberg, MD, MPH Katherine Hebert, MCRP Arthur M. Wendel, MD, MPH Sarah.
Workshop 6 - How do you measure Outcomes?
Title of Scholar Project Month day, year Presenter: Supervisor(s): Critical Care Western.
Results Student Engagement : Students generally found logbooks easy to use and practical in the hospital setting. Purpose : There appeared to be a perceived.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Assessing Quality Improvement in Special Needs Plans Marsha Davenport, MD, MPH CAPT USPHS Chief Medical Officer Medicare Drug and Health Plan Contract.
Strategic planning A Tool to Promote Organizational Effectiveness
Tim Friede Department of Medical Statistics
Shubhangi Arora1; Eden Haverfield2; Gabriele Richard2; Susanne B
First Things First Grantee Overview.
Title of the Change Project
Evaluating the Quality and Impact of Community Benefit Programs
A FRUIT AND VEGETABLE PRESCRIPTION PROGRAM
Criteria Rollout Meeting October 30, 2016
Title of the Change Project
Measurement Tools ENACTUS TRAINING Quality of Life
Jo-Anne Kelder Andrea Carr Justin Walls
Assessment in student life
DATA COLLECTION METHODS IN NURSING RESEARCH
Designing Effective Evaluation Strategies for Outreach Programs
DEVELOPING EVIDENCE-BASED PRACTICE IN CHAPLAINCY:
Patient Centered Medical Home
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
The Research Design Continuum
Right-sized Evaluation
Module 5 HAIL Research This module provides an overview of how researchers will collect data with both Independent Living Specialist(ILS) and consumers.
The Literature Search and Background of the Problem
MUHC Innovation Model.
Qualitative Data Collection: Accuracy, Credibility, Dependability
Leigh E. Tenkku, PhD, MPH Department of Family and Community Medicine
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Development of an electronic personal assessment questionnaire to capture the impact of living with a vascular condition: ePAQ-VAS Patrick Phillips, Elizabeth.
Distraction Techniques during pediatric medical procedures
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
A Path of Learning and Improvement
Performance Measurement and Rural Primary Care: A scoping review
Provincial Evaluation Plan By Kathleen Douglas-England
Measurement Tools ENACTUS TRAINING Quality of Life
Program Evaluation Essentials-- Part 2
Debriefing Presentation Covenant Health
Tourism Marketing for small businesses
Physical restraints vs. seclusion in hospitalized patients
This presentation will include:
Evaluation Goal: Ensure learnings from the program are identified and recorded, in particular: What roles can CHCs best play in addressing SDOH? What types.
Safety Culture Self-Assessment Methodology
Viral Hepatitis Prevention Project (VHPP) in Massachusetts
Unmet Needs and the Role of Discrimination
Faculty Development Dr Samira Rahat Afroze.
Research & scholarship
The impact of small-group EBP education programme: barriers and facilitators for EBP allied health champions to share learning with peers.
Needs Assessment: Getting To Know Your Audiences Better
Finance & Planning Committee of the San Francisco Health Commission
Presenter: Kate Bell, MA PIP Reviewer
The Center for Nursing Research Ochsner Health System December 2015
Speaker Name, Degree Position, Institution Date
Allied Health Statistics
Multijurisdictional FAQs (Workshop Stream 3)
Robyn Lennox, Hospital Discharge Hub Manager
Improving Stroke Patient and Family Education by using F.A.S.T.
Stakeholder engagement and research utilization: Insights from Namibia
Presentation transcript:

Research and Evaluation Overview SANJEEV SOCKALINGAM & JANE ZHAO June 15, 2018 1

LEARNING OBJECTIVES 1. Describe past and current research activities for Project ECHO® 2. Differentiate research and evaluation requirements for Project ECHO® 3. Identify evaluation frameworks and measures for Project ECHO ® AT THE CONCLUSION OF THIS SESSION, YOU WILL : 2

Resources: Publications & Data Analyses 80+ publications and growing Across at least 13 partners and 12 different conditions Majority of ECHOs evaluate levels one to four Four ECHOS have included patient health outcomes UNM: NEJM study on HCV (2011) Beth Israel Deaconess Medical Center: two JAMDA studies on geriatric mental health (2014, 2016) VHA: Journal of Telemedicine and Telecare study on endocrinology (2015) Cost effectiveness analyses Conf. paper on HCV that showed cost effectiveness at $3,500 per QALY Travel data in some papers that could be translated to dollars 3

Systematic Review: Current Evidence To Systematic Review: Current Evidence To Support ECHO Model Zhou, Crawford, Serhal et al., 2016 Moore’s Evaluation Framework # of Studies Results Level 1: Participation 12 Median participants = 38 Level 2: Satisfaction 13 All studies showed high levels of participant satisfaction Level 3: Learning / Knowledge 4 Increased pre-post knowledge scores Level 4: Competence 8 Used surveys and/or semi-structured interviews; 7 out of 8 studies showed improved participant self-efficacy Level 5: Performance 1 Chronic Pain: Change in service utilization (less mental health and more physical health) and increased non-opioid medication usage Level 6: Patient Health 7 Hepatitis C: Similar to SVR rates to specialists Dementia: Less hospitalized and improved behavioural issues Diabetes: Improved HbA1C levels Level 7: Community Health None

Publications by Conditions and Outcome Levels 5

Overall ECHO Impact Total number of ECHO publications to date: 86 Total number of Citations by ECHO-addressed conditions: 1223 6

Existing Data Sources for ECHOs iECHO / ECHO Admin statistics Records (chart reviews, referrals, travel savings) Existing databases Electronic Health Records Insurance claims data (e.g., Medicaid, managed care organizations) Statistics Canada-Census data 7

Research vs. Evaluation: How would you describe the difference? 8

Research Versus Evaluation Both use similar methods and analytical procedures Serve different purposes and goals Guided by different questions Target different audiences 9

Why It Is Important To Evaluate? 1010

Importance of Evaluation To determine the effectiveness of the ECHO program and if we are meeting our objectives (i.e. benefiting under-serviced, remote, and rural areas) To identify quality improvement opportunities To measure participant’s change in knowledge and self-efficacy To meet the MOHLTC funding requirements for reporting To document & distribute CME credits To demonstrate accountability within your institution To publish and add knowledge to your area 1111

Moore’s Evaluation Framework Why use this framework? We want to see how ECHO affects individual level change, but also how it relates to patient and population level health outcomes

Common ECHO Ontario Metrics Number of Spoke sites per cycle Number of sessions attended (per participant and per Spoke site) Number of professions/disciplines participating Participants knowledge - pre/post change Participants self-efficacy (includes attitudes and confidence)- pre/post change Clinical effectiveness- are recommendations implemented Participants satisfaction- per session and/or post cycle 1313

Evaluation Framework For ECHO Moore’s Evaluation Framework Evaluation Measures Sources of Data Level 1: Participation Number of Spoke sites Number of sessions attended Number of professions/disciplines participating Registration records Attendance records Level 2: Satisfaction Satisfaction surveys (looking at IT, format, learning environment) Session satisfaction surveys Post cycle satisfaction surveys Focus groups Interviews Level 3: Learning / Knowledge Change in knowledge Use pre and post test multiple choice questions around knowledge Level 4: Competence Changes in perceived self-efficacy Pre-post survey rating scales Level 5: Performance Degree to which attendees perform what ECHO intended them to do Observe performance in clinical setting Patient chart reviews to view recommendations implemented Self-report performance surveys Level 6: Patient Health How much does health of patients change as a result of ECHO Patient chart reviews Patient interviews or self-reported questionnaires on health status Level 7: Community Health Degree to which health in the community of patients changes due to ECHO-related changes in practice Epidemiological data and reports 1414

The Evaluation Process Evaluation needs to be embedded in the project planning process Meet with relevant stakeholders: Discuss a plan for your protocol Rationale for doing this evaluation-What do you want this evaluation to achieve? Identify your target population Do you want to publish? Conduct a literature search: What studies currently exist? How will this evaluation add to knowledge in this area? 1515

The Evaluation Process Start developing your protocol: Determine a design: Qualitative, Quantitative, Mixed Methods? Determine methods/tools to be used: Focus Groups, Paper Surveys, Online Surveys, Observations, Interviews? Determine types of questions to be used and how you want to ask them: Consider Moore’s Framework Knowledge, Self-Efficacy, Attitudes, Satisfaction, Open-ended questions, etc.? Multiple Choice, Likert Scale, Pre/post, etc.? Determine database and analysis tools: RedCap-Data capture software tool or SurveyMonkey SPSS or SAS for quantitative data analysis NVivo for qualitative data analysis Submit for REB or QI approval and start evaluating 1616

Choosing between QI and REB Quality Improvement Answers research question Risk must be assessed by ethics committee Rigid protocol Findings are generalized REB Improves program, process or system Minimal participant risk Adaptive, iterative design over time Findings applicable to local institution *For further guidance, connect with your institutions REB Coordinator

REB Application Process Develop Protocol (include background, rationale, methodology, risks/benefits, confidentiality, etc.) Identify PI Evaluation Development Consent Forms Assent Forms Recruitment Materials Data Collection Forms Staff Credentials/ Certificates Gather Supplementary Documents Create your REB submission package with all required documents-follow your institutions REB checklist REB Submission Revisions may be required Additional documents may need to be submitted Be aware the REB approval process can take up to months REB Decisions *For further guidance on your institutions REB process, connect with your REB Coordinator 1818

REB Review Process *For further guidance on your institutions REB process, connect with your REB Coordinator 1919

Potential Barriers To Keep In Mind Delays with REB approval Balancing evaluation needs with program needs Development of knowledge metrics for inter-professional Spokes Completion of research forms by Spokes (pre/post, weekly, etc.) Gathering patient and community level data Data sharing between ECHOs for larger evaluations 2020

Take Home Messages Plan your evaluation early- include it in your project planning process Think about what evaluation methods are the most practical and feasible for your ECHO 2121