Presented by: Gail V. Barrington, PhD, CMC

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
External Review Team Conference Call: Preparing the Team for Off-site Review Work.
How to Develop the Right Research Questions for Program Evaluation
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD, 1,2 B. Rucker, MPH 3,4.
Training of Process Facilitators Training of Process Facilitators.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Monitoring :Thailand’s Experiences Session 2: Monitoring: Processes, Potentials, Tools and Instruments Global Dialogue of Agencies and Ministries for International.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Division Of Early Warning And Assessment MODULE 5: PEER REVIEW.
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Strengthening the Science-Policy Platform on Biodiversity and Ecosystem Services Africa Consultation on IPBES May 2010 Nairobi, Kenya Peter Gilruth,
Renewing Public Health Systems: Public Health Human Resources Planning Processes Co-authors: Sandra Regan, Western University Marjorie MacDonald, University.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION GRAVES COUNTY SCHOOLS © 2010 AdvancED.
Session 2: Developing a Comprehensive M&E Work Plan.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Evaluating with Perspective: Mobilizing Community Participation in Program Evaluation …its not too late!
Strategic planning A Tool to Promote Organizational Effectiveness
Stages of Research and Development
Evaluating the Quality and Impact of Community Benefit Programs
Developing Community Assessments
for CIT Program Operation Resource Development Institute
4th Assessment of Progress Against the GEOSS 2015 Strategic Targets
Resource 1. Involving and engaging the right stakeholders.
CCP: Monitoring and Evaluation
The assessment process For Administrative units
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Canadian Election Study
Investment Logic Mapping – An Evaluative Tool with Zing
MUHC Innovation Model.
Accreditation Canada Medicine Accreditation 2016.
BUMP IT UP STRATEGY in NSW Public Schools
44th Meeting of the Standing Committee Bonn, Germany, October 2015 Report on activities of the Strategic Plan Working Group Ines Verleye,
HEALTH IN POLICIES TRAINING
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
A Path of Learning and Improvement
Zelphine Smith-Dixon, State Director of Special Education
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
“CareerGuide for Schools”
Human Services Integration Project
Opportunities for Growth
Guidance on Natura 2000 and Forests – Scoping Document
Farmers Market and Local Food Promotion Program Grant Writing Workshop
Community Technology Assessments
until the start of the webinar.
Session 4: SDG follow-up and review mechanisms
Resource 1. Evaluation Planning Template
Scanning the environment: The global perspective on the integration of non-traditional data sources, administrative data and geospatial information Sub-regional.
Introduction on the outline and objectives of the workshop
Considerations in Development of the SBSTA Five Year Programme of Work on Adaptation Thank Mr. Chairman. Canada appreciates this opportunity to share.
National Center for Mobility Management Webinar May 10, 2018
Finance & Planning Committee of the San Francisco Health Commission
The Hub Innovation Program Evaluation Plan
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Promoting Healthier Residence Environments
TECHNOLOGY ASSESSMENT
Developing SMART Professional Development Plans
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Evaluation of Health Canada’s Hepatitis C Program: Engaging Stakeholders in the Evaluation Process Presented by: Gail V. Barrington, PhD, CMC Barrington Research Group, Inc. National Hepatitis Coordinators’ Conference Program Evaluation for Viral Hepatitis Integration Projects Workshop San Antonio, Texas January 30, 2003

Key Principles of the Evaluation Process Wide stakeholder engagement from the evaluation design to Program recommendations Consultation with hepatitis C experts Guided by a Program logic model and Data Collection Matrix Use of traditional social science and applied research methods Peer reviews of the final report Purpose of this presentation is to demonstrate how these key principles were used in the mid-term evaluation of Health Canada’s Hepatitis C Program with the ultimate goal of increasing the utility of the evaluation.

Initial Consultations Preliminary interviews with all regional Program staff Informal telephone interviews conducted by the Project Director Project Director “listened in” on Program teleconferences early in the evaluation Purpose: To build rapport and acclimatize regional staff to the evaluation and the Evaluators To understand the regional perspective of the evaluation Two hepatitis C experts were invited to educate Barrington Research staff on hepatitis C issues Building rapport with the regional Program staff was critical since they are the gatekeepers. Several components of the Program are delivered through a regional structure. Recognized that their (regional) cooperation would be critical for the evaluation to proceed.

Instrument Design Each survey/interview question was designed to address an evaluation question from the Data Collection Matrix (DCM) The DCM was based on the Program Logic Model This use of the Logic Model and DCM helped to focus the questions asked in the surveys/interviews Survey/interview tools were also reviewed by stakeholders to ensure question relevancy, appropriateness, and comprehensiveness Health Expert Survey Various case study tools Community-Based Support Implementation and Outcome Achievement Survey Researcher Survey For each of the 4 areas addressed by the mid-term evaluation (Scope of the Problem, Program Implementation, Progress Toward Outcome Achievement & Lessons Learned), the DCM contains a set of evaluation questions and related indicators (evidence that the evaluation question has or has not been addressed by the Program). The Program Logic Model outlines the primary Program activities and their expected outputs, as well as the expected immediate, intermediate and long-term outcomes of the Program activities. The Health Expert Survey: Mark Swain The case study interviews: Lynn Schindel ensured that potential social issues were addressed by the tools. Also reviewed by the case study sites. Community-Based Survey: Regional Program staff ( process didn’t work in terms of achieving the preferred response rate) Researcher Survey: CIHR (positive and negative aspects) This engagement of the stakeholders likely contributed to the high response rates exhibited in most cases.

Sample Selection Because this was a formative evaluation, participants from many groups were to be surveyed to gain a broad overview of the Program Purposive sampling best addressed the need to collect data from populations of an unknown size (e.g., “health experts”) As resources were limited, this sampling approach was appropriate Key informants were used to identify potential participants: Health Canada – National and regional Program staff and other stakeholders at the national/ regional levels Hepatitis C experts – Health Expert Survey Canadian Institutes for Health Research (research fund manager) – Researcher Survey

Sample Selection (cont’d) Local level Case studies in 7 sites across Canada to: Access those infected with/affected by hepatitis C, community-based support projects funded by the Program, other hepatitis C service providers in the community Explore Program implementation at the micro level Methodology based on the work of theorists (Yin & Chelimsky) Sites selected in consultation with Health Canada staff and based on criteria such as regional representation, service to a priority population, project type and willingness to participate The case studies were valuable because they: Provided a sense of how the Program plays out at the community level; Provided insight into social issues in the community; Allowed for access to other community agencies (and individuals such as the Lethbridge pharmacist) that might not otherwise have been identified; Allowed us to strengthen our conclusions (e.g., clarifying info re: “doctors don’t understand my needs”); Allowed for the discovery of info that might not otherwise have been uncovered (e.g., Hepatitis C Society supported at the national level but falling apart at the community level); and Provided important insight into building community capacity (e.g., cannot give money to community groups without support for infrastructure, board development, etc).

Data Collection Reviewed Program documents Survey/interview data collection proceeded smoothly due to: Support of stakeholders Respectful treatment of participants Understanding of community agencies Multiple response options for each instrument (in-person or phone interviews; e-mail, mail or fax surveys) Thank you cards, gifts to the participating case study sites and incentives (grocery coupons) to clients interviewed

Analysis & Write-Up Used SPSS and N-Vivo for data analysis Utility of the findings kept in mind—”What does this Program need?” Data Collection Matrix used to guide qualitative analysis and the “triangulation” of all data sources All data sources that addressed an evaluation question were compiled in a Data Summary Similarities and differences across groups of respondents were compared Findings that are included in the final report represent themes that stood out across all groups of respondents and/or documents To reduce the potential for bias and to obtain as balanced a picture as possible, data were collected from multiple sources to address each evaluation question.

Analysis & Write-Up (cont’d) Case studies: Research team met to draft a report template (series of questions) Team leaders organized case study data using the report template Completed template given to Project Director for case study write-up Preliminary review of each case by research team Review of each case by site coordinator Case study signed-off by site prior to distribution to Health Canada Health Canada regional staff reviewed and made final changes Process increased stakeholder ownership of the findings Final Report: Guided by the Logic Model for report structure Peer reviewers helped clarify & focus the report Various national and regional stakeholders reviewed the draft report Brought in Bill Reeves for an external perspective at a late stage. Took us a step back but was necessary due to the complexity of the study. Bill also had extensive expertise, for example, in training requirements for physicians. His comments helped us to interpret some of the findings, particularly around community capacity.

Lessons Learned The evaluation identified lessons learned and highlighted Program strengths and weaknesses Through these insights, the Evaluators were able to propose 19 recommendations Health Canada staff were consulted on the wording of the recommendations to facilitate their implementation