Blending the Theoretical and the Empirical in Evaluation Science: A Case Example from Cancer Control/Prevention and a General Discussion Ross Conner University.

Slides:



Advertisements
Similar presentations
Explanation of slide: Logos, to show while the audience arrive.
Advertisements

Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Monarch Larva Monitoring Project Goals and Roles.
Principled Teaching for Deep Progress - Improving mathematical learning beyond methods and materials An NCETM research study module.
Exploring Collaborations: Successful Strategies for Increasing Equity and Access to STEM Strategy Session 1: Overview and Game Plan 1.
Cancer Education and Cultural Awareness Project (CECAP)
National Physician & Family Referral (NPFR) Project Overview NPFR Project Pilot Project Approach Methodology Professional Surveyors Traditional Media.
Developing a Logic Model
Diversity in Policing Project & Beyond.
Presentation slide 1.1 Aims of the session To provide a brief outline of the key features of the science strand of the secondary national strategy for.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Key Terms for Monitoring and Evaluation. Objectives Explain the difference between monitoring and evaluation. Introduce the most common M&E terms. Review.
Measuring Learning Outcomes Evaluation
OUTCOME MEASUREMENT TRAINING Logic Models OBJECTIVES FOR TODAY: n Recognize and understand components of a logic model n Learn how to create a logic.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
PROJECT OVERVIEW. Grundtvig Learning Partnership Through this Learning Partnership, participating organizations have agreed to address the following subjects:
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
How to Develop the Right Research Questions for Program Evaluation
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
What is Effective Professional Development? Dr. Robert Mayes Science and Mathematics Teaching Center University of Wyoming.
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Community Issues And Needs Associated With Microbicides Clinical Trials Presenter: John M. Mutsambi, Community Liaison Officer with University of Zimbabwe.
“Working Together, Reducing Cancer, Saving Lives”
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Preparing for the Main Event Using Logic Models as a Tool for Collaboratives Brenda M. Joly Leslie M. Beitsch August 6, 2008.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Model Name Transition Project Learning Network Workshop 3 IYF Transition Project.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
RAISING THE LEVELS OF ACHIEVEMENT OF PUPILS WITH MLD USING RESEARCH LESSON STUDY Dr Jeff Jones Development Leader.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Implementing Inquiry Based Science Teaching Through Teacher Networks The SINUS model for educational development in Germany Matthias Stadler, Kiel (Germany)
Colorectal Cancer Survivorship in Greene County, Pennsylvania: Assessment and Provider Education Mary Ann Ealy, Marlene Shaw and Carolyn Wissenbach Background.
Program Evaluation for Nonprofit Professionals Unit 2: Creating an Evaluation Plan.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Lessons from the CDC/RTC HIV Integration Project Marianne Zotti, DrPH, MS, FAAN Team Leader Services Management, Research & Translation Team NCCDPHP/DRH/ASB.
Alaska CCCP Vision & Mission Vision: A Cancer Free Alaska Mission: Working Together for a Cancer Free Alaska A CANCER FREE ALASKA.
Program Planning, Implementation & Evaluation Chapter 4, all pages Chapter 15, pp Chapter 17, all pages Chapter 18, all pages.
SD Math Partnership Project An Overview Marcia Torgrude and Karen Taylor.
Just a Click Away Conference, February 2011 Slide 1 Focus Consultants for the Research & Statistics Division Department of Justice.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Using Logic Models to Create Effective Programs
Outcome-based Planning and Evaluation Gloria Latimer, Ed.S, Director of Community Programs Jason Vahling, M.P.H., Community Program Specialist.
1 Module 1 Introduction: The Role of Gender in Monitoring and Evaluation.
GME SUMMIT SUMMARY Common Themes of Maryland’s GME Stakeholders’ Summit Held on May 20 th, 2015 Sponsored by DHMH, University of Maryland, and Johns Hopkins.
RAISING THE LEVELS OF ACHIEVEMENT OF PUPILS WITH MLD USING RESEARCH LESSON STUDY Dr Jeff Jones Development Leader.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Fdtl5 FDTL5 project V-ResORT : A Virtual Resource for Online Research Training.
NATIONAL EDUCATION INITIATIVE FOR K–12 CLASSROOMS.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
The Future for Assessment? Assessing Pupil Progress (APP) as a tool for effective Teacher Assessment in Primary Science.
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Focus Groups A Tool to Enhance CCRs
Fundamentals of Monitoring and Evaluation
Testing Process Roman Yagodka ISS Test Leader.
Program Evaluation Essentials-- Part 2
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
CATHCA National Conference 2018
Injury epidemiology- Participatory action research and quantitative approaches in small populations Lorann Stallones, PhD Professor and Director, Colorado.
Sacurima – CA Safety Culture and Risk Management in Agriculture.
Understanding Standards Biology (Advanced Higher)
Presentation transcript:

Blending the Theoretical and the Empirical in Evaluation Science: A Case Example from Cancer Control/Prevention and a General Discussion Ross Conner University of California Irvine USA and Senior Adviser and Past President – International Organisation for Cooperation in Evaluation – IOCE

Workshop Outline Evaluation Overview The Theoretical Component The Empirical Component Case Example: Chinese-Korean Cancer Prevention/Control Program Conclusions Questions & Answers; General Discussion

1. Evaluation Overview Inputs  Processes  Outputs  Outcomes  Impacts A simple example: program = a mathematics class Inputs: students, teacher, lesson texts, paper and pencils, classroom setting Processes: lessons and exercises taught by teacher Outputs: exercises completed by students Outcomes (short-term): Students learn new concepts, have new understandings (measures: tests) Impacts (long-term): Students apply new concepts and understandings, students advance to next level of learning For simplicity, this is set out as a linear track of 5 stages; in practice, it is often more complex with multiple tracks because of different components of the program.

2. The Theoretical Component Theory from Evaluation Science Evaluation Science: bringing the tools of science and discovery to the assessment and evaluation of programs or policies. Tracking and monitoring inputs and processes; identifying and documenting outputs; measuring outcomes; tracking and measuring impacts. Theory from the Program or Policy Area Program or Policy Area: bringing the past learning from the area to advance the area Building upon past experimental and theoretical work in the area, developing program components based on past research and practice. Using both types of theories, the program components and the evaluation components are developed.

3. The Empirical Component Program Design Specify planned inputs and planned processes Specify expected outputs, outcomes and impacts Specify evaluation questions Evaluation Design Specify evaluation design(s) to answer the evaluation questions. Design = the plan for data collection Specify evaluation measures to obtain necessary data

4. Case Example Area: Health – cancer control and prevention Populations: Chinese and Koreans in California, USA Program focus: women’s cancers initially, later men’s cancers Program aim: disease destigmatization

Overview: Chinese Program Components Materials in Chinese language Large free luncheons with expert lecturers Free screenings for cancers Cancer survivors involved in the programs High visibility in community

Overview: Korean Program Components Materials in Korean language Used networks of Korean Christian ministers to destigmatize; cancer survivors featured Free screenings for cancers Prominent role at public events and fairs

The Program Inputs  Processes  Outputs  Outcomes  Impacts Planned Inputs: materials in native language, large gatherings of community members, religious leaders (Korean), cancer experts (Chinese) Planned processes: dissemination of materials and information to 4,000 people, cancer screenings, discussions among community members Planned outputs: increased knowledge about cancer, increased understanding of consequences of disease stigmatization Expected outcomes: increased (by 10%) cancer screenings, decreased disease stigmatization Expected Impacts: 100 community cancer survivors serving as volunteers and outreach workers for the program

The Evaluation Inputs  Processes  Outputs  Outcomes  Impacts Examples of evaluation components: Inputs: tracked numbers of attendees at events Processes: observed community events – materials distributed, lectures given, discussions held Outputs: assessed knowledge about cancer Outcomes: tracked cancer screenings, tracked decreased disease stigmatization Impacts: tracked cancer survivors serving as volunteers for the program

Selected Evaluation Results Outputs and OutcomesChineseKoreanTotals Health Education Sessions Conducted (attendance)* 4,9128,02612,938 Community Outreach Workers Trained 12427** Cancer Screenings Conducted+ 2,2121,7393,951 * Estimated duplicate counts: 40% ** Ministers + Includes breast, uterus, liver, prostate and colon cancer screenings

5. Conclusions The program and the evaluation blended theoretical components from evaluation science and from the area of health education/disease prevention. The program set measurable objectives. These anchors provided a basis to judge achievements. The evaluation involved empirical assessments of inputs, processes, outputs, outcomes and impacts. The program planners/implementers from the Chinese and Korean communities were partners with the evaluation team, from start to finish. The programs continue today, with a smaller funding base

6. Audience Inputs and Comments Questions about the program Questions about the evaluation Questions about the results General questions about evaluation Any other general questions for discussion

Evaluation: Next Steps New book available soon: Оценка программ: методология и практика Редакторы: А.Кузьмин, Р.О’Салливан, Н.Кошелева