Controlling Your Evaluation Destiny Marla Steinberg Director of Evaluation MSFHR & The CAPTURE Project www.thecaptureproject.ca.

Slides:



Advertisements
Similar presentations
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
Advertisements

Evaluation What, How and Why Bother?.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Program Evaluation Essentials. WHAT is Program Evaluation?
Establishing Financial Direction
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
INVITATIONAL PRACTICUM IN METAEVALUATION Session 2 9/15/11 GAO & Joint Committee Standards DLS 9/13/11 1.
A Review of the Orissa Multi Lingual Programme (MLE) Evaluation Proposal (5 pages) John Owen (Dr)
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
7 Accountability Getting clear about what you want to accomplish with technology How will you measure its use? How will you communicate its effects?
Chapter 13 Selecting a Data Collection Method. DATA COLLECTION AND THE RESEARCH PROCESS Steps 1 and 2: Selecting a General Research Topic and Focusing.
Chapter 12 Selecting a Data Collection Method. DATA COLLECTION AND THE RESEARCH PROCESS Steps 1 and 2: Selecting a General Research Topic Steps 3 and.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
Research Methods and Proposal Writing
Introduction to the User’s Guide for Evaluating Learning Outcomes from Citizen Science Tina Phillips, Cornell Lab of Ornithology Marion Ferguson, Cornell.
By Saurabh Sardesai October 2014.
Evaluation. Practical Evaluation Michael Quinn Patton.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
© 2014 Public Health Institute PROPOSAL WRITING.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Effective dissemination and evaluation
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Continuous Quality Improvement (CQI)
Development of Competence Profile Quality managers in VET-institutions Project no: PL1-LEO This publication [communication] reflects the.
Teaching Reading and Writing to English Language Learners CEDEI Dr. Kathleen McInerney.
Research Methods and Proposal Writing
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Overview of Conference Goals and Objectives. Board of Directors Executive Director Registration Facilities & Equipment Security Leadership Institute Parents.
Assessing and Evaluating Student Learning UNIVERSIDAD AUTÓMA DE QUERÉTARO FACULTAD DE LENGUAS Y LETRAS Profesional Asociado Universitario en Enseñanza.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Designing Public Participation in Environmental Impact Assessment Presentation & Exercise.
Connecting State Performance Plans and Technical Assistance Through Evidence-Based Education Melissa Price Higher Education Support Center for SystemsChange.
Viewpoints for Student Partnerships Carry out a baseline study to research current position. Establish the case for student partnerships and align with.
Measuring your Impact & Outcomes How you know you are making a difference Jill Davies – South Hams CVS.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Evaluation Assists with allocating resources what is working how things can work better.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
Presentation Reprised from the NASFAA 2014 Conference By Pamela Fowler University of Michigan Ann Arbor Getting a Seat at the Table 1.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Building evaluation capacity – training Ben Barnes Principal Research Scientist July 2015.
Communicating Impact Elizabeth Coke Haller School Health Team Leader Program Development and Services Branch Division of Adolescent and School Health.
1 Performance Measures A model for understanding the behavior of our work Presented by Wendy Fraser.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
The Zen of Data Colorado Department of Education June 19, 2009.
Evaluation Revisiting what it is... Who are Stakeholders and why do they matter? Dissemination – when, where and how?
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Review of 2016 MARCO Survey Presented at Waikato Regional Council, Hamilton 27 November 2015 ​ Judy Oakden ​ Pragmatica Limited – a member of the Kinnect.
National Workshop 3 Luwero, Uganda March 2015.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
What is Social Accountability? “Social Accountability is an approach initiated by the community for collective responsibility of all stakeholders in ensuring.
John N. Lavis, MD, PhD Professor and Canada Research Chair in Knowledge Transfer and Exchange McMaster University Program in Policy Decision-Making McMaster.
The GEF Monitoring and Evaluation Policy. 2  Result-Based Management (RBM) - setting goals and objectives, monitoring, learning and decision making 
Unit 9: Evaluating a Public Health Surveillance System #1-9-1.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Stakeholder Relations. Local government principles, LGA- S4 “(a) transparent and effective processes, and decision-making in the public interest; and.
Chapter 1 Toward Accountability. The Faces of Accountability Professional Accountability.Service Delivery Accountability.Coverage Accountability.Cultural.
Evaluating the Quality and Impact of Community Benefit Programs
April 21 Francesca Recanatini, WBI
Institutional Effectiveness Plan
Community program evaluation school
Sam Catherine Johnston, Senior TA Specialist National AEM Center
Cynthia Curry, Director National AEM Center
Presentation transcript:

Controlling Your Evaluation Destiny Marla Steinberg Director of Evaluation MSFHR & The CAPTURE Project

Can you spot the PEAR cake?

What does evaluation mean to you? “ Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object” Evaluation can be defined as the systematic collection and analysis of information on the performance of a policy, program, or initiative to make judgments about relevance, progress or success and cost-effectiveness, and/or to inform future programming decisions about design and implementation. Evaluation means asking good, critical questions about programs to improve them and help them be accountable for the wise use of resources. Don’t be put off by the language of evaluation

Planning Tips Program ParticipantPractitioner or Manager Funder Set up evaluation advisory committee Jointly determine purpose of evaluation and generate evaluation questions Jointly develop program logic model so multiple perspectives are represented Jointly determine data collection strategy Choose methods that meet required level of rigour and are acceptable to participants Jointly identify other stakeholders and develop engagement and communication strategy Jointly develop reporting and dissemination strategy Think about different products for different audiences

Data Collection Tips Chosen data collection methods that meet required level of rigour and are acceptable to participants Include qualitative and quantitative data, get people to tell their stories Use multiple methods If possible try to use engaging methods Use lay or participant data collectors

Analysis and Interpretation Tips Hold interpretation sessions Find creative ways to display data that can speak to different audiences

Reporting and Dissemination Tips Don’t just create the one long evaluation report Video Art Brief reports

Using Results Tips Quality improvement Management Decision-making Jointly develop action plan

Joint Action Planning Action Planning Worksheet Evaluation Recommendation: Action Step(s)Person(s) Responsible TimelinePlans for monitoring results

Final Tip: Plan Evaluation Up Front How many evaluators does it take to change a light bulb? – one to do a needs assessment – one to do a feasibility study – one to do a qualitative study to find out what bulb to change – one to empower the bulb to change – one to tender a contract for further study – one to write performance indicators for success – one to do a cost benefit analysis to determine the best to buy – one to do a meta-evaluation showing that all previous studies have left everyone in the dark SO how many evaluators does it take?