1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.

Slides:



Advertisements
Similar presentations
Donald T. Simeon Caribbean Health Research Council
Advertisements

Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Copyright © 2014 by The University of Kansas Collecting and Analyzing Data.
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Chapter 4 How to Observe Children
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Evaluation. Practical Evaluation Michael Quinn Patton.
How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education,
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
Continuous Quality Improvement (CQI)
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Molly Chamberlin, Ph.D. Indiana Youth Institute
Classroom Action Research Overview What is Action Research? What do Teacher Researchers Do? Guidelines and Ideas for Research.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Reporting and Using Evaluation Results Presented on 6/18/15.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Impact assessment framework
Measuring the Value of Your Volunteer Efforts Nikki Russell Volunteer Initiatives Manager United Way of King County.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Overview of the Plain Talk Data Collection System Sarabeth Shreffler, MPH, CHES Program Officer, Plain Talk Program Public/Private Ventures.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Logic Models and Theory of Change Models: Defining and Telling Apart
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Beyond logical frameworks to program impact pathways CIIFAD M&E Workshop 5 November 2011  135 Emerson Hall Sunny S. Kim, MPH PhD candidate Division of.
The Process of Conducting Research
Tell the Story: Why Performance Measures Matter Cat Keen, M.S.W., M.B.A National Service Programs Director.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
The Interactive Model Of Program Planning
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Mapping the logic behind your programming Primary Prevention Institute
Monitoring and Evaluation
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Helpful hints for planning your Wednesday investigation.
Are we there yet? Evaluating your graduation SiMR.
State Development Information and tips to develop the Annual Work Plan 1.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Logic Models How to Integrate Data Collection into your Everyday Work.
Developing Community Assessments
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Program Evaluation Essentials-- Part 2
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Changing the Game The Logic Model
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix or Display Grouping data within common themes in order to identify patterns in the data. Storytelling Using quotes and detailed description to capture the “story” or essence of a program or event.

2 Analyzing Quantitative Data Can be done by program staff with the help of a spreadsheet (such as Excel). Quantitative data analysis consists of: Complex manipulations of the data designed to understand causal or correlational effects of program interventions. Simple calculations yielding factual information on attendance, usage, changes in performance, and/or changes in knowledge or attitudes (pre/post test). Best done by an evaluation contractor or trained researcher. Or

3 4. Analyze your Data Quantitative Data Analysis–Analysis of numbers. Best presented in the form of pictures – graphs and charts. Qualitative Data Analysis– Analysis of words and pictures. Best presented as “word stories” or “video stories.” 12356

4 2. Gather First Impressions Guiding Questions: What did you talk about with interviewees? What are your initial theories of what is working and what is not? Did anything surprise you? Discuss the quality and breadth of data with data collection team.

5 3. Organize and “Clean” Your Data Examine all the data related to each research question separately. Ideally, there is more than one data source for each question. Throw out surveys or written responses that are less than halfway completed

6 Overview of Data Analysis Process 1. Form a data collection team 2. Gather first impressions of data 3. Organize and “Clean” your data 4. Analyze data5. Prepare findings & recommendations 6. Discuss emerging findings with key stakeholders

7 Tips for Designing & Choosing Methods Consider your resource requirements. Determine the appropriateness of methods for respondents, organizational approach, and setting. Ensure that new data gathered fits well with existing data sources. Determine that methods fit with evaluation questions. Select methods that are easy to use.

8 How, What, Why? Choosing Methods (cont’d) 2. Identify a desired outcome/output and choose a method to measure it. AAttendance: sign-in sheets, intake forms, etc. WWriting skills: writing samples KKnowledge of community: survey of youth knowledge 3. More than one method enhances credibility. Gathering multiple sources of evidence provides opportunities to include different perspectives. SSurveys and interviews: Surveys provide numerical account of youth experience and outcomes. Interviews highlight youth voices and stories.

9 How, What, Why? Choosing Methods 1. Consider evaluation questions. Your choice of tools should be driven by the types of questions you want answered (how, what, how many, why). Why questions are best addressed through interviews and focus groups. How What How Many Why How questions help you understand how something happened or the process of implementing a program. Interviews and focus groups. What questions help you document what program staff have done and what participants have experienced. All methods. How many questions are best addressed through surveys, activity logs, intake data, etc.

10 Implementation Quality & Quantity Outcomes Effectiveness, Magnitude & Satisfaction Setting Assumptions Connecting Logic Model to Evaluation Questions Context Inputs Activities Outputs Short-term Outcomes Intermediate Outcomes Long-term Outcomes Adopted from WKKF Logic Model Development Guide p.36 What aspects of our situation most shaped our ability to do the work we set out to do in our community? What did our program accomplish in the community? What resulted from our work in the community? What have we learned about doing this kind of work in a community like ours?

11 Identify Evaluation Questions Evaluation questions focus on understanding how your program will meet its intended goals. Good questions enable you and others to get the answers you need to tell your story. Pick a program activity that you want to evaluate. Decide whether you want your questions to focus on the processes or outcomes of your work. Use simple, concrete language that focuses on what you need to know.

12 Identify strengths and weaknesses of program activities. Benefits of Evaluation Inform practice to improve program. Build organizational capacity. Inform and refine community change efforts. Enhance personal growth and development among staff and youth participants. Provide evidence of program effectiveness. Improve ability to plan and implement programs/campaigns. Document program progress toward meeting goals. Identify unmet community needs and assess impacts of social change efforts. Provide feedback to staff/participants on their work. Recognize accomplishments and provide suggestions for improvement. Report to community and funders about program effectiveness.