UNLEASH the POWER of the Evaluation Framework, Methods, Tools and Data Analysis.

Slides:



Advertisements
Similar presentations
Building a Career Portfolio
Advertisements

Quality control tools
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
A Systems Approach To Training
ENT4310 Business Economics and Marketing A six-step model for marketing research Arild Aspelund.
Evaluation What, How and Why Bother?.
Affinity Diagrams.
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
USING QUESTIONNAIRES. Steps to a Successful Survey  Step 1 – What do you want to know?  Step 2 – What is the audience?  Step 3 - Audience + Purpose.
Copyright © 2014 by The University of Kansas Collecting and Analyzing Data.
Evaluation.
Observation Tools Overview and User Guide. Does the need to determine the impact a student's ADHD is having in the classroom or quantitatively describe.
Analysing and Interpreting Data Chapter 11. O'Leary, Z. (2005) RESEARCHING REAL-WORLD PROBLEMS: A Guide to Methods of Inquiry. London: Sage. Chapter 11.2.
Jump to first page Chapter 2 System Analysis - Determining System Requirements.
Title I Needs Assessment and Program Evaluation
MR2300: MARKETING RESEARCH PAUL TILLEY Unit 10: Basic Data Analysis.
Grade 12 Subject Specific Ministry Training Sessions
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Business and Management Research
CBR 106: Developing a Client Satisfaction Survey.
The Marketing Survey How to construct a survey
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
NSW Curriculum and Learning Innovation Centre Tinker with Tinker Plots Elaine Watkins, Senior Curriculum Officer, Numeracy.
CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Influencing the Research Agenda Findings from an independent evaluation of a Cancer Network Consumer Research Panel Cindy Cooper, Julia Moore, Rosemary.
BSBIMN501A QUEENSLAND INTERNATIONAL BUSINESS ACADEMY.
Inquiry Test “What do I need to study?” asked the curious student. “Well, everything that we have covered so far.” replied the wonderful science teacher.
Improving Government Effectiveness Tracy Gallo – State of Vermont June Sweeney - Office of the State Auditor.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Semester 2: Lecture 2 Quantitative Data Analysis Prepared by: Dr. Lloyd Waller ©
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
BIS 360 – Lecture Five Ch. 7: Determining System Requirements.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Creating Questionnaires. Learning outcomes Upon completion, students will be able to: Identify the difference between quantitative and qualitative data.
The Power of Formative Assessment to Advance Learning.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Chapter Thirteen Analyzing Strategic Management Cases.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Esd113.org ESD 113 Your partner for learning solutions Early Learning Data Institute Using the Cycle of Inquiry and Action For Program Improvement.
 What is Public Relations Research? Research is important and thus it is the key to a successful Public Relations programme. Research assists in gathering.
Writing the “Results” & “Discussion” sections Awatif Alam Professor Community Medicine Medical College/ KSU.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
IFS310: Module 3 1/25/2007 Fact Finding Techniques.
Requirements Collection By Dr. Gabriel. Requirements A requirement is any function, constraint, or property that the system must provide, meet, or satisfy.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Researching your contemporary issue From How to Write an Effective Special Study Dodson, Jarvis & Melhuish.
Chapter 6: Analyzing and Interpreting Quantitative Data
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
CONFERENCE EVALUATION DATA ANALYSIS. DATA ANALYSIS  A credible amount of data has to be collected to allow for a substantial analysis  Information collected.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
1 Week 8 - Life cycle vs Methodology IT2005 System Analysis & Design.
National 5 AVU Learning Intentions: To gain knowledge on how to present information, form a conclusion and make a research sheet.
Survey Training Pack Session 3 – Questionnaire Design.
ANNOUCEMENTS 9/3/2015 – NO CLASS 11/3/2015 – LECTURE BY PROF.IR.AYOB KATIMON – 2.30 – 4 PM – DKD 5 13/3/2015 – SUBMISSION OF CHAPTER 1,2 & 3.
Interview questions and answers – free download/ pdf and ppt file network systems administrator KPI In this ppt file, you can ref KPI materials for network.
Survey Training Pack Session 20 – Presentation of Findings.
Improved socio-economic services for a more social microfinance.
Evaluation Emma King.
PRESENTATION AND DISCUSSION OF RESEARCH FINDINGS
IENG 451 / 452 Voice of the Customer: Analysis (KANO, CTQ)
Title of your science project
Finding Answers through Data Collection
Monitoring & Evaluation
Presentation transcript:

UNLEASH the POWER of the Evaluation Framework, Methods, Tools and Data Analysis

Agenda 1.Why Evaluate? 2.Evaluation Framework 3.Purpose of the Evaluation 4.Evaluation Plan (Methods, Sample) 5.Evaluation Tools 6.Analyse and Summarize Results 7.Presenting Information 8.Wrap Up and Questions

Discussion Questions:  What do you currently evaluate in the literacy service planning process?  Why do you conduct evaluations?

Why Evaluate?  Provide an opportunity to assess and correct your course on an ongoing basis  Evaluation is important for accountability purposes  Evaluation is important for learning – which leads to better quality practice when widely shared

 Determine how the Literacy Service Plan has fared in action  Assess the effectiveness of the Literacy Service Plan in terms of its perceived intentions and results  Assess the effectiveness of the literacy service planning process (to be able to make improvements in the process)

Evaluation Framework 1.Identify the purpose of, and audience for, the evaluation results 2.Identify what information needs to be collected and from whom 3.Select from different research methods 4.Select sample/population 5.Design evaluation tools 6.Administer evaluation tools 7.Analyze and summarize results 8.Present information

Identify the Purpose and Audience of the Evaluation Results  Need to be clear about the purpose of the evaluation. What are you evaluating? Five Forms of Evaluation  Effort evaluation – inputs  Performance evaluation – outputs  Effectiveness evaluation – outcomes  Efficiency evaluation – costs  Process evaluation – what worked well and how could the process be improved

Formative or summative evaluation? Formative Evaluation  To collect information that can be used primarily for development and improvement Summative Evaluation  To make an overall judgement about the effectiveness of a program/service/process

 When you construct your evaluation questions, you will refer back to the purpose of the evaluation to make sure the questions you ask will produce the information you want

Discussion Question:  Which form(s) of evaluation would you use in the literacy service planning process and why?  Effort evaluation – inputs  Performance evaluation – outputs  Effectiveness evaluation – outcomes  Efficiency evaluation – costs  Process evaluation – what worked well and how could the process be improved

Develop an Evaluation Plan What information will be collected (focus) How the information will be collected (methods) Who or what sources will provide the information (sample, sources)

Research Methods Asking people for information Surveys Interviews Focus groups Observing and analyzing existing materials Look at data that already exists Two basic ways to collect information:

Samples Questions to Ask:  How many people or pieces of print material should be included in your evaluation?  Which people or print material should be included in your evaluation?  How should they be selected?

Evaluation Tools Questions are the key!  Questionnaires/Surveys  Key Informant Interviews  Focus Groups

Designing Questions  Closed and open ended questions  For written surveys, list all possible options for closed questions, including “no opinion” or “don’t know”

Open Ended What are your hobbies? (please list below) _______________________________ Closed What are your hobbies? (please select all responses that apply to you)  Reading  Running  Knitting  Eating  Other (please list): ___________________ Do you like ice cream? (please select the most appropriate response)  Yes  No  Don’t Know

 Be sure that questions are neutral - watch out for the halo effect  Ask questions so that people can admit to none of the responses (what, if any,…)  Avoid agree or disagree statements  Avoid double-barrelled questions

Double-Barrelled Questions:  What feedback do you have about our parenting groups and our youth groups?  How would you rate your level of satisfaction with our volunteer training program and ongoing support for volunteers?

Avoid:  Sensitive or embarrassing questions  Double negatives (Do you agree or disagree that nonprofits should not receive government funding?)  Too many alternatives  Unimportant questions  Complex language (jargon)

Analyze and Summarize Results  What do you do with all of the information you’ve gathered?

Data Analysis Quantitative Data  Data which describes reality using numbers Qualitative Data  Data which describes events, persons, etc. without the use of numbers (words)

Quantitative Data Analysis  Tabulate the # of responses for each category in the questions  Average, range, most frequent response  62% of clients are satisfied with the services provided by our organization  Youth programs received an average rating of 3.5 out of 4  All services were rated between 2.82 and 3.43 out of 4

Qualitative Data Analysis  Read the responses and identify general themes  Sort each individual response into themes  Count # of responses in each theme to determine priority

Considering Findings Four processes to make sense of evaluation findings: 1.Description and Analysis 2.Interpretation 3.Judgement 4.Recommendations

Description and Analysis: Organizing the raw data into a form that reveals the basic results. Presents the facts of the case and the actual data Interpretation: This goes beyond the data to meaning and significance. What do the results mean? What’s the significance of the findings? Why did the findings turn out this way? What are the possible explanations?

Judgement: Bringing values to bear on the analysis and interpretations. To what extent and in what ways are the results positive or negative? What is good, bad, desirable or undesirable? Recommendations: What should be done? What are the action implications of the findings?

Presenting Information  Data needs to be arranged, ordered and presented in a reasonable format that allows readers to quickly detect patterns in the data  Visual presentations of information:  Bar graph  Scatterplot  Figures (graphs and other diagrams)  Pie chart  Flow charts  Line graphs  Tables

Bar Graph Pie Chart

Scatter PlotFlow Chart Line Graph

Reflection  What learnings and reflections do I have from this session?  What are some potential actions for my Regional Network?

Wrap Up and Questions

References  Ministry of Tourism and Recreation. (1982) Enjoying Research. Government of Ontario.  Patton, Michael Quinn. (1986) Utilization-Focused Evaluation. California: SAGE Publications.