CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.

Slides:



Advertisements
Similar presentations
What “Counts” as Evidence of Student Learning in Program Assessment?
Advertisements

Collecting and Analyzing Data to Inform Action. Stage 2: A theory of action for your project Exploring research and best practices to provide a strong.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.
May 5, 2015 Strategies for Evaluation Data Collection Eric Graig, Ph.D.
Jim Moran Nancy Lucero MEASURING OUTCOMES. TELLING THE WOKSAPE OYATE STORY Assumptions.
Laura Pejsa Goff Pejsa & Associates MESI 2014
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Analyzing Assessment Data. A process to consider... Student Learning Outcomes identified for program. Courses identified as to where the outcomes will.
1 Evaluation Framework and Communication of Results.
SOWK 6003 Social Work Research Week 10 Quantitative Data Analysis
Objectives of Session Seven Complete in-class survey Discuss question formats and ordering Case study: face-to-face interviews v. self-administered questionnaires.
Evaluation of Health Promotion CS 652 Sarah N. Keller.
Title I Needs Assessment and Program Evaluation
Objectives of Session Three Focus Group and Survey Assignment Formative Design Proposal Construct Validity Reliability in Data In-class Data Exercise.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Guidelines for Making Decisions about IEP Services IEP Services 8 of 8 Implement the Special Education Services Evaluate the Impact of Services.
Chapter 4 – Strategic Job Analysis and Competency Modeling
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Quality Improvement Prepeared By Dr: Manal Moussa.
Development of Questionnaire By Dr Naveed Sultana.
Observations A Mirror of the Classroom. Objectives  Gain specific strategies for communicating with teachers to promote successful teaching and learning.
UNLEASH the POWER of the Evaluation Framework, Methods, Tools and Data Analysis.
Development of Evaluation Instruments based on ICT Skills Standards for Teachers (ISST) 7th May, 2014.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Questionnaires and Interviews
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Chapter 9 Qualitative Data Analysis Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
© Prentice Hall, 2003 Business Communication TodayChapter Finding, Evaluating, and Processing Information.
Becoming a Teacher Ninth Edition
ATP Online Module July 2006 Conducting Qualitative Research
The Evaluation Plan.
What Is Design? What Is a Design Process? Design Process Examples The Design Process © 2012 Project Lead The Way, Inc.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Data analysis, interpretation and presentation
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Data analysis, interpretation and presentation (Chapter 8 – Interaction Design Text) The kind of analysis can be performed on the data depends on the goals.
Jing Wan.  Introduction  Logic model  Evaluation methods(Design, data collection, analysis)  Communication & dissemination.
Market Research The key to the customers wallet …..
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
Quality Assessment July 31, 2006 Informing Practice.
Student volunteers and the volunteer- involving community organisations vinspiredstudents research.
Vocabulary 1 Research Process. 1. Problem definition: the purpose of the study should be taken into account; the relevant background info; what info is.
Developing High Quality Student Learning Objectives
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
Getting Your Stories Straight: Using Examples and Anecdotes as Outcome Measures Montgomery County September 17, 2015 Barry Jay Seltser.
Record Keeping and Using Data to Determine Report Card Markings.
A step-by-step way to solve problems. Scientific Method.
Teacher Professional Development When Using the SWH as Student-Oriented Teaching Approach Murat Gunel, Sozan Omar, Recai Akkus Center for Excellence in.
Unit 5—HS 305 Research Methods in Health Science
Sociological Research Methods. The Research Process Sociologists answer questions about society through empirical research (observation and experiments)
Identifying What Data Must Be Collected. Collect the Right Data You must collect data that will help you answer the evaluation questions. The data should.
Basic Steps in Conducting an Evaluation Dr. Andrea Robles, CNCS Office of Research and Evaluation.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
10. STEP 1: DEFINE & SCOPE Essential EAFM Date Place 10. Step 1: Define and scope the FMU Version 1.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
Research Methods in Psychology Introduction to Psychology.
Independent Reading Writing Balanced Literacy Teachers choose material for students to read and a purpose for the reading, and then guide them to use.
Session 2: Developing a Comprehensive M&E Work Plan.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
Data Coaching Services Types of Data 1. 2 o Qualitative Data vs. o Quantitative Data o Where do student and teacher data fall within these categories?
Sociologists Doing Research Chapter 2. Research Methods Sociologists attempt to ask the “why” and “how” questions and gather evidence which will help.
Presentation transcript:

CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord

OVERVIEW–SESSION 3  Prerequisite: Session Three homework completed  After this session, you will be able to  Discuss the desired outcomes of your program  Identify outcome indicators  Explore and select various data collection methods  Select data analysis methods appropriate to your evaluation  Select communicating and reporting strategies appropriate to your evaluation 2

REVIEW: EVALUATION PROCESS  Systematic, purposeful activity for gathering information, enhancing knowledge, making decisions  Planning  Designing Measurable outcomes Guiding questions Approach and design  Collecting and Analyzing Data  Reporting and Communicating Information 3

YET ANOTHER LOGIC MODEL  Good Health for Kids – Innovation Network example – Handout #1  Evaluation Framework – Handout #2 4

SURVEY OR INTERVIEWS  Consider your guiding questions.  Consider your outcome and indicators.  Does your method match up with your indicators? 5

SURVEY CONSTRUCTION  Question types  Open-ended  Closed-ended  Multiple Choice  Single responses  Multiple responses  Values free and not leading  Linked to your indicators 6

INTERVIEWS 7  Who you interview matters!  Who is best positioned to know what you want to know?  Recording Responses  Hand written notes  Notes taken on a laptop, notebook, etc.  Recorded and transcribed interviews  Probing questions

ANALYZING YOUR DATA  Assuming surveys and interviews  Quantitative/Numeric data can be counted  Examples:  Number of responses that fall into a particular category  Scores on a test 9

ANALYZING QUANTITATIVE DATA  Frequency counts and tables 10

MULTIPLE RESPONSES 11

3 KEY MEASURES 12  Mean  Average, Best with continuous data  Median  Best with continuous data, The data point at 50%  Mode  Good for categorical data, the category with the most responses

QUALITATIVE DATA  Words, pictures, observations, documents, artifacts  Example with Open-ended Interview data 13

QUALITATIVE ANALYSIS  Framed by the Evaluation Guiding Questions  Look for particular themes or categories  Listing responses is not analysis, but necessary  Copy/past into themes or categories 14

QUALITATIVE ANALYSIS EXAMPLE 15

REPORTING AND COMMUNICATING  Handout #5 – Reporting Formats  Handout #6 – Evaluation Communicating and Reporting Plan 16

QUESTIONS? 17