CECC Webinar Series: Measuring Program Outcomes Presented by Terry Tolan and John Roden.

Slides:



Advertisements
Similar presentations
1 MEASUREMENT Measurement Error & Survey Construction.
Advertisements

Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Module 2 Learning More about the Summary of Functional Performance Every day, we are honored to take action that inspires the world to discover, love and.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Factors: Situational Characteristics9-10a Factors and Characteristics Important Issues and Questions Situational Characteristics Budget of available resources.
Program Evaluation Essentials: Developing High-Quality Questionnaires Mary E. Arnold, Ph.D. Associate Professor and 4-H Research and Program Evaluation.
Program Evaluation Spero Manson PhD
Objectives and Indicators for MCH Programs MCH in Developing Countries January 25, 2011.
Evaluation. Practical Evaluation Michael Quinn Patton.
Determining the Size of
By Arnold Goldstein and Ellen McGinnis
1 The Maryland Early Childhood Accountability System Program Effectiveness Based on Results for Children Maryland State Department of Education Division.
LOGIC MODELS AND OUTCOME MEASUREMENT United Way Community Investment Process Training.
Pengukuran Opini Publik. Survey Research Survey research is a technique that well designed for assessing the prevalence and distribution of attitudes,
Helping Non-Profit Organizations Understand The Value of Their Work Copyright Postrain Consulting Services 2009.
How to Develop the Right Research Questions for Program Evaluation
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
The Research Process Interpretivist Positivist
Community Input Discussions: Measuring the Progress of Young Children in Massachusetts August 2009.
2014 AmeriCorps External Reviewer Training
The Comprehensive School Health Education Curriculum:
Nine steps of a good project planning
RESEARCH A systematic quest for undiscovered truth A way of thinking
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
Rosana M. Aguilar Foster Care Research Group
The Evaluation Plan.
Brown, Suter, and Churchill Basic Marketing Research (8 th Edition) © 2014 CENGAGE Learning Basic Marketing Research Customer Insights and Managerial Action.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Creating Questionnaires. Learning outcomes Upon completion, students will be able to: Identify the difference between quantitative and qualitative data.
Effective Community Councils: Supporting Our Children Rebekah Duchette Boone County CECC.
Copyright © 2014 by The University of Kansas Gathering and Interpreting Ethnographic Information.
Logic Models and Theory of Change Models: Defining and Telling Apart
Journal Write a paragraph about a decision you recently made. Describe the decision and circumstances surrounding it. How did it turn out? Looking back,
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Project Stakeholder Management
Wellness.
MARKETING SURVEYS Constructing the Questionnaire validity  A questionnaire has validity when the questions asked measure what they were intended.
The Psychologist as Detective, 4e by Smith/Davis © 2007 Pearson Education Chapter Seven: The Basics of Experimentation II: Final Considerations, Unanticipated.
1 MARKETING RESEARCH Week 5 Session A IBMS Term 2,
EYFS – and the OFSTED Framework Sue Monypenny Senior Education Standards and Effectiveness Officer.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Quantitative research – variables, measurement levels, samples, populations HEM 4112 – Research methods I Martina Vukasovic.
Behavioral and Emotional Rating Scale - 2 Understanding and Sharing BERS-2 Information and Scoring with Parents, Caregivers and Youth May 1, 2012.
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
Chapter Seven: The Basics of Experimentation II: Final Considerations, Unanticipated Influences, and Cross-Cultural Issues.
Introduction Studies are important for gathering information. In this lesson, you will learn how to effectively design a study so that it yields reliable.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Changing Behaviour Attitudes.
Developing a Global Vision through Marketing Research Chapter 9 Matakuliah: J0474 International Marketing Tahun: 2009.
Using Logic Models to Create Effective Programs
Behavioral and Emotional Rating Scale - 2 Parents, Caregivers and Youth Information on BERS-2 Parent Rating Scale April 13, 2012.
Setting Goals And Achieving Outcomes Presented by Terry Tolan and John Roden.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Introduction/ Section 5.1 Designing Samples.  We know how to describe data in various ways ◦ Visually, Numerically, etc  Now, we’ll focus on producing.
Introduction to Observation Observing Children: A Tool for Assessment
Observing and Assessing Young Children
Fact Finding (Capturing Requirements) Systems Development.
CREATING A SURVEY. What is a survey questionnaire? Survey questionnaires present a set of questions to a subject who with his/her responses will provide.
Jalongo & Isenberg, Exploring Your Role, 3e Copyright © 2008 by Pearson Education, Inc. All rights reserved. 9.1 Chapter 9: Exploring Your Role in Documenting.
Evidence Based Practice & Research in Nursing Level 8, Academic Year (AY) 1434—1435 H Vanessa B. Varona, RN, MAN.
Focus Questions What is assessment?
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
About Market Research Making a Questionnaire
Logic Models and Theory of Change Models: Defining and Telling Apart
Chapter 5: Producing Data
6 Chapter Training Evaluation.
Presentation transcript:

CECC Webinar Series: Measuring Program Outcomes Presented by Terry Tolan and John Roden

Purpose To introduce you to the concept and a system for measuring your CECC’s outcomes.

What is Outcome Measurement? Outcome Measurement is the regular and systematic measuring of progress toward intended outcomes in order to: Increase the effectiveness of programs and services Communicate the value of those programs and services

Our Business Model Children Enter Kindergarten Ready High Quality Early Learning Environments Supportive Families Access to Data Participation in STARS A great early childhood workforce Families understand child health and developmental needs Common Kindergarten Entry Screener Scholarships & PD Plans Families are engaged Children have access to appropriate services Data is shared by early childhood programs

What CECCs Measure

What we useWhat we doWhat we count Program Outcome Model

What we useWhat we doWhat we count How THEY change! Program Outcome Model

SITUATION INPUTS ACTIVITIES OUTPUTS OUTCOMES INTIAL INTERMEDIATE LONG-TERM EXTERNAL INFLUENCES, ENVIRONMENTAL, RELATED PROGRAMS Outcomes Logic Model – A System of Measurement

Parenting Education Program 1.Parents from 10 families attend the workshops 2.Six group workshops are conducted 3.Parents’ understanding of children’s developmental issues increases 4.Parent provide more age appropriate guidance to children

Input Outcomes ActivityOutput Initial Intermediate Long-Term Inputs Through Outcomes: The Conceptual Chain

Children Enter School Ready Logic Model Framework

INPUTS ACTIVITIES OUTCOMES Initial Intermediate Long-term

Figuring Out Our Outcomes What do we want to be true of participants because of their involvement in the program? What do we want to be able to say about them? If we succeed with a participant (or don’t), what has changed (or hasn’t)? If we conduct the activity, then what do participants believe, know, have, or do as a result? And what benefit or change follows that?

Program Outcome Criteria for Each Outcome Is it reasonable to think the program can influence the outcome in a non-trivial way even though it can’t control it? Would measurement of the outcome help identify program successes and pinpoint problems? Will the programs various “publics” accept this as a valid outcome of the program?

SITUATION INPUTS ACTIVITIES OUTPUTS OUTCOMES INTIAL INTERMEDIATE LONG-TERM EXTERNAL INFLUENCES, ENVIRONMENTAL, RELATED PROGRAMS Outcomes Logic Model

Outcomes vs. Indicators Outcome: Benefits for participants Teens follow proper nutrition and health guidelines Indicator: The specific information that is tracked to indicate success in achieving the outcome Proper weight Does not smoke Takes a prenatal vitamin

Decision Tree

Sample Indicators ActivityOutcomeIndicator Improving ECE Programs through coaching ECE programs are high quality ECERS score of 5 or more

Data Gathering

Only 25% of Children Enter School Ready Planning Facilities Books Printing Costs Literacy Training Book Distribution Demonstration Number of Books Distributed Number of Parents Attending Training Number of Brochures Mailed OUTCOMES Gained Knowledge of Importance of Reading Awareness of Role/Impact Intentionality of Modeling Increase number of times child read to daily/weekly Increase number of parents reading Children Enter School Ready EXTERNAL INFLUENCES, ENVIRONMENTAL, RELATED PROGRAMS *Linking Parents to GED, Adult Education, and ELL Programming* Outcomes Logic Model: Parents Reading Daily to Children

Gathering Data

This checklist can help you decide which data collection methods are most appropriate for your outcome measurement. SURVEYS 1. Do I need data from the perspective of the participant, client, beneficiary, or customer? 2. Do I have a systematic way to get it from these individuals? 3. Do I need data that are standardized so that statistical comparisons can be made? (For example, will I need to report percents or other statistics?) 4. Will participants be able to understand the survey questions? (Consider age, cultural backgrounds, etc.) 5. Do participants have the necessary knowledge or awareness to accurately answer questions about the outcomes? Check List for Selecting Data Collection Methods

Don't ask a question if the answer is obvious. Avoid abbreviations and jargon. If they must be used, clearly define them. Ask yourself whether several questions are actually necessary or if you can get the information in one question. Don't try to cram too much into one question. Make your questions easy to understand. Make sure your sample population understands them. If a list of answers is provided, make sure all possible answers are present. Even with "yes" and "no" questions, it may be necessary to include a neutral "undecided" or "don't know." Don't mix "I feel" or "I think" questions with questions regarding facts. Keep factual and perception questions in separate groupings. Place sensitive demographic questions (such as age or income) at the end of the survey. General Guidelines for Survey Questions

Sometimes it takes just one word to bias a question. Avoid using inflammatory words in surveys, such as: allege, allude, arbitrary, blame, claim, demand, error, failure, fault, ignore, ill- advised, ill-informed, incompetence, ineptness, insist, just, maintain, misinformed, must, neglected, one-sided, only, overreact, peremptory, purport, questionable, rejection, rigid, so-called, unfortunately, unilateral, unreasonable Value-laden questions, especially those that attempt to be global in scope, tend to overwhelm respondents. For example, making respondents choose between a healthy environment and a vital economy will probably bias results. Don't distill complex issues into "black" or "white" scenarios. Rather, explore the "gray" areas. Avoid “Red Flag” Words

Is the question relevant? Is it consistent with survey goals? Does the question ask for "need to know" or "nice to know" information? What will be the value of a response? If 95 percent say, "Yes," would this affect decision making? Will respondents be able to answer the question? Will they have the information? Does the question lead to a particular response? (Is it a leading question?) If a set of answers is provided, are all possible answers listed? Is one side of the issue represented more than another? Does the question use negative phrases or words? Are positive adjectives or phrases used? If a scale is used for responses, is it balanced (for example, 1 to 5, with 3 being neutral)? Are "dead giveaway" words used, such as "all," "every," or "always"? Is the question wordy? Were ambiguous words used - words with more than one meaning? Is the question worded simply? Questions to Ask About Questions

Only 25% of Children Enter School Ready Planning Facilities Books Printing Costs Literacy Training Book Distribution Demonstration Number of Books Distributed Number of Parents Attending Training Number of Brochures Mailed OUTCOMES Gained Knowledge of Importance of Reading Awareness of Role/Impact Intentionality of Modeling Children being read to daily/weekly Parents reading to children routinely Children Enter School Ready Measurement of Process Indicators: Number of Trainings Held/Number of Attendees Number of Books Distributed Number of Demonstrations held/Number of Parents attending Events (Mostly Counting: Does not truly measure Impact) Outcomes Logic Model: Parents Reading Daily to Children Measurement of Outcomes Indicators: Pre-Post Survey of gained knowledge/awareness of parent impact Increase in the number of children being read to daily Increase in the number of parents reading to their children (Measuring Change: Change in Behavior/Practices)

urces/guidebooks/MeasuringOutcomes.pdfhttp:// urces/guidebooks/MeasuringOutcomes.pdf icModel.pdfhttp:// icModel.pdf Resources

Happy Measuring! Terry Tolan John Roden