Collecting High Quality Outcome Data: Part 2 Copyright © 2012 by JBS International, Inc. Developed by JBS International for the Corporation for National.

Slides:



Advertisements
Similar presentations
Designing Effective Action for Change
Advertisements

Chapter 5 Transfer of Training
Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
Grants will provide support and/or facilitate access to services and resources that contribute to improved educational outcomes for economically disadvantaged.
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
Foster Grandparent Program
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure?
Data Collection and Aggregation
Collecting High Quality Outcome Data: Part 1 Copyright © 2012 by JBS International, Inc. Developed by JBS International for the Corporation for National.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
Collecting High Quality Outcome Data, part 1. Learning objectives By the end of this module, learners will be able to: Recognize the benefits of collecting.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
Collecting High Quality Outcome Data, part 2. Learning objectives By the end of this module, learners will understand: Steps to implement data collection,
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
1 Managing to Outcomes Creating a Result-Driven Culture September 15, 2010 Jewel Bazilio-Bellegarde, CNCS Ken Terao and Anna Marie Schmidt, Project STAR.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION OF NATIONAL & COMMUNITY SERVICE.
Performance Measurement Overview 2/8/2014 Performance Measurement Overview 1 What is it? Why do it?
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
1 Creating Strong Reports New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Aviation Security Training Module 4 Design and Conduct Exercise II 1.
1. 2 Recruitment and Retention of Volunteer Lay Leaders: Results of the Follow-Up Survey Prepared by: Tamara H. Herrick MaineHealths Partnership for Healthy.
California Preschool Learning Foundations
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
Illinois Department of Children and Family Services, Pathways to Strengthening and Supporting Families Program April 15, 2010 Division of Service Support,
Illinois Department of Children and Family Services, Pathways to Strengthening and Supporting Families Program April 15, 2010 Division of Service Support,
Career and College Readiness Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Assessment Literacy MODULE 1.
Assessment Literacy Kentucky Core Academic Standards Characteristics of Highly Effective Teaching and Learning Career and College Readiness MODULE 1.
APS Teacher Evaluation
Providing Effective Feedback
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
The SCPS Professional Growth System
Time Management F OR A S MALL B USINESS. TIMEMANAGEMENT 2 Welcome 1. Agenda 2. Ground Rules 3. Introductions.
Pennsylvania Value-Added Assessment System (PVAAS) High Growth, High Achieving Schools: Is It Possible? Fall, 2011 PVAAS Webinar.
Evaluating Training Programs The Four Levels
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Customer Service.
Introduction AmeriCorps State & National 1 The following presentation will guide AmeriCorps State and National Program users through how to create Applicant-Determined.
Session 2: Introduction to the Quality Criteria. Session Overview Your facilitator, ___________________. [Add details of facilitators background, including.
Module 4: Establishing a Data-based Decision- making System.
© 2012 Common Core, Inc. All rights reserved. commoncore.org NYS COMMON CORE MATHEMATICS CURRICULUM Rigor Breakdown A Three Part Series.
Orientation and Training Susan A. Abravanel Sydney Taylor June 25 th, 2014.
District Advisory Council (DAC) 1 October 22, 2012 Westlawn Elementary School.
Science as a Process Chapter 1 Section 2.
Developing a Global Vision Through Marketing Research
Appraising and Managing Performance (c) 2007 by Prentice Hall7-1 Chapter 7.
Evidence: What It Is And Where To Find It Evidence: What It Is and Where to Find It How evidence helps you select an effective intervention Copyright ©
Assessment Literacy: Formative Instructional Practices
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
Student Survey
RTI Implementer Webinar Series: Establishing a Screening Process
2014 AmeriCorps State and National Symposium Basic Steps in Conducting an Evaluation.
1 Phase III: Planning Action Developing Improvement Plans.
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
The Framework for Teaching Charlotte Danielson 4c: Communicating with Families 1 6/12/201 3.
1 Extending the Service Pipeline AmeriCorps Funding Opportunities for Organizations Serving Veterans and Military Families November 18, 2009.
Questioning Techniques
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Data Collection* Presenter: Octavia Kuransky, MSP.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Collecting High Quality Outcome Data: Part 1 Collecting High Quality Outcome Data Copyright © 2012 by JBS International, Inc. Developed by JBS International.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Collecting High Quality Outcome Data: Part 2
Presentation transcript:

Collecting High Quality Outcome Data: Part 2 Copyright © 2012 by JBS International, Inc. Developed by JBS International for the Corporation for National & Community Service

Collecting High Quality Outcome Data: Part 2 Learning Objectives By the end of this module, learners will be able to: Describe steps to implement data collection Recognize data quality Key concepts are illustrated using a variety of program measurement scenarios. 2

Collecting High Quality Outcome Data: Part 2 Module Overview Implementing data collection Developing a data collection schedule Training data collectors Pilot testing instruments Ensuring data quality Reliability Validity Minimizing bias Summary of key points; additional resources 3

Collecting High Quality Outcome Data: Part 2 Implementing Data Collection After identifying a data source, method and instrument: 1. Identify data collection participants 2. Set a schedule for collecting data 3. Train data collectors 4. Pilot test the data collection process 5. Make changes 6. Implement data collection 4 FOR BEST RESULTS make key decisions about how to implement data collection BEFORE program startup!

Collecting High Quality Outcome Data: Part 2 Step 1: Identifying data collection participants 5 Brainstorm a list of all the relevant players in the data collection process. This includes: Clients/beneficiaries National service participants Staff members Host site staff Other stakeholders

Collecting High Quality Outcome Data: Part 2 Step 2: Creating A Data Collection Schedule 6 Identifies who will collect data, using which instrument, and when Share with team to keep everyone informed Include stakeholders in planning Include dates for collecting, analyzing, and reporting data Select a format

Collecting High Quality Outcome Data: Part 2 Step 3: Training Data Collectors Determine best person(s) to collect data Provide written instructions for collecting data Explain importance and value of data for program Walk data collectors through instrument Practice or role play data collection Review data collection schedule Explain how to return completed instruments 7

Collecting High Quality Outcome Data: Part 2 Step 4: Pilot Testing for Feasibility and Data Quality 1.Try out instruments with a small group similar to program participants 2.Discuss instrument with respondents 3.Analyze pilot test data to ensure the instrument yields the right information 8 Questions for Debrief How long did it take to complete? What did you think the questions were asking you about? Were any questions unclear, confusing, or difficult to answer? Were response options adequate? Did questions allow you to say everything you wanted to say?

Collecting High Quality Outcome Data: Part 2 Steps 5 & 6: Make Changes & Implement Your Plan 9 Make Changes Based on pilot test analysis: Improve instrument Strengthen process Implement Your Plan Perform periodic quality control checks

Collecting High Quality Outcome Data: Part 2 Ensuring Data Quality: Key Criteria Criteria for collecting high-quality, useful outcome data: Reliability Validity Minimizing Bias 10

Collecting High Quality Outcome Data: Part 2 Reliability Reliability: The ability of a method or instrument to yield consistent results under the same conditions. Requires that instruments be administer the same way every time: o Written instructions for respondents o Written instructions for data collectors o Train and monitor data collectors 11

Collecting High Quality Outcome Data: Part 2 Reliability Design instruments to improve reliability o Use clear and unambiguous language so question meaning is clear. 12 Unclear language How has the availability companionship services altered your capacity with respect to attending visits with medical practitioners in a timely manner? Clear language How has use of companionship services affected your ability to get to medical appointments on time?

Collecting High Quality Outcome Data: Part 2 Reliability Design instruments to improve reliability o Use attractive, uncluttered layouts that are easy to follow. 13 Cluttered layout What grade are you in? 6 th grade 7 th grade 8 th grade Uncluttered layout What grade are you in? 6 th grade 7 th grade 8 th grade

Collecting High Quality Outcome Data: Part 2 Ensuring Reliability For Surveys Avoid ambiguous wording that may lead respondents to interpret the same question differently. For Interviews Dont paraphrase or change question wording. Dont give verbal or non-verbal cues that suggest preferred responses. For Observation Clearly define the behavior or conditions to be observed. Carefully train and monitor observers to ensure consistency in how they interpret what they see. 14

Collecting High Quality Outcome Data: Part 2 Validity Validity is the ability of a method or instrument to accurately measure what it is intended to measure. Instrument measures the same outcome identified in theory of change Instrument measures relevant dimensions of outcome (attitude, knowledge, knowledge, behavior, condition) Instrument results corroborated by other evidence 15

Collecting High Quality Outcome Data: Part 2 Ensuring Validity Example Academic engagement (attachment to school) Instrument should measure same type of outcome – in this case, attitudes - as intended outcome Instrument should measure the outcome dimensions targeted by intervention, including feelings about: Students showing improvement in attitudes towards school should not exhibit contradictory behavior 16 - Teachers - Students - Being in school - Doing schoolwork

Collecting High Quality Outcome Data: Part 2 Minimizing Sources of Bias Bias involves systematic distortion of results stemming from how data are collected and how instruments are designed. Who: Non-responders = hidden bias How: Wording that encourages or discourages particular responses When and Where: Timing and location can influence responses Bias can lead to over- or under-estimation of program results 17

Collecting High Quality Outcome Data: Part 2 7 Ways of Minimizing Bias 1.Get data from as many respondents as possible 2.Follow up with non-responders 3.Take steps to reduce participant attrition 4.Work with program sites to maximize data collection 18

Collecting High Quality Outcome Data: Part 2 7 Ways of Minimizing Bias (continued) 5.Pilot test instruments and data collection procedures 6.Mind your language 7.Time data collection to avoid circumstances that may distort responses 19

Collecting High Quality Outcome Data: Part 2 Measurement Scenarios Explore these measurement scenarios to see how programs address issues of reliability, validity, and bias. 20

Collecting High Quality Outcome Data: Part 2 Did students in the mentoring program increase their attachment to school? Output Number of disadvantaged youth/mentor matches that were sustained by the CNCS-supported program for at least the required time period (ED4A) Outcome Number of students in grades K-12 that participated in the mentoring or tutoring or other education program who demonstrated improved academic engagement (ED27) How MeasuredPre/post survey of students to gauge attachment to school Outcome Target 90 (of 100) students in grades 6-8 that participate in the after-school program for 9 months will improve academic engagement, defined as feelings of attachment to school. Academic Engagement 21

Collecting High Quality Outcome Data: Part 2 Did students in the mentoring program increase their attachment to school? Outcome Number of students in grades K-12 that participated in the mentoring or tutoring or other education program who demonstrated improved academic engagement (ED27) How MeasuredPre/post survey of students to gauge attachment to school Reliability Do survey responses reflect students stable and established beliefs about school or just fleeting and changeable feelings? Validity Does the survey get at the dimensions of school attachment that are relevant to the intervention? Are students telling us how the really feel or what they think we want to hear? Bias Does the survey ask the questions in a neutral way? Have we timed the survey to avoid unrelated factors like exam stress that could contaminate our results? Academic Engagement Reliability, Validity, Bias 22

Collecting High Quality Outcome Data: Part 2 Did students in the tutoring program improve their reading skills? Output Number of students that complete participation in CNCS- supported K-12 education programs (ED2) Outcome Number of students with improved academic performance in literacy and/or math (ED5) How Measured Standardized tests in oral reading fluency and reading comprehension (pre/post) Outcome Target 160 (of 200) students that complete 12 weeks of reading tutoring will improve oral reading fluency and reading comprehension. Reading Ability 23

Collecting High Quality Outcome Data: Part 2 Did students in the tutoring program improve their reading skills? Outcome Number of students with improved academic performance in literacy and/or math (ED5) How Measured Standardized tests in oral reading fluency and reading comprehension (pre/post) ReliabilityAre standardized tests administered in a consistent manner? Validity Do standardized tests accurately reflect students reading abilities? Are the tests were using designed to measure the specific improvements our intervention is designed to produce? Bias How do we ensure that results be not affected by some schools being unable to administer the tests to student participants? Reading AbilityReliability, Validity, Bias 24

Collecting High Quality Outcome Data: Part 2 Did individuals who received counseling and referrals move into safe, healthy, affordable housing? Output Number of economically disadvantaged individuals, including homeless individuals, receiving housing services (O5) Outcome Number of economically disadvantaged individuals, including homeless individuals, transitioned into safe, healthy, affordable housing (O11) How MeasuredTracking Sheet (for follow-up calls and/or home visits) Outcome Target 70 (of 80) economically disadvantaged individuals will transition to safe, healthy, affordable rental housing within 9 months of receiving counseling and application assistance. Housing 25

Collecting High Quality Outcome Data: Part 2 Did individuals who received counseling and referrals move into safe, healthy, affordable housing? Outcome Number of economically disadvantaged individuals, including homeless individuals, transitioned into safe, healthy, affordable housing (O11) How MeasuredTracking Sheet (for follow-up calls and/or home visits) Reliability How do we measure affordable housing in light of differences in individuals financial circumstances? Validity How do we know that individuals who we help transition to housing attain a stable housing situation? Bias Some individuals with housing needs tend to be transient. How can we collect outcome data from all of these individuals during the 9-month follow-up period? HousingReliability, Validity, Bias 26

Collecting High Quality Outcome Data: Part 2 Did children in the physical fitness program improve exercise habits? Output Number of children and youth engaged in in-school or afterschool physical education activities with the purpose of reducing childhood obesity (H5) OutcomeNumber of children that increase physical exercise How MeasuredParticipant survey (pre-mid-post) Outcome Target 70 (of 80) children will report increased engagement in physical exercise compared to when they entered the program. Exercise Habits 27

Collecting High Quality Outcome Data: Part 2 Did children in the physical fitness program improve exercise habits? OutcomeNumber of children that increase physical exercise How MeasuredParticipant survey (pre-mid-post) Reliability How reliably can children remember their physical exercise for a period covering days or even weeks? Validity How do we measure increased physical exercise? Is it sufficient to look at how frequently children exercise or do we also need to consider the duration and intensity of exercise? How do we keep children from reporting what they think program staff want to hear? Bias How can we be sure that changes childrens exercise habits are due to the program rather than to a change in weather (i.e., availability of outdoor activities)? Exercise HabitsReliability, Validity, Bias 28

Collecting High Quality Outcome Data: Part 2 Did volunteer recruitment efforts allow the program to serve more people in key focus areas? Output Number of community volunteers recruited by CNCS- supported organizations or national service participants (G3- 3.1) Outcome Number of new beneficiaries that received services as a result of capacity building efforts in: Disaster Services, Economic Opportunity, Education, Environmental Stewardship, Healthy Futures, and/or Veterans and Military Families (G3-3.18) How Measured Tracking sheet measuring number of people served by volunteers (pre-post comparison) Outcome Target100 new beneficiaries will be served. Capacity BuildingRecruiting Volunteers To Serve More People 29

Collecting High Quality Outcome Data: Part 2 Did volunteer recruitment efforts allow the program to serve more people in key focus areas? Outcome Number of new beneficiaries that received services as a result of capacity building efforts in: Disaster Services, Economic Opportunity, Education, Environmental Stewardship, Healthy Futures, and/or Veterans and Military Families (G3-3.18) How Measured Tracking sheet measuring number of people served by volunteers (pre-post comparison) Reliability How can the program ensure that everyone follows the same criteria for counting someone as having been served by a volunteer? Validity How can the program ensure the accuracy of data showing the number of people served by volunteers? Bias How can the program ensure that information is captured for everyone who was served by a volunteer? Capacity BuildingReliability, Validity, Bias 30

Collecting High Quality Outcome Data: Part 2 Summary of key points Steps to implement data collection include identifying the players involved in data collection, creating a data collection schedule, training data collectors, pilot testing instruments, and revising instruments as needed. A data collection schedule identifies who will collect data, using which instrument, and when. Training data collectors by walking them through the instrument and role playing the process. Pilot testing involves having a small group of people complete an instrument and asking them about the experience. 31

Collecting High Quality Outcome Data: Part 2 Summary of key points Reliability, validity, and bias are key criteria for data quality. Reliability is the ability of a method or instrument to yield consistent results under the same conditions. Validity is the ability of a method or instrument to measure accurately. Bias involves systematic distortion of results due to over- or under-representation of particular groups, question wording that encourages or discourages particular responses, and by poorly timed data collection. 32

Collecting High Quality Outcome Data: Part 2 Additional resources CNCS Priorities and Performance Measurement o Knowledge Network instrument-formatting checklist o nt_Development_Checklist_and_Sample.pdf nt_Development_Checklist_and_Sample.pdf Practicum Materials o curriculum curriculum 33