Download presentation
Presentation is loading. Please wait.
1
1 2006 Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report October 20, 2006 Sarah Stachowiak Organizational Research Services
2
2 Purpose and Goals Increased knowledge of Assigned Measures (AMs) Increased skills in collecting participant data Increased skills in interpreting PBPS Outcomes Report
3
3 DASA Required Measures Pre-Post Survey Questions for All Youth Participants 13-17 years old PPG Items for Family, Community, School and Individual Domains Questions on: Perceived Risk, Perceived Harm, Perceived “Wrongfulness” and 30-Day Use of Substances 15 Questions on PPG03 – Individual Domain Scale
4
4 DASA Assigned Measures – Development Process Search of literature on Impacts and Effects of Different Best Practices Search for common shorter term, more direct outcomes for youth and parents participating in different programs and practices Definitions of Outcomes -> Measurable Indicators Search for Valid and Reliable Measurement Scales
5
5 DASA Assigned Measures Pre-Post Survey Questions for All Youth Participants 13-17 years old or All Parents/Guardians Set of Youth and Parent Outcomes that are aligned with different Best and Promising Practices (9 Youth Outcomes / 8 Parent Outcomes) Scales with 5-8 questions for each of the Assigned Measures – drawn from existing tools or scales
6
6 Measurement Scales Search Through Validated Instruments and Curriculum Surveys Identified Survey Items Consistent with Chosen Indicators Linked to Youth and Parent Outcomes 5-10 Additional Survey Questions per Outcome Data Collection Across Programs Addressing Outcome and Objectives
7
7 Parent Outcomes Improved Family Cohesion Improved Attitudes about Family Management Skills Increased Use of Family Management Skills Increased Family Involvement Improved Family Communication Reduced Family Conflict
8
8 Youth Outcomes Improved Bonding Less Favorable Attitudes Increased Refusal/Resistance Skills Improved Social Competence Skills Improved Personal Competence Reduced Anti-Social Behaviors Improved Academic Performance
9
9 Benefits of Assigned Measures More useful outcome data for County/Tribe and Provider purposes Ability to look at common changes across different Best Practices and other Programs More “realistic” questions for respondents Now have parent outcome data!!
10
10 Collecting Participant Data Participant ID Issues Administering Surveys Managing Data Collection
11
11 Assigning ID Numbers Track participants over time Administer a multiple tools (e.g., pre and post) Confidentiality versus anonymity Unique identifiers Simple ID Self-Generated ID Local ID Field in PBPS
12
12 Self-Generated ID Numbers What is the last letter of your first name? What is the second letter of your last name? What is the month of your birthday? What is the first letter of your middle name?
13
13 Administering Surveys Share the purpose and intent Assure confidentiality Make sure everyone understands the ID code directions Consider type of administration (e.g., facilitator reads questions)
14
14 Managing Data Collection Maintain a survey tracking system Take steps to maximize response rate Use “data windows” Collect data when you have access to participants Consider incentives
15
15 PBPS Outcome Report Levels of Aggregation Types of Data Presented Service Characteristics Pre-Post Changes
16
16 Levels of Aggregation
17
17 Descriptive Data Frequencies: summaries of the number or percent of observations in each response category Averages: mean of responses Cross-tabulations: summaries of frequency distributions across different subgroups or levels of a second variable (not yet available)
18
18 T-TestsT-Tests Test for statistically significant difference between mean values Paired Samples – comparison of mean values on one variable over time for the same participants (e.g., Pre vs. Post) Mean differences “not due to chance” Standard convention p <.05 (probability that difference is due to chance is less than 5 percent)
19
19 Interpreting Quantitative Data Look at your data: What patterns do you see in the rows and columns? What findings are most interesting? What client characteristics might explain these patterns? What program strategies might explain these patterns?
20
20 Service Characteristics/Demographics Survey Completion Rate Average Attendance Rate Frequencies for: Gender Race Ethnicity Age (not for parent programs) Note: Data are dynamic; only relevant categories are shown Note: Demographics for all participants, not those who had pre-post data
21
21 Question Detail Scoring scale # Pre Post Pre and Post Results Average scores Statistical Significance Better, Worse, No Change % Change State Comparison Sub-Scales/Average of Questions #/% Individuals whose scores were…
22
22 Interpretation Considerations Sample size Completion rate Representativeness Cross tabulations (available 2007)
23
23 Group Exercise Interpreting Outcome Report Data
24
24 Reporting Findings Considerations: What do the data say about the outcomes? Who is your audience? What is your purpose? How can you best communicate what the data say? What are the implications of the findings for program development? For marketing?
25
25 Reporting Findings Provide Context: Outputs (e.g., dosage (frequency, quantity of intervention, number of participants) Description of intervention Background information that will help you interpret the data Process information (e.g., fidelity)
26
26 ResourcesResources Updated Evaluation Guidebook Regional Prevention Managers
27
27 Final Thoughts Goals of AMs and Outcome Report: Learning! Better decision-making Stronger prevention planning and programming Work in progress
28
28 Contact Information Sarah Stachowiak Organizational Research Services sarahs@organizationalresearch.com 206-728-0474 x10
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.