1 2006 Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Standardized Scales.
Multiple Indicator Cluster Surveys Data Dissemination - Further Analysis Workshop Basic Concepts of Further Analysis MICS4 Data Dissemination and Further.
Multiple Indicator Cluster Surveys Data Interpretation, Further Analysis and Dissemination Workshop Basic Concepts of Further Analysis.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Reference Guide Module 4: Reports October 2014 Reference Guide Module 4: Reports October 2014.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Magia G. Krause Ph.D. Candidate School of Information University of Michigan Society of American Archivists Annual Meeting August 14, 2009 Undergraduates.
Meal Time/Family Time: An Evaluation of a Nutrition Education Program Sondra M. Parmer, MS Alabama Cooperative Extension System Auburn University.
Chapter 41 Training for Organizations Research Skills.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Steps in the Research Process I have a research question, what do I do next?
Quantifying Data.
Assessing Financial Education: A Practitioner’s Guide December 2010.
Now that you know what assessment is, you know that it begins with a test. Ch 4.
School’s Cool in Childcare Settings
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
11 Triple P Outcomes in California Arizona Child Trauma Summit April 9, 2013 Cricket Mitchell, PhD Senior Associate, CiMH.
UNIVERSITY OF CALIFORNIA, IRVINE CAOMP UNIVERSITY OF CALIFORNIA, IRVINE CAOMP ONLINE TOOL BOX.
Conducting a Job Analysis to Establish the Examination Content Domain Patricia M. Muenzen Associate Director of Research Programs Professional Examination.
Research Methods. Research Projects  Background Literature  Aims and Hypothesis  Methods: Study Design Data collection approach Sample Size and Power.
Dr. Albrecht Research Team EXAMPLE of EVALUATIO N RESEARCH SERVICE LEARNING
Questionnaires and Interviews
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
NJ - 1 Performance Measurement Reporting Development Services Group, Inc. Don Johnson For more information contact Development Services Group, Inc
The Learning Behaviors Scale
Nursing Care Makes A Difference The Application of Omaha Documentation System on Clients with Mental Illness.
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Impact assessment framework
Selective Prevention Working Group: Considerations for the Performance Indicator Table Reporting to the Plenary Session of the VII Meeting of the CICAD.
Must include a least one for each box below. Can add additional factors. These problems… School Performance Youth Delinquency Mental Health [Add Yours.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Developing Business Practice –302LON Using data in your studies Unit: 5 Knowledgecast: 2.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Military Family Services Program Participant Survey Training Presentation.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Chapter 2 Data.
. Relationship between Types of Sex Education and High Risk Sexual Behavior Andrea M. Anderson & Kaitlyn Harlander Advised by: Susan Wolfgram, Ph.D. University.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Final Report for East Carolina University
Why Do State and Federal Programs Require a Needs Assessment?
Military Family Services Program Participant Survey Briefing Notes.
The Kansas Communities That Care Survey Survey Development.
Using Family Survey Data for Program Improvement Pam Roush, Director WV Birth to Three October 7, 2009.
EDCI 696 Dr. D. Brown Presented by: Kim Bassa. Targeted Topics Analysis of dependent variables and different types of data Selecting the appropriate statistic.
Performance Indicators Table for Youth Substance Abuse & Use Prevention September 15, 2005 Ottawa, Ontario Wanda Jamieson & Tullio Caputo.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluating the Indonesia Early Childhood Education and Development Project.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
SCHOOL COUNSELING INTERVENTIONS Adrienne WatkinsBall State University.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Chapter Eight: Quantitative Methods
This action-based research study used a descriptive triangulation process, which included quantitative and qualitative methods to analyze nursing students’
Domain-Based Outcome Measures. In order to assess the effectiveness of DAS-funded prevention programming and its influence on certain specified risk and.
1-2 Training of Process Facilitators Training of Process Facilitators To learn how to explain the Communities That Care process and the research.
Training of Process Facilitators 1- Training of Process Facilitators 5-1.
Are we there yet? Evaluating your graduation SiMR.
What is Research Design? RD is the general plan of how you will answer your research question(s) The plan should state clearly the following issues: The.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Chapter Eight: Quantitative Methods
Fidelity of Implementation in Scaling-up Highly Rated Science Curriculum Units for Diverse Populations Carol O’Donnell and Sharon Lynch The George Washington.
Presentation transcript:

Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report October 20, 2006 Sarah Stachowiak Organizational Research Services

2 Purpose and Goals  Increased knowledge of Assigned Measures (AMs)  Increased skills in collecting participant data  Increased skills in interpreting PBPS Outcomes Report

3 DASA Required Measures  Pre-Post Survey Questions for All Youth Participants years old  PPG Items for Family, Community, School and Individual Domains  Questions on: Perceived Risk, Perceived Harm, Perceived “Wrongfulness” and 30-Day Use of Substances  15 Questions on PPG03 – Individual Domain Scale

4 DASA Assigned Measures – Development Process  Search of literature on Impacts and Effects of Different Best Practices  Search for common shorter term, more direct outcomes for youth and parents participating in different programs and practices  Definitions of Outcomes -> Measurable Indicators  Search for Valid and Reliable Measurement Scales

5 DASA Assigned Measures  Pre-Post Survey Questions for All Youth Participants years old or All Parents/Guardians  Set of Youth and Parent Outcomes that are aligned with different Best and Promising Practices (9 Youth Outcomes / 8 Parent Outcomes)  Scales with 5-8 questions for each of the Assigned Measures – drawn from existing tools or scales

6 Measurement Scales  Search Through Validated Instruments and Curriculum Surveys  Identified Survey Items Consistent with Chosen Indicators Linked to Youth and Parent Outcomes  5-10 Additional Survey Questions per Outcome  Data Collection Across Programs Addressing Outcome and Objectives

7 Parent Outcomes  Improved Family Cohesion  Improved Attitudes about Family Management Skills  Increased Use of Family Management Skills  Increased Family Involvement  Improved Family Communication  Reduced Family Conflict

8 Youth Outcomes  Improved Bonding  Less Favorable Attitudes  Increased Refusal/Resistance Skills  Improved Social Competence Skills  Improved Personal Competence  Reduced Anti-Social Behaviors  Improved Academic Performance

9 Benefits of Assigned Measures  More useful outcome data for County/Tribe and Provider purposes  Ability to look at common changes across different Best Practices and other Programs  More “realistic” questions for respondents  Now have parent outcome data!!

10 Collecting Participant Data  Participant ID Issues  Administering Surveys  Managing Data Collection

11 Assigning ID Numbers  Track participants over time  Administer a multiple tools (e.g., pre and post)  Confidentiality versus anonymity  Unique identifiers  Simple ID  Self-Generated ID  Local ID Field in PBPS

12 Self-Generated ID Numbers  What is the last letter of your first name?  What is the second letter of your last name?  What is the month of your birthday?  What is the first letter of your middle name?

13 Administering Surveys  Share the purpose and intent  Assure confidentiality  Make sure everyone understands the ID code directions  Consider type of administration (e.g., facilitator reads questions)

14 Managing Data Collection  Maintain a survey tracking system  Take steps to maximize response rate  Use “data windows”  Collect data when you have access to participants  Consider incentives

15 PBPS Outcome Report  Levels of Aggregation  Types of Data Presented  Service Characteristics  Pre-Post Changes

16 Levels of Aggregation

17 Descriptive Data  Frequencies: summaries of the number or percent of observations in each response category  Averages: mean of responses  Cross-tabulations: summaries of frequency distributions across different subgroups or levels of a second variable (not yet available)

18 T-TestsT-Tests  Test for statistically significant difference between mean values  Paired Samples – comparison of mean values on one variable over time for the same participants (e.g., Pre vs. Post)  Mean differences “not due to chance”  Standard convention p <.05 (probability that difference is due to chance is less than 5 percent)

19 Interpreting Quantitative Data Look at your data:  What patterns do you see in the rows and columns?  What findings are most interesting?  What client characteristics might explain these patterns?  What program strategies might explain these patterns?

20 Service Characteristics/Demographics  Survey Completion Rate  Average Attendance Rate  Frequencies for:  Gender  Race  Ethnicity  Age (not for parent programs)  Note: Data are dynamic; only relevant categories are shown  Note: Demographics for all participants, not those who had pre-post data

21 Question Detail  Scoring scale  # Pre Post  Pre and Post Results  Average scores  Statistical Significance  Better, Worse, No Change  % Change  State Comparison  Sub-Scales/Average of Questions  #/% Individuals whose scores were…

22 Interpretation Considerations  Sample size  Completion rate  Representativeness  Cross tabulations (available 2007)

23 Group Exercise  Interpreting Outcome Report Data

24 Reporting Findings Considerations:  What do the data say about the outcomes?  Who is your audience? What is your purpose?  How can you best communicate what the data say?  What are the implications of the findings for program development? For marketing?

25 Reporting Findings  Provide Context:  Outputs (e.g., dosage (frequency, quantity of intervention, number of participants)  Description of intervention  Background information that will help you interpret the data  Process information (e.g., fidelity)

26 ResourcesResources  Updated Evaluation Guidebook  Regional Prevention Managers

27 Final Thoughts  Goals of AMs and Outcome Report:  Learning!  Better decision-making  Stronger prevention planning and programming  Work in progress

28 Contact Information Sarah Stachowiak Organizational Research Services x10