Download presentation
Presentation is loading. Please wait.
Published byJason Robinson Modified over 9 years ago
1
System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency Wide PBS Conference Kalamazoo RESA October 19 th, 2010 Becca Sanders, PhD Evaluator, Kalamazoo Wraps beccaanddan@gorge.net
2
Outline Getting Oriented: System Level Evaluation Why invest in evaluation? A normative vs. empirical question How invest in evaluation? A series of shifts with a capital “S” The “Big Three” Getting Started: System Level Evaluation Data Collection– Just one of the “Big Three”
3
Why Invest in Evaluation? What’s the most convincing reason for a system to invest in evaluation efforts? Demonstrate outcomes? Ensuring fidelity? Describe services and service recipients? Regulatory compliance of funders/ bureaucracies? Nope to all of the above…
4
Why Invest? The Normative Pitch The most convincing overriding reason for investing in evaluation: Systems often fail because stakeholders, at various levels and in various roles, didn't know enough about, have, or use data that could have helped to prevent such failure. Cannon M. and Edmonson A. 2004. Failing to learn and learning to fail (intelligently). How great organizations put failure to work to improve and innovate.
5
In other words… Data can help with this!
6
Or maybe this…
7
OK, that’s the “Why” in a Nutshell Moving on to the How… How Now #1: 2 Shifts with a capital S Changes that need to happen within our own field (evaluation). Stakeholders can help by changing their expectations and perceptions regarding the roles of evaluators. See ya’ll in 10 years… this is going to take a while.
8
Shift #1: The Model Shift Goodbye Traditional Evaluation… Evaluation of Change Efforts Multiple Change Efforts Practices/ Services Procedures Policies Etc… Multiple Change Efforts Practices/ Services Procedures Policies Etc… Data Culture Traditional Onlooker In the Mix for a Fix Helps us do the data stuff
9
Shift #2: The Onus Shift Evaluation as “beside” system problems: part of the solution. versus Evaluation as “within” system problems: part of the problem and solution. Internalize evaluation functions Regard evaluation as another cog on the wheel that needs fixing Grow a “data guided culture” across your organization/ systems
10
How Now? The “Big Three” Getting Evaluation Systems in Place 1) Data Collection: Does the data exist? (Challenges: relevance/ utility, quality) 2) Data Access: Can we access? (Challenges: MIS, power structures, trust, fear, bureaucracy, agency cultures) 3) Data Dissemination: Are we using? (Challenges: simple, timely, relevant, usable)
11
The “Big Three” Roadmap A Mix of Technical and Adaptive Driving Technical: change in know how the “teachables” Adaptive: change in social conditions, values and beliefs the “less-teachables” There’s a whack of adaptive work in system level evaluation…
12
The Big 3Highly Technical Highly Adaptive Data Collection Data Access Data Dissemination Roll out on collection Coaching Methodology Help with MIS systems/ database set up Instrument review Data Collection/ entry Technical Assistance MIS system access Analysis Data system management MOU establishment with partnering organizations Catered Reporting Timing Stakeholder driven Data splits Interpretation of research literature
13
Getting Oriented: Relevance to PBS? PBS recognition of the fundamental value of data PBS already regards evaluation as “In the Mix for a Fix” Data functions fully internalized/ woven into daily operations Numerous folks with data roles at many tables and many levels of systems Decision making in accordance with data at many levels of the system
14
SYSTEMS DATA PRACTICES OUTCOMES Supporting Decision Making Supporting Youth Behavior Supporting Staff Behavior Supporting Social Competence and Knowledge Increase Same point… this time in a picture
15
Getting Started: Narrowing The Big Three Conversation 1) Data Collection: Does the data exist? 2) Data Access: Can we access? 3) Data Dissemination: Are we using? Turbo Takeaway Tips for Getting Started with Data Collection in 15 Minutes or Less
16
Getting Started: 3 Kinds of Data Descriptive Process Outcome Tip #1: Go for the low hanging fruit first! Who did you serve? What did you do? Bye bye narrative… hello aggregate coding.
17
Descriptive: An Example from ( Thank you Gretchen Lemmer!) Excerpt from the PBS Observation Form… Was the PBS poster hung upYN Before the start of the session, were behavior incentives explained? YN Was a GEAR Up Award given at the end of a session? YN Data internalized as part of ongoing operations… Capacity building for data based decision making Can help drive TA needs g.lemmer@prevention-works.org
18
Descriptive: Another Example from Family and Children Services (Thank you Maura Alexander!) Excerpt from PBS individual tally sheet.. BEHAVIORSKitchenDining Area Great Room Main Pool Stay Safe Take Responsibility Everyone Be Respectful Positive Interactions Support Others Total for Child: MauraW@fcsource.org
19
Descriptive: What’s going on? Often a snapshot Dichotomous variables Checklist types of measures 3 Kinds of Data Descriptive Process Outcome Process: What’s going south? Capture the nature of process/ implementation over time Subscales Mixed constructs Tip #2: Step ladder advised for process work… and use the backdoor if you can find it!
20
What’s the backdoor? When development of a measurement tool defines what a process (or outcome) should look like. The “ideal state” is revealed in the measure. Bye bye likerts, hello conditional clarity via items in the measurement tool. What’s the front door? When you already have conceptual clarity on what you’re hoping to achieve (the outcome) and what happened (the intervention) Systems rarely use the front door I too avoid the front door
21
Example of a Backdoor Process Measure PBS Benchmarks of Quality (BOQ): Measures Level of PBS Implementation Benchmark3 points2 points1 point0 points 11. Behaviors defined Written documentation exists that includes clear definitions of all behaviors listed. All of the behaviors are defined but some of the definitions are unclear. Not all behaviors are defined or some definitions are unclear. No written documentation of definitions exists. www.kalamazoowrapsevaluation.org
22
3 Kinds of Data Descriptive Process Outcome Tip #3: Don’t climb the tree without the ladder! Psychometrics- reliability and validity- matter huge!
23
Reliability/ Validity in Outcome Data Junk in=Junk out Reliability: Stability and consistency of a measure (Tip #5: calibrate, calibrate, calibrate) Validity: Ability to capture the intended construct (Tip #6: Do the construct search) Lots of cheesy outcome measures out there Lots of great attempts to develop outcome measures w/out researchers at the table Researchers: create measures Evaluators: bridge research and the field
24
3 Kinds of Data Descriptive Process Outcome Increasingly hard: to measure/ capture well to interpret on the wallet to analyze Grand Finale Turbo Tip #7: When it comes to PBS data collection, invest in the search- not the development– of measures. The PBS idea generation machine is huge
25
Relevant & Measurable Indicators Team-based Decision Making & Planning Continuous Monitoring Regular Review Effective Visual Displays Efficient Input, Storage, & Retrieval Evaluation Ready to Get Started? A visual of the PBS Model of Evaluation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.