System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency Wide PBS Conference Kalamazoo RESA October 19 th, 2010.

Slides:



Advertisements
Similar presentations
Nursing Diagnosis: Definition
Advertisements

Welcome to Site Management Amy Thompson. Agenda I.Foundation Introductions Setting the Session Agenda II.Site Management Principles III.Site Management.
Program and Implementation. Plans for the next 90 minutes and beyond! Define quality assurance for a program Implementation plans Utilization of.
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Why and How should we focus on Results? Susan Stout, Manager Results Secretariat OPCS November, 2006.
Fundamentals and Best Practices for outcomes and success.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Administration, Management, and Coordination of Supportive Housing: Guidelines from CSH’s Dimensions of Quality MHSA TA Operations Call September 1, 2010.
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
What You Will Learn From These Sessions
Fidelity Instruments and School Burden Patricia Mueller, Ed.D., Brent Garrett, Ph.D., & David Merves, C.A.S. Evergreen Evaluation & Consulting, LLC AEA.
Continuous Quality Improvement: Ideas from The Field Becca Sanders, Carolyn Sullins Evaluators, Kalamazoo Wraps System of Care
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Tier 2/3 Coaching Functions & Skills Kimberli Breen Michele Capio Illinois PBIS Network.
Guiding and Evaluating Positive Behavioral Support Implementation Shawn Fleming.
An Overview of: “A Resource Guide For Head Start Programs: Moving Beyond a Culture of Compliance to a Culture of Continuous Improvement” Cover photo.
TELL Colorado Post-Survey Webinar Andrew Sioberg New Teacher Center.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Project quality management Cody Ronning 3/23/2015.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
1 Understanding and Developing Child Welfare Practice Models The Service Array Process National Child Welfare Resource Center for Organizational Improvement.
National Center on Response to Intervention Developed by the National Center on Response to Intervention and RMC Research RTI Integrity Framework: A Tool.
Teacher Development and Evaluation Model October 22, 2012.
First, a little background…  The FIT Program is the lead agency for early intervention services under the Individuals with Disabilities Education Act.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Step 6: Implementing Change. Implementing Change Our Roadmap.
+ Facts + Figures: Your Introduction to Meetings Data Presented By: Name Title Organization.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Too expensive Too complicated Too time consuming.
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
UNICEF’s work and planned activities for the production of data on children with disabilities Claudia Cappa, Data and Analytics Section, UNICEF, NY.
Creating a Positive Environment: P ositive B ehavioral I nterventions & S upports Carol Frodge Former Principal, Edmonds School District PBIS Trainer Fierce.
School-wide Facilitator Training September 15, 2010 School-Wide Positive Behavior Support.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
GARDEN CITY DISTRICT LEADERSHIP November 6, 2012.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Project quality management. Introduction Project quality management includes the process required to ensure that the project satisfies the needs for which.
BENCHMARKS OF QUALITY (BOQ) January 30 th, 2007 Joey Ledvina Parr, Ph.D. Elsa Velez, Ph. D.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
The Local Evaluation: A Brief Overview Becca Sanders, M.S.W., Ph.D. Program Evaluator
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Communimetrics and CQI Stephen Shimshock PhD Michael Martinez MSW Amy Edwards LMSW Yakiciwey Mitchell MSW Angelina Garcia MSW.
Notes for Trainers (Day Training)
Sustaining Your Gains.  Up to 70% of change initiatives fail, impacting: › Best possible care › Staff and provider frustration › Reluctance to engage.
Office of Service Quality
In cooperation with PLN 34: Taking Action with Data™ Session 1.
Edit the text with your own short phrases. The animation is already done for you; just copy and paste the slide into your existing presentation.
Session 2: Developing a Comprehensive M&E Work Plan.
Observing and Assessing Young Children
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MANAGING EMPLOYEE PERFORMANCE Facilitator: Joan Strohauer, CalHR Guest Presenters: Marva Lee, Personnel Officer, CalSTRS Brenna Neuharth, Workforce Planning.
DAC Data Analytics Using State and Local Data to Improve Results.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Strategies for Supporting Home Visitors with Data Collection
Viewing Data-Driven Success Through a Capability Lens
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
2016 Improving Data, Improving Outcomes Conference
Oregon Team : Carla Wade : Jan McCoy :Dave Cook : Jennifer Arns
Thank you for agreeing to complete the Benchmarks of Quality
Introduction to the PRISM Framework
CATHCA National Conference 2018
Using State and Local Data to Improve Results
Presentation transcript:

System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency Wide PBS Conference Kalamazoo RESA October 19 th, 2010 Becca Sanders, PhD Evaluator, Kalamazoo Wraps

Outline Getting Oriented: System Level Evaluation Why invest in evaluation?  A normative vs. empirical question How invest in evaluation?  A series of shifts with a capital “S”  The “Big Three” Getting Started: System Level Evaluation  Data Collection– Just one of the “Big Three”

Why Invest in Evaluation? What’s the most convincing reason for a system to invest in evaluation efforts?  Demonstrate outcomes?  Ensuring fidelity?  Describe services and service recipients?  Regulatory compliance of funders/ bureaucracies? Nope to all of the above…

Why Invest? The Normative Pitch The most convincing overriding reason for investing in evaluation: Systems often fail because stakeholders, at various levels and in various roles, didn't know enough about, have, or use data that could have helped to prevent such failure. Cannon M. and Edmonson A Failing to learn and learning to fail (intelligently). How great organizations put failure to work to improve and innovate.

In other words… Data can help with this!

Or maybe this…

OK, that’s the “Why” in a Nutshell Moving on to the How… How Now #1: 2 Shifts with a capital S Changes that need to happen within our own field (evaluation). Stakeholders can help by changing their expectations and perceptions regarding the roles of evaluators. See ya’ll in 10 years… this is going to take a while.

Shift #1: The Model Shift Goodbye Traditional Evaluation… Evaluation of Change Efforts Multiple Change Efforts Practices/ Services Procedures Policies Etc… Multiple Change Efforts Practices/ Services Procedures Policies Etc… Data Culture Traditional Onlooker In the Mix for a Fix Helps us do the data stuff

Shift #2: The Onus Shift Evaluation as “beside” system problems: part of the solution. versus Evaluation as “within” system problems: part of the problem and solution. Internalize evaluation functions Regard evaluation as another cog on the wheel that needs fixing Grow a “data guided culture” across your organization/ systems

How Now? The “Big Three” Getting Evaluation Systems in Place 1) Data Collection: Does the data exist? (Challenges: relevance/ utility, quality) 2) Data Access: Can we access? (Challenges: MIS, power structures, trust, fear, bureaucracy, agency cultures) 3) Data Dissemination: Are we using? (Challenges: simple, timely, relevant, usable)

The “Big Three” Roadmap A Mix of Technical and Adaptive Driving Technical: change in know how the “teachables” Adaptive: change in social conditions, values and beliefs the “less-teachables” There’s a whack of adaptive work in system level evaluation…

The Big 3Highly Technical Highly Adaptive Data Collection Data Access Data Dissemination Roll out on collection Coaching Methodology Help with MIS systems/ database set up Instrument review Data Collection/ entry Technical Assistance MIS system access Analysis Data system management MOU establishment with partnering organizations Catered Reporting Timing Stakeholder driven Data splits Interpretation of research literature

Getting Oriented: Relevance to PBS? PBS recognition of the fundamental value of data PBS already regards evaluation as “In the Mix for a Fix”  Data functions fully internalized/ woven into daily operations  Numerous folks with data roles at many tables and many levels of systems  Decision making in accordance with data at many levels of the system

SYSTEMS DATA PRACTICES OUTCOMES Supporting Decision Making Supporting Youth Behavior Supporting Staff Behavior Supporting Social Competence and Knowledge Increase Same point… this time in a picture

Getting Started: Narrowing The Big Three Conversation 1) Data Collection: Does the data exist? 2) Data Access: Can we access? 3) Data Dissemination: Are we using? Turbo Takeaway Tips for Getting Started with Data Collection in 15 Minutes or Less

Getting Started: 3 Kinds of Data Descriptive Process Outcome Tip #1: Go for the low hanging fruit first! Who did you serve? What did you do? Bye bye narrative… hello aggregate coding.

Descriptive: An Example from ( Thank you Gretchen Lemmer!) Excerpt from the PBS Observation Form… Was the PBS poster hung upYN Before the start of the session, were behavior incentives explained? YN Was a GEAR Up Award given at the end of a session? YN Data internalized as part of ongoing operations… Capacity building for data based decision making Can help drive TA needs

Descriptive: Another Example from Family and Children Services (Thank you Maura Alexander!) Excerpt from PBS individual tally sheet.. BEHAVIORSKitchenDining Area Great Room Main Pool Stay Safe Take Responsibility Everyone Be Respectful Positive Interactions Support Others Total for Child:

Descriptive: What’s going on? Often a snapshot Dichotomous variables Checklist types of measures 3 Kinds of Data Descriptive Process Outcome Process: What’s going south? Capture the nature of process/ implementation over time Subscales Mixed constructs Tip #2: Step ladder advised for process work… and use the backdoor if you can find it!

What’s the backdoor? When development of a measurement tool defines what a process (or outcome) should look like. The “ideal state” is revealed in the measure.  Bye bye likerts, hello conditional clarity via items in the measurement tool. What’s the front door? When you already have conceptual clarity on what you’re hoping to achieve (the outcome) and what happened (the intervention)  Systems rarely use the front door  I too avoid the front door

Example of a Backdoor Process Measure PBS Benchmarks of Quality (BOQ): Measures Level of PBS Implementation Benchmark3 points2 points1 point0 points 11. Behaviors defined Written documentation exists that includes clear definitions of all behaviors listed. All of the behaviors are defined but some of the definitions are unclear. Not all behaviors are defined or some definitions are unclear. No written documentation of definitions exists.

3 Kinds of Data Descriptive Process Outcome Tip #3: Don’t climb the tree without the ladder! Psychometrics- reliability and validity- matter huge!

Reliability/ Validity in Outcome Data Junk in=Junk out Reliability: Stability and consistency of a measure (Tip #5: calibrate, calibrate, calibrate) Validity: Ability to capture the intended construct (Tip #6: Do the construct search) Lots of cheesy outcome measures out there Lots of great attempts to develop outcome measures w/out researchers at the table Researchers: create measures Evaluators: bridge research and the field

3 Kinds of Data Descriptive Process Outcome Increasingly hard: to measure/ capture well to interpret on the wallet to analyze Grand Finale Turbo Tip #7: When it comes to PBS data collection, invest in the search- not the development– of measures. The PBS idea generation machine is huge

Relevant & Measurable Indicators Team-based Decision Making & Planning Continuous Monitoring Regular Review Effective Visual Displays Efficient Input, Storage, & Retrieval Evaluation Ready to Get Started? A visual of the PBS Model of Evaluation