Targets & Improvement Activities State Improvement Planning Carol Massanari, MPRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010.

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

Campus Improvement Plans
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Transition from Part C to Part B in Louisiana (Session # S & 115)
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 State Monitoring Under IDEA A Snapshot of Past Practices.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Data-based Decision Making: Tools for Improving Practice National Conference Call May 10, 2011 Anne Lucas, NECTAC/WRRC Christina Kasprzak, NECTAC/ECO.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Updates on APR Reporting for Early Childhood Outcomes (Indicators C-3 and B-7) Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San.
Evaluating SPP/APR Improvement Activities Presented by Jeanna Mullins, Mid-South Regional Resource Center, RRCP Document developed by members of the Systems.
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
A Model for Collaborative Technical Assistance for SPP Indicators 1, 2, 13, & 14 Loujeania Bost, Charlotte Alverson, David Test, Susan Loving, & Marianne.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
Local Contributing Factor Tool for SPP/APR Compliance Indicators C-1, C-7, C-8, C-9/B-15, B-11 and B-12: Collecting and Using Valid and Reliable Data to.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Working with Your RRC to Improve Secondary Transition Education Presented by: Lucy Ely Pagán, NERRC and Jeanna Mullins, MSRRC.
The RRCP Program A Framework for Change Presented to our SPDG Partners June 2010.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
Julie R. Morales Butler Institute for Families University of Denver.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
SSIP Process A Suggested Pathway, Timeline and Gantt Chart WRRC Regional Forum Eugene October 31 and November 1, 2013.
Using a Logic Model to Plan and Evaluate Your Technology Leadership Development Program Chad Green, Program Analyst Lynn McNally, Technology Resource Supervisor.
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Session 2: Developing a Comprehensive M&E Work Plan.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Chair: Linda Miller, Great Lakes West Comprehensive Center Statewide Systems of Support: The RCC & State Story.
1 Early Childhood Transition: Facts, Figures, Fantasies and the Future Objectives 1. To share selected findings based on the SPP/APR and NECTC study 2.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Getting Results through Systemic Approaches
Evaluating SPP/APR Improvement Activities
Introduction to Program Evaluation
OSEP Project Directors Meeting
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Program Planning and Evaluation Essentials
Program Planning and Evaluation Methods
Implementation Guide for Linking Adults to Opportunity
Early Childhood Transition APR Indicators and National Trends
Using Data for Program Improvement
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Troubleshooting Logic Models
Using Data for Program Improvement
Evaluating SPP/APR Improvement Activities
2019 Spring & Fall Timeline May 10, 2019
Measuring Child and Family Outcomes Conference August 2008
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
M & E Plans and Frameworks
Presentation transcript:

Targets & Improvement Activities State Improvement Planning Carol Massanari, MPRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San Francisco, California

It’s been six years already? Old age arrives suddenly, as does the snow. One morning, on awakening, one realizes that everything is white. Western Regional Resource Center APR Clinic 2010

Getting here has not always been obvious. Western Regional Resource Center APR Clinic 2010

A New Opportunity Western Regional Resource Center APR Clinic 2010 Using prior data to look at trends over time. Reviewing and evaluating what we’ve been doing. Shaping the form that is emerging from the chaos.

State Improvement Planning SPP/APR as a management tool Evaluating to document and report results

Using the SPP/APR as a Management Tool Jeanna Mullins Jim Leinen, WRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San Francisco, California

Development Started with information from the November 2009 WRRC Think Tank on “Managing Improvement” Field tested with KY Part B staff Gathered input at Spring TA&D Conference and from RRC Program staff

Design TA Tool for TA providers at the national, regional, State or local levels Guide conversations on SPP process and management structures Serve as an informal formative evaluation Assist in identifying potential areas for improvement

Attributes of Promising Practices Topical Areas Leadership Staff/Workforce Assignments Strategic Planning Technical Assistance and Improvement Activities Measurement, Analysis and Knowledge Management Customer Focus/Results Western Regional Resource Center APR Clinic 2010

Rubric Rating of Attributes Fully Operational Under Development Under Consideration Not Considered at This Time Not Applicable Western Regional Resource Center APR Clinic 2010

Needs Identification Strengths Potential opportunities for change Comparison to federal, State, regional or local priorities for targeting improvement Resources available to support improvement Western Regional Resource Center APR Clinic 2010

Using the SPP/APR as a Management Tool Activity: Select one of the state systems’ Attributes: Leadership, Staff/Work Force Assignments Strategic Planning Technical Assistance and Improvement Activities Measurement Analysis & Knowledge Management Customer Focus/Results Answer the questions related to your selected Attribute Western Regional Resource Center APR Clinic 2010

Activity #1 Team Report Out: What attribute did you select and why? Share1-2 highlights about what you discovered while applying this tool Any comments about the tool? Western Regional Resource Center APR Clinic 2010

Guiding Evaluation and Improvement Planning Logic Models & Evaluation Plans Jeanna Mullins Jim Leinen, WRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San Francisco, California

Logic Model Elements  Inputs (resources to be used)  Processes (activities and strategies to be used)  Outputs (immediate, countable results produced by activities), and  Outcomes (benefits and changes that are realized)

Logic Model Elements  Inputs (resources to be used)  Processes (activities and strategies to be used)  Outputs (immediate, countable results produced by activities), and  Outcomes (benefits and changes that are realized)

A New Resource EVALUATING SPP/APR IMPROVEMENT ACTIVITIES Developed by members of the Systems and Improvement Planning Priority Team (NECTAC and RRCP) December 2009

Evaluating SPP/APR Improvement Activities Provides: 1. information about evaluation, improvement planning and strategic systems thinking 2. guidance on selecting an appropriate design for evaluating different types of improvement activities 3. additional resources and tools that support SPP/APR implementation and evaluation Western Regional Resource Center APR Clinic 2010

Foundation: Systems Thinking Systems are made up of interrelated, interconnected components Systems change involves changing the capacity, interrelationships, and interdependencies among parts, levels and stakeholders Desired changes in one part or level of the system must be accompanied by changes in other parts or levels Western Regional Resource Center APR Clinic 2010

Foundation: Logic Models InputsProcessOutputs Outcomes ‘theory of change’ model to show the process by which change is expected to occur illustrated through a visual depiction or logic model shows how the inputs (resources) and processes (activities) connect to outputs (products or units produced, such as number of staff trained) and outcomes (intended results)

What is a Logic Model? “A systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan to do, and the changes or results you hope to achieve.” WK Kellogg Foundation (2002)

Why Use a Logic Model?* Serves as a clear roadmap for the work Provides a framework for planning, implementation and evaluation Graphically illustrates key elements of an improvement activity and how they are interrelated *Source: Evaluation of Improvement Activities: Considerations and Strategies. George Liacopoulos, Ph.D. Mid-South Regional Resource Center’s Systems Improvement Forum, Baltimore, MD. April, 2010.

Benefits of Logic Models* Ensures stakeholders understand the purpose of the activities, the resources needed, the activities to be conducted and their capacity to effect change Tracks what has worked and what has not worked well so that success can be replicated Identifies and prioritizes questions to be asked in an evaluation *Source: Evaluation of Improvement Activities: Considerations and Strategies. George Liacopoulos, Ph.D. Mid-South Regional Resource Center’s Systems Improvement Forum, Baltimore, MD. April, 2010.

Foundations: SPP as long term plan for systems change EI/ECSE are systems of complex, interrelated components with goal of achieving outcomes for children and families Changes to EI/ECSE systems require a combination of improvement activities that are interconnected and support changes to infrastructure that work together to achieve the desired results

Selecting and Reviewing a Set of Improvement Activities e.g. Was a root cause analysis conducted to drive the activities? Was a logical linkage established between the root cause(s) and proposed activity outcomes? Have specific action steps been developed (e.g., task, activity, resources, timelines)? Have personnel been identified who will develop, implement, monitor and evaluate the improvement activity? Have methods to collect the outcome data been identified? Has a data collection timeline been developed? (And more … see full paper)

Steps for Evaluating Individual Improvement Activities 1.What is the goal or purpose? 2.What methods will you use for data collection? 3.What are your timelines for collecting, analyzing and reporting the data? 4.What is your baseline data that you will monitor for improvement? 5.How will the data be analyzed? 6.How will the evaluation results be used? 7.Who will be responsible for the various aspects of the evaluation?

Scenario: Child Outcomes Data Situation: Considerable amount of missing data on C3/B7 Local programs are unclear about when the data should be collected and reported, etc. Written policies and procedures are unclear, e.g. roles and responsibilities of Part C v. 619 Not all local staff have been trained

State Improvement Activities to address missing data issue: 1.Collaborate across State Part C and Section 619 agencies to clarify roles and responsibilities 2.Revise written policies and procedures to clarify e.g. roles/responsibilities 3.Provide additional training on the child outcomes data collection and reporting process Scenario: Child Outcomes Data

Short term goals of the 3 activities: To clarify roles and responsibilities of Part C and Section 619 school district staff in the collection and reporting To improve the written policies and procedures so that local program staff find them clear and comprehensive To increase local provider/program understanding of the child outcomes data collection and reporting process Long term goal of these 3 activities: To increase statewide reporting of child outcomes data (and decrease the missing data in the child outcomes indicator) Scenario: Child Outcomes Data

Logic Model for Scenario: Child Outcomes Data

Contact Information Western Regional Resource Center APR Clinic 2010 Christina Kasprzak, NECTAC/ECO Jeanna Mullins, MSRRC

Key Considerations Considering it may not be possible to evaluate all improvement activities, you could develop criteria to determine which activities you would like to evaluate:  Where is the state investing most resources?  “More bang for your buck” – What improvement activities can be clustered and/or influence multiple Indicators?  Improvement activities associated with indicators for which no progress is being evidenced  Indicators for which you have no or little data  Extent of implementation (e.g., statewide or more targeted)  Degree of confidence in the improvement activity Western Regional Resource Center APR Clinic 2010

Evaluation Plan/Logic Model Team Activity Instructions: Select an improvement activity from your SPP Explain reasons for selecting this improvement activity Draft an evaluation plan or an evaluation logic model for this activity Western Regional Resource Center APR Clinic 2010

Activity #2 Team Report Out: What indicators or cluster of indicators did you select? What improvement activities did you select and why? Any questions or comments about the process? Western Regional Resource Center APR Clinic 2010