Evaluating SPP/APR Improvement Activities Presented by Jeanna Mullins, Mid-South Regional Resource Center, RRCP Document developed by members of the Systems.

Slides:



Advertisements
Similar presentations
Disproportionality in Special Education
Advertisements

Intro. Website Purposes  Provide templates and resources for developing early childhood interagency agreements and collaborative procedures among multiple.
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
Transition from Part C to Part B in Louisiana (Session # S & 115)
Targets & Improvement Activities State Improvement Planning Carol Massanari, MPRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
Part B Indicator 13 FFY 09 SPP/APR Writing Suggestions Western Regional Resource Center APR Clinic 2010 November 1-3 San Francisco, California.
[Insert Exercise Name] Evaluator Briefing and Guidance.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
1 Module 6 Putting It All Together. 2 Learning Objectives At the end of this session participants will understand: The monitoring and evaluation process.
Strategic Management of Human Capital Recruitment Strategy
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
Pouring a Foundation for Program Improvement with Quality SPP/APR Data OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
 AKA CIPP  Evaluators: Elaine Carlson and Tom Munk  Assist in summative evaluation of the center  Helped develop standardized logic model  Helped.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Local Contributing Factor Tool for SPP/APR Compliance Indicators C-1, C-7, C-8, C-9/B-15, B-11 and B-12: Collecting and Using Valid and Reliable Data to.
Using State Data to Inform Parent Center Work. Region 2 Parent Technical Assistance Center (PTAC) Conference Charleston, SC June 25, 2015 Presenter: Terry.
1 Community-Based Care Readiness Assessment and Peer Review Team Procedures Overview Guide Department of Children and Families And Florida Mental Health.
Bilingual Students and the Law n Title VI of the Civil Rights Act of 1964 n Title VII of the Elementary and Secondary Education Act - The Bilingual Education.
1 Records Exchange Advice, Communication, and Technical Support (REACTS) State Strategic Plan Overview December 14, 2011.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
A Principled Approach to Accountability Assessments for Students with Disabilities CCSSO National Conference on Student Assessment Detroit, Michigan June.
SPR&I: Changes, New Measures/Targets, and Lessons Learned from Focused Monitoring Visits David Guardino, SPR&I Coordinator Fall 2009 COSA Conference.
Georgia Institute of Technology CS 4320 Fall 2003.
Results Driven Accountability PRT System Support Grant Targeted Improvement Plan Cole Johnson, NDE.
1 Charting the Course: Smoother Data Sharing for Effective Early Childhood Transition Wisconsin’s Journey Lori Wittemann, Wisconsin Department of Health.
The IEP: Drafting the IEP (Steps 1, 2, 3, and 4) Southwest Ohio Special Education Regional Resource Center Tuesday, November 7, 2006.
Welcome 2011 California Statewide Medical and Health Exercise.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Evaluation-Zen: The Pathway to Results The Story of North Carolina.
Using Data Effectively ABE Directors’ Meeting October 9th and 10th, 2002.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
Focused Review of Improvement Indicators A Self-Assessment Process SPP Stakeholder Meeting December 16, 2009.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
Project financed under Phare EUROPEAN UNION QUALITY EXTERNAL MONITORING IN THE SCHOOL YEAR 2007 – 2008 CONCLUSIONS AND RECOMMENDATIONS Material produced.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
National Center for Homeless Education State Coordinators Meeting 2016.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
ECO FAMILY EXPERIENCES AND OUTCOMES MEASUREMENT SYSTEM SIOBHAN COLGAN, ECO AT FPG CHELSEA GUILLEN, ILLINOIS MELISSA RASPA, ECO AT RTI ALICE RIGEWAY, CONNECTICUT.
1 Assuring the Quality of Data from the Child Outcomes Summary Form.
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
How Can Evaluation Efforts Illuminate Systems Change in Courts, Tribes, and States? 19 th Annual National Human Services Training Evaluation Symposium.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
1 Early Childhood Transition: Facts, Figures, Fantasies and the Future Objectives 1. To share selected findings based on the SPP/APR and NECTC study 2.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
Part C Data Managers — Review, Resources, and Relationship Building
Evaluating SPP/APR Improvement Activities
IW:LEARN TDA/SAP Training Course
Strategic Management of Human Capital Recruitment Strategy
OSEP Project Directors Meeting
Educator Effectiveness Regional Workshop: Round 2
2016 Improving Data, Improving Outcomes Conference
Monitoring and Evaluation using the
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Monitoring Child Outcomes: The Good, the Bad, and the Ugly
Early Childhood Transition APR Indicators and National Trends
Using Data for Program Improvement
OSEP “Hot Topics in Early Childhood” Meeting
History of work between ODE and ECO
Using Data for Program Improvement
Evaluating SPP/APR Improvement Activities
Presentation transcript:

Evaluating SPP/APR Improvement Activities Presented by Jeanna Mullins, Mid-South Regional Resource Center, RRCP Document developed by members of the Systems and Improvement Planning Priority Team (NECTAC and RRCP)

Resource Document Highlights Content  Types of SPP/APR Improvement Activities  Selection and Review of Activities  Steps for Evaluation of Activities  Evaluation Scenarios

Poll Question  Have you reviewed the Evaluating SPP/APR Improvement Activities document?

Types of Improvement Activities  Improve data collection and reporting  Improve systems administration and monitoring  Build systems and infrastructures of technical assistance and support  Provide technical assistance/ training/professional development

Types of Improvement Activities  Clarify/examine/develop policies and procedures  Program Development  Collaboration/coordination  Evaluation  Increase/adjust FTE

Selection and Review of Improvement Activities  Root cause analysis?  Links between root cause, data, and proposed outcomes?  Evidence-based practices?  Address more than one indicator?  Identification of collaborative partners?

Selection and Review of Improvement Activities  Detailed action plan (tasks, person(s) responsible, resources needed, timelines) for each activity?  Short-term and long-term outcomes?  Data sources, collection, analyses, and reporting?

Poll Question  Do you have an evaluation plan for one or more of your SPP/APR improvement activities?

Steps for Evaluating Improvement Activities  Goal/Purpose  Questions  Process/Impact  Data Collection Methods

Steps for Evaluating Improvement Activities  Timelines  Data Analysis Methods  Use and Reporting of Results  Person(s) Responsible

Sample Improvement Activity Collaborate across State Part C and Section 619 agencies to clarify roles and responsibilities of local programs at the data collection point when the child exits Part C and enters Part B.

Collaboration Scenario Context  Missing data on C3/B7  Local programs are unclear about when data should be collected and reported  Written policies and procedures are unclear, (e.g. roles and responsibilities of Part C v. 619)  All local staff have not been trained

Goal/Purpose Goal  To clarify roles and responsibilities of Part C and Section 619 Question  To what extent did the collaboration result in clear descriptions of roles and responsibilities of Part C and 619 staff?

Impact and Methods Impact: Outcome Evaluation Data Collection Method: Focused discussion with small group

Timelines and Data Analysis Timelines: Prior to the dissemination of revised policies and procedures and provision of training (Identify the specific timelines) Data Analysis: Baseline: Local programs report that roles and responsibilities are not clear.

Data Analysis Method  Notes/transcript from the reviewer discussion will be analyzed to determine whether or not roles and responsibilities for Part C and 619 are clear, and, if not, how they could be improved.

Data Use and Reporting  Feedback will help ensure written roles and responsibilities are clear. Will be incorporated into overall policies and procedures related to collecting and reporting outcomes data and into training.

Persons Responsible  Responsibilities: Part C and 619 staff will identify a set of local providers/program staff to participate in the review and discuss the reviewer feedback and implications for additional clarification needed regarding roles and responsibilities

Improvement Planning Resources RRCP SPP/APR Planning Calendar  Evaluating SPP/APR Improvement Activities  NCRRC SPP/APR Improvement Activity Review Form  State Systems Self-Assessment and Planning Guide  Using the SPP/APR as a Management Tool

Questions, Comments, Ideas  What are your thoughts?