Effective Programs for Successful Students The Network Summer Summit Presenters: Rachel Harrington (Performance-Based Monitoring Division) Judy Struve.

Slides:



Advertisements
Similar presentations
August 8, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Shannon Housson, Director Overview of.
Advertisements

PBM The Staging Process. Check the dates August In late August, districts receive their PBMAS summary report.
System Safeguards and Campus Improvement
Before IDEA One in five children with disabilities was educated. One in five children with disabilities was educated. More than 1 million children with.
Campus Improvement Plans
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Data Analysis State Accountability. Data Analysis (What) Needs Assessment (Why ) Improvement Plan (How) Implement and Monitor.
Accountability preview Major Mindshift Out with the Old – In with the New TEPSA - May 2013 (Part 2) Ervin Knezek John Fessenden
2015 SpEd Assessment Updates TETN Event # Presented June 5, 2013 TEA’s Student Assessment Division.
The Special Education Process 1 Connecting Research to Practice for Teacher Educators.
APAC Meeting | January 22, 2014 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Overview of Performance.
Accountability Update Ty Duncan Coordinator of Accountability and Compliance, ESC
PSP Summer Institute| July 29 – August 2, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Shannon.
Division of Instructional Support Office of School Improvement, Accountability & Compliance.
2013 State Accountability System Allen ISD. State Accountability under TAKS program:  Four Ratings: Exemplary, Recognized, Academically Acceptable, Academically.
Educator Evaluations Education Accountability Summit August 26-28,
Performance-Based Monitoring and Interventions April 3, 2008.
State Directors Conference Boise, ID, March 4, 2013 Cesar D’Agord Regional Resource Center Program WRRC – Western Region.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Adapted from Texas Education Agency PBM ProductDate of 2014 release Projected date of 2015 release PBMAS (Unmasked reports) LEA reports posted.
State Accountability and Federal Adequate Yearly Progress.
Provided by Education Service Center Region XI 1 Title I, Part A Overview Provided by Education Service Center Region XI
Questions & Answers About AYP & PI answered on the video by: Rae Belisle, Dave Meaney Bill Padia & Maria Reyes July 2003.
STATE ACCOUNTABILITY OVERVIEW Back To School| August 19-22, 2013 Dean Munn Education Specialist Region 15 ESC.
Special Education in the United States Susie Fahey and Mario Martinez.
ESL Education Program Report Hudson ISD ESL/Content-Based An English program that serves students identified as students of limited English proficiency.
Department of Special Education Division of Special Programs in the Office of Academics 1111 W. 6 th Street Phone: (512) Austin, Texas
Coordinated Early Intervening Services (CEIS). 34 CFR § : An LEA may not use more than 15 percent of the amount the LEA receives under Part B of.
1. 2 Roots of Ontario Legislation and Policy Bill 82 (1980), An Amendment to the Education Act: –Universal access: right of all children, condition notwithstanding,
Assessing Students With Disabilities: IDEA and NCLB Working Together.
Division Liaison Update Division Liaison Meeting The College of William and Mary January 7, 2013.
Rebecca H. Cort, Deputy Commissioner NYSED VESID Presentation to NYS Staff / Curriculum Development Network Targeted Activities to Improve Results for.
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
TAKS-Alt ReportingInformation for Parents Copyright © 2007, Texas Education Agency. All rights reserved. Reproduction of all or portions of this work is.
Overview of Title I Part A Farwell ISD. The Intent of Title I Part A The intent is to help all children to have the opportunity to obtain a high quality.
1 Accountability Conference Education Service Center, Region 20 September 16, 2009.
Go for the Touchdown with ESC-2
1 The Special Education Assessment and IEP Process EDPOWER Teacher Institute 2013.
March 7, 2013 Texas Education Agency | Office of Assessment and Accountability Division of Performance Reporting Accountability Policy Advisory Committee.
Program Evaluation NCLB. Training Objectives No Child Left Behind Program Series: Program Evaluation To provide consistency across the State regarding.
Performance-Based Monitoring Analysis System (PBMAS) January 2005 Update.
An Introduction to the State Performance Plan/Annual Performance Report.
IDEA and NCLB Standards-Based Accountability Sue Rigney, U.S. Department of Education OSEP 2006 Project Directors’ Conference.
Response to Intervention (RtI) Secondary Model for Intervention.
 Implementation of certain STAAR Grades 3-8 indicators, as appropriate and contingent on data availability.  Use of the Met Standard performance standard.
Exceptional Lives: Special Education in Today’s Schools, 6e ISBN: © 2010 Pearson Education, Inc. All rights reserved. Chapter 2 Ensuring Progress.
Texas State Performance Plan Data, Performance, Results TCASE Leadership Academy Fall 2008.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Read to Achieve Parent Presentation What is Read to Achieve? Read to Achieve was created in legislation and approved by the North Carolina.
Responsiveness to Instruction (RtI) What’s New in North Carolina?
Welcome to Abbett Elementary! Curriculum Night 2015.
Rowland Unified School District District Local Education Agency (LEA)Plan Update Principals Meeting November 16, 2015.
Addressing Federal Program Stages in PBM OCTOBER 27, 2015.
Ensuring Progress in the General Education Curriculum ED 222 Spring 2010.
Federal Accountability/ AYP Update Accountability TETN April 19, 2011 Shannon Housson and Ester Regalado TEA, Performance Reporting Division.
State Advisory Panel & Interagency Coordinating Council Notice of Proposed Rulemaking (NPRM)Significant Disproportionality & Overview of SAP/ICC Website.
A GUIDE FOR CANTON PUBLIC SCHOOL DISTRICT’S PARENTS AND STAKEHOLDERS The Mississippi Literacy-Based Promotion Act
School Accountability and Grades Division of Teaching and Learning January 20, 2016.
Harlingen CISD Performance Based Monitoring Analysis System (PBMAS)
April 29-30, Review information related to the RF monitoring system Ensure that the agency meets its ongoing obligation to have a monitoring system.
Edgewood ISD   School District Effectiveness  Statutory Requirements  Valid Indicators of Performance  Maximum Inclusion  Individual.
Federal Accountability/ AYP Update
Title III of the No Child Left Behind Act
PBMAS Overview and TAIS Training
Career Technology Education Performance-Based Monitoring Summer Overview Ed Garcia, CTE Specialist Region One Education Service Center (956)
Advancing ELL Progress
Campus Improvement Planning
Assessing Students With Disabilities: IDEA and NCLB Working Together
ESSA accountability & Report Card Proposed regulations
Presentation transcript:

Effective Programs for Successful Students The Network Summer Summit Presenters: Rachel Harrington (Performance-Based Monitoring Division) Judy Struve (Program Monitoring and Interventions Division) June 30,

Purpose of Today’s Presentation  Introduce participants to the state’s monitoring system for special programs, with a particular focus on special education programs.  Review identification and interventions components of the monitoring system.  Provide information on how to:  Conduct meaningful data analyses regarding special education program effectiveness.  Connect causal factors identified in data analysis to improve program effectiveness. Copyright © Texas Education Agency All rights reserved. 2

How Do We Know if a Program is Effective? 1.Is it being implemented in accordance with the state and/or federal laws, rules, and requirements that govern it (i.e., compliance)? 2.Is it having measurable, quantifiable results on student performance? Copyright © Texas Education Agency All rights reserved. 3

What is the Appropriate Balance Between Those Two Questions? Copyright © Texas Education Agency All rights reserved. CompliancePerformance 4

Prior to 2003, Program Effectiveness was Largely Measured by Compliance. Copyright © Texas Education Agency All rights reserved. Compliance 5

2003 Change in State Law Leads to New Monitoring System  Revised alignment of agency functions resulting from:  Statute from 78 th Texas Legislature (2003) which limited and redirected agency monitoring  Appropriations bill from 78 th legislative session also significantly lowered agency’s FTE cap which necessitated a more efficient agency organization  Shift away from process to results, i.e. program effectiveness and student performance  Strong emphasis on data integrity/validation  Focus on a coordinated approach to agency monitoring (not isolated program monitoring) Copyright © Texas Education Agency All rights reserved. 6

Monitoring System = Performance-Based Monitoring  Review of every district every year rather than a limited cyclical process  Monitoring decisions made on variety of levels based on performance  Intervention activities based on levels of performance  LEA analysis of data  Submission of information and desk review by TEA  On-site reviews as appropriate  Emphasis on student performance, program effectiveness, as well as compliance Copyright © Texas Education Agency All rights reserved. 7

After 2003, the Balance Shifted. Copyright © Texas Education Agency All rights reserved. Compliance Performance 8

Monitoring Definition Monitoring is: 1.Using a data-driven performance- based model to observe, evaluate, and report on the public education system at the individual student group, campus, local education agency, regional, and statewide levels across diverse areas including program effectiveness; compliance with federal and state law and regulations; financial management; and data integrity for the purpose of assessing that student needs are being met; Copyright © Texas Education Agency All rights reserved. So, it’s not “getting dinged?” It’s about “ensuring that student needs are being met”? That sounds like something we all can agree on! 9

Monitoring Definition (continued) Monitoring is: 2.Promoting diagnostic and evaluative systems in LEAs that are integrated with the agency’s desk audit and intervention process; and 3.Relying on a research-based framework of interventions that ensures compliance and enhances student success. Copyright © Texas Education Agency All rights reserved. 10

Overall Goals for Monitoring  Deliver a consistent and coordinated response to identified areas of low performance/program ineffectiveness in districts/campuses.  Take into account both the extent and the duration of a district’s area(s) of low performance/program ineffectiveness.  Achieve an integration of indicators and interventions.  Program effectiveness and compliance monitoring — a balanced perspective Copyright © Texas Education Agency All rights reserved. 11

Components of PBM  Identification and Interventions  Performance-Based Monitoring Analysis System (PBMAS) and Data Validation System are used to identify performance and program effectiveness trends and concerns as well as data anomalies/concerns.  Performance-Based Monitoring Interventions include strategies used to address performance and program effectiveness trends and concerns as well as data anomalies/concerns. Copyright © Texas Education Agency All rights reserved. 12

Guiding Principles of PBM  School District Effectiveness: PBM is designed to assist school districts in their efforts to improve student performance and program effectiveness.  Statutory Requirements: PBM is designed to meet statutory requirements.  Indicator Design: PBM indicators reflect critical areas of student performance, program effectiveness, and data integrity. Copyright © Texas Education Agency All rights reserved. So, it’s not about “taking me away from my job?” It’s about “improving student performance and program effectiveness?” That sounds like something that’s part of my job! 13

Guiding Principles (continued)  Maximum Inclusion: PBM system is designed to evaluate a maximum number of school districts through the use of appropriate alternatives for analyzing districts with small numbers of students.  Individual Program Accountability: PBM system is structured to ensure that low performance in one area cannot be offset by high performance in another area and likewise that low performance in one area does not lead to interventions in program areas where performance is high. Copyright © Texas Education Agency All rights reserved. 14

Guiding Principles (continued)  High Standards: PBM system promotes high standards for all students in all districts. Standards are adjusted over time to ensure continued student achievement and progress.  Annual Statewide Focus: PBM system ensures the annual evaluation of all school districts in the state.  Public Input and Accessibility: The design, development, and implementation of the PBM system are informed by ongoing public input. School district performance information that the PBM system generates is accessible to the public. Copyright © Texas Education Agency All rights reserved. 15

Guiding Principles (continued)  System Evolution: PBM is a dynamic system in which indicators are added, revised, or deleted in response to changes and developments that occur outside of the system, including new legislation and the development of new assessments.  Coordination: PBM is part of an overall agency coordination strategy for the data-driven, performance-based evaluation of school districts. Copyright © Texas Education Agency All rights reserved. 16

What Are Some of the Measurable Characteristics of Effective Programs?  Effective Programs:  Demonstrate they have achieved strong performance results and/or gains in reading, mathematics, science, social studies, and writing for the students they serve.  Demonstrate that, when students, are no longer receiving special education services, they perform well academically.  Make appropriate decisions regarding which test version students served in special education will take. Copyright © Texas Education Agency All rights reserved. 17

What Are Some of the Measurable Characteristics of Effective Programs? (continued)  Effective Programs:  Place students in the least restrictive environment.  Employ effective strategies to prevent students from dropping out of school.  Promote on-time graduation for as many students as possible.  Provide opportunities for as many students as possible to graduate under the Recommended or Distinguished Achievement diploma programs. Copyright © Texas Education Agency All rights reserved. 18

What Are Some of the Measurable Characteristics of Effective Programs? (continued)  Effective Programs:  Implement special education placement decisions based on a student’s disability not his/her race, ethnicity, socioeconomic status, or English language proficiency.  Recognize that student success occurs when students have maximum access to a comprehensive curriculum taught by qualified educators in classrooms equipped with appropriate instructional supports as well as peer- to-peer interaction, and therefore:  Include proven strategies that reduce and/or prevent the need to remove students for disciplinary reasons. Copyright © Texas Education Agency All rights reserved. 19

Performance-Based Monitoring Analysis System (PBMAS)  Performance-Based Monitoring Analysis System (PBMAS) – an automated data system that reports annually on the performance of school districts and charter schools in selected program areas (bilingual education/ESL, career and technical education, special education, and certain Title programs under NCLB)  Specifically designed to measure the characteristics of program effectiveness listed on the previous slides Copyright © Texas Education Agency All rights reserved. 20

Performance-Based Monitoring Analysis System (PBMAS) (continued)  Each year’s PBMAS report is typically released to districts in August.  Your district’s PBMAS reports from are available at:  PBMAS reports for all 20 ESC regions as well as the state are also available on the PBM web site. Copyright © Texas Education Agency All rights reserved. 21

22

PBMAS Performance Levels Required Improvement (0 RI) Not Assigned (NA) No Data (ND) Special Analysis (0, 1, 2, or 3 SA) Hold Harmless (3 HH or 4 HH) Report Only Copyright © Texas Education Agency All rights reserved. 23

Characteristic 1: Demonstrate strong performance results and/or gains in reading, mathematics, science, social studies, and writing for the students they serve. Characteristic 2: Demonstrate that, when students, are no longer receiving special education services, they perform well academically. PBMAS Program Area and Indicator Number Indicator Name SPED #1(i-v) SPED #2(i-v) SPED #3(i-iv) SPED STAAR 3-8 Passing Rate (M, R, S, SS, W) SPED YAE* STAAR Passing Rate (M, R, S, SS, W) SPED STAAR EOC Passing Rate (M, S, SS, ELA) *YAE = Year-After-Exit 24

Notes for 2015 PBMAS (for Indicators on Slide 24)  STAAR A and STAAR Alternate 2 results will be included.  A targeted hold harmless provision will be implemented.  A PL 4 will be added.  For STAAR 3-8 indicators:  Mathematics results will be included, using passing standards equivalent to the previous mathematics tests.  PL 1, PL 2, and PL 3 cut-point ranges will be lowered.  For STAAR EOC indicators:  Performance Level (PL) cut-point ranges for math and science will be changed.  PL assignment for U.S. History will be added.  ELA indicator will continue as Report Only. 25

STAAR EOC Subject-Area PL 0 Cut Points Mathematics2014 = 50%-100% 2015 = 60%-100% Science2014 = 50%-100% 2015 = 60%-100% Social Studies2014 = Report Only 2015 = 60%-100% English Language Arts2014 = Report Only 2015 = Report Only 26

Performance Level Table for SPED #1(i-v) SPED #1 (i-v) PL 0PL 1PL 2PL 3PL 4 Mathematics 70.0% - 100%55.0% %40.0% % 25.0% % 0% % Reading 70.0% - 100%55.0% %40.0% % 25.0% % 0% % Science 65.0% - 100%50.0% %40.0% % 25.0% % 0% % Social Studies 65.0% - 100%50.0% %40.0% % 25.0% % 0% % Writing 70.0% - 100%55.0% %40.0% % 25.0% % 0% % 27

Performance Level Table for SPED #3 (i-iv) SPED #3 (i-iv) PL 0PL 1PL 2PL 3PL 4 Mathematics 60.0% - 100%50.0% %40.0% % 30.0% % 0% % Science 60.0% - 100%50.0% %40.0% % 30.0% % 0% % Social Studies 60.0% - 100%50.0% %40.0% % 30.0% % 0% % ELA Report Only 28

 What is the targeted hold harmless provision for SPED Indicator #1(i-v) and #3 (i-iv)?  It stipulates that any district that received a PL 0 or 0 RI on the SPED STAAR Modified Participation Rate indicator in the 2014 PBMAS that would otherwise receive a PL 3 or PL 4 on SPED Indicator #1(i-v) or SPED Indicator #3 (i-iv) in the 2015 PBMAS will receive a PL 3 HH or PL 4 HH, as applicable for that subject area(s).  For 2015 PBMAS interventions purposes, the count of PL 3 HH or PL 4 HH under SPED Indicator #1 (i-v) or SPED Indicator #3 (i-iv) will not be considered in a district’s total PL 3 or PL 4 count in the special education program area. Targeted Hold Harmless 29

Characteristic 3: Make appropriate decisions regarding which test version students served in special education will take. PBMAS Program Area and Indicator NumberIndicator Name SPED #4SPED STAAR Alternate 2 Participation Rate 30

Characteristic 4: Place students in the least restrictive environment. PBMAS Program Area and Indicator NumberIndicator Name SPED #5 Note for 2015 PBMAS: PLs will be assigned. SPED #6 and SPED #8 SPED #7 and SPED #9 SPED Regular Early Childhood Program Rate Ages 3-5 SPED Regular Class ≥80% Rate Ages 6-11 Ages SPED Regular Class <40% Rate Ages 6-11 Ages

Characteristic 5: Employ effective strategies to prevent students from dropping out of school. Characteristic 6: Promote on-time graduation for as many students as possible. Characteristic 7: Provide opportunities for as many students as possible to graduate under the Recommended or Distinguished Achievement diploma programs. PBMAS Program Area and Indicator NumberIndicator Name SPED #10 SPED #11 SPED #12 Note for 2015 PBMAS: New PL 1 – PL 3 cut points will be implemented. SPED Annual Dropout Rate Grades 7-12 SPED RHSP/DAP Diploma Rate SPED Graduation Rate 32

Characteristic 8: Implement special education placement decisions based on a student’s disability not his/her race, ethnicity, socioeconomic status, or English language proficiency. PBMAS Program Area and Indicator NumberIndicator Name SPED #13 SPED #14 SPED #15 SPED #16 SPED Representation SPED African American Representation SPED Hispanic Representation SPED LEP Representation 33

Characteristic 9: Recognize that student success occurs when students have maximum access to a comprehensive curriculum taught by qualified educators in classrooms equipped with appropriate instructional supports as well as peer-to-peer interaction, and therefore include proven strategies that reduce and/or prevent the need to remove students for disciplinary reasons. PBMAS Program Area and Indicator NumberIndicator Name SPED #17 SPED #18 SPED #19 Note for 2015 PBMAS: We will begin transition to a new PL structure for these 3 indicators by reporting disproportionality rates. Disproportionality rates tell us how much higher the special education xxx rate is compared to the all students rate, rather than just telling us the absolute difference between the two rates. SPED Discretionary DAEP Placements SPED Discretionary ISS Placements SPED Discretionary OSS Placements 34

 2015 PBMAS Manual rule adoption is underway.  Proposed Amendment to 19 TAC Chapter 97, Planning and Accountability, Subchapter AA, Accountability and Performance Monitoring, § , Performance-Based Monitoring Analysis System  Summary: The proposed amendment would adopt the Performance-Based Monitoring Analysis System 2015 Manual PBMAS Manual 35

 30-Day Public Comment Period: May 22, 2015 – June 22,  The rule will be effective July 30,  For current information on the rule adoption process:  Visit oner_Rules_(TAC)/Commissioner_of_Education_Rules_- _Texas_Administrative_Code/ and/or oner_Rules_(TAC)/Commissioner_of_Education_Rules_- _Texas_Administrative_Code/  Subscribe to the Rules Listserv at /new /new 2015 PBMAS Manual (continued) 36

 Once the rule becomes effective, the Manual will be posted on our web site, followed by a listserv notification of the posting.  At that time, hard-copy versions will also be available from TEA’s Publications Office (see order form at the back of the posted Manual) PBMAS Manual (continued) 37

Effectiveness of PBMAS: Performance Gains and Positive Results for Students  The Performance-Based Monitoring Analysis System (PBMAS) was first implemented in  The most current data available from the PBMAS are data from the 2014 PBMAS.  The following tables provide longitudinal data summarizing performance gains achieved through the PBMAS as shown in the changes in various indicators’ state rates over time. 38

Effectiveness of PBMAS: Performance Gains and Positive Results for Students (continued)  The tables are summarized by years of comparable data available for a given indicator.  As a result of several statutory and policy changes that occurred outside of the PBMAS (particularly changes to the state assessment system), some indicators have as few as three years’ of comparable data available while others have as many as ten. 39

Effectiveness of PBMAS (continued) PBMAS Indicator 2004 State Rate 2014 State Rate Change RHSP/DAP Diploma Rate 12.8%25.5%+12.7 SPED Representation 11.6%8.5%-3.1% Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) Table 1 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 40

Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) PBMAS Indicator 2004 State Rate 2013 State Rate Change Less Restrictive Environments for Students (Ages 12-21) 46.8%63.6%+16.8 Table 2 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 41

Effectiveness of PBMAS (continued) PBMAS Indicator2005 State Rate2014 State RateChange Less Restrictive Environments for Students (Ages 3-5) 9.6%16.7%+7.1 Discretionary DAEP Placement Rate 1.5 percentage points higher than all students 0.8 percentage points higher than all students -0.7 Discretionary ISS Placement Rate 23.2 percentage points higher than all students 12.3 percentage points higher than all students Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) Table 3 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 42

Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) PBMAS Indicator 2007 State Rate 2013 State Rate Change Less Restrictive Environments for Students (Ages 6-11) 35.5%39.6%+4.1 Table 4 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 43

Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) PBMAS Indicator 2007 State Rate 2014 State Rate Change Annual Dropout Rate (Grades 7-12) 3.2%2.3%-0.9 Graduation Rate 72.7%77.8%+5.1 Table 5 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 44

Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) PBMAS Indicator 2008 State Rate 2014 State Rate Change Discretionary OSS Placements 12.7 percentage points higher than all students 8.1 percentage points higher than all students -4.6 Table 6 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 45

Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) PBMAS Indicator2009 State Rate2011 State RateChange SPED TAKS Passing Rate (Mathematics) 59.5%68.2%+8.7 SPED TAKS Passing Rate (Reading) 68.1%75.4%+7.3 SPED TAKS Passing Rate (Science) 51.1%59.9%+8.8 SPED TAKS Passing Rate (Social Studies) 69.9%77.5%+7.6 SPED TAKS Passing Rate (Writing) 70.3%76.6%+6.3 Table 7 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) 46

Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area ( ) PBMAS Indicator2009 State Rate2011 State RateChange SPED YAE TAKS Passing Rate (Mathematics) 77.5%83.4%+5.9 SPED YAE TAKS Passing Rate (Reading) 83.3%86.8%+3.5 SPED YAE TAKS Passing Rate (Science) 73.4%81.0%+7.6 SPED YAE TAKS Passing Rate (Social Studies) 90.2%94.3%+4.1 SPED YAE TAKS Passing Rate (Writing) 88.1%89.8%+1.7 Table 8 – PBMAS Performance Gains and Positive Results for Students: SPED Program ( ) *YAE = Year-After-Exit 47

Contact Information for Slides 3-48 Performance-Based Monitoring Phone: (512) Copyright © Texas Education Agency All rights reserved. 48

Why is it important to have effective programs? 49

Interventions to Improve Effectiveness of Programs Intervention the act of becoming involved in something in order to have an influence on what happens Copyright © Texas Education Agency All rights reserved. 50

Staging for Interventions Based on results of PBMAS reports, districts are assigned a stage of intervention, determined by:  The number of performance levels of 2 or 3 assigned to performance indicators  The more PL 2s and 3s, the higher the stage (Stages 1-4)  The higher the stage, the more engagement with the agency 51

How My LEA Was Selected for Interventions Stage 1 One individual SPED PBMAS indicator = 3 to Two individual SPED PBMAS indicator = 3 and up to two individual SPED PBMAS indicators = 2 Stage 4 Six or more individual SPED PBMAS indicators = 3 52

Copyright © Texas Education Agency All rights reserved. 53

Integrated Interventions From until interventions were conducted by individual program areas. Now districts and charter schools look at all programs exhibiting areas of need in a integrated system. Copyright © Texas Education Agency All rights reserved. 54

What Will Interventions Do for the Charter School? PBMAS report tells you where there are concerns in the effectiveness of programs. Intervention activities help you to see why and what you can do about these concerns! Copyright © Texas Education Agency All rights reserved. 55

What Activities Are Required? What do you become involved in so that you have influence on what happens?  Analyzing data  Assessing needs  Conducting compliance review  Continuous improvement planning through development of a targeted improvement plan  If the charter school has multiple issues, the agency could conduct an on-site review. Copyright © Texas Education Agency All rights reserved. 56

Who engages in these activities? District Leadership Team  7-9 members  Recommended members:  LEA administrator  Special Education administrator  Parent(s) of student(s) with disabilities  General education teacher  Special education teacher  Campus administrator  Representative from DAEP if DAEP indicator is to be analyzed  Secondary guidance counselor or person knowledgeable about dropout information if dropout indicator is to be analyzed Copyright © Texas Education Agency All rights reserved. 57

 Other optional members  Community stakeholders  Related service providers  Speech therapist  Special education evaluation personnel  Student(s) with disabilities  JJAEP representative  Others as determined by local needs Copyright © Texas Education Agency All rights reserved. 58

Focused Data Analysis  A focused review of data indicators for which a higher level of performance concern has been identified  Requires a specified team of individuals to gather, disaggregate, and review data to determine possible causes for the performance concern as reflected in PBMAS indicators  Results of the analysis generally are reflected as problem statements Copyright © Texas Education Agency All rights reserved. 59

Focused Data Analysis (continued)  Completed in all stages of intervention  Must be completed on each PBMAS indicator assigned a performance level (PL) of 2 or 3.  Other indicators may be reviewed at the LEA’s discretion. Copyright © Texas Education Agency All rights reserved. 60

61 Copyright © Texas Education Agency All rights reserved. PBMAS is a district/charter school report. However, it is important to know how each campus contributes to it!

SPED STAAR Alternate Participation Rate Calculation: Number of students in Grades 3-9 served in special education tested on STAAR alternate for all subjects applicable to the student’s grade level Number of students in Grades 3-9 served in special education for whom any STAAR assessment was submitted Copyright © Texas Education Agency All rights reserved. 62

63 Copyright © Texas Education Agency All rights reserved. So now I know that special education students contributed to the campus’ low performance, now what?

64 Copyright © Texas Education Agency All rights reserved. You can not fix a charter school’s or campus’ issues until you know exactly who is contributing to the issues!

Needs Assessment  After analysis of data and development of problem statements, identify the root causes of areas of concern Development of Targeted Improvement Plan  Determine annual goals, and strategies, and interventions to accomplish goal Copyright © Texas Education Agency All rights reserved. 65

Let’s see how this all works Based on the data analysis, the problem statements for Mayberry ISD: Students with disabilities who are removed to ISS and DAEP are all enrolled at Barney Fife Middle School. BFMS’s rate of removal to ISS and DAEP for all students is 30%, removal for students with disabilities is 45%. 75% of the students removed were in the 8 th grade. Students removed to ISS and DAEP were failing the classes from whence referrals to the office were made. Copyright © Texas Education Agency All rights reserved. 66

Results of Needs Assessment BFMS does not have a PBIS in place. Teachers do not differentiate instruction. There is not a continuum of special education services at BFMS. No student with disabilities receives more than 15 minutes per class period of inclusion support services. Based on these results, strategies/ interventions are identified in the targeted improvement plan. Copyright © Texas Education Agency All rights reserved. 67

68 Copyright © Texas Education Agency All rights reserved. Change just does not happen. You have to create it. You have to do something to get something different.

69 Copyright © Texas Education Agency All rights reserved. Effective programs don’t just happen. Knowledge of the program. Data to know if program is working Strategies to change outcomes if they are not what you want.

Contact Information for Slides Program Monitoring and Interventions Phone: (512) Copyright © Texas Education Agency All rights reserved. 70