Performance-Based monitoring Analysis System (PBMAS) Training

Slides:



Advertisements
Similar presentations
System Safeguards and Campus Improvement
Advertisements

Campus Improvement Plans
Spring 2015 TELPAS Holistic Rating Training System
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Adapted from Texas Education Agency PBM ProductDate of 2014 release Projected date of 2015 release PBMAS (Unmasked reports) LEA reports posted.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
South Western School District Differentiated Supervision Plan DRAFT 2010.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Texas Accountability Intervention System (TAIS). Data Process Reporting State Districts & Campuses IR or Met Standard Indexes & Safeguards Federal (ESEA.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Welcome to today’s Webinar: Tier III Schools in Improvement We will begin at 9:00 AM.
An Overview 2015 PBMAS and Intervention Activities
Data Analysis Needs Assessment Improvement Plan Implement & Monitor ●Set annual goals ●Identify a strategy ●Set quarterly goals ●Determine interventions.
Addressing Federal Program Stages in PBM OCTOBER 27, 2015.
June 2015 Good morning and welcome! 1.Open template and begin to review; and 2.Be sure you have your 2015 accountability data out, ready for easy.
NH Department of Education Developing the School Improvement Plan Required by NH RSA 193-H and Federal Public Law for Schools in Need of Improvement.
Cristina G. Vázquez, Manager, Student Assessment Division, Texas Education Agency.
Instructional Leadership Supporting Common Assessments.
April 29-30, Review information related to the RF monitoring system Ensure that the agency meets its ongoing obligation to have a monitoring system.
Colorado Alternative Cooperative Education (ACE) CTE Redesign CACTE July 2017 Key Messages for Early Implementation & Change Management in SY
Review, Revise and Amend from Procedures for State Board Policy 74
Data Driven Decisions for School Improvement
Bilingual/ESL Education Department
Public School Monitoring Roadmap
The Federal programs department September 26, 2017
Title III of the No Child Left Behind Act
TAIS Overview for Districts
PBMAS Overview and TAIS Training
Add your school name and the date and time of the meeting
Courtney Mills Principal, Midlands Middle College
Transitioning to every student succeeds act (ESSA) Parent & Family Engagement Policy Training March 21, 2017 NOTE: To change the image on this slide,
Career Technology Education Performance-Based Monitoring Summer Overview Ed Garcia, CTE Specialist Region One Education Service Center (956)
Title I, Part A - Parent & Family Engagement Compliance Monitoring Review Training The Federal Programs Department September 26, 2018.
Fahrig, R. SI Reorg Presentation: DCSI
Fahrig, R. SI Reorg Presentation: DCSI
Advancing ELL Progress
Texas Student Data System
Webinar: ESSA Improvement Planning Requirements
Welcome to the Annual Meeting of Title I Parents
CTE Directors’ Meeting
Partnering for Success: Using Research to Improve the Lowest Performing Schools June 26, 2018 Massachusetts Department of Elementary and Secondary Education.
School Improvement Plans and School Data Teams
Annual Parent Meeting Klein Road Elementary
Jean Scott & Logan Searcy July 22, MEGA
Fahrig, R. SI Reorg Presentation: DCSI
Fahrig, R. SI Reorg Presentation: DCSI
Evaluating the Quality of Student Achievement Objectives
Creating a vision for continuous improvement
Fahrig, R. SI Reorg Presentation: DCSI
2018 OSEP Project Directors’ Conference
SPR&I Regional Training
TELPAS Alternate Student Eligibility
Fahrig, R. SI Reorg Presentation: DCSI
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Preparing to Use This Video with Staff:
Developing School Improvement Plans #101
Using Data for Program Improvement
Studio School Title I Annual Meeting Title I Program Overview for Schoolwide Program (SWP) Schools Federal and State Education Programs Branch.
Fahrig, R. SI Reorg Presentation: DCSI
Annual Title I Meeting and Benefits of Parent and Family Engagement
Using Data for Program Improvement
SGM Mid-Year Conference Gina Graham
Fahrig, R. SI Reorg Presentation: DCSI
Welcome to the Annual Meeting of Title I Parents
Fahrig, R. SI Reorg Presentation: DCSI
IMPLEMENTATION & MONITORING
Fahrig, R. SI Reorg Presentation: DCSI
2019 Spring & Fall Timeline May 10, 2019
Background This slide should be removed from the deck once the template is updated. During the 2019 Legislative Session, the Legislature updated a the.
New Special Education Teacher Webinar Series
Presentation transcript:

Performance-Based monitoring Analysis System (PBMAS) Training NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. 2018-2019 Region 9 ESC November 28, 2018

Welcome! Micki Wesley, Director of Accountability and Compliance Kara Fluty, Educational Specialist, ESSA Tracy Patrick, Educational Specialist, Accountability and Compliance Angelina Chapa, BE/ESL Michael Chapman, CTE Amy Blackwell, SpEd

PBMAS District-level Data-driven Part of TEA’s annual evaluation of school districts’ performance and program effectiveness 4 federally-funded program areas

BE/ESL CTE ESSA SPED Bilingual Education/English as a Second Language Career & Technical Education ESSA Every Student Succeeds Act SPED Special Education

ACRONYMS Reminder on Acronyms FRE Federally Required Elements PL Performance Level RI Required Improvement SA Special Analysis YAE Year-After-Exit MSR Minimum Size Requirement SD Significant Disproportionality RP Reasonable Progress DDV Discipline Data Validation NA Not Assigned ND No Data

PERFORMANCE LEVELS Not Assigned N/A 1 2 3 0SA 1SA 2SA 3SA 0RI Lowest Score Highest Score Not Assigned N/A 1 2 3 0SA 1SA 2SA 3SA 0RI

ADDRESSING N/A’S N/A – doesn’t mean you met standard; means you did NOT meet MSR N/A does NOT mean PL of 0 The only data you do NOT need to address is a performance level of 0 or No Data Ensure accuracy of No Data Look at numbers (denominators/numerators) on report to see true performance level

PMBAS Staging The following LEAs will be required to engage in a district continuous improvement process: LEAs staged for interventions in either single or multiple PBMAS program areas, which includes the assignment of a determination level for one or more of the federally- required elements (FREs)1 for the special education program.

Intervention Process Overview LEAs are evaluated in each PBMAS program area and assigned a stage of intervention if they have: one or more PBMAS indicators with a performance level (PL) 3 or 4 and/or (for the special education program) a determination level for one or more of the FREs. LEAs that are assigned a stage for intervention are assigned as a Stage 1, 2, 3, or 4. Intervention activities for all program areas at any stage of intervention will include engaging in a district continuous improvement process (i.e. data analysis, root cause analysis, strategy selection and planning, implementation fidelity and monitoring,). Only LEAs who are assigned a Stage of 3 or 4 in any program area must submit their targeted improvement plan (TIP) to TEA via ISAM. LEAs staged for interventions in any program area at Stage 1 or 2, develop their targeted improvement plan and retain it and supporting documentation locally.

District Leadership Team (DLT) and District Coordinator of School Improvement (DCSI) LEAs required to engage in interventions must establish district leadership team, composed of key LEA personnel and stakeholders, to conduct and monitor the activities of the process. The DLT must include a DCSI. The DCSI is a district-level employee who is in a leadership position in special programs, school improvement, curriculum and instruction, or another position with responsibility for student performance. Membership of the DLT should include representatives from programs staged for interventions, LEA staff responsible for school improvement, curriculum and instruction, and other programs that may have an impact on student performance and program effectiveness. LEA is not required to submit a list of DLT members unless requested by TEA. LEA is required to submit the name of the DCSI through ISAM. Stage 3 & 4

2018-2019 PBMAS Resources 2018 PBMAS Manual https://tea.texas.gov/pbm/PBMASManuals.aspx The following resources are available at https://tea.texas.gov/Student_Testing_and_Accountability/ Monitoring_and_Interventions/Program_Monitoring_and_Int erventions/Performance-Based_Monitoring/ Performance Based Monitoring Analysis System (PBMAS) Interventions guidance 2018-2019 Interventions and Submissions: PBMAS Districts (calendar) 2018-2019 PBMAS Staging Framework District Targeted Improvement Plan

PBMAS Masked Reports and Download Files The 2018 Performance-Based Monitoring Analysis System (PBMAS) masked district/open-enrollment charter reports and masked data download files are available at the following links: https://rptsvr1.tea.texas.gov/pbm/distrpts.html (MASKED DISTRICT/OPEN- ENROLLMENT CHARTER REPORTS) https://rptsvr1.tea.texas.gov/pbm/download.html (MASKED DATA DOWNLOAD)

Stage 3 or 4 ONLY

Bilingual Education/English as a Second Language

Bilingual/ESL Indicators BE/ESL Indicator #1(i-v): BE STAAR 3-8 Passing Rate BE/ESL Indicator #2(i-v): ESL STAAR 3-8 Passing Rate BE/ESL Indicator #3(i-v): LEP (Not Served in BE/ESL) STAAR 3-8 Passing Rate BE/ESL Indicator #4(i-v): LEP Year-After-Exit (YAE) STAAR 3-8 Passing Rate BE/ESL Indicator #5(i-iv): LEP STAAR EOC Passing Rate BE/ESL Indicator #6: LEP Annual Dropout Rate (Grades 7-12) BE/ESL Indicator #7: LEP Graduation Rate BE/ESL Indicator #8: TELPAS Reading Beginning Proficiency Level Rate (Report Only) BE/ESL Indicator #9: TELPAS Composite Rating Levels for Students in U.S. Schools Multiple Years (Report Only)

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #1(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. BE STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes *STAAR- The STAAR 3-8 passing rate is based upon STAAR, STAAR Spanish, and STAAR Alternate 2.

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #2(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. ESL STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #3(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. LEP (Not Served in BE/ESL) STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #4(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. LEP YAE STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #5(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. LEP STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #6 LEP Annual Dropout Rate (Grades 7-12) No Changes

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #7 LEP Graduation Rate Add RI Two years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #8 TELPAS Reading Beginning Proficiency Level Rate Add SA Three years of data available for analysis Align PLs with TELPAS Composite Rating Indicator: PL 0 = 0% to 7.5% (no change) PL 1 = 7.6% to 10.5% PL 2 = 10.6% to 14.4% PL 3 = 14.5% to 100% Report Only- Due to the changes in TELPAS and the timing of the standard setting in late summer *For 2017 and prior, composite ratings were calculated using, in part, the student’s TELPAS Listening and Speaking (grades 2-12) performance as determined by a holistic rating system. In 2018, the composite ratings will instead use the student’s TELPAS Listening and Speaking performance as determined by the new item-based standardized assessments. Report only for BE/ESL #8 and #9 is proposed due to the changes in TELPAS and the timing of the standard setting in late summer.

Program Area and Indicator Number 2018 PBMAS Preview: BE/ESL Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS BE/ESL #9 TELPAS Composite Rating Levels for Students in U.S. Schools Multiple Years No Changes Report Only- Due to the changes in TELPAS and the timing of the standard setting in late summer *For 2017 and prior, composite ratings were calculated using, in part, the student’s TELPAS Listening and Speaking (grades 2-12) performance as determined by a holistic rating system. In 2018, the composite ratings will instead use the student’s TELPAS Listening and Speaking performance as determined by the new item-based standardized assessments. Report only for BE/ESL #8 and #9 is proposed due to the changes in TELPAS and the timing of the standard setting in late summer.

Career and Technical Education

Career and Technical Education Indicators CTE Indicator #1(i-iv): CTE STAAR EOC Passing Rate CTE Indicator #2(i-iv): CTE LEP STAAR EOC Passing Rate CTE Indicator #3(i-iv): CTE Economically Disadvantaged STAAR EOC Passing Rate CTE Indicator #4(i-iv): CTE SPED STAAR EOC Passing Rate CTE Indicator #5: CTE Annual Dropout Rate (Grades 9-12) CTE Indicator #6: CTE Graduation Rate CTE Indicator #7: CTE Nontraditional Course Completion Rate-Males CTE Indicator #8: CTE Nontraditional Course Completion Rate-Females

Program Area and Indicator Number 2018 PBMAS Preview: CTE Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS CTE #1(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. CTE STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: CTE Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS CTE #2(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. CTE LEP STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add PL 4 for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: CTE Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS CTE #3(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. CTE Economically Disadvantaged STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: CTE Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS CTE #4(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. CTE SPED STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add PL 4 for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: CTE Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS CTE #5 CTE Annual Dropout Rate (Grades 9-12) No Changes CTE #6 CTE Graduation Rate

Program Area and Indicator Number 2018 PBMAS Preview: CTE Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS CTE #7 CTE Nontraditional Course Completion Rate-Males Implement new course list Report Only One year of data available for analysis No RI or SA Add PL Assignment PL0- 40.0%-100% PL1- 23.0%-39.9% PL2- 15.0%-22.9% PL3- 0%-14.9% CTE #8 CTE Nontraditional Course Completion Rate-Females * Two courses removed from Nontraditional for Females 2017, Practicum in Transportation, Distribution and Logistics I and II

PBMAS Data Connections for CTE Are CTE students graduating? Dropping out? How are your CTE students performing on EOC tests? Collaborate with core teachers to determine standards where students are struggling. Do CTE courses address similar standards? Determine strategies to make connections. What can CTE teachers do to provide support to students taking EOC tests? Are students taking and completing nontraditional courses? Why or why not? Are there barriers in place?

Every Student Succeeds Act

Every Student Succeeds Act ESSA Indicator #1(i-v): Title I, Part A STAAR 3-8 Passing Rate ESSA Indicator #2(i-iv): Title I, Part A STAAR EOC Passing Rate ESSA Indicator #3: Title I, Part A Annual Dropout Rate (Grades 7-12) ESSA Indicator #4: Title I, Part A Graduation Rate ESSA Indicator #5(i-v): Migrant STAAR 3-8 Passing Rate ESSA Indicator #6(i-iv): Migrant STAAR EOC Passing Rate ESSA Indicator #7: Migrant Annual Dropout Rate (Grades 7-12) ESSA Indicator #8: Migrant Graduation Rate

Program Area and Indicator Number 2018 PBMAS Preview: ESSA Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS ESSA #1(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. Title I, Part A STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: ESSA Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS ESSA #2(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. Title I, Part A STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: ESSA Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS ESSA #3 Title I, Part A Annual Dropout Rate (Grades 7-12) No Changes ESSA #4 Title I, Part A Graduation Rate

Program Area and Indicator Number 2018 PBMAS Preview: ESSA Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS ESSA #5(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. Migrant STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: ESSA Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS ESSA #6(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. Migrant STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: ESSA Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS ESSA #7 Migrant Annual Dropout Rate (Grades 7-12) No Changes ESSA #8 Migrant Graduation Rate

Special Education

Special Education Compliance Review Only LEAs assigned a Stage 3 or 4 for their special education program will conduct a compliance review for each PBMAS special education indicator assigned a PL 2 or higher. Resources are available in ISAM and on the Division of School Improvement webpage to assist LEAs with completing the compliance review. LEAs retain the compliance review itself and only submit to TEA if requested. However, LEAs at a stages 3 and 4 are required to submit a completed Special Education Compliance Review Summary to the TEA by February 15. https://tea.texas.gov/pmi/SPEDmonitoring/

Special Education Compliance Review For LEAs at any level of staging, if noncompliance is identified during the review process, LEAs will develop a Corrective Action Plan (CAP) which outlines the activities/steps the LEA will take to correct all substantiated findings of noncompliance, to include that: policies and procedures, including operating guidelines and practices are reviewed and revised, as necessary; professional development is provided to identified staff; admission, review, and dismissal (ARD) committee meetings are convened to address the noncompliance, and, when required, determine if the noncompliance denied students a free appropriate public education (FAPE), and consider compensatory services, as appropriate; and develop and engage in monitoring activities to ensure ongoing compliance. The LEA is required to correct any finding of noncompliance as soon as possible, but in no case, may the correction take longer than one calendar year from the date of notification of noncompliance. 

Special Education SPED Indicator #1(i-v): SPED STAAR 3-8 Passing Rate SPED Indicator #2(i-v): SPED Year-After-Exit (YAE) STAAR 3-8 Passing Rate SPED Indicator #3(i-iv): SPED STAAR EOC Passing Rate SPED Indicator #4: SPED STAAR Alternate 2 Participation Rate SPED Indicator #5: SPED Annual Dropout Rate (Grades 7-12) SPED Indicator #6: SPED Graduation Rate SPED Indicator #7: SPED Regular Early Childhood Program Rate (Ages 3-5) SPED Indicator #8: SPED Regular Class ≥80% Rate (Ages 6-21)

Special Education Continued SPED Indicator #9: SPED Regular Class ˂40% Rate (Ages 6-21) SPED Indicator #10: SPED Separate Settings Rate (Ages 6-21) SPED Indicator #11: SPED Representation (Ages 3-21) SPED Indicator #12: SPED OSS and Expulsion ≤10 Days Rate (Ages 3-21) SPED Indicator #13: SPED OSS and Expulsion >10 Days Rate (Ages 3-21) SPED Indicator #14: SPED ISS ≤10 Days Rate (Ages 3-21) SPED Indicator #15: SPED ISS >10 Days Rate (Ages 3-21) SPED Indicator #16: SPED Total Disciplinary Removals Rate (Ages 3-21)

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #1(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #2(i-v) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED YAE STAAR 3-8 Passing Rate (M, R, S, SS, W) Add SA Three years of data available for analysis No Changes

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #3(i-iv) Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED STAAR EOC Passing Rate (M, S, SS, ELA) Add RI for ELA Add SA (except ELA) Three years of data available for analysis (two years for ELA) There are three years of data available for ELA Add SA for ELA

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #4 SPED STAAR Alternate 2 Participation Rate No Changes

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #5 Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED Annual Dropout Rate No Changes

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #6 Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED Graduation Rate No Changes

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #7 Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. Regular Early Childhood Rate (Ages 3-5) Three years of data available for analysis There are three years of data available Add SA

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #8 Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED Regular Class ≥80% Rate (Ages 6-21) Add RI Discontinue SD RO by race/ethnicity Two years of data available for analysis There are three years of data available Add SA

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #9 SPED Regular Class <40% Rate (Ages 6-21) Revise the indicator based on final federal regulations under 34 CFR Part 300, issued by USDE on December 19, 2016 Assign SD Year 1 PLs based on race/ethnicity Add RI to overall component, where two years of data are available There are three years of data available Add SA to overall component SPED #10 SPED Separate Settings Rate (Ages 6-21) New indicator required by final federal regulations under 34 CFR Part 300, issued by USDE on December 19, 2016 Overall component is Report Only Report Only to the overall component (continued)

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #11 Additional provisions pertaining to this indicator will be covered in the 2018 PBMAS Other System Components preview. SPED Representation (Ages 3-21) One integrated indicator based on final federal regulations under 34 CFR Part 300, issued by USDE on December 19, 2016 Assign SD Year 1 PLs based on race/ethnicity and disability category No Changes

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #12 SPED OSS and Expulsion ≤ 10 Days Rate (Ages 3-21) Previewed in DDV Previously previewed in 2017 DDV Add overall PL assignment as Report Only *Indicators #12-16 were previewed in DDV during the fall 2017. Due to the U.S. Department of Education Proposed Delay of Regulations 20 U.S.C. 1418(d) and 34 CFR 300.646 and §§300.647 SD Implementation from July 1, 2018 to July 1, 2020, PBMAS will assign overall Report Only .

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #13 SPED OSS and Expulsion >10 Days Rate (Ages 3-21) Previewed in DVM Previously previewed in 2017 DDV Add overall PL assignment as Report Only

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #14 SPED ISS ≤ 10 Days Rate (Ages 3-21) Previewed in DDV Previously previewed in 2017 DDV Add overall PL assignment as Report Only

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #15 SPED ISS > 10 Days Rate (Ages 3-21) Previewed in DDV Previously previewed in 2017 DDV Add overall PL assignment as Report Only

Program Area and Indicator Number 2018 PBMAS Preview: SPED Program Area and Indicator Number Indicator Name 2017 PBMAS 2018 PBMAS SPED #16 SPED Total Disciplinary Removal Rate (Ages 3-21) Previewed in DDV Previously previewed in 2017 DDV Add overall PL assignment as Report Only

Other System Components

2018 PBMAS Preview: Other System Components Program Area Description 2017 PBMAS 2018 PBMAS Other System Components Required Improvement Calculation No Changes CTE Tech Prep Status The CTE Tech Prep code 3 was discontinued in the 2016-2017 Texas Student Data System (TSDS); however, was available on the 2016-2017 STAAR EOC answer documents. Included in the accountable district’s results. The CTE Tech Prep code 3 was discontinued in the 2016-2017 Texas Student Data System (TSDS); however, was available on the Summer 2017 STAAR EOC answer documents. Included in the accountable district’s results.

2018 PBMAS Preview: Other System Components Program Area Description 2017 PBMAS 2018 PBMAS Other System Components Format of Performance Levels 2017 available Performance Levels were ND, NA, NA DI, NA SA, 0, 0 RI, 0 SA, 1, 1 SA, 2, 2, SA, 3, 3 SA, 4, 4 SA, Report Only and SD (Year 1) Add SD (Year 2)

Engaging in continuous improvement

Continuous Improvement Process Data Analysis Root Cause Analysis Strategy Selection and Planning Implementation and Monitoring

Focused Data Analysis

Data Analysis Analyze data for each PBMAS indicator with a PL of 2 or higher and/or an area of noncompliance of a FRE. Identify specific campuses contributing to areas of low performance or noncompliance and target those campuses for interventions. Use multiple data sources to examine areas that may have an impact. Identify why a campus did not meet standard and where the performance gaps are Write data-driven problem statements and annual goals on which to base improvement planning

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (1 min): (show Slide 5) Key Idea: Analyzing our data as a first step lets us know exactly what we need to plan for and how much we need to accomplish. Analyzing data first tells us where we are now and how much we need to do to get to where we want to go.

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea When looking at data evaluate by: content area grade level teacher level student group …….to uncover where performance gaps really are. (show Slide 14) Key Idea: When looking at state accountability data, evaluate by content area, grade level, teacher level, and student group to uncover where performance gaps really are.

A district report such as PBMAS is made up of data from campuses. Districts must understand this relationship.

Targets are different for different student groups! ELA/READING & MATH ONLY Targets are different for different student groups!

First table shows TARGETS & whether you met them or not Second table shows YOUR data

Trend Data ESSA Indicators Must Address

Problem Statement Development Use data analysis to determine what problems exist that are contributing to the ineffective program areas. A problem statement should: capture “where the LEA or program is” compared to “where the LEA or program wants to be” Be concise and objective Not assign causation Example: The district received a PL of 3 in SS for the 2018, 3-8 STAAR for ESSA Students. The district scored 40% in SS at the Meets Grade Level performance on the 3-8 STAAR in 2018.

Annual Goal An annual goal should: Annual goals should be aligned to the Closing the Gaps domain in the accountability system. An annual goal should: capture “where the LEA or program is” compared to “where the LEA or program wants to be” Be specific Be measurable Example: The district will achieve a PL of 0 in SS on the 2019 PBMAS report. The district will achieve 65% in SS and 70% in Writing on the 2019, 3-8 STAAR for ESSA students. Use the quadrant template provided to record your problem statement and annual goal.

Guided Root Cause Analysis Tracy Patrick

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (1 min): (show slide 4) Identifying the root cause of low performance allows us to remove the condition that is causing (and will continue to cause) low performance. Key Idea: We analyze root causes so we can identify the real reason we are underperforming and select a strategy that creates sustainable gains in student achievement. We analyze root causes so we can identify the real reason we are underperforming and select a strategy that creates sustainable gains in student achievement. updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 ESC staff adds introduction and training logistics/norms here. Say (1 min): In the next 2 hours, we are going to learn how to identify the root cause for our areas of low performance (the problem statements).

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Objectives Engage in guided dialogue with stakeholder groups to brainstorm possible root causes Validate and prioritize root causes Say (5 mins): (show slide 5) At the end of this session, you will be able to: Engage in guided dialogue with stakeholder groups to brainstorm possible root causes Validate and prioritize root causes   We will look at a couple of case studies to determine why these actions are important for planning, and then we will practice have a guided discussion to conduct a root cause analysis. When you return to your campus, you will be able to implement these tools for your own root cause analysis. updated 8/13/18

NEEDS ASSESSMENT

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (3 min): (show Slide 7) Key idea: what we see from this case study is that we set ourselves up for successful planning when we invite a variety of stakeholders to the table to have a guided discussion and prioritize root causes. . We set ourselves up for successful planning when we invite a variety of stakeholders to the table to have a guided discussion and prioritize root causes. updated 8/13/18

What you will need….

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 How to Use the Guided Discussion Tool Under each topic: Start with the first element and GO IN ORDER. Determine if your campus does that element consistently and with fidelity. If yes, provide evidence or artifacts to validate. Continue to the next element. If your answer is no, write the question on the Guided Discussion Documentation sheet, and then move to the next topic. Repeat for each topic. Take the Guided Discussion worksheet from your packet. You are going to use this document for your conversation. (Show slide 11) (Facilitator explains how to use worksheet and where to record answers) Since you don’t have all your campus data/artifacts with you today, we will just answer the questions based on your campus experience. (Remember this is practice!) Take 10 minutes to work through the questions and record in your note guide. (Facilitator walks around to see at what level participants are stopping.) updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (2 min): (show Slide 13) Key idea: The 5 whys help us identify the barriers that prevent us from implementing systems and processes. The 5 whys help us identify the barriers that prevent us from implementing systems and processes. updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 The Guided 5 Whys Ask and answer “why” 5 times for the question you stopped at. The first 2 why questions are set for you! Example: Under teaching and learning, our first NO answer was to “Campus instructional leaders have consistent, documented expectations for maximizing instructional time and delivering effective instruction.” Why wasn’t this system/process implemented? Because we want teachers to work with their own style of instruction. Why didn’t we achieve success when we let teachers work with their own style of instruction? Because our teachers were inconsistent with the quality and rigor of instruction. Why were teachers inconsistent? Because we didn’t monitor quality and rigor consistently. Why didn’t we monitor consistently? Because we didn’t establish a system for observations and feedback that worked with our schedule. Why didn’t we create this system? Because we did not prioritize it. Say (3 mins): (show slide 14) Now we are going to use a traditional root cause analysis tool, the 5 whys, to uncover the root cause that kept us from taking this step. But our 5 whys are also a bit more guided. For this practice, we will just use the question you stopped at in teaching and learning. (Facilitator reads over example in slide 14) Take 10 minutes with your group to do this. updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Guided 5 Whys Reflection Of the barriers you identify in the because statements: Which are due to mindsets? Which are due to action (or inaction)? Which are due to resources? Say (3 min): (show slide 15) You have identified some of barriers or threats that may have prevented you from implementing this system or process. You’ll want to keep these in mind when you write (and implement!) your plan. In your note taking guide, answer the following questions to focus your ideas for planning (take 2 minutes): Of the barriers we identified in the because statements: Which are due to mindsets? Which are due to action (or inaction)? Which are due to resources? updated 8/13/18

Determine which root cause has the highest leverage.

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (2 min): (show Slide 16) Key idea: Removing barriers and ensuring that the conditions exist for campus improvement is one of the key responsibilities of the District Coordinator of School Improvement. Removing barriers and ensuring that the conditions exist for campus improvement is one of the key responsibilities of the District Coordinator of School Improvement. updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Writing a Root Cause Statement The root cause statement should: identify the system or process that was missing that led to low performance identify one or more key barriers that kept that system or process from being implemented Example: Campus instructional leaders did not create consistent, documented expectations for delivering effective instruction because we prioritized teacher autonomy over student outcomes. Say (6 mins): (show Slide 17) Our last step in this section is to write the root cause statement that would go in our plan. With your table team, develop your root cause statement. The root cause statement should: identify the system or process that was missing that led to low performance identify one or more key barriers that kept that system or process from being implemented The example we worked with earlier is written as a root cause statement on the slide. The system that was lacking is highlighted in yellow; the key barrier is highlighted in green. Take 5 minutes to do this. (Facilitator walks around to answer questions and help team identify biggest barrier(s).) updated 8/13/18

Aligned Strategy Identification Kara Fluty

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (1 min): (show Slide 4) Our work today will be about what campuses need to do to select an appropriate strategy. Key Idea: The best strategies are those that are aligned and scaffolded to the root cause of low performance. The best strategies are those that are aligned and scaffolded to the root cause of low performance. updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Objectives Identify the highest lever strategies that directly align with the root cause and address the source of performance gaps Prioritize strategies in a scaffolded and sequenced manner Say (5 mins): (show Slide 5) Today, you are going to practice how to: Identify the highest lever strategies that directly align with the root cause and address the source of performance gaps Prioritize strategies in a scaffolded and sequenced manner   We will look at a couple of case studies to identify what makes a strategy aligned and prioritized, and then we will practice selecting aligned and prioritized strategies. When you return to your campus, you will be able to use your root cause and data to drive your strategy selection. updated 8/13/18

What good strategy consist of: •Three to five strategies that outline the big areas of focus—i.e., what’s going to drive improvement •A series of strategic initiatives with each bet that identifies the specific ways to make that bet a reality in practice •Can fit on one page! •But how do you choose what to land on? . . .

Strategies to Consider Evidence for ESSA - https://www.evidenceforessa.org/ What Works Clearinghouse - https://ies.ed.gov/ncee/wwc/ TEA Strategic Priorities - https://www.region10.org/programs/title-i-capacity-building-initiative/tea-priorities/ ESSA - Using Evidence-Based Practices to Better Student Outcomes - https://www.region10.org/programs/title-i-capacity-building-initiative/essa-support/evidence-based/

Problem Statement and Annual Goal Review Root Cause Presenter Notes: First, we need to gather the problem statements that we identified in our data analysis. Here we show a sample problem statement.

Linking the strategy to the root cause Problem Statement: Students who are English Language learners have a 60% pass rate in reading. Annual Goal: Students who are English Language learners will have a 70% pass rate in reading. Root Cause Strategy …because… Presenter Notes: The reason we choose a strategy based on the root cause is because the root cause is the reason the problem happened. Let’s look at an example.

Linking the strategy to the root cause Problem Statement: Students who are English Language learners have a 60% pass rate in reading. Annual Goal: Students who are English Language learners will have a 70% pass rate in reading. Root Cause: Administrators were not holding teachers accountable for implementing language strategies. Strategy …because… Presenter Notes: The reason we choose a strategy based on the root cause is because the root cause is the reason the problem happened. Here is a sample root cause for this problem statement. Discussion question: How might our strategy identification differ if we based it on the problem statement rather than on the root cause?

Strategy: Broad approach Selecting a strategy Problem Statement: Students who are English Language learners have a 60% pass rate in reading. Annual Goal: Students who are English Language learners will have a 70% pass rate in reading. Root Cause: Administrators were not holding teachers accountable for implementing language strategies. Strategy: Broad approach …because… Presenter Notes: Now we need to select a strategy that will help us address the adult behaviors in the root cause. Keep in mind that the strategy needs to be the BROAD approach to resolving the root cause.

Selecting the Strategy What’s the impact on student learning? What’s the ease of implementation?

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Objectives Today’s session will help all campus IR levels, whether you are fine tuning a plan or writing it for the first time, because we are going to practice doing the following: (show slide 3): Identify the action steps (in the proper sequence) necessary to effectively implement the aligned strategy Determine what resources are needed to conduct those action steps (personnel, materials, time) Assign roles and responsibilities for implementation and monitoring Establish metrics, milestones, and evidence of implementation fidelity that will measure progress at critical times through the school year   We will look at a sample improvement plan first, and then we will practice drafting an implementation plan. When you return to your campus, you will be able to use these skills to develop or update your improvement plan in a way that ensures that you fully implement your chosen strategy. Identify the action steps (in the proper sequence) necessary to effectively implement the aligned strategy Determine what resources are needed to conduct those action steps (personnel, materials, time) Assign roles and responsibilities for implementation and monitoring Establish metrics, milestones, and evidence of implementation fidelity that will measure progress at critical times through the school year updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea (show slide 6) Key idea: A strategy is only effective if the implementation plan is well written with: Measurable outcomes Accountability for all stakeholders A strategy is only effective if the implementation plan is well-written with: Measurable outcomes Accountability for all stakeholders updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Practice: Writing an Implementation Plan (show slide 7): Take a minute to read over the Problem Statement, Root Cause, Strategy, and Annual Goal. Note the alignment of these elements: the problem statement exists because of the root cause. So the campus will implement this strategy to resolve the root cause and achieve the annual goal. Remember that these elements all must be aligned in your plan. (Wait 1 minute for review) Problem Statement 75% of 5th graders did not meet grade level in math because we failed to adhere to teacher induction practices, and our new teachers struggled. We will develop and monitor a year-long teacher orientation program in which master teachers and instructional coaches will provide 1:1 coaching for new teachers so that we can reduce the number of 5th grade students who did not meet grade level in math by 20%. Root Cause Strategy Annual Goal updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (2 mins): (show slide 12) If any of these elements are missing, you will want to add them as you work on resources. Key idea: The actions are the skeleton of the plan and must include: Training for all staff Observations and feedback on training implementation A way to measure student progress The actions are the skeleton of the plan and must include: Training for all staff Observations and feedback on training implementation A way to measure student progress updated 8/13/18

Actions Required List actions required to implement each strategy. Are these actions short term, intermediate, or long term? Who will be responsible for this action? What resources will be needed?

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (3 mins): Now that we have our action steps identified and sequenced, we need to make sure that all these things become a reality. (show slide 13): Key Idea: A plan only works if everyone knows what they need to do and if they are held accountable for doing it. A plan only works if everyone knows what they need to do and if they are held accountable for doing it. updated 8/13/18

Fahrig, R. SI Reorg Presentation: DCSI 06/28/18 Key Idea Say (1 min): (show slide 22) Key idea: Goals measure results; they don’t check off whether an action was done. Goals measure results; they don’t check off whether an action was completed. updated 8/13/18

Make Sure Your Goal is S.M.A.R.T. Each goal should be: Specific, Measurable, Attainable, Relevant and Time-bound. Identify a goal for each activity.

Implementation

Reminders If you are already engaging in the continuous improvement process for Accountability purposes, you may adjust the targeted improvement plan that you have developed to include all components required to meet the minimum criteria(s) of the intervention activities. Districts can complete a targeted improvement plan using the template of your choice. No parameters will be given regarding which template to use during the 2018-19 monitoring activities.

TEA Contact For questions about the PBMAS manual or the indicators described in the manual, please contact: Performance-Based Monitoring (512) 936-6426 pbm@tea.texas.gov