Division of Instructional Support Office of School Improvement, Accountability & Compliance.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

Agenda For Today! Professional Learning Communities (Self Audit) Professional Learning Communities (Self Audit) School Improvement Snapshot School Improvement.
(Individuals with Disabilities Education Improvement Act) and
PBM The Staging Process. Check the dates August In late August, districts receive their PBMAS summary report.
1 ARANSAS COUNTY ISD PBMAS Program Effectiveness Review PBM November 11, 2008 DEIC Meeting Prepared by: Susan Kovacs Director of Federal & Special Programs.
System Safeguards and Campus Improvement
Campus Improvement Plans
Region 8 Education Service Center.  Develop an understanding of the characteristics and requirements of a CNA.  Develop an understanding of the CNA.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
Delta Sierra Middle School Napa/Solano County Office of Education School Assistance and Intervention Team Monitoring Report #8 – July 2008 Mary Camezon,
Data Collection An overview of how data are collected and used in Washington state.
Reconstitution Planning and Guidance Overview
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
On Site Review Process Office of Field Services.
Designing and Implementing An Effective Schoolwide Program
Bluebonnet Elementary School Celebrations and Recommendations for Continuous School Improvement Round Rock Independent School District Module 7 Assignment.
January 19, :00 – 10:00 a.m. ET 1. Changes to Kentucky’s ESEA Waiver Request Required by USDOE Affecting 703 KAR 5:222, Categories for Recognition,
Rider 70 Changes to Staging for Special Education.
Bibb County Schools Standard 1: Vision and Purpose Standard: The system establishes and communicates a shared purpose and direction for improving.
PEIMS and Accountability. Clear System of Data Quality Documentation (Enrollment, Special Program, etc.) PEIMS Data Entry Pearson Data File Answer Documents.
Indistar Summit – Coaching with Indistar February 2012 Presenters: Yvonne Holloman, Ph.D. Associate Director, Office of School Improvement Michael Hill.
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Go for the Touchdown with ESC-2
ANNUAL EVALUATION PLAN Schoolwide Programs. Annual Evaluation Plan.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
NCLB Federal Funding Planning Meeting Private Non Profit Schools LEA Date.
Program Evaluation NCLB. Training Objectives No Child Left Behind Program Series: Program Evaluation To provide consistency across the State regarding.
Campus Improvement Plans Northwest ISD Presentation to the Board of Trustees October 14, 2013.
Performance-Based Monitoring Analysis System (PBMAS) January 2005 Update.
Why Do State and Federal Programs Require a Needs Assessment?
BOCES Data Collection & Reporting October 10, 2013 Lisa Pullaro Mid-Hudson Regional Information Center.
1 NCLB Title Program Monitoring NCLB Title Program Monitoring Regional Training SPRING 2006.
On Site Review Process Office of Field Services.
On Site Review Process Office of Field Services Last Revised 8/15/2011.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
Texas State Performance Plan Data, Performance, Results TCASE Leadership Academy Fall 2008.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
1 Title IA Coordinator Training Preparing for Title IA Monitoring
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
An Overview 2015 PBMAS and Intervention Activities
Northwest ISD Target Improvement Plan Seven Hills Elementary
A Capacity Building Program of the Virginia Department of Education Division Support for Substantial School Improvement 1.
OEPA West Virginia Board of Education Policy 2320: A Process for Improving Education: Performance- Based Accreditation System RESA 6 – October, 2014 Office.
Rowland Unified School District District Local Education Agency (LEA)Plan Update Principals Meeting November 16, 2015.
Addressing Federal Program Stages in PBM OCTOBER 27, 2015.
School Accreditation School Improvement Planning.
School Site Council (SSC) Essentials in brief An overview of SSC roles and responsibilities Prepared and Presented by Wanda Chang Shironaka San Juan Unified.
The Leadership Challenge in Graduating Students with Disabilities Guiding Questions Joy Eichelberger, Ed.D. Pennsylvania Training and Technical Assistance.
South Hunterdon Regional School District Consolidated Monitoring Report (CMR) Presentation to the SHRSD Board of Education on October 26, 2015 Audit from.
RtI Response to Instruction and Intervention Understanding RtI in Thomspon School District Understanding RtI in Thomspon School District.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
Adapted from guidance presented on August 2013 by Alexandra Pressley, Associate in Education Improvement Services NYSED Local Assistance Plan Schools:
1 Willa Spicer, Assistant Commissioner Cathy Pine, Director Carol Albritton, Teacher Quality Coordinator Office of Professional Standards, Licensing and.
Revisiting SPL/IIT/SAT/SLD AND OTHER ALPHABETIC ANOMOLIES!
Effective Programs for Successful Students The Network Summer Summit Presenters: Rachel Harrington (Performance-Based Monitoring Division) Judy Struve.
Harlingen CISD Performance Based Monitoring Analysis System (PBMAS)
April 29-30, Review information related to the RF monitoring system Ensure that the agency meets its ongoing obligation to have a monitoring system.
Edgewood ISD   School District Effectiveness  Statutory Requirements  Valid Indicators of Performance  Maximum Inclusion  Individual.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
English Learner Subgroup Self-Assessment (ELSSA)
Bilingual/ESL Education Program Report
SACSCOC Fifth-Year Readiness Audit
Career Technology Education Performance-Based Monitoring Summer Overview Ed Garcia, CTE Specialist Region One Education Service Center (956)
Advancing ELL Progress
Roles and Responsibilities
Annual Title I Meeting and Benefits of Parent and Family Engagement
PUBLIC SCHOOL CHOICE RENEWAL PROCESS
Presentation transcript:

Division of Instructional Support Office of School Improvement, Accountability & Compliance

 A system implemented to address House Bill 3459 of the 78 th Texas Legislature, Regular Session (2003).  Limited and redirected the Texas Education Agency’s monitoring activities.  Included a new performance-based section on bilingual education  Included new local board of trustees’ responsibilities for ensuring school district compliance with all applicable requirements of state programs  Included an emphasis on data integrity  first year of implementation  DATA DRIVEN analysis system that focuses on STUDENT PERFORMANCE and PROGRAM EFFECTIVENESS  Utilizes “performance indicators” and validation of “data integrity”

1. Using a data-driven performance-based model to observe, evaluate, and report on the public education system at the individual student group, campus, local education agency, regional, and statewide levels across diverse areas including program effectiveness; compliance with federal and state law and regulations; financial management; and data integrity for the purpose of assessing that student needs are being met; 2. Promoting diagnostic and evaluative systems in LEAs that are integrated with the agency’s desk audit and intervention process; and 3. Relying on a research-based framework of interventions that ensure compliance and enhance student success.

 School District Effectiveness: PBMAS is designed to assist school districts and charters in their efforts to improve local performance.  Statutory Requirements: PBMAS is designed to meet statutory requirements.  Valid Indicators of Performance: PBMAS indicators are designed to reflect critical areas of student performance, program effectiveness, and data integrity.  Maximum Inclusion: PBMAS is designed to evaluate a maximum number of school districts and charters by using appropriate alternatives to analyze the performance of districts with small numbers of students.  Individual Program Accountability: PBMAS evaluations are structured to ensure that low performance in one program area cannot be masked by high performance in other program areas or lead to interventions in program areas where performance is high.

 High Standards: PBMAS is designed to encourage high standards for all students in all districts and charters. Standards will be adjusted over time to ensure high expectations continue to be set.  Annual Statewide Evaluation: PBMAS allows for the annual evaluation of a maximum number of school districts and charters in the state, and all evaluated school districts can access their PBMAS performance data on a yearly basis.  Public Input and Accessibility: The design, development, and implementation of PBMAS are all informed by ongoing public input. Performance information that PBMAS generates is accessible to the public.  System Evolution: PBMAS is a dynamic system in which indicators are added, revised, or deleted in response to changes and developments that occur outside of the system, including new legislation and the development of new assessments.  Coordination: PBMAS is part of an overall agency coordination strategy for the performance-based evaluation of school districts and charters.

Student Performance Program Effectiveness Compliance with State & Federal Requirements Date Quality & Integrity

 PBMAS is comprised of:  Special Education  Bilingual/ESL  Career and Technical Education  No Child Left Behind

Bilingual ESL Program-13 Indicators NCLB Program- 8 Indicators CTE-10 Indicators Special Education-18 Indicators

 Standards  The quantifiable level of minimally acceptable performance against which individual district and charter performance is measurable.  Types of Standards  Absolute Standard – tied to an absolute requirement or goal that all districts have the possibility of achieving each year.  Relative Standards – are not tied to an absolute requirement or goal. May be used in the PBMAS to determine a baseline absolute standard for certain indicators in the part where an absolute standard is not possible due to new indicators or may not be appropriate depending on the purpose of a particular indicator.

 Performance Level  The result that occurs when a standard is applied to a district’s performance on an indicator

 In PBMAS, the state accountability standards for Academically Acceptable are used as the point at which performance level 0 (Met Standard) is set for TAKS indicators.  The standards for performance levels 1, 2, and 3 are based on how far away a district’s performance is from the standard.  In PBMAS there are 5 general performance levels:  NE (Not Evaluated)  0 (Meet Standard)  1 (Did not meet standard)  2 (Did not meet standard)  3 (Did not meet standard)

 There are two types of special analysis in PBMAS:  Automated Special Analysis (SA)-a tool that can be used to analyze the performance of districts and charters with small numbers of students.  Will be used in PBMAS.  Professional Judgment Special Analysis (PJSA)  Note the annotations for performance levels that are based on automated special analysis and professional judgment special analysis :  NE (Not Evaluated)  0SA/0PJSA (Meet Standard)  1SA/1PJSA (Did not meet standard)  2SA/2PJSA (Did not meet standard)  3SA/3PJSA (Did not meet standard)

 See Handout

 Summary of Interventions  Trigger TEA Visit  Bilingual Education – Stage 4  CTE – Stage 4  NCLB – Stage 4  Special Education – Stage 4

 Interventions are not one-size-fits-all. When higher levels of agency involvement are needed, they will be individually designed based on specific LEA data and identified issues.  The primary focus is a continuous improvement plan with strategies and activities that positively impact student performance and program effectiveness.  TEA follow up on implementation of the CIP is a given.  Analyzing performance level data and examining patterns or trends across indicators and program areas to inform interventions decision- making  Taking into account both the extent and the duration of a district’s area(s) of low performance/program ineffectiveness

Bilingual Education/ESL Monitoring Focus Data Analysis Focus Data Analysis and System Analysis Public Program Performance Review (LEA Public Meeting) Program Effectiveness BE-ESL On-Site Review Continuous Improvement Plan Career and Technical Education Monitoring Focus Data Analysis and System Analysis Compliance Review CTE On-Site Review Program Access Review Continuous Improvement Plan Corrective Action Plan NCLB Program Monitoring Initial Compliance Analysis (ICA) Focus Data Analysis Public Program Performance Review (LEA Public Meeting) NCLB On-Site Review Continuous Improvement Plan Corrective Action Plan Special Education Monitoring Focus Data Analysis Focus Data Analysis and System Analysis Public Program Performance Review (LEA Public Meeting) Compliance Review Special Education On-Site Review Continuous Improvement Plan Corrective Action Plan

 A focused review of data indicators for which a higher level of performance concern has been identified.  Traditionally requires a specified Core Team of individuals to gather, disaggregate, and review data to determine possible causes for the performance concern.  Results of the analysis generally are reflected as findings (strengths and areas in need of improvement).  TEAM must review pertinent data and complete the FDA template  Each indicator with a performance level of 2 or 3 must be addressed within this template.  Describe issues and findings  Identify data sources reviewed.

 Data Sources Reviewed  Reading and Math TAKS scores; disaggregated by special populations, by campus, by grade level  Summary Report-test performance of LEP students  Benchmark scores as provided by district/campus data analysis programs  Master Schedule  Teacher Certifications  Staff Development records  District Improvement plan  Campus Improvement plans  PEIMS Reports  AEIS Reports  Lesson Plans  Course Syllabus  ASEIT Reports of disaggregated data (by Student Expectation)  TAKS remediation attendance rosters  LEP/CTE 4 year plans  TELPAS Results  Teacher interviews, PBMAS DATA

 A process through which instances of performance concern and/or noncompliance are addressed through the identification of desired results, evidence of change, activities, resources, and interim and final review timelines that drive positive program change.  Emphasis is on a continuous improvement process which promotes improved student performance and program effectiveness over time.  Improvement planning occurs in a team environment, with required and recommended participants indentified

 Information from System Analysis and FDA must be integrated into the Continuous Improvement Planning Process – this is at the District level.

 All district level reviews should lead to interventions and/or improvements to the program.  This is NOT a cyclical system – continuous review and progress is monitored.  Bottom line is that the state systems are now interrelated and campus/district teams must work together to improve students performance.

 Why the district was selected for on-site and how their campus impacted that?  What did the Focused Data Analysis show?  What is in the CIP?  What activities in the CIP should they be doing to address the targeted needs?  What specifically are they doing on their campus to meet the PBMAS standards?

Copyright © Texas Education Agency All rights reserved Submission Deadlines Bilingual Education / ESL Stage 1A: October 22, 2010 (Stage 1A submits only if random/stratified selection) Stage 1B: October 22, 2010 Stage 2: November 12, 2010 Stage 3: November 19, 2010 Stage 4: TEA timelines TBD case-by-case

Copyright © Texas Education Agency All rights reserved Submission Deadlines Career and Technical Education Stage 1: CTE staff reviews improvement activities in Perkins eGrant PER – no additional submission required Stage 2: October 22, 2010 Stage 3: November 19, 2010 Stage 4: TEA timelines TBD case-by-case

Copyright © Texas Education Agency All rights reserved Submission Deadlines No Child Left Behind Stage 1: October 22, 2010 Stage 2: October 22, 2010 Stage 3: November 19, 2010 Stage 4: TEA timelines TBD case-by-case

Copyright © Texas Education Agency All rights reserved Submission Deadlines Special Education Stage 1A: October 22, 2010 (Stage 1A submits only if random/stratified selection) Stage 1B: November 19, 2010 Stage 2: December 10, 2010 Stage 3: January 14, 2011 Stage 4: TEA timelines TBD case-by-case

Copyright © Texas Education Agency All rights reserved. 28 Enhanced ISAM Changes were made to underlying data structures and the user interface to improve the following:  Transparency  Communication  Tracking  Letter Generation  Reporting

Copyright © Texas Education Agency All rights reserved. 29

Copyright © Texas Education Agency All rights reserved.

Copyright © Texas Education Agency All rights reserved.

Copyright © Texas Education Agency All rights reserved. 32

 Bag provided contains: programs monitored, color coded indicators  Divide into groups of 2 or 3 to sort all program indicators based on group consensus. (Each group sort one program area)  Name the groups  List the ‘categories’ on chart tablet  Report the final categories

 What categories were the indicators grouped into?  Are there any indicators that are exclusive to a program; if so which one’s?  Are there any considerations for including those exclusive indicators in another category?

AssessmentCompletion EnvironmentIdentification Discipline

We now recognize that: Bil/ESL; CTE; NCLB; SPED programs are different in name, they serve one purpose…each individual child. program indicators are evaluated by individual program, they ultimately and more importantly, indicate the degree of success of every individual child. the intent of the services through these programs are to recognize the interrelatedness and call for a systemic way in which to promote continuous improvement to maximize student success.

 PBMAS report  PBMI Staging  Comprehensive Data Analysis  Focus Data Analysis Guidance Document  Core Team Identified  Conduct Focus Data Analysis (FDA)  Program specific analysis and templates  Develop continuous improvement plan (CIP)

You are the core team for Sample district. Each table has been assigned a program FDA Review the FDA On the Data Analysis Results of the FDA highlight the factors that pertain to area as identified: Leadership Data, or Curriculum and instruction Identify strategies or initiatives that the LEA should consider when creating the CIP to address the causal factors that will impact performance for the assigned category Record CIP strategies or initiatives on chart tablet. Indicate the Data Analysis Result(s) addressed. 15 minute activity

 Report to whole group  What category did you have?  Summarize causal factor(s) that you wrote strategies for?  What strategies did you determine for the CIP to address the causal factors to impact performance in the assigned category?  For each category and program, are there strategies that are relevant to the other programs.

 Evidence of implementation  Evidence of impact  What data sources might the district use to measure progress and impact?

Migrant Only LEP Only Special ED Only CTE Only 3,998 6,648 LS 3,549 LM 6,091 CLP 703 3,416 CMS 699 6,817 LEP & CTE 6,073 CS 6,219 50,182 4,845 CM 389 6,817 LEP & CTE 48, Migrant & Special Ed 371 Migrant & Special Ed Region One 2010 TAKS Administration : Mathematics by Program Participation (Unduplicated Count)

Region One 2010 TAKS Administration : Mathematics by Program Participation (Unduplicated Count)

 Targeted in –site review to address program effectiveness concerns related to documented substantial, imminent, or ongoing risks based in current or longitudinal data.  Lead focus discussions  Interview stakeholders, service providers, and administrators, and  Conduct classroom observations, document reviews, and student data reviews

 Focus Groups scheduling  Maps of district offices and campuses  Core Analysis Team activity verification  FDA Data Availability  CIP Status of Activities  District and Campus information  Lists (students, teachers, campuses)

 Data requests from agency  Folder retrieval system  Records retrieval system  Staff Development records  Facilities  Cross district collaboration and awareness  Campus and district staff articulation

 District Entry  Administrator Focus Group  Director (s) Interviews  Core Team Focus Group  Parent Focus Group  Various Teacher Focus Groups  Campus Visits  Folder Review  Case Studies  Data Clarification  District Exit

 Campus and district staff articulation  Implementation with fidelity  Student Progress

“You cannot solve a problem from the same consciousness that created it. You must learn to see the world anew.” Albert Einstein

 Connie Guerra, B/ESL,  Christina Salas, CTE,  Omar Chavez, Migrant,  Belinda Gorena, Title I,  Kelly Solis, Sp. Ed.,