Presentation is loading. Please wait.

Presentation is loading. Please wait.

Division of Instructional Support Office of School Improvement, Accountability & Compliance.

Similar presentations


Presentation on theme: "Division of Instructional Support Office of School Improvement, Accountability & Compliance."— Presentation transcript:

1 Division of Instructional Support Office of School Improvement, Accountability & Compliance

2  A system implemented to address House Bill 3459 of the 78 th Texas Legislature, Regular Session (2003).  Limited and redirected the Texas Education Agency’s monitoring activities.  Included a new performance-based section on bilingual education  Included new local board of trustees’ responsibilities for ensuring school district compliance with all applicable requirements of state programs  Included an emphasis on data integrity  2004-2005 first year of implementation  DATA DRIVEN analysis system that focuses on STUDENT PERFORMANCE and PROGRAM EFFECTIVENESS  Utilizes “performance indicators” and validation of “data integrity”

3 1. Using a data-driven performance-based model to observe, evaluate, and report on the public education system at the individual student group, campus, local education agency, regional, and statewide levels across diverse areas including program effectiveness; compliance with federal and state law and regulations; financial management; and data integrity for the purpose of assessing that student needs are being met; 2. Promoting diagnostic and evaluative systems in LEAs that are integrated with the agency’s desk audit and intervention process; and 3. Relying on a research-based framework of interventions that ensure compliance and enhance student success.

4  School District Effectiveness: PBMAS is designed to assist school districts and charters in their efforts to improve local performance.  Statutory Requirements: PBMAS is designed to meet statutory requirements.  Valid Indicators of Performance: PBMAS indicators are designed to reflect critical areas of student performance, program effectiveness, and data integrity.  Maximum Inclusion: PBMAS is designed to evaluate a maximum number of school districts and charters by using appropriate alternatives to analyze the performance of districts with small numbers of students.  Individual Program Accountability: PBMAS evaluations are structured to ensure that low performance in one program area cannot be masked by high performance in other program areas or lead to interventions in program areas where performance is high.

5  High Standards: PBMAS is designed to encourage high standards for all students in all districts and charters. Standards will be adjusted over time to ensure high expectations continue to be set.  Annual Statewide Evaluation: PBMAS allows for the annual evaluation of a maximum number of school districts and charters in the state, and all evaluated school districts can access their PBMAS performance data on a yearly basis.  Public Input and Accessibility: The design, development, and implementation of PBMAS are all informed by ongoing public input. Performance information that PBMAS generates is accessible to the public.  System Evolution: PBMAS is a dynamic system in which indicators are added, revised, or deleted in response to changes and developments that occur outside of the system, including new legislation and the development of new assessments.  Coordination: PBMAS is part of an overall agency coordination strategy for the performance-based evaluation of school districts and charters.

6 Student Performance Program Effectiveness Compliance with State & Federal Requirements Date Quality & Integrity

7  PBMAS is comprised of:  Special Education  Bilingual/ESL  Career and Technical Education  No Child Left Behind

8 Bilingual ESL Program-13 Indicators NCLB Program- 8 Indicators CTE-10 Indicators Special Education-18 Indicators

9  Standards  The quantifiable level of minimally acceptable performance against which individual district and charter performance is measurable.  Types of Standards  Absolute Standard – tied to an absolute requirement or goal that all districts have the possibility of achieving each year.  Relative Standards – are not tied to an absolute requirement or goal. May be used in the PBMAS to determine a baseline absolute standard for certain indicators in the part where an absolute standard is not possible due to new indicators or may not be appropriate depending on the purpose of a particular indicator.

10

11  Performance Level  The result that occurs when a standard is applied to a district’s performance on an indicator

12  In PBMAS, the state accountability standards for Academically Acceptable are used as the point at which performance level 0 (Met Standard) is set for TAKS indicators.  The standards for performance levels 1, 2, and 3 are based on how far away a district’s performance is from the standard.  In PBMAS there are 5 general performance levels:  NE (Not Evaluated)  0 (Meet Standard)  1 (Did not meet standard)  2 (Did not meet standard)  3 (Did not meet standard)

13  There are two types of special analysis in PBMAS:  Automated Special Analysis (SA)-a tool that can be used to analyze the performance of districts and charters with small numbers of students.  Will be used in PBMAS.  Professional Judgment Special Analysis (PJSA)  Note the annotations for performance levels that are based on automated special analysis and professional judgment special analysis :  NE (Not Evaluated)  0SA/0PJSA (Meet Standard)  1SA/1PJSA (Did not meet standard)  2SA/2PJSA (Did not meet standard)  3SA/3PJSA (Did not meet standard)

14  See Handout

15  Summary of Interventions  Trigger TEA Visit  Bilingual Education – Stage 4  CTE – Stage 4  NCLB – Stage 4  Special Education – Stage 4

16  Interventions are not one-size-fits-all. When higher levels of agency involvement are needed, they will be individually designed based on specific LEA data and identified issues.  The primary focus is a continuous improvement plan with strategies and activities that positively impact student performance and program effectiveness.  TEA follow up on implementation of the CIP is a given.  Analyzing performance level data and examining patterns or trends across indicators and program areas to inform interventions decision- making  Taking into account both the extent and the duration of a district’s area(s) of low performance/program ineffectiveness

17 Bilingual Education/ESL Monitoring Focus Data Analysis Focus Data Analysis and System Analysis Public Program Performance Review (LEA Public Meeting) Program Effectiveness BE-ESL On-Site Review Continuous Improvement Plan Career and Technical Education Monitoring Focus Data Analysis and System Analysis Compliance Review CTE On-Site Review Program Access Review Continuous Improvement Plan Corrective Action Plan NCLB Program Monitoring Initial Compliance Analysis (ICA) Focus Data Analysis Public Program Performance Review (LEA Public Meeting) NCLB On-Site Review Continuous Improvement Plan Corrective Action Plan Special Education Monitoring Focus Data Analysis Focus Data Analysis and System Analysis Public Program Performance Review (LEA Public Meeting) Compliance Review Special Education On-Site Review Continuous Improvement Plan Corrective Action Plan

18  A focused review of data indicators for which a higher level of performance concern has been identified.  Traditionally requires a specified Core Team of individuals to gather, disaggregate, and review data to determine possible causes for the performance concern.  Results of the analysis generally are reflected as findings (strengths and areas in need of improvement).  TEAM must review pertinent data and complete the FDA template  Each indicator with a performance level of 2 or 3 must be addressed within this template.  Describe issues and findings  Identify data sources reviewed.

19  Data Sources Reviewed  Reading and Math TAKS scores; disaggregated by special populations, by campus, by grade level  Summary Report-test performance of LEP students  Benchmark scores as provided by district/campus data analysis programs  Master Schedule  Teacher Certifications  Staff Development records  District Improvement plan  Campus Improvement plans  PEIMS Reports  AEIS Reports  Lesson Plans  Course Syllabus  ASEIT Reports of disaggregated data (by Student Expectation)  TAKS remediation attendance rosters  LEP/CTE 4 year plans  TELPAS Results  Teacher interviews, PBMAS DATA

20  A process through which instances of performance concern and/or noncompliance are addressed through the identification of desired results, evidence of change, activities, resources, and interim and final review timelines that drive positive program change.  Emphasis is on a continuous improvement process which promotes improved student performance and program effectiveness over time.  Improvement planning occurs in a team environment, with required and recommended participants indentified

21  Information from System Analysis and FDA must be integrated into the Continuous Improvement Planning Process – this is at the District level.

22  All district level reviews should lead to interventions and/or improvements to the program.  This is NOT a cyclical system – continuous review and progress is monitored.  Bottom line is that the state systems are now interrelated and campus/district teams must work together to improve students performance.

23  Why the district was selected for on-site and how their campus impacted that?  What did the Focused Data Analysis show?  What is in the CIP?  What activities in the CIP should they be doing to address the targeted needs?  What specifically are they doing on their campus to meet the PBMAS standards?

24 Copyright © Texas Education Agency 2010. All rights reserved. 24 2010-2011 Submission Deadlines Bilingual Education / ESL Stage 1A: October 22, 2010 (Stage 1A submits only if random/stratified selection) Stage 1B: October 22, 2010 Stage 2: November 12, 2010 Stage 3: November 19, 2010 Stage 4: TEA timelines TBD case-by-case

25 Copyright © Texas Education Agency 2010. All rights reserved. 25 2010-2011 Submission Deadlines Career and Technical Education Stage 1: CTE staff reviews improvement activities in Perkins eGrant PER – no additional submission required Stage 2: October 22, 2010 Stage 3: November 19, 2010 Stage 4: TEA timelines TBD case-by-case

26 Copyright © Texas Education Agency 2010. All rights reserved. 26 2010-2011 Submission Deadlines No Child Left Behind Stage 1: October 22, 2010 Stage 2: October 22, 2010 Stage 3: November 19, 2010 Stage 4: TEA timelines TBD case-by-case

27 Copyright © Texas Education Agency 2010. All rights reserved. 27 2010-2011 Submission Deadlines Special Education Stage 1A: October 22, 2010 (Stage 1A submits only if random/stratified selection) Stage 1B: November 19, 2010 Stage 2: December 10, 2010 Stage 3: January 14, 2011 Stage 4: TEA timelines TBD case-by-case

28 Copyright © Texas Education Agency 2010. All rights reserved. 28 Enhanced ISAM Changes were made to underlying data structures and the user interface to improve the following:  Transparency  Communication  Tracking  Letter Generation  Reporting

29 Copyright © Texas Education Agency 2009. All rights reserved. 29

30 Copyright © Texas Education Agency 2010. All rights reserved.

31 Copyright © Texas Education Agency 2009. All rights reserved.

32 Copyright © Texas Education Agency 2010. All rights reserved. 32

33

34  Bag provided contains: programs monitored, color coded indicators  Divide into groups of 2 or 3 to sort all program indicators based on group consensus. (Each group sort one program area)  Name the groups  List the ‘categories’ on chart tablet  Report the final categories

35  What categories were the indicators grouped into?  Are there any indicators that are exclusive to a program; if so which one’s?  Are there any considerations for including those exclusive indicators in another category?

36 AssessmentCompletion EnvironmentIdentification Discipline

37 We now recognize that: Bil/ESL; CTE; NCLB; SPED programs are different in name, they serve one purpose…each individual child. program indicators are evaluated by individual program, they ultimately and more importantly, indicate the degree of success of every individual child. the intent of the services through these programs are to recognize the interrelatedness and call for a systemic way in which to promote continuous improvement to maximize student success.

38  PBMAS report  PBMI Staging  Comprehensive Data Analysis  Focus Data Analysis Guidance Document  Core Team Identified  Conduct Focus Data Analysis (FDA)  Program specific analysis and templates  Develop continuous improvement plan (CIP)

39 You are the core team for Sample district. Each table has been assigned a program FDA Review the FDA On the Data Analysis Results of the FDA highlight the factors that pertain to area as identified: Leadership Data, or Curriculum and instruction Identify strategies or initiatives that the LEA should consider when creating the CIP to address the causal factors that will impact performance for the assigned category Record CIP strategies or initiatives on chart tablet. Indicate the Data Analysis Result(s) addressed. 15 minute activity

40  Report to whole group  What category did you have?  Summarize causal factor(s) that you wrote strategies for?  What strategies did you determine for the CIP to address the causal factors to impact performance in the assigned category?  For each category and program, are there strategies that are relevant to the other programs.

41  Evidence of implementation  Evidence of impact  What data sources might the district use to measure progress and impact?

42 Migrant Only LEP Only Special ED Only CTE Only 3,998 6,648 LS 3,549 LM 6,091 CLP 703 3,416 CMS 699 6,817 LEP & CTE 6,073 CS 6,219 50,182 4,845 CM 389 6,817 LEP & CTE 48,610 371 Migrant & Special Ed 371 Migrant & Special Ed Region One 2010 TAKS Administration : Mathematics by Program Participation (Unduplicated Count)

43

44 Region One 2010 TAKS Administration : Mathematics by Program Participation (Unduplicated Count)

45

46  Targeted in –site review to address program effectiveness concerns related to documented substantial, imminent, or ongoing risks based in current or longitudinal data.  Lead focus discussions  Interview stakeholders, service providers, and administrators, and  Conduct classroom observations, document reviews, and student data reviews

47  Focus Groups scheduling  Maps of district offices and campuses  Core Analysis Team activity verification  FDA Data Availability  CIP Status of Activities  District and Campus information  Lists (students, teachers, campuses)

48  Data requests from agency  Folder retrieval system  Records retrieval system  Staff Development records  Facilities  Cross district collaboration and awareness  Campus and district staff articulation

49  District Entry  Administrator Focus Group  Director (s) Interviews  Core Team Focus Group  Parent Focus Group  Various Teacher Focus Groups  Campus Visits  Folder Review  Case Studies  Data Clarification  District Exit

50  Campus and district staff articulation  Implementation with fidelity  Student Progress

51 “You cannot solve a problem from the same consciousness that created it. You must learn to see the world anew.” Albert Einstein

52

53  Connie Guerra, B/ESL, cguerra@esc1.net  Christina Salas, CTE, csalas@esc1.net  Omar Chavez, Migrant, ochavez@esc1.net  Belinda Gorena, Title I, bgorena@esc1.net  Kelly Solis, Sp. Ed., ksolis@esc1.net


Download ppt "Division of Instructional Support Office of School Improvement, Accountability & Compliance."

Similar presentations


Ads by Google