Download presentation
Presentation is loading. Please wait.
1
Performance-Based Monitoring: WWWWWH + 3
The whowhatwhenwherewhyhow of Pbm and its data, plus 3 bonus questions Network summer summit – june 2017
2
Purpose of Today’s Presentation
Provide the who, what, when, where, why, and how of Performance-Based Monitoring Provide the who, what, when, where, why, and how of Performance-Based Monitoring data Provide answers to questions about: The most recently introduced data validation indicator, STAAR EOC Test Participation Rate The program area of PBMAS for which charter school districts are staged at a higher rate than other school districts Changes introduced in 2017 based on Final USDE Regulations 34 CFR Part 300 Copyright Texas Education Agency All rights reserved.
3
The seven “circumstances” (with apologies to Hermagoras of Temnos)
The ancient Greek rhetorician Hermagoras of Temnos divided each topic into its seven “circumstances“ – quis, quid, quando, ubi, cur, quem ad modum, and quibus adminiculis. Translated to English these are the 5Ws – who, what, when, where, why – and the how (in what way and by what means). Today, with apologies to Hermagoras, I appropriate his method of presentation without the promise of his rhetorical skills, but with the promise of maybe a few acronyms and even some math, but no more Latin. Copyright Texas Education Agency All rights reserved.
4
Copyright Texas Education Agency 2017. All rights reserved.
WWWWWH of PBM who, what, when, where, why, and how of Performance- Based Monitoring Copyright Texas Education Agency All rights reserved.
5
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHO: Performance-Based Monitoring (PBM) is part of the Performance Reporting Division in the Office of Academics at the Texas Education Agency (TEA). Phone: (512) Address: RACHEL HARRINGTON DIRECTOR, PERFORMANCE-BASED MONITORING TEXAS EDUCATION AGENCY N CONGRESS AVENUE AUSTIN TEXAS Copyright Texas Education Agency All rights reserved.
6
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHO (cont.) Presenter: Cheryl Robinson, Director, PBM Data Processing and Systems Development Directs technical staff in PBM in production of PBM reporting and is contact for districts in Regions 19 and 20 11 years at TEA, split between Student Assessments and PBM 13 additional years as programmer/analyst 3 years as secondary math teacher in Texas schools Copyright Texas Education Agency All rights reserved.
7
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHAT: Produces reporting: Performance-Based Monitoring Analysis System (PBMAS) Data Validation System Leaver Records Data Validation Discipline Data Validation Student Assessment Data Validation Copyright Texas Education Agency All rights reserved.
8
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHAT (cont.): PBMAS reporting: Annual District-level, data-driven monitoring system (since 2004) Performance of school districts and charter schools as determined by PBM in selected program areas: Bilingual Education/English as a Second Language (BE/ESL) Career and Technical Education (CTE) Certain Federal Title Programs – Title I, Part A and Migrant (previously NCLB, ESSA in 2017) Special Education (SPED) Organized within PBMAS by these program areas as indicators and assigned performance levels Downloadable data files of reported (aggregated) PBM data for use by technical users State (since 2006) and region (since 2007) indicator reporting also produced Other data included in PBMAS reporting: Four SPED State Performance Plan (SPP) Federally Required Elements (FRE) determined by Special Populations (since 2016) Copyright Texas Education Agency All rights reserved.
9
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHAT (cont.): Data Validation reporting: Annual District-level data validity analyses Examines three data types (regardless of program area): Leaver Records Data (since 2005) Discipline Data (since 2004) Student Assessment Data (since 2005) Organized by separate reporting of each data type, with indicators only reported if determined anomalous or suggests a trend over time Copyright Texas Education Agency All rights reserved.
10
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHEN: WHAT REPORT WHEN RELEASED (TYPICAL MONTH) PBMAS District Report and data download (unmasked) August PBMAS District Report and data download (masked) September PBMAS State and Region Reports October Leaver Records Data Validation Discipline Data Validation November Student Assessment Data Validation December Copyright Texas Education Agency All rights reserved.
11
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHERE: WHAT REPORT WHEN RELEASED (TYPICAL MONTH) WHERE PBMAS District Report and data download (unmasked) August TEASE PBMAS District Report and data download (masked) September TEA Public Web PBMAS State and Region Reports October Leaver Records Data Validation Discipline Data Validation November Student Assessment Data Validation December Copyright Texas Education Agency All rights reserved.
12
Copyright Texas Education Agency 2017. All rights reserved.
PBM WHY: Performance-Based Monitoring was created in response to House Bill 3459 (78th Texas Legislature, Regular Session, 2003). This bill mandated important changes in the monitoring TEA is authorized to conduct. In an October 2003 letter to district administrators, the Commissioner stated: In response to these changes, the agency intends to develop and implement a data-driven and performance-based monitoring system that 1) reduces – to the extent possible – the burden of monitoring on school districts and charter schools by accurately identifying for further review only those with clear indicators of non-compliance or poor program quality and 2) encourages continuous improvement in alignment with the state’s accountability system. This new data-driven system will enable the agency to monitor district and charter school performance on an ongoing, rather than cyclical basis. Copyright Texas Education Agency All rights reserved.
13
Copyright Texas Education Agency 2017. All rights reserved.
PBM HOW: How is PBMAS reporting used? PBM provides SI with summarized PBMAS results for each district after district reports are released on TEASE. SI assigns a stage of intervention for each of the four program areas based on these PBMAS summaries. SI staff then monitors and supports intervention activities using a continuous improvement model. Activities targeted to improve student performance and program effectiveness emphasize data integrity and analysis, needs assessment, improvement planning, and progress reporting. If noncompliance, student performance, or program effectiveness concerns are identified, school districts are required to participate in these activities and may also be subject to additional sanctions and interventions, including on-site reviews. SI provides the integrated SPED intervention staging levels to the Performance Reporting Division. The Texas Academic Performance Report (TAPR) district-level reports include the SPED staging level as the Special Education Determination Status. PBM also provides each Education Service Center (ESC) with the summarized PBMAS results for each district in its region in order to facilitate district support by ESC staff. Copyright Texas Education Agency All rights reserved.
14
Copyright Texas Education Agency 2017. All rights reserved.
PBM HOW (cont.): How is Data Validation reporting used? PBM provides SI with summarized data validation system results for each district after district reports for each data type (Leaver Records, Discipline, Student Assessment) are released on TEASE. SI staff reviews data validation reporting and works with districts identified for potential data inaccuracies, data anomalies, or data irregularities. On-site reviews may be conducted to validate implementation of the PBM system and the accuracy of data used in analysis. If noncompliance, student performance, or program effectiveness concerns are identified, school districts are required to take actions to address these concerns. If inaccurate data reporting is determined, a school district may be subject to additional sanctions and interventions. PBM also provides each Education Service Center (ESC) with the summarized data validation system results for each district in its region in order to facilitate district support by ESC staff. Copyright Texas Education Agency All rights reserved.
15
Copyright Texas Education Agency 2017. All rights reserved.
PBM HOW (cont.) How are indicator results determined? Manuals for PBMAS and the data validation systems for each data type are posted on the TEA public website and may be downloaded from there. The PBMAS Manual includes a publication order form that may be used to order copies from TEA. Manuals are the primary resource to understand indicator determinations. Each manual contains a section(s) with general information about that system, followed by an explanation of each indicator, and then any applicable appendices. General information about PBMAS includes introduction and components sections that explain guiding principles, changes from the previous year, and calculation methods unique to PBMAS, such as required improvement and special analysis for small numbers. General information about each data validation system includes an explanation of the differences between PBMAS data validation, data sources, any additional resources, and a sample report. Each indicator explanation includes data sources, any calculations performed, any minimum size requirements or other criteria applied/allowed, additional notes, and for PBMAS, performance level assignments or designations. Appendices may include code description tables in addition to the latest list of ESC PBM contacts and other helpful contact information. The PBMAS appendix “Comments, Questions, and Review of Incorrect Performance Level Assignments” includes a deadline for reporting an incorrect performance level based on a data or calculation error attributable to TEA or one of its data contractors. Copyright Texas Education Agency All rights reserved.
16
Copyright Texas Education Agency 2017. All rights reserved.
PBM HOW (cont.): How – besides the manual – can I receive additional help concerning indicator determination? Each ESC has a PBM contact to assist you. Refer to any PBM manual’s appendix for names, phone numbers, and addresses – or use the Search RESCs function of AskTED and specify PERFORMANCE-BASED MONITORING for the selected role. These PBM contacts typically attend multiple PBM and SI TETN sessions each year and are responsible for providing districts in their regions with relevant training and technical assistance. ESC PBM contacts and districts should refer to the applicable manual’s appendix listing PBM contact information, or refer to this presentation’s slide 5. Questions about indicators should be addressed to PBM. Copyright Texas Education Agency All rights reserved.
17
Point of Clarification
Although questions about indicators should be directed to Performance-Based Monitoring, any staging and interventions questions – including ISAM inquiries – should be directed to School Improvement. Contact information for SI is included in each PBM manual’s appendix. It is: Phone: (512) For information about the four SPP FREs included in PBMAS district reporting, contact Special Populations. Phone: (512) For other program area information related to PBMAS, refer to other helpful contact information in the PBMAS manual’s appendix. Copyright Texas Education Agency All rights reserved.
18
Copyright Texas Education Agency 2017. All rights reserved.
PBM HOW (cont.) How are PBM indicators in PBMAS and Data Validation developed and modified? Indicators are developed to meet statutory requirements. Some are referenced as background in PBM manuals. PBM coordinates with other TEA divisions and departments, including the setting of high expectations to ensure continued student achievement and progress. Indicators are added, revised, or deleted in response to changes and developments that occur outside of the system, including new legislation and the development of new assessments. PBMAS design, development, and implementation are informed by public input received through stakeholder meetings, the public comment period included in the annual rule adoption of PBMAS manuals, and ongoing Texas Education Telecommunications Network (TETN) sessions. The public comment period for the 2017 PBMAS is June 2 through July 3, and a public hearing was held June 16. Comments concerning indicators in the Data Validation System are welcome and should be ed or mailed to PBM by the date specified in the appendix of each data type’s manual. Copyright Texas Education Agency All rights reserved.
19
Copyright Texas Education Agency 2017. All rights reserved.
WWWWWH of PBM Data who, what, when, where, why, and how of Performance- Based Monitoring Data Copyright Texas Education Agency All rights reserved.
20
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHO: Districts provide all PBM reporting data through two sources. Texas Student Data Systems Public Education Information Management System (TSDS PEIMS) The responsibility of your PEIMS Coordinator Submitted four times a year, with Fall (Submission 1) and Summer (Submission 3) data most commonly used for monitoring and accountability purposes Student Assessments (provided to TEA by the testing contractors) The responsibility of your District Test Coordinator Summer, Fall, and Spring administrations used for monitoring and accountability purposes Copyright Texas Education Agency All rights reserved.
21
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHO (cont.): The Division of Research and Analysis identifies, categorizes, and aggregates TSDS PEIMS submissions and applies any applicable exclusions to graduation and dropout data. The resulting files are the data sources for PBMAS graduation rate and annual dropout rate indicators and for the continuing students’ dropout rate indicator in Leaver Records Data Validation. (Data for your district are available in TEASE under the RES tab.) The Division of Research and Analysis also calculates the number and rate of underreported students from TSDS PEIMS submissions and provides that data to PBM for the underreported students indicator in Leaver Records Data Validation. (Data for your district are available in the TSDS Data Portal and, for previous years, in PEIMS EDIT+.) Copyright Texas Education Agency All rights reserved.
22
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHO (cont.): The testing contractor consolidates testing data submitted by districts for the entire accountability year into one record, if possible, for each student; adds demographic and program area information from TSDS PEIMS fall enrollment snapshots; adds previous year testing data; and provides the resulting Consolidated Accountability File (CAF) to TEA. The CAF is the data source for all PBMAS STAAR and TELPAS indicators and for all Student Assessment Data Validation indicators. (Your District Test Coordinator is provided a corresponding file that contains any student for whom the district submitted testing information during the current accountability year.) All other PBMAS and data validation indicators access TSDS PEIMS directly as their data source. Copyright Texas Education Agency All rights reserved.
23
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHAT: TSDS PEIMS Data Student Data Sub-Category Sub-Category Code Basic Information 40100 Enrollment (commonly referred to as Fall Snapshot) 40110 Special Education Program 41163 Title I, Part A Program 41461 Basic Attendance and Flexible Attendance 42400 & 42500 Special Education Attendance and Flexible Attendance 42405 & 42505 Disciplinary Action 44425 Course Completion 43415 Student School Leaver 40203 Copyright Texas Education Agency All rights reserved.
24
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHAT (cont.): Student Assessment Data Administrations Consolidated Accountability File (CAF) Summer of previous school year, Fall and Spring of current school year (the accountability year) Summer EOC Files Summer of current school year Copyright Texas Education Agency All rights reserved.
25
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHEN: TSDS PEIMS Data Student Data Sub-Category Sub-Category Code Submissions Basic Information 40100 Submissions 1, 3, & 4 Enrollment (commonly referred to as Fall Snapshot) 40110 Submission 1 Special Education Program 41163 Title I, Part A Program 41461 Basic Attendance and Flexible Attendance 42400 & 42500 Submission 3 Special Education Attendance and Flexible Attendance 42405 & 42505 Submissions 3 Disciplinary Action 44425 Submission 3* Course Completion 43415 Student School Leaver 40203 Submission 1* * Additional information is provided on Slide 27. Copyright Texas Education Agency All rights reserved.
26
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHEN (cont.): Student Assessment Data Administrations Files Available Consolidated Accountability File (CAF) Summer of previous school year, Fall and Spring of current school year (the accountability year) Mid-July (to TEA) (end of July or early August to districts and regions) Summer EOC Files Summer of current school year August (to TEA) (August to districts) Copyright Texas Education Agency All rights reserved.
27
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHEN (cont.) TSDS PEIMS data submission deadlines dictate the most recent data that can be included in each year’s PBM reporting. Some TSDS PEIMS data are reported based on students’ statuses from the previous school year. For example, the most recent data available for the 2017 PBMAS released in August 2017 is: TSDS PEIMS discipline data from Submission 3 (disciplinary actions from school year ).* TSDS PEIMS course completion data from Submission 3 (course completions from school year ). TSDS PEIMS graduation data from Submission 1 (leaver records from school year ). TSDS PEIMS dropout data from Submission 1 (leaver records from school year ). Most other 2017 PBMAS indicators are based on students’ statuses from the most recent school year (school year ). *Exception for 2017 PBMAS is summarized on Slide 53 (second paragraph). Copyright Texas Education Agency All rights reserved.
28
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHEN (cont.) For Leaver Records Data Validation, the most recent graduate, dropout, and other leaver reasons evaluated are from the previous school year’s leaver records from TSDS PEIMS Submission 1. These data are available to PBM the following spring. So, the school year data will first be evaluated in the 2018 Leaver Records Data Validation reporting released in October 2018. For Discipline Data Validation, the most recent disciplinary actions evaluated are from the previous school year’s disciplinary actions from TSDS PEIMS Submission 3 (summer). These data are available to PBM in the fall, which allows school year data to first be evaluated in the 2017 Discipline Data Validation reporting released in November 2017. For Student Assessment Data Validation, with the exception of the STAAR EOC Test Participation Rate indicator, all data for the most recent accountability year are available in the CAF delivered by the test contractor in July. So, with that one exception, assessment data for the Summer 2016, Fall 2016, and Spring 2017 administrations will first be evaluated in the 2017 Student Assessment Data Validation reporting released in December 2017. Copyright Texas Education Agency All rights reserved.
29
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHERE: Again, districts provide all PBM reporting data through two sources – TSDS PEIMS and Student Assessment administrations. So, if you want to see it, where is it? Data in TSDS PEIMS Contact your district’s PEIMS Coordinator for any files that were uploaded into TSDS PEIMS. The PEIMS Coordinator has access – and other district staff may have access – to reporting in the TSDS Data Portal (and PEIMS Edit+ for previous school years). The data reported there may vary slightly from the uploaded data, if PID errors are not corrected. Student Assessments Data Contact your District Test Coordinator for any files that were uploaded by the district into the testing contractor’s system. Data reported by the testing contractor will vary from uploaded data, because answer “documents” can be completed or altered during testing. Too, separate “documents” for a student may be combined during the contractor’s processing. Finally, the Test Coordinator may request changes to the collected data during certain change “windows” and throughout the year. Some are applied before the CAF is created. The Test Coordinator receives student-level and aggregated reports and student-level data files from the testing contractor for each test administration. The Student Portal and Analytic Portal are available in the Texas Assessment Management System at The Test Coordinator receives a district CAF with student-level data for the accountability year. Copyright Texas Education Agency All rights reserved.
30
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHERE (cont.): Where can you find the data TEA reports from your data? Graduation and dropout data provided to PBM by the Division of Research and Analysis can be found in TEASE under the RES tab. Reporting is usually released in June. PBMAS district data can be downloaded from TEASE (unmasked) and from the TEA public website (masked). See Slide 29 for student-level reporting available from TSDS PEIMS and the CAF. Data Validation System student-level data are available from TEASE (for applicable indicators) as reports in pdf or rtf format. Refer to the applicable manual for availability. Also, refer to the manual for TSDS Data Portal or PEIMS Edit+ reports that may be used to examine student-level data. Copyright Texas Education Agency All rights reserved.
31
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHY: The data selected for use in PBMAS reporting are driven by the requirements of each PBMAS indicator. The data examined in the Data Validation System are driven by the requirement to ensure the integrity of certain data. In the October 2003 letter to district administrators, the Commissioner stated: Clearly, data collection and analysis will be an integral part of whatever monitoring the agency undertakes – whether that monitoring is conducted remotely from the agency or on-site at the district or charter school. Districts and charter schools are, therefore, strongly encouraged to verify that their internal data collection procedures and systems are effectively designed to assure data quality and integrity. The remaining discussion of PBM data will focus on the “why” and “how” of data integrity as examined in the Data Validation System. Copyright Texas Education Agency All rights reserved.
32
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHY (cont.): House Bill 3459 (78th Texas Legislature, Regular Session, 2003) recognized that the reliability of any data evaluation and reporting system depends on the validity and accuracy of the data used in those systems. An example of Texas Education Code (TEC) related to the purpose of indicators in the Data Validation System is as follows: TEC § Limitation on Compliance Monitoring. (a) Except as provided by Section (5), (a), , or , the agency may monitor compliance with requirements applicable to a process or program provided by a school district, campus, program, or school granted charters under Chapter 12, including the process described by Subchapter F, Chapter 11, or a program described by Subchapter B, C, D, E, F, H, or I, Chapter 29, Subchapter A, Chapter 37, or Section , and the use of funds provided for such a program under Subchapter C, Chapter 42, only as necessary to ensure: . . . (3) data integrity for purposes of: (A) the Public Education Information Management System (PEIMS); and (B) accountability under Chapter 39. (b) The board of trustees of a school district or the governing body of an open-enrollment charter school has primary responsibility for ensuring that the district or school complies with all applicable requirements of state educational programs. Other examples related to data integrity can be found in a variety of state and federal laws, rules, and regulations. Copyright Texas Education Agency All rights reserved.
33
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHY (cont.): Why is data integrity so important? Districts’ data are used for many critical purposes: Funding Monitoring Evaluation Compliance Auditing Research Transparency and the public’s right to know Confirming the accuracy of data is a vital part of validating and safeguarding these purposes that result in improved student performance and program effectiveness . Copyright Texas Education Agency All rights reserved.
34
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHY (cont.): Why are Student School Leaver data the focus of Leaver Records Data Validation? The integrity of leaver records has been evaluated annually by TEA since the school year. In response to the 2003 legislation, the TEC was amended to require an annual electronic audit of dropout records and a report based on the findings of the audit. House Bill 3, passed during the 81st Legislature Regular Session (2009) maintained the requirement. See TEC, § Leaver records determine a district’s completions and dropouts, so affect, for example: PBMAS annual dropout and graduation rate indicators (PBM) State Accountability ratings (Performance Reporting) Federal Accountability identification of Focus, Priority, and Reward schools (School Improvement and Support [SIS]) Copyright Texas Education Agency All rights reserved.
35
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHY (cont.): Why are Disciplinary Action data the focus of Discipline Data Validation? In 1995, the 74th Texas Legislature enacted the Safe Schools Act, which created Disciplinary Alternative Education Programs (DAEPs) and Juvenile Justice Alternative Education Programs (JJAEPs) to serve students who had committed disciplinary offenses. To evaluate districts’ use of DAEPs and JJAEPs and to review the documentation of district-reported discipline information, Disciplinary Action records were added to PEIMS. The data collected for these analyses includes both reason and action codes to understand the student’s conduct and the district’s subsequent response. In addition to the 2003 legislation that provides specific authority for TEA to monitor data integrity, TEC § specifically requires an electronic evaluation of discipline data. Finally, TEC § authorizes the commissioner to conduct special accreditation investigations. Copyright Texas Education Agency All rights reserved.
36
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA WHY (cont.): Why are Student Assessment data the focus of Student Assessment Data Validation? In addition to the 2003 legislation that provides specific authority for TEA to monitor data integrity, TEC § allows for special accreditation investigations when anomalous data related to reported absences are observed in the administration of the state student assessment program: TEC § Special Accreditation Investigations. (a) The commissioner may authorize special accreditation investigations to be conducted: (1) when excessive numbers of absences of students eligible to be tested on state assessment instruments are determined; Student Assessment data determine a district’s performance on Texas assessments, so affect, for example: PBMAS STAAR and TELPAS indicators (PBM) State Accountability ratings (Performance Reporting) Federal Accountability identification of Focus, Priority, and Reward schools (School Improvement and Support [SIS]) Copyright Texas Education Agency All rights reserved.
37
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW: How does PBM evaluate districts in the Data Validation System? Unlike PBMAS indicators that yield definitive results, data validation indicators typically suggest an anomaly that may require a local review to determine whether the anomalous data are accurate. Unlike PBMAS indicators that evaluate districts by a range of established cut points, data validation indicators typically require an annual review of data to identify what data may be anomalous or what trends can be observed over time. Therefore, evaluation indicator performance criteria generally are not, and generally cannot, be established in advance, although there are some exceptions (e.g. underreported students in Leaver Records Data Validation). Copyright Texas Education Agency All rights reserved.
38
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW (cont.): How does PBM report districts in the Data Validation System? Unlike PBMAS reports that include results for each evaluated indicator, Data Validation System reports only include identified indicators (or “triggered” indicators) in district reporting. Districts without any triggered indicators will receive such a message. It is suggested that you ensure the message contains your district name or number and that you print the message and retain it for your records. Copyright Texas Education Agency All rights reserved.
39
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW (cont.): How can data integrity be improved? Districts that staffed their data collection and reporting functions/responsibilities with highly competent personnel and expertise have been the most successful at designing internal procedures and systems to assure data quality, accuracy, and integrity. Likewise, districts that have not made the ongoing development of data expertise and data capacity a priority within their districts continue to struggle with issues of data quality, accuracy, and integrity. Copyright Texas Education Agency All rights reserved.
40
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW (cont.): Districts that developed, and have continued to sustain, strong communication and coordination among their data department, relevant program areas, and administrative leadership have been the most successful at designing internal procedures and systems to assure data quality, accuracy, and integrity. Likewise, districts that do not have these strong communication and coordination links continue to struggle with issues of data quality, accuracy, and integrity. Copyright Texas Education Agency All rights reserved.
41
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW (cont.): Districts that regularly review, analyze, and evaluate their data have been the most successful at designing and/or modifying internal procedures and systems to assure data quality, accuracy, and integrity. Likewise, districts that only review their data if asked by TEA to do so continue to struggle with issues of data quality, accuracy, and integrity and lack the necessary internal systems, procedures, and processes. Copyright Texas Education Agency All rights reserved.
42
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW (cont.): How can data integrity contribute to improved student performance and program effectiveness? Districts that approach data integrity monitoring as an ongoing, holistic endeavor have been the most successful at assuring data quality, accuracy, and integrity and implementing effective data-driven decisions. Likewise, districts that approach data integrity monitoring as an episodic, isolated activity completed by one individual continue to struggle with issues of data quality, accuracy, and integrity and are unable to implement effective data-driven decisions. Copyright Texas Education Agency All rights reserved.
43
Copyright Texas Education Agency 2017. All rights reserved.
PBM DATA HOW (cont.): Districts that use their data to make policy, programmatic, and educational decisions have been able to effect real and lasting changes to promote program effectiveness and student achievement. Likewise, districts that rarely or never make data-driven decisions continue to struggle with issues of data quality, accuracy, and integrity as well as program effectiveness and student achievement. Copyright Texas Education Agency All rights reserved.
44
Point of Emphasis Data validation indicators may identify one or more districts that are collecting and reporting accurate data. The process districts use, however, to either validate the accuracy of their data or determine that erroneous data were submitted is fundamental to the integrity of the entire system. Erroneous data may be the result of: an isolated error that can be addressed locally through better training, improved quality control, or other targeted response a systemic issue within one data collection (e.g. Disciplinary Actions) or a pervasive issue (i.e. across all data systems), which requires an extensive district response, including more involvement by TEA and the application of sanctions as necessary and appropriate Copyright Texas Education Agency All rights reserved.
45
Copyright Texas Education Agency 2017. All rights reserved.
+ 3 3 Bonus questions Copyright Texas Education Agency All rights reserved.
46
Copyright Texas Education Agency 2017. All rights reserved.
BONUS QUESTION 1: The STAAR EOC Test Participation Rate indicator in Student Assessment Data Validation, is the most recently introduced data validation indicator. It evaluates discrepancies between course completion data and STAAR EOC test participation. Since its introduction in 2013, PBM has most frequently received these questions about it: Who is evaluated? Which EOC completions are included and excluded? Why is a student who completed an EOC course in a previous year included in the indicator? How can a student’s EOC be “Not Found”, even though the testing contractor’s Student Details report indicates an answer document was submitted for the student during one of the three administrations evaluated? Copyright Texas Education Agency All rights reserved.
47
BONUS QUESTION 1 (cont.):
STAAR EOC Test Participation Rate: Q: Who is evaluated? A: Students first enrolled in grade 9 or below in the school year, as determined by graduation cohorts for the class of 2015 and later These students must meet STAAR graduation requirements to earn a high school diploma from a Texas public or charter school, including taking the STAAR EOC upon completion of a corresponding course. When the indicator was introduced in 2013, many students were still considered TAKS students and excluded from the indicator by enrolled grade level. Beginning in 2015, however, PBM began to examine graduation cohorts available in November for a more reliable indication of whether a student must meet STAAR or TAKS requirements. Copyright Texas Education Agency All rights reserved.
48
BONUS QUESTION 1 (cont.):
STAAR EOC Test Participation Rate: Q: Which EOC completions are included and excluded? A: Using TSDS PEIMS Course Completion data, completions – whether passed or failed – are initially included. The specific courses are Algebra I, English I, English II, Biology, and U.S. History, including Alternate and Some Other Language (SOL) versions and IB and AP versions of U.S. History. The completion is then excluded, if it is determined the student also completed the same course or a version of the course in the previous school year. Beginning in 2016, the completion is also excluded if a valid STAAR EOC answer document was submitted prior to the fall administration of the evaluated school year. The submitted document is recorded in the CAF cumulative EOC testing data for the student. Copyright Texas Education Agency All rights reserved.
49
BONUS QUESTION 1 (cont.):
STAAR EOC Test Participation Rate: Q: Why is a student who completed an EOC course in a previous year included in the indicator? A: The student either previously completed the course two or more years ago and did not STAAR test upon completion of the course, or, if tested, the student’s identifiers do not match those for the CAF cumulative EOC testing data. A prior data integrity issue is often the cause of such a student’s inclusion in the indicator. Examples are that: the student did not test after the initial course completion; a course completion was recorded incorrectly; or student identifiers were incorrectly submitted on an answer document and not corrected. Copyright Texas Education Agency All rights reserved.
50
BONUS QUESTION 1 (cont.):
STAAR EOC Test Participation Rate: Q: How can a student’s EOC be “Not Found”, even though the testing contractor’s Student Details report indicates an answer document was submitted for the student during one of the three administrations evaluated? A: Matching of course completions and EOC answer documents is by student ID (SSN or “S” state ID). Failure on the part of the district to submit the correct student ID on the answer document, then, results in the student being identified as “Not Found”. (An incorrect student ID may be a simple data entry error or may result from inconsistency in the transition from the “S” state ID to the SSN in PEIMS and assessment reporting.) After testing results are reported to the district, a district may (and should) request that the testing contractor correct the student ID in testing history. Subsequently, the contractor’s Student Details report based on the dynamic student history data base will include the answer document. This indicator, however, cannot access the dynamic student history data base and will report the student’s assessment as “Not Found”. Copyright Texas Education Agency All rights reserved.
51
Copyright Texas Education Agency 2017. All rights reserved.
BONUS QUESTION 2: Q: For what program area of PBMAS are charter school districts staged at a higher rate than other school districts? A: PBMAS staging levels of the NCLB (ESSA in 2017) program area in 2016 were significantly higher for charter school districts than for other school districts. Contributing to the higher staging levels were STAAR performance indicators for Title I, Part A. In particular, grades 3-8 mathematics and EOC English Language Arts (ELA) passing rates compared unfavorably. In addition, charter school districts were more often identified for lower Title I, Part A graduation rates. Why? Copyright Texas Education Agency All rights reserved.
52
Copyright Texas Education Agency 2017. All rights reserved.
BONUS QUESTION 3: Significant changes to PBMAS and Discipline Data Validation will be introduced in 2017, based on Final USDE Regulations 34 CFR Part 300. What should your leadership team, especially your Special Education Director, know? The annual rule action for adoption of the PBMAS manual was filed with the Texas Register on Monday, May 22, 2017, for publication in the June 2, 2017 issue. A public hearing was held June 16. The public comment period is June 2, 2017 through July 3, Comments on proposed rules may be submitted to Visit the TEA website and review the Proposed Amendment to 19 TAC § and download Study pages 9-10 of the linked pdf (numbered pages 5-6 of the proposed manual) for an overview of changes to SPED indicators related to the USDE regulations. The indicator pages, of course, provide the details. Copyright Texas Education Agency All rights reserved.
53
BONUS QUESTION 3 (cont.):
A very high level summary of the required changes appear in the amendment’s BACKGROUND INFORMATION AND JUSTIFICATION section: Since 2013, TEA has been implementing a transition plan for the PBMAS SPED program area in anticipation of new federal regulations under 34 Code of Federal Regulations Part 300, which were finalized and issued on December 19, These regulations require 98 separate indicators to evaluate districts' data regarding (a) special education representation [49 indicators]; (b) disciplinary removals [35 indicators]; and (c) educational placements [14 indicators]. These indicators will be used to assign PLs of significant disproportionality based on seven racial/ethnic groups and six disability categories, as required. The federal regulations also require thresholds be set to determine which districts will be identified for significant disproportionality. As with all PBMAS PL cut points, the 2017 PBMAS thresholds for these new indicators were set with advice from stakeholder groups Because the PBMAS representation and educational placements indicators were already well aligned with federal requirements and had included preliminary calculations of disproportionality, the expanded federal requirements pertaining to those two components can immediately be incorporated into SPED Indicator #11 (SPED Representation - Ages 3-21), SPED Indicator #7 (SPED Regular Class <40% Rate - Ages 6-21), and a new SPED Indicator #8 (SPED Separate Settings Rate - Ages ). The three PBMAS discipline indicators, however, will need to be replaced with the 35 discipline indicators required to implement the new federal regulations. There is insufficient time for those indicators to be developed and included with the PBMAS. Therefore, the discipline indicators will be integrated into the 2018 PBMAS. Copyright Texas Education Agency All rights reserved.
54
BONUS QUESTION 3 (cont.):
The summary of SPED indicators on numbered pages 5-6 of the proposed manual explains that the five new integrated discipline indicators will be previewed in the 2017 Discipline Data Validation system. The summary also explains the consequences for districts exceeding the threshold for any of the required 98 separate indicators: Under the federal regulations, consequences for districts that exceed the established thresholds are considerably greater than the requirements that comprise intervention staging for PBMAS. Specifically, any district that exceeds the established thresholds is required to: (a) provide for the review and, if appropriate, revision of the district’s policies, procedures, and practices; (b) allocate 15% of its Part B funds to be used for comprehensive coordinated early intervening services to serve children in the district, particularly, but not exclusively, children in those groups that were significantly over-identified; and (c) publicly report on the revision to its policies, procedures, and practices. States have the flexibility to stipulate that districts must exceed an established threshold for up to three consecutive years before these requirements must be implemented. Copyright Texas Education Agency All rights reserved.
55
BONUS QUESTION 3 (cont.):
What are some differences between the preliminary calculations of disproportionality implemented during the transition, starting in 2013, and the final calculations to be implemented in 2017? During the transition, student groups were compared to all students in the district. The final regulations require that student groups be compared to students not in that student group. So, for example, Asian students are compared to not Asian students. Moreover, if the number of students not in the student group is less than the minimum size requirements (MSRs), regulations specify to use state data for students not in the student group. Intermediate rounding was used in the calculation to determine significant disproportionality (SD) during the transition, but no intermediate rounding will occur in final calculations. The term used for the calculation’s result is “risk ratio”, and it is not expressed as a percent, as it was during the transition. The term used for the risk ratio above which districts are identified as SD is “threshold”. Copyright Texas Education Agency All rights reserved.
56
BONUS QUESTION 3 (cont.):
Expect more information from your ESC about the implementation of Final USDE Regulations 34 CFR Part 300. Copyright Texas Education Agency All rights reserved.
57
Copyright Texas Education Agency 2017. All rights reserved.
Your Questions? Your chance to ask the presenter a question or offer comments… Copyright Texas Education Agency All rights reserved.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.