Download presentation
Presentation is loading. Please wait.
Published byReginald Robbins Modified over 8 years ago
1
Effective Programs for Successful Students The Network Summer Summit Presenters: Rachel Harrington (Performance-Based Monitoring Division) Judy Struve (Program Monitoring and Interventions Division) June 30, 2015 1
2
Purpose of Today’s Presentation Introduce participants to the state’s monitoring system for special programs, with a particular focus on special education programs. Review identification and interventions components of the monitoring system. Provide information on how to: Conduct meaningful data analyses regarding special education program effectiveness. Connect causal factors identified in data analysis to improve program effectiveness. Copyright © Texas Education Agency 2015. All rights reserved. 2
3
How Do We Know if a Program is Effective? 1.Is it being implemented in accordance with the state and/or federal laws, rules, and requirements that govern it (i.e., compliance)? 2.Is it having measurable, quantifiable results on student performance? Copyright © Texas Education Agency 2015. All rights reserved. 3
4
What is the Appropriate Balance Between Those Two Questions? Copyright © Texas Education Agency 2015. All rights reserved. CompliancePerformance 4
5
Prior to 2003, Program Effectiveness was Largely Measured by Compliance. Copyright © Texas Education Agency 2015. All rights reserved. Compliance 5
6
2003 Change in State Law Leads to New Monitoring System Revised alignment of agency functions resulting from: Statute from 78 th Texas Legislature (2003) which limited and redirected agency monitoring Appropriations bill from 78 th legislative session also significantly lowered agency’s FTE cap which necessitated a more efficient agency organization Shift away from process to results, i.e. program effectiveness and student performance Strong emphasis on data integrity/validation Focus on a coordinated approach to agency monitoring (not isolated program monitoring) Copyright © Texas Education Agency 2015. All rights reserved. 6
7
Monitoring System = Performance-Based Monitoring Review of every district every year rather than a limited cyclical process Monitoring decisions made on variety of levels based on performance Intervention activities based on levels of performance LEA analysis of data Submission of information and desk review by TEA On-site reviews as appropriate Emphasis on student performance, program effectiveness, as well as compliance Copyright © Texas Education Agency 2015. All rights reserved. 7
8
After 2003, the Balance Shifted. Copyright © Texas Education Agency 2015. All rights reserved. Compliance Performance 8
9
Monitoring Definition Monitoring is: 1.Using a data-driven performance- based model to observe, evaluate, and report on the public education system at the individual student group, campus, local education agency, regional, and statewide levels across diverse areas including program effectiveness; compliance with federal and state law and regulations; financial management; and data integrity for the purpose of assessing that student needs are being met; Copyright © Texas Education Agency 2015. All rights reserved. So, it’s not “getting dinged?” It’s about “ensuring that student needs are being met”? That sounds like something we all can agree on! 9
10
Monitoring Definition (continued) Monitoring is: 2.Promoting diagnostic and evaluative systems in LEAs that are integrated with the agency’s desk audit and intervention process; and 3.Relying on a research-based framework of interventions that ensures compliance and enhances student success. Copyright © Texas Education Agency 2015. All rights reserved. 10
11
Overall Goals for Monitoring Deliver a consistent and coordinated response to identified areas of low performance/program ineffectiveness in districts/campuses. Take into account both the extent and the duration of a district’s area(s) of low performance/program ineffectiveness. Achieve an integration of indicators and interventions. Program effectiveness and compliance monitoring — a balanced perspective Copyright © Texas Education Agency 2015. All rights reserved. 11
12
Components of PBM Identification and Interventions Performance-Based Monitoring Analysis System (PBMAS) and Data Validation System are used to identify performance and program effectiveness trends and concerns as well as data anomalies/concerns. Performance-Based Monitoring Interventions include strategies used to address performance and program effectiveness trends and concerns as well as data anomalies/concerns. Copyright © Texas Education Agency 2015. All rights reserved. 12
13
Guiding Principles of PBM School District Effectiveness: PBM is designed to assist school districts in their efforts to improve student performance and program effectiveness. Statutory Requirements: PBM is designed to meet statutory requirements. Indicator Design: PBM indicators reflect critical areas of student performance, program effectiveness, and data integrity. Copyright © Texas Education Agency 2015. All rights reserved. So, it’s not about “taking me away from my job?” It’s about “improving student performance and program effectiveness?” That sounds like something that’s part of my job! 13
14
Guiding Principles (continued) Maximum Inclusion: PBM system is designed to evaluate a maximum number of school districts through the use of appropriate alternatives for analyzing districts with small numbers of students. Individual Program Accountability: PBM system is structured to ensure that low performance in one area cannot be offset by high performance in another area and likewise that low performance in one area does not lead to interventions in program areas where performance is high. Copyright © Texas Education Agency 2015. All rights reserved. 14
15
Guiding Principles (continued) High Standards: PBM system promotes high standards for all students in all districts. Standards are adjusted over time to ensure continued student achievement and progress. Annual Statewide Focus: PBM system ensures the annual evaluation of all school districts in the state. Public Input and Accessibility: The design, development, and implementation of the PBM system are informed by ongoing public input. School district performance information that the PBM system generates is accessible to the public. Copyright © Texas Education Agency 2015. All rights reserved. 15
16
Guiding Principles (continued) System Evolution: PBM is a dynamic system in which indicators are added, revised, or deleted in response to changes and developments that occur outside of the system, including new legislation and the development of new assessments. Coordination: PBM is part of an overall agency coordination strategy for the data-driven, performance-based evaluation of school districts. Copyright © Texas Education Agency 2015. All rights reserved. 16
17
What Are Some of the Measurable Characteristics of Effective Programs? Effective Programs: Demonstrate they have achieved strong performance results and/or gains in reading, mathematics, science, social studies, and writing for the students they serve. Demonstrate that, when students, are no longer receiving special education services, they perform well academically. Make appropriate decisions regarding which test version students served in special education will take. Copyright © Texas Education Agency 2015. All rights reserved. 17
18
What Are Some of the Measurable Characteristics of Effective Programs? (continued) Effective Programs: Place students in the least restrictive environment. Employ effective strategies to prevent students from dropping out of school. Promote on-time graduation for as many students as possible. Provide opportunities for as many students as possible to graduate under the Recommended or Distinguished Achievement diploma programs. Copyright © Texas Education Agency 2015. All rights reserved. 18
19
What Are Some of the Measurable Characteristics of Effective Programs? (continued) Effective Programs: Implement special education placement decisions based on a student’s disability not his/her race, ethnicity, socioeconomic status, or English language proficiency. Recognize that student success occurs when students have maximum access to a comprehensive curriculum taught by qualified educators in classrooms equipped with appropriate instructional supports as well as peer- to-peer interaction, and therefore: Include proven strategies that reduce and/or prevent the need to remove students for disciplinary reasons. Copyright © Texas Education Agency 2015. All rights reserved. 19
20
Performance-Based Monitoring Analysis System (PBMAS) Performance-Based Monitoring Analysis System (PBMAS) – an automated data system that reports annually on the performance of school districts and charter schools in selected program areas (bilingual education/ESL, career and technical education, special education, and certain Title programs under NCLB) Specifically designed to measure the characteristics of program effectiveness listed on the previous slides Copyright © Texas Education Agency 2015. All rights reserved. 20
21
Performance-Based Monitoring Analysis System (PBMAS) (continued) Each year’s PBMAS report is typically released to districts in August. Your district’s PBMAS reports from 2004-2014 are available at: http://ritter.tea.state.tx.us/pbm/distrpts.html http://ritter.tea.state.tx.us/pbm/distrpts.html PBMAS reports for all 20 ESC regions as well as the state are also available on the PBM web site. Copyright © Texas Education Agency 2015. All rights reserved. 21
22
22
23
PBMAS Performance Levels 01234 0 Required Improvement (0 RI) Not Assigned (NA) No Data (ND) Special Analysis (0, 1, 2, or 3 SA) Hold Harmless (3 HH or 4 HH) Report Only Copyright © Texas Education Agency 2015. All rights reserved. 23
24
Characteristic 1: Demonstrate strong performance results and/or gains in reading, mathematics, science, social studies, and writing for the students they serve. Characteristic 2: Demonstrate that, when students, are no longer receiving special education services, they perform well academically. PBMAS Program Area and Indicator Number Indicator Name SPED #1(i-v) SPED #2(i-v) SPED #3(i-iv) SPED STAAR 3-8 Passing Rate (M, R, S, SS, W) SPED YAE* STAAR Passing Rate (M, R, S, SS, W) SPED STAAR EOC Passing Rate (M, S, SS, ELA) *YAE = Year-After-Exit 24
25
Notes for 2015 PBMAS (for Indicators on Slide 24) STAAR A and STAAR Alternate 2 results will be included. A targeted hold harmless provision will be implemented. A PL 4 will be added. For STAAR 3-8 indicators: Mathematics results will be included, using passing standards equivalent to the previous mathematics tests. PL 1, PL 2, and PL 3 cut-point ranges will be lowered. For STAAR EOC indicators: Performance Level (PL) cut-point ranges for math and science will be changed. PL assignment for U.S. History will be added. ELA indicator will continue as Report Only. 25
26
STAAR EOC Subject-Area PL 0 Cut Points Mathematics2014 = 50%-100% 2015 = 60%-100% Science2014 = 50%-100% 2015 = 60%-100% Social Studies2014 = Report Only 2015 = 60%-100% English Language Arts2014 = Report Only 2015 = Report Only 26
27
Performance Level Table for SPED #1(i-v) SPED #1 (i-v) PL 0PL 1PL 2PL 3PL 4 Mathematics 70.0% - 100%55.0% - 69.9%40.0% - 54.9% 25.0% - 39.9% 0% - 24.9% Reading 70.0% - 100%55.0% - 69.9%40.0% - 54.9% 25.0% - 39.9% 0% - 24.9% Science 65.0% - 100%50.0% - 64.9%40.0% - 49.9% 25.0% - 39.9% 0% - 24.9% Social Studies 65.0% - 100%50.0% - 64.9%40.0% - 49.9% 25.0% - 39.9% 0% - 24.9% Writing 70.0% - 100%55.0% - 69.9%40.0% - 54.9% 25.0% - 39.9% 0% - 24.9% 27
28
Performance Level Table for SPED #3 (i-iv) SPED #3 (i-iv) PL 0PL 1PL 2PL 3PL 4 Mathematics 60.0% - 100%50.0% - 59.9%40.0% - 49.9% 30.0% - 39.9% 0% - 29.9% Science 60.0% - 100%50.0% - 59.9%40.0% - 49.9% 30.0% - 39.9% 0% - 29.9% Social Studies 60.0% - 100%50.0% - 59.9%40.0% - 49.9% 30.0% - 39.9% 0% - 29.9% ELA Report Only 28
29
What is the targeted hold harmless provision for SPED Indicator #1(i-v) and #3 (i-iv)? It stipulates that any district that received a PL 0 or 0 RI on the SPED STAAR Modified Participation Rate indicator in the 2014 PBMAS that would otherwise receive a PL 3 or PL 4 on SPED Indicator #1(i-v) or SPED Indicator #3 (i-iv) in the 2015 PBMAS will receive a PL 3 HH or PL 4 HH, as applicable for that subject area(s). For 2015 PBMAS interventions purposes, the count of PL 3 HH or PL 4 HH under SPED Indicator #1 (i-v) or SPED Indicator #3 (i-iv) will not be considered in a district’s total PL 3 or PL 4 count in the special education program area. Targeted Hold Harmless 29
30
Characteristic 3: Make appropriate decisions regarding which test version students served in special education will take. PBMAS Program Area and Indicator NumberIndicator Name SPED #4SPED STAAR Alternate 2 Participation Rate 30
31
Characteristic 4: Place students in the least restrictive environment. PBMAS Program Area and Indicator NumberIndicator Name SPED #5 Note for 2015 PBMAS: PLs will be assigned. SPED #6 and SPED #8 SPED #7 and SPED #9 SPED Regular Early Childhood Program Rate Ages 3-5 SPED Regular Class ≥80% Rate Ages 6-11 Ages 12-21 SPED Regular Class <40% Rate Ages 6-11 Ages 12-21 31
32
Characteristic 5: Employ effective strategies to prevent students from dropping out of school. Characteristic 6: Promote on-time graduation for as many students as possible. Characteristic 7: Provide opportunities for as many students as possible to graduate under the Recommended or Distinguished Achievement diploma programs. PBMAS Program Area and Indicator NumberIndicator Name SPED #10 SPED #11 SPED #12 Note for 2015 PBMAS: New PL 1 – PL 3 cut points will be implemented. SPED Annual Dropout Rate Grades 7-12 SPED RHSP/DAP Diploma Rate SPED Graduation Rate 32
33
Characteristic 8: Implement special education placement decisions based on a student’s disability not his/her race, ethnicity, socioeconomic status, or English language proficiency. PBMAS Program Area and Indicator NumberIndicator Name SPED #13 SPED #14 SPED #15 SPED #16 SPED Representation SPED African American Representation SPED Hispanic Representation SPED LEP Representation 33
34
Characteristic 9: Recognize that student success occurs when students have maximum access to a comprehensive curriculum taught by qualified educators in classrooms equipped with appropriate instructional supports as well as peer-to-peer interaction, and therefore include proven strategies that reduce and/or prevent the need to remove students for disciplinary reasons. PBMAS Program Area and Indicator NumberIndicator Name SPED #17 SPED #18 SPED #19 Note for 2015 PBMAS: We will begin transition to a new PL structure for these 3 indicators by reporting disproportionality rates. Disproportionality rates tell us how much higher the special education xxx rate is compared to the all students rate, rather than just telling us the absolute difference between the two rates. SPED Discretionary DAEP Placements SPED Discretionary ISS Placements SPED Discretionary OSS Placements 34
35
2015 PBMAS Manual rule adoption is underway. Proposed Amendment to 19 TAC Chapter 97, Planning and Accountability, Subchapter AA, Accountability and Performance Monitoring, §97.1005, Performance-Based Monitoring Analysis System Summary: The proposed amendment would adopt the Performance-Based Monitoring Analysis System 2015 Manual. 2015 PBMAS Manual 35
36
30-Day Public Comment Period: May 22, 2015 – June 22, 2015. The rule will be effective July 30, 2015. For current information on the rule adoption process: Visit http://tea.texas.gov/About_TEA/Laws_and_Rules/Commissi oner_Rules_(TAC)/Commissioner_of_Education_Rules_- _Texas_Administrative_Code/ and/or http://tea.texas.gov/About_TEA/Laws_and_Rules/Commissi oner_Rules_(TAC)/Commissioner_of_Education_Rules_- _Texas_Administrative_Code/ Subscribe to the Rules Listserv at https://public.govdelivery.com/accounts/TXTEA/subscriber /new https://public.govdelivery.com/accounts/TXTEA/subscriber /new 2015 PBMAS Manual (continued) 36
37
Once the rule becomes effective, the Manual will be posted on our web site, followed by a listserv notification of the posting. At that time, hard-copy versions will also be available from TEA’s Publications Office (see order form at the back of the posted Manual). 2015 PBMAS Manual (continued) 37
38
Effectiveness of PBMAS: Performance Gains and Positive Results for Students The Performance-Based Monitoring Analysis System (PBMAS) was first implemented in 2004. The most current data available from the PBMAS are data from the 2014 PBMAS. The following tables provide longitudinal data summarizing performance gains achieved through the PBMAS as shown in the changes in various indicators’ state rates over time. 38
39
Effectiveness of PBMAS: Performance Gains and Positive Results for Students (continued) The tables are summarized by years of comparable data available for a given indicator. As a result of several statutory and policy changes that occurred outside of the PBMAS (particularly changes to the state assessment system), some indicators have as few as three years’ of comparable data available while others have as many as ten. 39
40
Effectiveness of PBMAS (continued) PBMAS Indicator 2004 State Rate 2014 State Rate Change RHSP/DAP Diploma Rate 12.8%25.5%+12.7 SPED Representation 11.6%8.5%-3.1% Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) Table 1 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2004-2014) 40
41
Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) PBMAS Indicator 2004 State Rate 2013 State Rate Change Less Restrictive Environments for Students (Ages 12-21) 46.8%63.6%+16.8 Table 2 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2004-2013) 41
42
Effectiveness of PBMAS (continued) PBMAS Indicator2005 State Rate2014 State RateChange Less Restrictive Environments for Students (Ages 3-5) 9.6%16.7%+7.1 Discretionary DAEP Placement Rate 1.5 percentage points higher than all students 0.8 percentage points higher than all students -0.7 Discretionary ISS Placement Rate 23.2 percentage points higher than all students 12.3 percentage points higher than all students -10.9 Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) Table 3 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2005-2014) 42
43
Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) PBMAS Indicator 2007 State Rate 2013 State Rate Change Less Restrictive Environments for Students (Ages 6-11) 35.5%39.6%+4.1 Table 4 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2007-2013) 43
44
Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) PBMAS Indicator 2007 State Rate 2014 State Rate Change Annual Dropout Rate (Grades 7-12) 3.2%2.3%-0.9 Graduation Rate 72.7%77.8%+5.1 Table 5 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2007-2014) 44
45
Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) PBMAS Indicator 2008 State Rate 2014 State Rate Change Discretionary OSS Placements 12.7 percentage points higher than all students 8.1 percentage points higher than all students -4.6 Table 6 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2008-2014) 45
46
Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) PBMAS Indicator2009 State Rate2011 State RateChange SPED TAKS Passing Rate (Mathematics) 59.5%68.2%+8.7 SPED TAKS Passing Rate (Reading) 68.1%75.4%+7.3 SPED TAKS Passing Rate (Science) 51.1%59.9%+8.8 SPED TAKS Passing Rate (Social Studies) 69.9%77.5%+7.6 SPED TAKS Passing Rate (Writing) 70.3%76.6%+6.3 Table 7 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2009-2011) 46
47
Effectiveness of PBMAS (continued) Table 4 – PBMAS Performance Gains and Positive Results for Students: Special Education Program Area (2004-2014) PBMAS Indicator2009 State Rate2011 State RateChange SPED YAE TAKS Passing Rate (Mathematics) 77.5%83.4%+5.9 SPED YAE TAKS Passing Rate (Reading) 83.3%86.8%+3.5 SPED YAE TAKS Passing Rate (Science) 73.4%81.0%+7.6 SPED YAE TAKS Passing Rate (Social Studies) 90.2%94.3%+4.1 SPED YAE TAKS Passing Rate (Writing) 88.1%89.8%+1.7 Table 8 – PBMAS Performance Gains and Positive Results for Students: SPED Program (2009-2011) *YAE = Year-After-Exit 47
48
Contact Information for Slides 3-48 Performance-Based Monitoring Phone: (512) 936-6426 Email: pbm@tea.texas.gov Copyright © Texas Education Agency 2015. All rights reserved. 48
49
Why is it important to have effective programs? 49
50
Interventions to Improve Effectiveness of Programs Intervention the act of becoming involved in something in order to have an influence on what happens Copyright © Texas Education Agency 2015. All rights reserved. 50
51
Staging for Interventions Based on results of PBMAS reports, districts are assigned a stage of intervention, determined by: The number of performance levels of 2 or 3 assigned to performance indicators The more PL 2s and 3s, the higher the stage (Stages 1-4) The higher the stage, the more engagement with the agency 51
52
How My LEA Was Selected for 2014- 2015 Interventions Stage 1 One individual SPED PBMAS indicator = 3 to Two individual SPED PBMAS indicator = 3 and up to two individual SPED PBMAS indicators = 2 Stage 4 Six or more individual SPED PBMAS indicators = 3 52
53
Copyright © Texas Education Agency 2015. All rights reserved. 53
54
Integrated Interventions From 2004-2005 until 2011-2012 interventions were conducted by individual program areas. Now districts and charter schools look at all programs exhibiting areas of need in a integrated system. Copyright © Texas Education Agency 2015. All rights reserved. 54
55
What Will Interventions Do for the Charter School? PBMAS report tells you where there are concerns in the effectiveness of programs. Intervention activities help you to see why and what you can do about these concerns! Copyright © Texas Education Agency 2015. All rights reserved. 55
56
What Activities Are Required? What do you become involved in so that you have influence on what happens? Analyzing data Assessing needs Conducting compliance review Continuous improvement planning through development of a targeted improvement plan If the charter school has multiple issues, the agency could conduct an on-site review. Copyright © Texas Education Agency 2015. All rights reserved. 56
57
Who engages in these activities? District Leadership Team 7-9 members Recommended members: LEA administrator Special Education administrator Parent(s) of student(s) with disabilities General education teacher Special education teacher Campus administrator Representative from DAEP if DAEP indicator is to be analyzed Secondary guidance counselor or person knowledgeable about dropout information if dropout indicator is to be analyzed Copyright © Texas Education Agency 2015. All rights reserved. 57
58
Other optional members Community stakeholders Related service providers Speech therapist Special education evaluation personnel Student(s) with disabilities JJAEP representative Others as determined by local needs Copyright © Texas Education Agency 2015. All rights reserved. 58
59
Focused Data Analysis A focused review of data indicators for which a higher level of performance concern has been identified Requires a specified team of individuals to gather, disaggregate, and review data to determine possible causes for the performance concern as reflected in PBMAS indicators Results of the analysis generally are reflected as problem statements Copyright © Texas Education Agency 2015. All rights reserved. 59
60
Focused Data Analysis (continued) Completed in all stages of intervention Must be completed on each PBMAS indicator assigned a performance level (PL) of 2 or 3. Other indicators may be reviewed at the LEA’s discretion. Copyright © Texas Education Agency 2015. All rights reserved. 60
61
61 Copyright © Texas Education Agency 2015. All rights reserved. PBMAS is a district/charter school report. However, it is important to know how each campus contributes to it!
62
SPED STAAR Alternate Participation Rate Calculation: Number of students in Grades 3-9 served in special education tested on STAAR alternate for all subjects applicable to the student’s grade level Number of students in Grades 3-9 served in special education for whom any STAAR assessment was submitted Copyright © Texas Education Agency 2015. All rights reserved. 62
63
63 Copyright © Texas Education Agency 2015. All rights reserved. So now I know that special education students contributed to the campus’ low performance, now what?
64
64 Copyright © Texas Education Agency 2015. All rights reserved. You can not fix a charter school’s or campus’ issues until you know exactly who is contributing to the issues!
65
Needs Assessment After analysis of data and development of problem statements, identify the root causes of areas of concern Development of Targeted Improvement Plan Determine annual goals, and strategies, and interventions to accomplish goal Copyright © Texas Education Agency 2015. All rights reserved. 65
66
Let’s see how this all works Based on the data analysis, the problem statements for Mayberry ISD: Students with disabilities who are removed to ISS and DAEP are all enrolled at Barney Fife Middle School. BFMS’s rate of removal to ISS and DAEP for all students is 30%, removal for students with disabilities is 45%. 75% of the students removed were in the 8 th grade. Students removed to ISS and DAEP were failing the classes from whence referrals to the office were made. Copyright © Texas Education Agency 2015. All rights reserved. 66
67
Results of Needs Assessment BFMS does not have a PBIS in place. Teachers do not differentiate instruction. There is not a continuum of special education services at BFMS. No student with disabilities receives more than 15 minutes per class period of inclusion support services. Based on these results, strategies/ interventions are identified in the targeted improvement plan. Copyright © Texas Education Agency 2015. All rights reserved. 67
68
68 Copyright © Texas Education Agency 2015. All rights reserved. Change just does not happen. You have to create it. You have to do something to get something different.
69
69 Copyright © Texas Education Agency 2015. All rights reserved. Effective programs don’t just happen. Knowledge of the program. Data to know if program is working Strategies to change outcomes if they are not what you want.
70
Contact Information for Slides 48-70 Program Monitoring and Interventions Phone: (512) 463-5226 Email: pmi@tea.texas.gov Copyright © Texas Education Agency 2015. All rights reserved. 70
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.