Download presentation
Presentation is loading. Please wait.
1
Unified Improvement Planning:
Analyzing Data Version 2.0 Hosted by: Colorado Department of Education Provided by : Center for Transforming Learning and Teaching Version 1.3
2
Introductions Center for Transforming Learning and Teaching
Julie Oxenford-O’Brian Mary Beth Romke Colorado Department of Education Lindsey Dulin Judy Huddleston Christina Larson Erin Loften Lisa Medler Alyssa Pearson Version 1.3
3
Session Purpose Ensure school planning teams are prepared to identify notable trends and prioritize performance challenges as part of unified improvement plan data narrative. Version 1.4
4
Introductions Share: Name, Job Title, School/District
Your role in facilitating unified improvement planning Your most important outcome for this session Version 1.4
5
Materials Version 1.4
6
The materials used during this session were developed in partnership with the Center for Transforming Learning and Teaching located in the School of Education and Human Development at the University of Colorado Denver. Version 1.3
7
Norms The standards of behavior by which we agree to operate while we are engaged in learning together.
8
Session Outcomes Explain how unified improvement planning (UIP) will improve student learning and system effectiveness. Identify the data analysis process included in UIP and how the results will be captured in Data Narrative. Determine what data reports/views will be used. Interpret required performance metrics. Review current school or district performance. Describe notable trends (over at least 3 years). Determine which performance challenges will focus school/district improvement activity for the coming year. Apply the UIP Quality Criteria to evaluate trend statements and priority performance challenges. Document the process used to identify trends and prioritize performance challenges for the Data Narrative. Develop a plan for completing data analysis for the school or district UIP. Engage in hands-on learning activities and dialogue with colleagues. Access additional resources. Complete follow-up activities. Version 1.3
9
Agenda UIP & Data Narrative Overview Interpret Performance Metrics
Review Current Performance Identify Notable Trends Prioritize Performance Challenges Plan Data Analysis Version 1.4
10
Purposes of Unified Improvement Planning
Provide a framework for performance management. Support school and district use of performance data to improve system effectiveness and student learning. Shift from planning as an event to continuous improvement. Meet state and federal accountability requirements. Give external stakeholders a way to learn about how schools and districts are making improvements. Version 1.4
11
How will engaging in unified improvement planning result in improvements in performance?
Version 1.4
12
Theory of Action: Continuous Improvement
Evaluate Plan Implement FOCUS Monitor Progress at least quarterly Version 1.4
13
Performance Indicators
Achievement Percent proficient and advanced Reading (TCAP, Lectura, and Co-Alt) Writing (TCAP, Escritura, and Co-Alt) Math (TCAP and CoAlt) Science (TCAP and CoAlt) Growth Normative and Criterion- Referenced Growth Median Student Growth Percentiles in Reading, Writing, Mathematics, and English Language Proficiency Median Adequate Growth Percentiles Growth Gaps Growth Gaps Median Student Growth Percentiles and Median Adequate Growth Percentiles for disaggregated groups: Poverty Race/Ethnicity Disabilities English Learners Below proficient Postsecondary and Workforce Readiness Colorado ACT Graduation Rate Disaggregated Graduation Rates Dropout Rate
14
Planning Terminology Consider the Unified Improvement Planning Terminology (in the Unified Improvement Planning Handbook, Appendix A) Work in a triad to answer the following questions: What is the relationship between performance indicators, measures, metrics, expectations and targets? What is the difference between a measure and a metric? Version 1.4
15
Unified Improvement Planning Processes
Gather and Organize Data Preparing to Plan Section IV: Target Setting Section IV: Action Planning Section III: Data Narrative Today Review Performance Summary Describe Notable Trends Prioritize Performance Challenges Identify Root Causes Set Performance Targets Identify Major Improvement Strategies Ongoing: Progress Monitoring Identify Interim Measures Identify Implementation Benchmarks Version 1.4
16
Colorado Unified Planning Template
Major Sections: Summary Information about the school or District Improvement Plan Information Narrative on Data Analysis and Root Cause Identification Action Plan(s) Version 1.4
17
Section I Section II Section III Section IV
Summary Information about the School/District Student Performance Measures for State and Federal Accountability Accountability Status and Requirements for Improvement Plan Additional Information about the School/ District Progress Monitoring of Prior Year’s Targets Data Worksheet Notable Trends Priority Performance Challenges Root Causes School Target Setting Form Targets Interim Measures Major Improvement Strategies Improvement Plan Information Data Narrative Description of School/District and Process for Data Analysis Review Current Performance Trend Analysis Action Planning Form Research Supporting Associated Root Causes Action Steps Timeline Key People Resources Implementation Benchmarks Progress Version 1.4
18
Updates to UIP Data Analysis
Clarification regarding the role of the Data Narrative Two additional metrics on the SPF/DPF and UIP Template Removal of AYP and Educator Qualification from UIP Template Additional reports required for UIP Version 1.3
19
Planning and Accountability Timeline
When should local teams engage in developing or revising unified improvement plans? Review the Planning Timeline (UIP Handbook, p. 38) and Sample Planning Calendar for Developing/Revising UIP (Toolkit, p. 5) Consider: How do these calendars compare to the timeline in which your schools engaged in planning for the school year? Will you submit your UIP for one of the early posting dates? Version 1.3
20
The Role of the Data Narrative
Turn to: Narrative on Data Analysis and Root Cause Identification (UIP Handbook, p. 11) Work with a partner to explain: What is the role of the Data Narrative? Why were two additional worksheets included in this section of the UIP template? Version 1.3
21
Capturing Notes Today Capture notes for the UIP Data Narrative in the Data Narrative Outline. Plan for completing the Data Narrative using the Planning Data Analysis note catcher. Bookmark the Data Narrative Outline (Toolkit, p. 11) and the Planning Data Analysis (Toolkit, p. 79). Version 1.3
22
Agenda UIP Processes Overview Interpret Performance Metrics
Review Current Performance Identify Notable Trends Prioritize Performance Challenges Plan Data Analysis Version 1.4
23
Data are like ___________ because ______________.
24
Accountability Measures and Metrics
Consider the table of performance indicators, measures, metrics and expectations (UIP Handbook, p. 8-11). What measures are required? What metrics are required? What are minimum state and federal expectations for each metric? Version 1.3
25
Metrics included in the SPF
Take out your SPF/DPF and turn to the detailed reporting by performance indicator (p. 2) Identify which metrics are included for each performance indicator: Academic Achievement Academic Growth Academic Growth Gaps Postsecondary and Workforce Readiness (secondary only) Version 1.3
26
Indicators and Metrics
Indicate your current level of comfort explaining each of the following metrics to a colleague (on a scale of 1 to 5). Indicator Metrics Academic Achievement % Proficient/Advanced School’s Percentile Academic Growth Median Growth Percentile Median Adequate Growth Percentile Academic Growth Gaps Subgroup Median Growth Percentile Subgroup Median Adequate Growth Percentile Postsecondary and Workforce Readiness Graduate Rate Disaggregated Graduation Rate Dropout Rate Colorado ACT Composite Score Version 1.3
27
Reviewing SPF and Required UIP Metrics
Growth: Median Growth Percentiles Median Adequate Growth Percentiles (catch-up and keep-up growth) Growth Gaps Growth in English Language Proficiency (CELApro growth) Disaggregated Graduation Rates NEW Version 1.3
28
Percentage vs. Percentile
Version 1.4
29
Percentiles Percentiles Growth Percentiles Range from 1 - 99
Indicate the standing of a student’s score relative to the norm group (i.e. how a particular student compares with all others who took the same test). Growth Percentiles Range from 1-99 Indicate the standing of a student’s progress to their academic peers, or students with a similar score history (i.e. how his/her recent change in scores compares to the change in scores of other’s who started at the same level). Version 1.4
30
Medium 3rd grade score (540) High 3rd grade score (671)
4th Grade Students 563 575 581 458 699 663 575 681 558 749 Low 3rd grade score (295) 363 575 481 358 599 Version 1.4
31
Medium 3rd grade score (540) High 3rd grade score (671)
563 575 581 458 663 575 681 558 749 699 Low 3rd grade score (295) 363 575 358 481 599 Version 1.4
32
Student Growth Percentiles
Medium 3rd grade score (540) High 3rd grade score (671) 458 563 575 581 699 558 575 663 681 749 11 31 50 58 86 19 24 52 64 99 Student Growth Percentiles Low 3rd grade score (295) 358 363 481 575 599 35 39 61 82 95 Version 1.4
33
Student Growth Percentiles
Require 2 consecutive years of state assessment results. Calculated for individual students (reading, writing, math, English proficiency). Compare individual student’s change in performance to that of his/her academic peers (statewide). Are based on all of the sequential years for which prior state assessment results are available. Provide a normative basis for asking about how much growth a student could make.
34
Mountain School Valley School 563 358 575 458 558 681 575 581 31 35 50
11 19 64 24 58 Valley School 749 699 481 575 599 363 663 99 86 61 82 95 39 52 Version 1.4
35
Mountain School Valley School 11 458 19 558 24 575 31 563 35 358 50
581 64 681 Valley School 39 363 52 663 61 481 82 575 86 699 95 599 99 749 Version 1.4
36
Median Growth Percentile
Mountain School Median Growth Percentile 11 458 19 558 24 575 31 563 35 358 50 575 58 581 64 681 33 Valley School 39 363 52 663 61 481 82 575 86 699 95 599 99 749 82 Version 1.4
37
Median Growth Percentile
Aggregate measure of the growth of a group of students: District/ School Grade-Level Disaggregated Group (ELL, IEP, FRL, Minority) Middle (median) growth percentile for the students in the group. “Typical” student growth for the group. Version 1.3
38
Adequate Growth (CSAP/TCAP)
What is adequate growth? Based on catch-up and keep-up growth So. . . a quick refresher on catch-up and keep-up growth. See Adequate Growth Basics (Toolkit, p. 19) Version 1.4
39
Catch-Up Growth To be eligible to make catch-up growth:
The student scores below proficient (unsatisfactory or partially proficient) in the previous year. To make catch-up growth: The student demonstrates growth adequate to reach proficient performance within the next three years or by 10th grade, whichever comes first.
40
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 95 Not Proficient 55
41
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 85 85 Not Proficient
42
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 80 80 80 Not Proficient
43
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 76 76 Not Proficient 76 76
44
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 76 is the minimum-this student’s adequate growth percentile. 80 95 85 76 80 85 76 80 Not Proficient 76 76
45
Adequate Growth Percentile for Catch Up
For students eligible to make catch-up growth (those who scored unsatisfactory or partially proficient in the previous year). Adequate Growth Percentile = the minimum growth percentile he/she would have needed to make catch-up growth. Version 1.3
46
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 76 76 76 76 Not Proficient
47
Calculating Catch-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade 55th percentile growth will not be enough for this student to catch up – she did not make catch-up growth. Proficient 76 76 76 76 Not Proficient 55 55 55 55
48
Keep-Up Growth To be eligible to make keep-up growth:
The student scores at the proficient or advanced level in the previous year. To make keep-up growth: The student demonstrates growth adequate to maintain proficiency for the next three years or until 10th grade, whichever comes first.
49
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 79 12 Not Proficient
50
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 25 25 Not Proficient
51
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 38 38 38 Not Proficient
52
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 50 50 50 50 Not Proficient
53
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 50 is the maximum -this student’s adequate growth percentile 38 50 25 50 38 25 50 38 12 50 Not Proficient
54
Adequate Growth for Keep-Up
For students eligible to make keep-up growth (those who scored proficient or advanced in the previous year). Adequate Growth Percentile = the maximum of the growth percentiles needed for each of the next three years (or until 10th grade) he/she needed to score at least proficient. Version 1.3
55
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade Proficient 50 50 50 50 Not Proficient
56
Calculating Keep-Up Growth
6th grade 7th grade 8th grade 9th grade 10th grade 79 Proficient 79 79 79 50 50 50 79th percentile growth will be enough for this student to keep up – he made keep-up growth. 50 Not Proficient
57
Calculating Median Adequate Growth Percentiles for CSAP/TCAP
AGP Sorted AGPs Median AGP Adequate growth percentiles for all catch-up and keep-up students Median Adequate Growth for this school is 55 Search for the middle value… Version 1.4
58
Move-Up Growth To be eligible to make move-up growth:
The student scores at the proficient level in the previous year. To make move-up growth: The student demonstrates enough growth to move up to advanced within the next three years or by 10th grade; whichever comes first. Version 1.3
59
Catch-up ● Keep-up ● Move-up
Check your understanding. . . Which students could make catch-up growth? Which students could make keep-up growth? Which students could make move-up growth? Draw a Venn diagram to show if/how these groups overlap. Version 1.3
60
Catch-up ● Keep-up ● Move-up
Eligible to make Catch-Up Growth Eligible to make Keep-Up Growth Eligible to make Move-Up Growth Version 1.3
61
Percent Making Catch-Up Growth
Denominator: The number of students who scored below proficient (unsatisfactory or partially proficient) in the previous year (i.e. students eligible for catch-up growth). Numerator: The number of students who made catch-up growth (i.e. demonstrated enough growth to reach proficient performance within the next three years or by 10th grade, whichever comes first). Performance is improving if: The denominator is getting smaller (approaching zero) The numerator is increasing The percent is increasing (approaching 100)
62
Percent Making Keep-Up Growth
Denominator: The number of students who scored proficient or advanced in the previous year (i.e. students eligible to make keep-up growth). Numerator: The number of students who made keep-up growth (i.e. demonstrated enough growth to maintain proficiency for the next three years or until 10th grade, whichever comes first). Performance is improving if: The numerator is increasing The percent is increasing (approaching 100)
63
Percent Making Move-Up Growth
Denominator: The number of students who scored proficient in the previous year (i.e. students eligible to make move-up growth). Numerator: The number of students who made move-up growth (i.e. demonstrated enough growth to move up to advanced within the next three years or by 10th grade, whichever comes first). Performance is improving if: The numerator is increasing. The percent is increaseing (approaching 100)
64
Catch-up ● Keep-up ● Move-up
Does the sum of these percentages add up to 100? The percent of students making catch-up growth The percent of students making keep-up growth The percent of students making move-up growth Why Not? Version 1.3
65
Catch-Up in Different Contexts
School or District Growth Summary Reports: The percent of students in the school/district making catch-up growth Number of students making catch-up growth/ the number of students eligible to make catch-up growth SPF or DPF For students eligible to make catch-up growth Median Growth Percentile Median Adequate Growth Percentile Version 1.3
66
Comparing SGP & CUKUMU Student Growth Percentiles
Normative Compare student progress to that of their academic peers Adequate growth/Catch-up, Keep-up, Move-up Growth to standard Compare student growth to how much growth they need to reach or stay proficient Version 1.3
67
Academic Growth Gaps Consider the definition of “Academic Growth Gaps” in the Planning Terminology (UIP Handbook p. 28) Talk with a partner: Is this definition consistent with the interpretation of “growth gaps” used in your district? If not, how is it different? How could trends in growth gaps be described using this definition? What data is needed? Version 1.3
68
Adequate Growth Percentiles Over Time
Used in conjunction with median growth percentiles to describe growth gap trends. Accessed through: data lab (see, Accessing Median Adequate Growth Percentiles over Time) SPF reports over time How will you access adequate growth percentiles over time for disaggregated groups? [Planning Data Analysis note catcher] Version 1.3
69
New Measures and Metrics
Indicator: Student Academic Growth Sub-Indicator: English Language Proficiency Measure: CELApro Metrics: Median Student Growth Percentile, Median Adequate Growth Percentile (calculated differently) Indicator: Postsecondary and Workforce Readiness Sub-Indicator: Graduation Rate Measure/Metrics: Disaggregated 4-,5-,6-,7-year graduation rates Disaggregated groups: Minority, FRL, ELL, IEP
70
Measuring Growth of English Language Development
Uses CELApro as the measure (instead of TCAP/CSAP) Applies the Colorado Growth Model methodology to CELApro results Reported only for schools/districts with 20 or more ELLs Measures how much normative growth a student has made towards attaining English proficiency (MGP) Measures how much growth would be adequate to attain the desired level of English language proficiency within a given timeframe (AGP) Version 1.3
71
Disaggregated Graduation Rates
Consider the definition of Graduation Rate in the Planning Terminology (UIP Handbook, Appendix A, p. 28) and the SPF Scoring Guides and Reference Data (Toolkit, p. 27) How are 4,5,6,7 year graduation rates calculated? Which disaggregated groups are included in the SPF/DPF disaggregated graduation rates? What disaggregated graduation rate meets expectations? Version 1.3
72
Disaggregated Graduation Rates
4-year 5- year 6- year 7-year Number of students graduating in 4 years + number of students from the base year who graduated early 4-year rate + number of students graduating in 5 years 5-year rate + number of students graduating in 6 years 6-year rate + number of students graduating in 7 years Number of students in 9th grade in the base year + Transfers in - Transfers out Version 1.3
73
Disaggregated Achievement Data
Version 1.3
74
Accessing Disaggregated Achievement Data
Most districts already use this data and access it through a local data tool. Also available through: Data Center Job-aide: Accessing Disaggregated Achievement Data (UIP Data Analysis Toolkit, p.) Version 1.3
75
Small N? What if summary reports have little or no data?
CDE does not report data for small N to protect student privacy. Options? Student-Level Data Summary statistics for smaller N Accessed through District data reporting tool Downloading student-level records from CEDAR The Colorado Growth Model web-based application (student-level) Version 1.3
76
Accessing Data Reports/Views
Turn to the Planning for Data Analysis note catcher. Make notes about how you will access required state metrics to finalize your data analysis. Include CELApro Growth if appropriate. Make notes about how you will access local performance data. Version 1.3
77
Agenda UIP Processes Overview Interpret Performance Metrics
Review Current Performance Identify Notable Trends Prioritize Performance Challenges Plan Data Analysis Version 1.4
78
Reviewing Current Performance
Use the SPF to identify and describe: School or District accountability status Indicators (and sub-indicators) where performance did not at least meet state/federal expectations Magnitude of the over-all school/district performance challenge Describe how current performance compares to the prior year’s plan (using the Progress Monitoring of Prior Year’s Performance Targets Worksheet) Version 1.3
79
Review SPF Report Capture your answers to the following questions in the Data Narrative Outline: What was the school’s plan type assignment? In which indicator areas did school performance not at least meet state and federal expectations? In which sub-indicators did school performance not at least meet state and federal expectations? In which indicators and sub-indicators did school performance not at least meet local expectations? Version 1.3
80
Magnitude . . . From the UIP Quality Criteria: Schools/districts must identify priority performance challenges and root causes that reflect the magnitude of the overall performance challenge. What does this mean? Version 1.3
81
Identifying the magnitude of the performance challenge
Do the school’s performance challenges include: 80% or more of the students or closer to 15% of the students? All students or only some disaggregated groups of students? Which ones? All content areas? One or two content areas? Which ones? Version 1.3
82
Determining Magnitude
Use the Identifying the Magnitude of the Performance Challenge Worksheet, in the 3rd column answer each question in reference to your school (or a school in your district). Describe the magnitude of your performance challenge in your Data Narrative Outline, (Toolkit, p. 12.) Version 1.3
83
Describing Performance in Relationship to Prior Year’s Targets
Consider: Progress Monitoring of Prior Year’s Performance Targets Worksheet. Use your UIP from (School Target Setting Form) and your 2012 SPF to answer the following questions: Which annual targets from were met? Which were not met? For targets that were met: Is this worth celebration? Were the target(s) rigorous enough? For targets that were not met: Should this continue to be a priority for the current year? Version 1.3
84
Reflecting on Prior Year’s Targets
Brainstorm answers to the following questions: Why were the school’s performance targets met? Why were the school’s performance targets not met? Select one or two explanations to share. Capture your “best thinking” on your Data Narrative Outline Version 1.3
85
Data Analysis Planning
Turn to the Planning for Data Analysis note catcher. Make notes about how you will complete the following: Review Current Performance (Toolkit, p. 80) Progress Monitoring of Prior Year’s Targets (Toolkit, p. 81) Version 1.3
86
Agenda UIP Processes Overview Interpret Metrics
Review Current Performance Identify Notable Trends Prioritize Performance Challenges Plan Data Analysis Version 1.4
87
Collaborative Inquiry for Data Analysis
Choose a partner. Take out: Guiding Assumptions for Collaborative Inquiry (Toolkit, p. 35) Read individually one row in the chart. When each partner has completed a row, look up and “say something.” Something might be a question, a brief summary, a key point, an interesting idea or personal connection to the text. Continue until you complete all of the rows in the table.
88
What are notable trends?
Review Step Two: Identify Notable Trends (UIP Handbook, p 13-15). Discuss: What are the most critical things to remember about performance trends? How can we determine if a trend is notable? What are some examples of “notable” performance trends? Version 1.3
89
Trends Include all performance indicator areas.
Include at least three years of data. Consider data beyond that included in the school performance framework (grade-level data, K-2). Consider local performance data. Include positive and negative performance patterns. Identify where the school did not at least meet state and federal expectations. Include information about what makes the trend notable. Version 1.4
90
Inventory Local Performance Data
Consider the following tool: Inventory of Performance Data Sources (Toolkit, p. 17) Components (see Legend) Content Area Assessment Grade Levels Which Students Content Focus Metrics Questions Determine how you will complete the inventory of locally available performance data. Capture notes in the Planning Data Analysis note catcher (Toolkit, p. 79). Version 1.3
91
Trend Statements Include
Measure/Metric Content Area Which students (grade-levels, disaggregated groups) Direction (stable, increasing, decreasing) Amount (percentages, percentiles, rates, scores) Time period (years) What makes the trend notable Version 1.3
92
How to Describe Notable Trends
Determine what metrics will be considered and what questions will guide analysis. Make predictions about performance. Interact with data (at least 3 years). Look for things that pop out, with a focus on patterns over time (at least three years). List positive and negative facts about the data (with a focus on patterns over time, or trends). Identify which trends are notable (narrow) and which require additional analysis. Write notable trend statements.
93
Levels of Performance Data
School Grade-Level Disaggregated group Standard/Sub-content Area Classroom Student work System Program (Tier I) Program (Tier II/ Tier III) Individual Version 1.3
94
Levels and Performance Metrics
Performance Metric (examples) Aggregate school or district-level Standard/strand Disaggregated group Classroom (formal)/Individual % and number scoring at each performance level, MGP, & AGP (overall and by grade-level) Number and % meeting standard % and number (within group) scoring at each performance level, MGP, AGP (overall and by grade- level) Scale score, individual performance rating, student growth percentile Version 1.3
95
Questions Different metrics make it possible to answer different questions. For example : Could you determine which students were likely to be proficient within the next three years if the metric you are considering is the % of students who scored proficient or better this year? Version 1.3
96
Organizing Data for Continuous Improvement
Consider Organizing Data for Continuous Improvement (Toolkit, p. 41) Components: Path through the data Measures and metrics Critical questions for each metric Associated data reports (or views) Version 1.3
97
A path through the data. . . Select one content area on which to focus Performance (achievement/growth) by grade level for 3+ years Performance by disaggregated groups by grade level for 3+ years Look for and describe positive and negative trends Disaggregate groups further Within grade-levels achievement by standard/sub-content area Look across groups Cross-content area performance (3+ years) Post-Secondary and Workforce Readiness metrics (3+ years) Version 1.3
98
Performance Metrics Academic Achievement (overall and by grade-level)
% proficient or better % and number scoring at each performance level (unsatisfactory, partially proficient, proficient, and advanced) Academic Growth (overall and by grade-level) Median Student Growth Percentiles Median Adequate Growth Percentiles % catch-up % keep-up % move-up Version 1.3
99
Metrics for Achievement at the Standard/Sub-Content Area Level
TCAP Achievement by Standard or Sub-Content Area by grade-level % proficient and above Version 1.3
100
Disaggregated Group Metrics
Disaggregated Groups: Minority (combines: Asian, Black, Hispanic, Native American) Free/Reduced ELL IEP Below Proficient Academic Achievement Metrics (%P/A, % and N for each achievement level) Academic Growth Metrics (MGP, AGP, % catch-up, keep-up, move-up) Version 1.3
101
Disaggregating Disaggregated Groups
Minority (Asian, Black, Hispanic, Native American,) ELL (FEP, LEP, NEP, monitoring status) IEP (limited Intellectual capacity, emotional disability, specific learning disability, hearing disability, visual disability, physical disability, speech/language disability, deaf-blind, multiple disabilities, infant disability, autism, traumatic brain injury) Version 1.3
102
Post-Secondary and Workforce Readiness Metrics
Graduation Rate Disaggregated Graduation Rates Drop-out Rate Average Colorado ACT Composite Score Version 1.3
103
Identifying questions to guide analysis
Use Organizing Data for Continuous Improvement and Data Analysis Questions. Consider the magnitude of the performance challenge and make-up of the student population to determine which disaggregated data will be considered. Determine which local performance data will be used. Capture the questions that will guide the analysis for each metric on the Data Analysis Questions chart. Version 1.3
104
Some Questions for Academic Achievement
Over-All Aggregated and by Grade level Achievement What are trends in % proficient and advanced over the last 3-5 years? What are the trends in % proficient and advanced by grade level for the last 3-5 years? How do our trends compare to the state trends for the same time period? Version 1.3
105
Some Questions for Academic Growth
Overall and Grade-Level Growth What has been the school-level trend in median growth percentiles over the last 3-5 years? What has been the trend in median growth percentiles by grade level for the last 3-5 years? How do the MGPs for the last 3-5 years compare to minimum state expectations? What has been the trend in % of students making catch-up growth overall and by grade level? What has been the trend in % of students making keep-up growth overall and by grade level? How do the school’s trends in CUKU compare to the state? Version 1.3
106
Some Questions for Disaggregated Group Performance
What have been the trends in %proficient and advanced for each disaggregated group present at our school over the last 3 years? What have been the trends in median growth percentiles for each disaggregated group present at our school over the last 3 years? How does the MGP compare to the median AGP for each disaggregated group at our school for the last 3 years? Version 1.3
107
Focus and Reports In what content area will you focus your initial analysis? Organize your data reports for that content area, including: TCAP/CSAP Summary by grade level (at least 3 years) Growth Summary by grade level Achievement and Growth by disaggregated groups Achievement at the standard and sub-content area level Version 1.3
108
How to Describe Performance Trends
Determine what metrics will be considered and identify questions to guide analysis. Make predictions about performance. Interact with data (at least 3 years). Look for things that pop out, with a focus on patterns over time (at least three years). List positive and negative facts about the data (observations). Identify which trends are notable (narrow) and which require additional analysis. Write trend statements.
109
Why Predict? Access prior learning
Why Predict? Access prior learning Name the frames of reference through which we view the world Make the assumptions underlying our predictions explicit, trying to understand where they came from Activate our engagement with the data
110
Preparing to Predict Select a recorder for your table.
Preparing to Predict Select a recorder for your table. On a piece of flip-chart paper, create a T-chart. Put “predictions” on one side and “assumptions” on the other side of the T-chart. The recorder will capture predictions on the left side of this chart. Predictions Assumptions
111
Questions Guide Predictions
Take out your Data Analysis Questions chart. Use your questions to make predictions about what you will see in your data. Capture predictions and assumptions on the T- chart. Post Predictions and Assumptions on your data wall. Version 1.3
112
How to Describe Performance Trends
Determine what metrics will be considered and identify questions to guide analysis. Make predictions about performance. Interact with data (at least 3 years). Look for things that pop out, with a focus on patterns over time (at least three years). List positive and negative facts about the data (observations). Identify which trends are notable (narrow) and which require additional analysis. Write trend statements.
113
Analyzing Data Be patient and hang out in uncertainty
Analyzing Data Be patient and hang out in uncertainty Don’t try to explain the data Observe what the data actually shows No Because Because
115
Interacting with data Consider strategies for interacting with data:
Highlight (color code) based on a legend. Do origami – fold the paper so you can compare columns. Create graphic representations. Agree on an approach How will you interact with your data? Plan to include a visual representation (consider the Interacting with Data Job Aide, Toolkit, p. 65)
116
Capture your Observations
Consider the questions to guide your analysis. Identify things that “pop out”. Note patterns over time (3-5) years. Include both strengths and challenges. Capture observations about your data on a flip chart. Version 1.3
117
How to Describe Performance Trends
Determine what metrics will be considered and identify questions to guide analysis. Make predictions about performance. Interact with data (at least 3 years). Look for things that pop out, with a focus on patterns over time (at least three years). List positive and negative facts about the data (observations). Identify which trends are notable (narrow) and which require additional analysis. Write trend statements.
118
What makes a trend notable?
Consider the UIP Handbook, What makes a trend notable? (p. 14) With a partner discuss. . . to what could we compare our performance trends? How did our performance compare to a specific expectation (criterion)? How did our performance compare to others (groups of students within the school, district, state)? Use CSAP/TCAP Historical Trends (Toolkit, p. 69) as reference for trends in % proficient and advanced. Version 1.3
119
Trend Statement Example
Component Example Measure/Metric Percent of students proficient or advanced on TCAP/CSAP Content Area Math Which students (grade-levels, disaggregated groups) 4th Grade (all students in school) Direction Declined Amount 70% to 55% to 48% Time period 2009 to 2011 What makes the trend notable? This was well below the minimum state expectation of 71%. CASE July DPF/SPF
120
Examples of Notable Trends
The median growth percentile of English Language learners in writing increased from 28 to 35 to 45 between and 2011,meeting the minimum expectation of 45 in 2011 and exceeding the district trend over the same time period. The dropout rate has remained relatively stable (15, 14, 16) and much higher than the state average for each year between 2009 and 2011. Version 1.3
121
Identify Notable Trends
Identify Notable Trends Consider your observations. Compare school performance trends to other points of reference (criterion, others performance over the same time period). Determine which of the identified patterns in school performance are notable. Continue analysis until at least 8 notable trends (positive and negative) are identified.
122
How to Describe Performance Trends
Start with a performance focus and relevant data report(s) and identify questions to guide analysis. Make predictions about performance. Interact with data (at least 3 years). Look for things that pop out, with a focus on patterns over time (at least three years). List positive and negative facts about the data (observations). Identify which trends are notable (narrow) and which require additional analysis. Write trend statements.
123
Write Observations as Trend Statements
Use the “Developing Trend Statements” template Specify the measure/metrics and for which performance indicator the trend applies. Describe for which students the trend applies (grade level and disaggregated group). Describe the time period. Describe the trend (e.g. increasing, decreasing, stable). Determine if the trend is notable and describe why. Version 1.3
124
Checking our Thinking Work with your “partner table”. Assign an ‘A’ and a ‘B’ table. Take turns presenting trends and providing/receiving feedback: Table A facilitator presents their team’s notable trends explaining why each was identified as “notable” Table B team members ask clarifying questions. Table A facilitator responds. Table B team members provide warm and cool feedback about Table A notable trends. Switch roles Version 1.3
125
Capturing Trends in the UIP Template
Capture notable trends (positive and negative) in the Data Analysis Worksheet, (Toolkit, p. 75 excerpted from the UIP template). Note: this worksheet is organized by performance indicator. Version 1.3
126
Make Notes for Data Narrative
Take out the Data Narrative Outline. What data did the planning team review to identify notable trends? Capture this information. Describe the process in which your team engaged to analyze the school’s data and identify notable trends. What were the results of the analysis (which trends were identified as notable)? Version 1.3
127
Completing Trend Analysis
Take out Planning for Data Analysis Make notes on how you will complete your trend analysis. . . Who will participate? When? What materials and tools will you use? Version 1.3
128
Agenda UIP Processes Overview Interpret Metrics
Review Current Performance Identify Notable Trends Prioritize Performance Challenges Plan Data Analysis Version 1.4
129
Priority Performance Challenges
Review Step Four: Prioritize Performance Challenges in the UIP Handbook, p. 15. Discuss: What are the most critical things to remember about priority performance challenges? Why do we prioritize performance challenges? How do performance challenges relate to trends? How do priority performance challenges relate to the magnitude of the over-all school challenges? Version 1.3
130
Priority Performance Challenges
Priority performance challenges are. . . Specific statements about performance Strategic focus for the improvement efforts About the students Priority performance challenges are NOT What caused or why we have the performance challenge Action steps that need to be taken Concerns about budget, staffing, curriculum, or instruction About the adults Version 1.4
131
Priority Performance Challenges Non-Examples
To review student work and align proficiency levels to the Reading Continuum and Co. Content Standards Provide staff training in explicit instruction and adequate programming designed for intervention needs. Implement interventions for English Language Learners in mathematics. Budgetary support for para-professionals to support students with special needs in regular classrooms. No differentiation in mathematics instruction when student learning needs are varied. Version 1.4
132
Prioritizing Performance Challenges
Review for which performance indicators priorities must be identified and the magnitude of the over-all performance challenge. Consider notable trends. Focus the list, combining related trends. Identify trends that are most urgent to act on. Do a reality check (initial prioritization). Evaluate the degree to which the proposed priorities reflect the magnitude of the over-all performance challenge. Achieve consensus on the top three (or four) priorities. Version 1.3
133
What guides our prioritization?
Take out the Data Narrative Outline, consider: In which indicator areas (Academic Achievement, Academic Growth, Academic Growth Gaps, Postsecondary and Workforce Readiness) did school/district performance not at least meet state/federal expectations? Review the magnitude of the school’s over-all performance challenge. Version 1.3
134
Prioritizing Performance Challenges
Review for which performance indicators priorities must be identified and the magnitude of the over-all performance challenge. Consider notable trends. Focus the list, combining related trends. Identify trends that are most urgent to act on. Do a reality check (initial prioritization). Evaluate the degree to which the proposed priorities reflect the magnitude of the over-all performance challenge. Achieve consensus on the top three (or four) priorities. Version 1.3
135
Combine Related Trends
Consider your notable trend statements. Do any of these trends address the same performance challenge (e.g. growth and achievement trends for the same students in the same content area)? Combine related trend statements. Note combined trend statement can include more than one metric (MGPs and % proficient/advanced) for the same students. Capture combined trend statements (and those that could not be combined) on a flip chart. Version 1.3
136
Prioritizing Performance Challenges
Review for which performance indicators priorities must be identified and the magnitude of the over-all performance challenge. Consider notable trends. Focus the list, combining related trends. Identify trends that are most urgent to act on. Do a reality check (initial prioritization). Evaluate the degree to which the proposed priorities reflect the magnitude of the over-all performance challenge. Achieve consensus on the top three (or four) priorities. Version 1.3
137
Initial Prioritization
Identify trends that are urgent to act on (those that represent performance challenges). Do a preliminary check on team priorities using “dot voting” Each person gets 2 (or 3) votes. Team members can spend their votes on different performance challenges or all on one. Identify the performance challenges with the highest number of votes (“proposed priorities”). Version 1.3
138
Prioritizing Performance Challenges
Review for which performance indicators priorities must be identified and the magnitude of the over-all performance challenge. Consider notable trends. Focus the list, combining related trends. Identify trends that are most urgent to act on. Do a reality check (initial prioritization). Evaluate the degree to which the proposed priorities reflect the magnitude of the over-all performance challenge. Achieve consensus on the top three (or four) priorities. Version 1.3
139
Aligning Priorities to Magnitude
Review, “How to determine the appropriate level for a priority performance challenge”, (UIP Handbook, p ) Work with a partner: What does it mean to say the priority performance challenge is aligned to the magnitude of the overall performance challenges for the school? Identify an example of a priority performance challenge that would not be aligned to the magnitude of the school or district’s over-all performance challenge. Version 1.3
140
Evaluating Proposed Priorities
As a team, consider all of the proposed priority challenges. Eliminate priorities that do not reflect the over-all magnitude of the performance challenge for the school or district. Identify remaining priority performance challenges. Version 1.3
141
Prioritizing Performance Challenges
Review for which performance indicators priorities must be identified and the magnitude of the over-all performance challenge. Consider notable trends. Focus the list, combining related trends. Identify trends that are most urgent to act on. Do a reality check (initial prioritization). Evaluate the degree to which the proposed priorities reflect the magnitude of the over-all performance challenge. Achieve consensus on the top three (or four) priorities. Version 1.3
142
Capturing Priority Performance Challenges in the UIP Template
Capture priority performance challenges by performance indicator in the Data Analysis Worksheet (Toolkit, p. 75 excerpted from the UIP template). Some priority performance challenges may be listed by more than one performance indicator. Version 1.3
143
Apply Quality Criteria Section III: Priority Performance Challenges
Use the Quality Criteria for Unified Improvement Planning, Trends and Priority Performance Challenges Consider: How are the trends and priority performance challenges similar and/or different from that reflected in quality criteria? How could these sections be improved upon? Version 1.3
144
Data Narrative Notes Take out the Data Narrative Outline. (Toolkit, p. 14) Describe the process in which your team engaged to prioritize your performance challenges. What were the results? Which performance challenge(s) were selected as priorities for the current school year? Why was each prioritized? List your priority performance challenges. Version 1.3
145
Completing Prioritization of Performance Challenges
Take out Planning for Data Analysis note catcher (Toolkit, p. 84). Make notes on how you will complete your prioritization of performance challenges. . . Who will participate? When? What materials and tools will you use? Version 1.3
146
Agenda UIP Processes Overview Interpret Metrics
Review Current Performance Identify Notable Trends Prioritize Performance Challenges Plan Data Analysis Version 1.4
147
Data Narrative Notes In the Planning for Data Analysis/Data Narrative note catcher (Toolkit p ) Make any final notes about the following components of the data narrative: Review of Current Performance Trend Analysis Priority Performance Challenges Consider the tasks involved in completing the Data Analysis Portion of the Data Narrative. Make notes about how these tasks will be completed, when, and by whom. Version 1.3
148
Next Steps Bring Prioritized Performance Challenges to the Root Cause Analysis session. Version 1.3
149
Give us Feedback!! Written: Use sticky notes + The aspects of this session that you liked or worked for you. The things you will change in your practice or that you would change about this session. ? Question that you still have or things we didn’t get to today. Ideas, ah-has, innovations Oral: Share one ah ha!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.