Download presentation
Presentation is loading. Please wait.
1
PVAAS Statewide Core Team August, 2008
PVAAS & School Improvement: Using PVAAS for Annual School Level Planning, Grades 4-8 and 11 PVAAS Statewide Core Team August, 2008 Welcome Team/Session Facilitator Introductions Focus for Today’s Session: Using PVAAS for School Improvement planning We will spend this session answering questions that are addressed as schools go through annual school level planning. The questions and process addressed in this session are based upon the process in Getting Results!™ Generation 5. We will walk through this process together using the data packet in front of you. You will also have opportunities to examine your own data and formulate questions and directions for next steps. PA SAS: Standards-Aligned Systems
2
PDE Provides The Data Tools…
PA AYP Public website that reports all AYP-related data on district, school, grade and subgroup. -- Subject area information is also available. PSSA Data Interactive by eMetric Interactive web-based tool for analyzing PSSA data by district, school, grade, subgroup and student. -- Reporting category information is also available. PVAAS Web-based value-added tool that reports growth of cohorts of students and projections for individual students for performance on future PSSA assessments. 4Sight Member Center Interactive web-based tool for benchmark assessment data by grade, class, subgroup and student. -- Intended to be formative and used often throughout the school year. Before we begin looking at the school improvement, or annual school level process, let’s take step back and review the four data tools provided by PDE for districts – the first three are available to all districts and 4Sight Member Center is available to all districts that have elected to participate in the 4Sight Benchmark Assessment System. Our focus for today’s session is PVAAS but the real power of data is the interaction of multiple data tools with the local assessment data that districts collect routinely.
3
PDE Provides The Framework…
“Getting Results!™” A Framework for Continuous School Improvement Planning The framework provides guiding questions for root cause analyses in the following data environments: PSSA and AYP inquiry PVAAS 4Sight Benchmark Data Local Assessment Data Sets PDE has a statewide framework for school improvement planning called Getting Results. PVAAS is one of several data tools that schools are asked to use when analyzing their school’s results, determining root cause and developing a plan action for improvement. There are specific inquiries defined in Getting Results for each of the data tools. PVAAS provides a Pre-Defined Printing package for all reports need to complete the Getting Results Continuous School Improvement Planning process.
4
Getting Results!™ Getting Results! is a TWO-year plan for continuous school improvement. The questions concerning data analysis that are posed in Getting Results!, Generation 5 will be addressed in this session. These are the questions schools answer in Year One of the two-year plan. For those schools involved in the formal school improvement process, Getting Results!, remember this is a two-year plan! The questions concerning data analysis that we will be addressing in this session are the same questions schools need to address in Year One of the two-year plan.
5
Year 1 Year 2 Continuous Improvement Process
Phase 1 Organize and Review the Data Phase 2 Analyze Data and Discover Root Cause Continuous Improvement Process Phase 3 Plan Solution Phase 7 Implement the Revision Phase 4 Implement the Plan NCLB clearly says “The school plan shall cover a 2-year period” §1116(b)(3) By moving to a two year cycle, we can have schools assess the status of implementation on the Action Sequence of the first year of the plan and make revisions for the second year. The hope is to make the school improvement plan a living, breathing document that is routinely revisited and monitored by the administration and leadership team of the school. This slide begins with the original 3 phases that were in older versions of Getting Results. The goal of each phase is listed below: Phase 1– ORGANIZE and REVIEW DATA – emphasizes the need for multiple data sources, including summative, formative, and perceptual. Phase 2 – ANALYZE DATA and DISCOVER “root cause” – offers worksheets for analyzing data from multiple data sources and finding the underlying causes of the state of student achievement. This phase is based on the six components of Pennsylvania’s standards-aligned instructional system (i.e., clear standards; fair assessments; standards-aligned curriculum; instruction; instructional materials & resources; interventions), and the four “lenses” of Quality Teaching, Quality Leadership, Artful Use of Infrastructure, and Continuous Learning Ethic. Phase 3 – PLAN SOLUTION – aligns analysis of data and root cause with strategic action planning. Phase 4 – IMPLEMENT the PLAN – The school improvement plan must be a living, breathing document that is routinely revisited and monitored by the administration and leadership team of the school. Year 2 Phase 5 – ANALYZE EVIDENCE of EFFECTIVENESS – guides reflection of plan implementation. How was the plan implemented? How do you know if it was effective? Phase 6 – REVISE the PLAN – makes refinements and revisions after a status review of the two year plan. Phase 7 – IMPLEMENT the REVISION – The revised School Improvement Plan is an addendum to the two year plan and refines and focuses school improvement efforts. Phase 6 Revise the Plan Phase 5 Analyze Evidence of Effectiveness Year 2
6
On page 8 of Getting Results, there are explicit directions for each of the three data tools on how to access and print the appropriate reports needed in this process. This page identifies which data sets must be included with the plan. It gives a direct link to the websites and contact information if help is needed. When in the Word document, put the mouse on the link to the website and a box will appear that says hold the control key and click. This will not work in the PowerPoint; however, screen captures are included to demonstrate. The PowerPoint walks through how to access and print each data packet. Later in the PowerPoint examples of the data packets are shown. The next slide shows what happens when the mouse is placed on the link.
7
Annual School Level Planning
For those schools not involved in the formal school improvement process, Getting Results! provides a good framework and guiding questions to assist your school in annual school level planning. The questions that we will address should still be the focus of the Analysis/Discovery phase of the Data-Informed Inquiry cycle that follows on the next slide. Even schools not involved in the formal school improvement process benefit greatly from the Getting Results! framework. This framework provides ALL schools with guiding questions that assist schools in their annual school level planning. The questions addressed in this session should be the focus of the Analysis/Discovery phase of the Data-Informed Inquiry cycle shown on the next slide.
8
Context: 3-Phase Data-Informed Inquiry Cycle
Analysis Discovery Data Solutions What data do we have regarding achievement, growth and positive results for students? What do the data tell us about the areas of strength and areas of concern? and Why do the data look that way? What are the “root causes”? What are we going to do about it all? Which evidence-based strategies must we consider in our improvement plan? (ex. What Works Clearinghouse) (ex. SAS-Math) Through Pennsylvania’s three-phase data-informed inquiry cycle, schools can (1) define the data available on both achievement and growth, (2) discover the areas of strength and concern, (3) examine why the data looks like it does, and (4) identify appropriate evidence-based solutions that address the real problem, or “root cause”. In the first phase we call “Data,” schools define all data available including demographic, perceptual, achievement, and growth data. In the second phase called “Data Analysis and Discovery,” school improvement teams collect and analyze data to provide a clearer picture of a school’s student performance and growth, and prioritized areas of instructional need. Schools should analyze data from PSSA, PVAAS, and 4Sight (if available), as well as other pertinent sources. School teams identify areas of strength and areas for improvement in student learning. Schools should also examine why the data looks like it does. Schools ask themselves probing questions about Clear Standards, Fair Assessments, Big Ideas (Curriculum), Instruction, Interventions (Safety Nets) and Instructional Materials and Resources. Schools work to discover the contributing factors and root causes that are impeding student achievement and/or growth in a specific area. This leads to the last of the three phases called “Solutions”, where schools identify appropriate evidence-based solutions that address the real needs of a school. Examples of sources for decisions regarding Solutions include What Works Clearinghouse and the new Standards-Aligned System in Mathematics.
9
School Structures for Data-Informed Decision Making
District-Level Support (Budgetary Support, Professional Development, Resources and Time) Demographic l Perceptual l Process Data Student Learning Data Building Level School Demographic Data PennData Discipline Data Attendance Data Mobility Rate Data Parent Surveys Annual Building-Wide Planning Process Focus: All Students Who: School-Wide Team How: PDE Getting Results, Data Retreat, School/Continuous Planning Process Building Level PSSA & PVAAS Final 4Sight Benchmark Test Standardized Assessments District End-of-Year Tests EAP/Tutoring Assessments Grade/Course Level Initial: PSSA/PVAAS/Final Tests Class/Subgroup Levels Cyclical: 4Sight Benchmark Data – Grade Level District Quarterly Assessments Common Classroom Data Classroom Summaries EAP/Tutoring Assessments Periodic Grade-Level Planning Process Focus: Groups of Students Who: Teacher Teams How: Regular 1-2 Hour meetings Grade/Course Level Class Demographic Data Class Engagement Data Satisfaction Data Attendance Data Walk Through Data The graphic on this slide illustrates how schools can organize their structures to support a data-informed culture. A truly data-informed district recognizes that data analysis occurs at multiple levels and can serve different purposes for a school district. The foundation of this structure is the clear support from the District Office. It may be productive to have the Superintendent articulate a clear commitment for budgetary, professional development, resources and time to accomplish this paradigm shift. Schools should consider three levels of planning: The Annual Building-wide Planning Process, The Periodic Grade Level Planning Process, and The Student Planning Process. Each of these levels has its own focus and team: The Annual Building-wide processes focuses on the entire building utilizing a school-wide team. Outcomes from this level should be building-wide year-long goals. The Periodic Grade Level should consist of teams of teachers teaching the same grade or course. The focus of the meetings are the students who the teachers have in common. This is generally the hardest level to create since it has potentially a significant impact on scheduling, The Student Planning Process reflects the practices of excellent teachers by which they continually monitor and adjust instruction on a daily basis with their children. In addition, teachers will have the information from the Monthly Planning Process meetings that will help to provide a focus for the delivery of their individual instructional plans with their own students. Trainers should emphasize the two-sides arrows and how the data and conclusions flow between the levels on a routine basis. This communication flow enhances the effectiveness and impact of each planning process. Each level has different data that it considers: The data for the Annual Process has been characterized as an autopsy or audit of the previous year’s experience since the students have moved on to the next grade. It is still valuable for analyzing the “big picture” and for informing the process of creating building-wide goals. PSSA and PVAAS fit into this category since the results are not published until the students have moved on to the next grade or course. The Demographic, Perceptual and Process data for the Periodic Level is more specific to the group of students of interest. The achievement data now focus on more cyclical and regular uniform assessments that “take the pulse” of the students. It is most important that the Periodic meetings occur shortly after the administration of the assessments so that the data will be as timely as possible. At the student planning level, individual teachers are encouraged to document student learning and collect data that can be used to diagnose skills and monitor student progress. Student-Planning Process Focus: Classroom of Students Who: Teacher Classroom Level Initial: PSSA/PVAAS/Final Tests Student-Level Achievement and Growth Data Cyclical: 4Sight Benchmark Data – Student Level Continuous Individual Classroom Assessments EAP/Tutoring Assessments Progress Monitoring Classroom Level Qualitative Data Student Historical Information Student Medical Information Student Learning Information
10
Annual School Improvement Planning Meetings
Unit of Analysis: The School
11
Unit of Analysis: The School
Purpose of Analysis: Plan to Improve Student Achievement and Meet AYP Targets Annual School-Wide Goal Setting Ensure instructional program coherence (IPC) Curriculum Instruction Assessment Supports (infrastructure including scheduling, staffing, etc) Monitoring Implementation and Effectiveness of School Plan Each unit of analysis (school, grade, and student) has a specific purpose or focus. At the school level, the purpose is to: Plan for improvements in student achievement and to meet AYP targets. Set annual goals for the entire school and groups of students. Ensure there is alignment between curriculum, instruction, assessments, and interventions/supports (i.e., are they addressing the needs of all students?) Monitor the implementation and effectiveness of the overall school plan.
12
What Does Annual School-Level Planning Look Like Now?
Does it occur annually? Who is at the table? Principals Teachers Others What data are used? Does this result in an action plan? Is the action plan followed? Are you focused on the whole school or on particular students? At this level the focus should be on whole school…ALL students. Focus is on systems, curriculum, supports and resources for all students.
13
Data Packet: Annual School-Level Analysis
Question Tool Report 1) School Met AYP? PAAYP Performance Chart 2) Every Grade Met AYP? eMetric 3-Year Portrait 3) Every Grade Met Growth Standard? PVAAS School Value-added Report 4) Every Grade 3 Year Trend Change in Proficiency? 5) Every Predicted Performance Level Met Growth Standard? Performance Diagnostic Report 6) Every Subgroup Met AYP? 7) Achievement Gap Narrower? 8) Every Grade Projected To Meet AYP Target? Grade Projection Summary 9) School and Subgroups Met Participation Targets? Data Table 10) Every Subgroup Met Growth Standard? 11) Subgroups Performed Similarly in Reporting Categories? These questions can be found in Getting Results!™ and reflect the types of questions schools should be addressing as they focus on school-level analysis. Along with each question are the data tool(s) and report(s) that are suggested to address those questions. For the rest of the session, we will walk through this process and these questions focusing specifically on the questions related to growth highlighted in red. These questions can be addressed by using PVAAS reports. For purposes of time, we will concentrate on Reading in the example that follows.
14
1) Has the school met its AYP targets?
Data tool: PA AYP Performance Chart Use the PA AYP Data Table to review if your school has met its AYP targets.
15
2) Has every grade met its AYP targets?
Data tool: eMetric Organization & Analysis of Your PSSA Data - 3-year Portrait Percent Scoring At/Above Proficient PSSA Reading Results for Most Recent Year relative to NCLB/AYP Target Use the eMetric 3-Year Portrait to review if every grade level has met its AYP targets.
16
3) Has every grade met or exceeded a year’s worth of growth?
Data tool: PVAAS School Value-Added Report Use the PVAAS School Value-Added Report to review if every grade level has met or exceeded a year’s worth of growth. The Value-Added reports for grades 4 through 8 and the ones for grade 11 are slightly different. Let’s review quickly on the next several slides (1) what to examine on these reports and (2) how to interpret it for both grades 4 through 8 and for grade 11.
17
PVAAS Value-added Cohort Growth Descriptors for Grades 4-8
Favorable Indicator Estimated gain at or above growth standard. Students in this cohort have made at least one year growth. All schools can achieve this rating. Caution Indicator Estimated gain below growth standard but by less than one standard error. Students in this cohort have grown less than the standard. Stronger Caution Estimated gain below growth standard by more than one but less than two standard errors. Students in this cohort have fallen behind their peers. The Ratings on the PVAAS School Report and on the District Value-Added Report are color coded to assist with quick recognition of the rating. The growth descriptors for grade 4 through 8 reporting are: [CLICK] Green - Estimated gain at or above growth standard. Students In this cohort have made at least one year of growth. All schools can achieve this rating. Yellow - Estimated gain below growth standard but by less than one standard error. Students in this cohort have grown less than the standard. Light Red - Estimated gain below growth standard by more than one but less than two standard errors. Students in this cohort have fallen behind their peers. Red - Estimated gain well below growth standard by more than two standard errors. Students have made little progress. Emphasize that the Value-Added side of PVAAS is about the growth of cohorts of students – not about the growth of individual students. Strongest Warning Estimated gain well below growth standard by more than two standard errors. Students in this cohort have made little progress.
18
PVAAS School Value-Added Report, Grades 4-8
Every grade in the school has met or exceeded a year’s worth of growth? (“Getting Results!”™, p. 12) This is an example of a Value-Added Report that is available at both the district and school level. The top circle indicates the gain in 2008 for each grade level 4 through 8. In this example, grades 4 and 6 received a ‘green’ rating indicating that students in those grades made at least one year of growth. These students as a cohort met or exceeded the growth standard. However, there is slight evidence that students in grades 5 and 8 fell behind their peers, as indicated by the yellow color . The rose indicator in grade 7 tells us that there is stronger evidence that this cohort of 7th grade students made little progress during this past school year. [CLICK] In 4th and 6th grade, there is now sufficient history to calculate the 3-year Mean gain for the respective grade. These values are useful when looking for patterns of growth within the same grades. Users should always keep in mind that when you are considering one grade’s values over several years, the student cohorts changed each year. However, if you begin to see patterns like those displayed in 4th grade, you may want to investigate as three different groups of 4th graders met the Growth Standard with high gain values. The pattern over the 3 years for 6th grade indicates consistent growth for all three cohort but somewhat less dramatically than the cohorts in 4th grade.
19
PVAAS Value-added Cohort Growth Descriptors for Grade 11
Below Observed performance is significantly BELOW the expected performance. NDD Observed performance is NOT DETECTABLY DIFFERENT than the expected performance. Above Observed performance is significantly ABOVE the expected performance. PVAAS growth indicators for cohorts of high school students are slightly different than those for Grades The high school growth indicators are based on a comparison of predicted or expected performance and the observed performance on the actual PSSA exam. The three descriptors indicate the outcome of the comparison: Red -- Observed performance is BELOW expected performance Green -- Observed performance is ABOVE expected performance Yellow -- Observed performance is NOT DETECTABLY DIFFERENT than expected performance. NOT DETECTABLY DIFFERENT is best considered in the context of a criminal jury trial. The two verdicts that the jury can return in any criminal case only verdicts of Guilty – evidence is sufficient to remove reasonable doubt Not-Guilty – not sufficient evidence to be sure of guilt Not Detectably Different is like a Not-Guilty verdict – we do not have sufficient evidence to conclude that the observed performance is above or below expectations.
20
PVAAS School Value-Added Report, Grade 11
This is a copy of the PVAAS high school report for 2008. [CLICK] Key concepts from this report include the mean student score and mean predicted score, which are used to determine the school ranking as compared to the state average. This is a profile for a school that indicates that the achievement score is well above average levels (74 percentile) AND the school effect indicates above expected progress.
21
Check for Understanding: Value-Added Report
What do the four colors (green, yellow, rose/pink, and red) mean for grades 4-8? How is that different from the three colors (green, yellow, red) for grade 11? Why does the State Average NCE have a value of 50 every year on the grades 4-8 reports? What does the Growth Standard mean for the grades 4-8 reports? Why does it have a value of 0 every year? What is the mean predicted score on the grade 11 report? Should a school only dig deeper if the gain is NOT green? Should a school only be concerned about gains that are not green? Is it appropriate to look at growth for the same grade level across the last several years (looking vertically at growth across years in a particular grade level)? This slide should be used when providing professional development to trainers. If time allows, this can also be used with district trainings. Take a few minutes for participants to think, pair and share on how they would address answering these questions.
22
Hands-On with Value-added Reports
Review by Grade Level Review by Subject Discuss Meaning as It Relates to AYP Targets Discuss as It Relates to Other Building/District Goals Discuss When These Would Be Used by Grade Level Teams Provide time for hands-on with grade level summaries reports Review by Grade Level Review by Subject Discuss Meaning as It Relates to AYP Targets Discuss as It Relates to Other Building/District Goals Discuss When These Would Be Used by Grade Level Teams
23
4) Is the trend over the past 3 years for each grade an increase in percent Proficient or Advanced?
Data tool: eMetric Organization & Analysis of Your PSSA Data – 3-year Portrait Percent Scoring At/Above Proficient Use the eMetric 3-Year Portrait to review the trend your school has been at each grade level for increasing the percent of students at the proficient/advanced levels.
24
Data tool: PVAAS School Performance Diagnostic Report
5) Has every performance level in each grade met or exceeded a year’s worth of growth? Data tool: PVAAS School Performance Diagnostic Report Use the PVAAS Performance Diagnostic Report to examine if every predicted performance level has met or exceeded a year’s worth of growth. The performance diagnostic report for grade 11 looks similar and should be interpreted in the same way for grades 4 through 8 and for grade 11. Let’s review on the next three slides the keys to interpreting this report.
25
PVAAS Performance Diagnostic Report The Key is the RED Whisker!
The District and School Value-added Reports provide the estimated mean gains of cohorts of students. Means tend to hide extreme values in any data set so it is prudent to disaggregate the growth data. PVAAS provides the only opportunity to disaggregate data based on predicted performance levels! This report details the mean gain of students predicted to be Basic, predicted to be Basic, predicted to be Proficient, and predicted to be Advanced. As will be displayed on the following screen the red whisker is the key!!!!
26
What the whiskers tell us…
Exceeded Growth Standard; More than One Year’s Growth Green Met the Growth Standard; Made One Year’s Growth Yellow Growth Standard (One Year’s Growth) [CLICK] If the red whisker contains the Green Line (Growth Standard line = 0) or a year’s worth of growth, then we can conclude that the cohort has met the Growth Standard and therefore, the students have experienced one year’s worth of growth. If the red whisker is below the Green Line, then we have sufficient evidence that the cohort in question did not meet the Growth Standard and therefore its position in the statewide distribution is lower than where it was last year. If the red whisker is above the Green Line, then we have sufficient evidence that the cohort in question exceeded the Growth Standard and therefore its position in the statewide distribution is higher than where it was last year. Below Growth Standard; Less than One Year’s Growth Rose
27
Performance Diagnostic Report
Every performance levels (below basic, basic, proficient, advance) in each grade has met or exceeded a year’s worth of growth? (“Getting Results!”™, p. 12) In the school Performance Diagnostic report, students’ mean performances are disaggregated by predicted proficiency groupings based on students’ prior histories. Since this part of the report is to be used only for diagnostic purposes, one standard error is used in interpreting the significance of results. This report addressed the displayed Getting Results question directly. PVAAS data can also be disaggregated into quintile groupings in which students are grouped relative to performances of all students in the state for that subject and grade level. The report provides a visual representation of the school’s Performance Diagnostic report. It shows the most recent information as well as the previous cohort’s information. Again, this report does not tell us WHY progress is being made or not made. It tells us about the amount of progress made for different groups of students. The following slide aids in the interpretation of the standard error bars in the display. [CLICK]
28
Check for Understanding: Performance Diagnostic Report
What do the Gain and Standard Error values mean? What do the colors mean on the pie chart version of this report? What does the red whisker, or red line, mean? How does it help in the interpretation of this report? What do the blue bars represent versus the gold bars? This slide should be used when providing professional development to trainers. If time allows, this can also be used with district trainings. Take a few minutes for participants to think, pair and share on how they would address answering these questions.
29
Hands-On with Performance Diagnostic Reports
Review by Grade Level Review by Subject Be Sure You Know How to Locate Subgroup Reports as well Discuss Patterns and Meaning of Patterns
30
6) Have subgroups met or exceeded the NCLB target of Proficient or Advanced?
Suggested data tool: eMetric Organization & Analysis of Your PSSA Data - 3-year Portrait Percent Scoring At/Above Proficient PSSA Reading Results for Most Recent Year relative to NCLB/AYP Target Use the eMetric 3-Year Portrait to examine if all subgroups have met or exceeded AYP targets.
31
Percent Scoring At/Above Proficient
7) Has the achievement gap between the entire student group and relevant subgroups become narrower this year? Data tool: eMetric Organization & Analysis of Your PSSA Data - 3-year Portrait Percent Scoring At/Above Proficient Use the eMetric 3-Year Portrait to examine if the achievement gap between all students and the relevant subgroups have become narrower this past year.
32
Data Tool: PVAAS Grade Projection Summary Report
8) Is each grade level on a trajectory to reach the AYP goal at the end of this school year? Data Tool: PVAAS Grade Projection Summary Report Use the PVAAS Projection Summaries to examine if each grade level is on a trajectory, or path, to reach AYP at the end of this school year. Let’s take a moment to review what the projection summary reports tell us.
33
Projection Summary Reports
What is it? Report that summarizes the numbers and percents of students in likelihood ranges of performing at the proficient level on a future PSSA exam. How might a school use this report? Intervention Planning Resource Allocation Strategic Planning School Improvement Planning Cautions This report provides another indicator about likelihood of future performance. PVAAS now includes Projection Summary Reports – a report that summarizes the numbers and percents of students in likelihood ranges of performing at the proficient level on a future PSSA exam. Remind audience of these uses when they see the reports on the following slides. Always remind the audiences that any tool is only ANOTHER INDICATOR… no silver bullets.
34
Check for Understanding: PVAAS Projections
Can you get projections for just one year in to the future? Or, can you get projections for other years? How should a school interpret the three ranges of probabilities (i.e., 0-40%, 40-70%, %)? How reliable are the projections? This slide should be used when providing professional development to trainers. If time allows, this can also be used with district trainings. Take a few minutes for participants to think, pair and share on how they would address answering these questions.
35
Hands-On with Grade Projection Summaries
Review by Grade Level Review by Subject Discuss Meaning as It Relates to AYP Targets Discuss as It Relates to Other Building/District Goals Discuss When These Would Be Used by Grade Level Teams Provide time for hands-on with grade level summaries reports Review by Grade Level Review by Subject Discuss Meaning as It Relates to AYP Targets Discuss as It Relates to Other Building/District Goals Discuss When These Would Be Used by Grade Level Teams
36
9) Has the school met its AYP participation targets for all relevant subgroups?
Data tool: PA AYP Data Table Use the PA AYP Performance Chart to examine if all groups of students have met the AYP target(s) for participation.
37
Data tool: PVAAS School Performance Diagnostic Report (by subgroup)
10) Has every subgroup in each grade met or exceeded a year’s worth of growth? Data tool: PVAAS School Performance Diagnostic Report (by subgroup) Use the PVAAS Performance Diagnostic Report (selecting appropriate subgroups) to examine if students within every subgroup are making one year’s worth of growth or more. These reports are interpreted the same way as the Performance Diagnostic Report for the entire cohort of students at that grade level.
38
Data tool: eMetric Group Summary – Reporting Categories
11) Have subgroups performed similarly to all the student groups in each reporting category? Data tool: eMetric Group Summary – Reporting Categories Use the eMetric Reporting Categories Group Summary report to examine if all subgroups of students are performing similarly to all students at the grade level in each reporting category.
39
Summarize the patterns and trends….
Summarize the patterns and trends in your data. What insights have you gained about the achievement and growth of students? Let’s think back to all of the data you analyzed at the school-level. Remember the purpose of analysis at this level was to: Plan for improvements in student achievement and to meet AYP targets. Set annual goals for the entire school and groups of students. Ensure there is alignment between curriculum, instruction, assessments, and interventions/supports (i.e., are they addressing the needs of all students?) Monitor the implementation and effectiveness of the overall school plan. What were the overall patterns or trends you discovered in your data? What insights have you gained about the achievement and growth of students? We will now walk through several activities together that will help you summarize the patterns and trends in achievement and growth.
40
Activity #1 Plot each grade, considering achievement (eMetric) and growth (PVAAS). Into what quadrant does each grade fall? Where are the strengths? Where are the areas for improvement? What questions does this raise about: Curriculum Instruction Interventions Professional Development Resources This activity focuses on both achievement and growth. This activity will help you determine not only where your students at each grade level are in relation to the AYP targets (where they are now) but also how much progress they are making (where they are headed). Trainers should walk through each question with school teams having them use a blank scatter plot to plot each grade level in regards to achievement and growth.
41
Activity #1 Using eMetric 3-Year Portrait, what percent of 7th graders were proficient or advanced? You should see 59.1%. Looking at the PVAAS School Value-Added Report, did students meet or exceed one year’s growth? What growth descriptor is indicated? In this school, we can see that although only 59% of 7th grade students are proficient/advanced and did not meet AYP (without provisions), we have significant evidence that they made more than a year’s growth. 7th
42
Activity #2 Chart for each grade, the predicted performance levels with more than one year’s growth (star) and those with less than one year’s growth (circle)? Which groups are showing positive growth? Which groups are losing ground? Is this consistent across grade levels? What questions does this raise about: Curriculum Instruction Interventions Professional Development Resources This activity again focuses on both achievement and growth. However in this activity schools will be looking at PVAAS performance diagnostic reports to help them determine which groups of students (groups based on predicted achievement or performance levels) are making more or less than one year’s growth. (Are higher performing students losing ground compared to our lowest achieving students who are making more than a year’s growth?) Trainers should walk through each question with school teams having them use a blank diagnostic reports to plot each grade level and predicted performance subgroup that are making more or less than one year’s growth.
43
Growth Standard (One Year’s Growth)
Activity #2 7th 7th 7th 7th 8th Growth Standard (One Year’s Growth) 8th 8th Looking at the PVAAS Performance Diagnostic Reports, which students met or exceeded one year’s growth? Which students made less than one year’s growth? In 7th grade, we can see that all four groups (those predicted to be below basic, basic, proficient, and advanced) exceeded the growth standard; they made more than one year’s growth. In 8th grade, we see that three groups (those predicted to be below basic, basic and proficient) are making more than one year’s growth. However students predicted to be Advanced are falling farther behind and not making one year’s growth. 8th
44
Activity #3 Chart for each grade and subgroup, the predicted performance levels with more than one year’s growth (star) and those with less than one year’s growth (circle)? Which groups are showing positive growth? Which groups are losing ground? Is this consistent across grade levels? How does the growth of the subgroups compare to overall students? Trainers should again walk through each question with school teams having them use a blank diagnostic reports to plot each subgroup AND grade level and predicted performance subgroup that are making more or less than one year’s growth. NOTE: This activity can be skipped if time is limited. Indicate to the participants however that this step is important for them to do in order to determine if patterns and trends are different for their subgroups than for all students. Using another blank performance diagnostic grid in front of you, chart for each grade and subgroup the predicted performance levels with more than one year’s growth (using a star) and those with less than one year’s growth (using a circle). Which groups are showing positive growth? Which groups are losing ground? Is this consistent across grade levels? How does the growth of the subgroups compare to overall students?
45
Growth Standard (One Year’s Growth)
Activity #3 7th 8th Growth Standard (One Year’s Growth) Looking at the PVAAS Performance Diagnostic Reports for subgroups, which students met or exceeded one year’s growth? Which students made less than one year’s growth? In the 7th grade IEP subgroup, we can see that those predicted to be Basic made more than one year’s growth. This is consistent with the growth occurring with all students. In the 8th grade IEP subgroup, we see that students predicted to be Below Basic are making more than one year’s growth. This is particularly good news as these students may not have reached proficiency yet they are making more than one year’s worth of growth. This is the kind of pattern that will need to continue to help close the achievement gap for these students. This school may want to take a closer look a the IEP subgroup predicted to be Basic; these students are meeting the growth standard. However if these students are to move towards proficiency, we need to accelerate their growth. What can this school learn from their IEP subgroup of students who were predicted to be Below Basic and made more than a year’s worth of growth?
46
Discussion Is the school projected to meet AYP?
What is the percent of students with high likelihood to reach proficiency? Is each grade projected to meet AYP? At each grade level, what is the percent of students with high likelihood to reach proficiency? How does this compare to the data you have just reviewed that “looks back”? Are students moving towards proficiency? Let’s take a look at your school’s data. Is the school projected to meet AYP? What is the percent of students highly likely to reach proficiency? Is each grade projected to meet AYP? At each grade level, what is the percent of students highly likely to reach proficiency? How does this compare to the data you have just reviewed that “looks back”? Are students moving towards proficiency?
47
Discussion What insights have you gained about the achievement and growth of students? What questions are you left with concerning: Curriculum Instruction Interventions Professional Development Resources What insights have you gained about the achievement and growth of students? What questions do you still have concerning: Curriculum Instruction Interventions Professional Development Resources
48
Next Steps 3-Phase Data-Informed Inquiry Cycle
Analysis Discovery Data Solutions What data do we have regarding achievement, growth and positive results for students? What do the data tell us about the areas of strength and areas of concern? and Why do the data look that way? What are the “root causes”? What are we going to do about it all? Which evidence-based strategies must we consider in our improvement plan? (ex. What Works Clearinghouse) (ex. SAS-Math) The next steps in this process involve further discovery, root cause analysis and solutions/goal setting…highlighted in red on this slide.
49
Year 1 Year 2 Continuous Improvement Process
Phase 1 Organize and Review the Data Phase 2 Analyze Data and Discover Root Cause Continuous Improvement Process Phase 3 Plan Solution Phase 7 Implement the Revision Phase 4 Implement the Plan For those schools involved in the Getting Results! formal school improvement process, this is indicated as Phase 3 in the continuous improvement process. Phase 6 Revise the Plan Phase 5 Analyze Evidence of Effectiveness Year 2
50
Additional Professional Development
Each IU has a PVAAS contact Fall 2008 Professional Development 21 statewide webinars for districts Dates on PDE website 42+ days of hands-on professional development offered at IUs Contact your IU for registration Follow-up support Professional Development Materials PDE website For continuous support and professional development opportunities, various webinars and mini-modules have been developed and archived on the PDE website. These include: The Use and Interpretation of PVAAS reporting in grades 4 through 8 in grades 9 through 11 for ESL students for Special Education students for Title I students Understanding the Growth Standard Methodology Understanding the Projection Methodology The Use and Interpretation of Scatter Plots The Use and Interpretation of the Projection Summary Tool
51
Statewide PVAAS Professional Development, Support & Materials
Contact Information If you have any questions about statewide PVAAS support and materials, feel free to contact the state-wide core team using this or phone number. To access reports, please use the URL PVAAS Website
52
Local PVAAS Professional Development, Support & Technical Assistance
Contact Information [Name] [ ] [Phone] If you have any specific needs for technical assistance or support, please contact your local Intermediate Unit at………
53
PVAAS is provided by the Department of Education as a data tool for continuous school improvement. Gerald L. Zahorchak, Secretary of Education Gerald L. Zahorchak, D.Ed Secretary of Education Commonwealth of Pennsylvania
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.