Download presentation
Presentation is loading. Please wait.
1
PVAAS Statewide Project Director for PDE
7/7/10 Kristen Lewald, Ed. D. Lancaster-Lebanon IU 13 PVAAS Statewide Project Director for PDE PVAAS Overview: Special Education A Session for District Special Education Administrators Welcome to PVAAS 2010! This webinar will provide an overview to PVAAS reporting for Grades 4 through 8 Mathematics and Reading. The intended audience is Teachers – more information on any PVAAS topic is available on the PDE website or by contacting the PVAAS Core Team – contact information is provided at the end of this presentation. PVAAS Overview
2
Today’s Session 4 Key PVAAS Reports for Special Education
Performance Diagnostic Reports (90 min) BREAK (10min) Custom Diagnostic Reports (40 min) Student Projections (40 min) Student Projection Reports PVAAS Projection Summary Reports
3
PVAAS PERFORMANCE DIAGNOSTIC REPORT
PVAAS and Special Education PVAAS PERFORMANCE DIAGNOSTIC REPORT
4
Regional Sharing Which districts/LEAs have an indicator in PVAAS reporting that students with IEPs exceed the standard for PA Academic Growth? A regional forum and collective document to share this information as decided by local districts IU secures permission from LEA admin for regional sharing and plan/agreed upon rules for regional sharing
5
How students grow depends on what schools do with them!
7/7/10 Review of Literature Is there a relationship between… Demographics vs. Achievement? YES Demographics vs. Growth? NO How students grow depends on what schools do with them! From the 1960s, it has been established that there is a relationship between student demographics and achievement results. This is NOT the case when considering growth. A review of the literature indicates there is NO relationship between growth/progress and demographic factors, such as socioeconomic status and racial/ethnic background. Optional Citations: The following are additional literature reviews on these relationships if questions should arise or to strengthen the point being made with the slide. A single measure of student achievement has inherent limitations due to the fact that achievement is correlated to a student’s socioeconomic status and past performance (Hershberg, et al.; Olson, 2007; Sanders, 2000). Fallon (2003) reports that the importance of value-added assessment is that it is based on the experimental design that removes virtually all influence of genetics and socio-economic factors. The design provides a measure of the direct effect of the effectiveness of schools. One of the advantages of value-added assessment systems is that it calculates the growth of an individual not relative to some generic cohort average but actually against that student’s own achievement levels in the previous year. (Callendar, 2004; Drury & Doran, 2003; Hershberg, 2005; Hershberg, Lea-Kruger, & Simon, 2004) [CLICK] [Value-added assessment systems] can remove the effects of factors not under the control of the school, such as prior performance and socioeconomic status, and thereby provides a more accurate indicator of school or teacher influence than is possible when these factors are not controlled (McCaffrey, Lockwood, Koretz & Hamilton, 2003; Ross, Wang, Sanders, Wright & Stringfield, 1999a; Wright, Horn & Sanders, 1997). PVAAS Overview
6
TWO Types of PVAAS Information
7/7/10 TWO Types of PVAAS Information Looking Forward/Planning… PVAAS Projection Reports For Individual Students and Cohorts of Students Looking Back/Evaluation… Value-added Growth Reports For Groups of Students The two PVAAS methodologies really look at two different issues. The Value-added, or growth, information “looks back”…it helps schools to evaluate the effectiveness of the school. How much growth did students make in the past school year? The Projection information “looks forward”…it helps schools plan for the future. Are students on a path to proficiency or higher? Both serve different purposes, and both are equally important for continuous school improvement. Again – it is important to be clear - PVAAS does not provide a growth estimate for individual students. We will share why in the next section. [CLICK] Today PVAAS Overview
7
Building and Grades Question for Educators:
7/7/10 Building and Grades Here is another example of a Value-added Report. As you can see, grades 4 and 6 demonstrated positive growth (green); grades 5 and 8 growth values provide a slight warning indicator; and lastly, 7th grade’s growth value indicates a more serious warning. In this report, 4th and 6th grades have indicated positive growth over the past three years, while 5th and 7th have indicated negative values over the past two years. The growth history in 8th grade is mixed. These data tend to suggest that there are both strengths and weaknesses that need to be investigated further in this school. We have included a possible question that an administrator might want to ask of staff when reviewing this report. We will do so on all other reports in this presentation. [CLICK] Question for Educators: Are all grade levels showing a year’s worth of growth? PVAAS Overview
8
Going Deeper Performance Diagnostic Report
7/7/10 Going Deeper Performance Diagnostic Report Blue Bar – Current Cohort Missing Bar – Insufficient Numbers Whisker – Margin of Error on Growth Value Gold Bar – Previous Cohorts This is an example of a grade level Performance Diagnostic Report. This report can be generated at a district or school level. The purpose of this report is to provide insight into the growth of students who are predicted to perform at the Below Basic, Basic, Proficient or Advanced levels on the current year’s exam in this subject. This prediction is based on the students’ previous performances. The report consists of two sections – a graph and a table. On the graph there are blue bars representing the growth of the predicted PSSA performance subgroup over the last tested year. The gold bars represent the growth of the three prior years’ subgroups (not including the current year) in this predicted PSSA performance category. You will notice that we are missing a blue bar for students predicted to be Basic. This is due to the fact that PVAAS will provide an estimate only for a subgroup of students with 5 or more students. Otherwise the error becomes too large for the estimate to be meaningful. The error is represented by the red whisker (or I bar) at the end of each bar. This is a margin of error whisker that says that the true value of growth is likely to be within the whisker. The whisker plays a critical part in the interpretation of the growth as is indicated in the next slide. [CLICK] Question for Educators: Are all types of learners showing a year’s worth of growth? PVAAS Overview
9
Links to students in selected predicted PSSA performance category
7/7/10 Links to students in selected predicted PSSA performance category Link to all students reported for this subject. There are a number of hyperlinks, indicated in blue and underlined, on this report. We will address a few key links here. To access a list of the students predicted to be in each PSSA performance level, users should click on the hyperlink (or “hot key”) for the particular subgroup to see all students in that predicted PSSA performance level. This will produce a list of students, their state NCE scores and actual PSSA performance levels. (Note: This must be done from a school level report. You cannot access student level data from a district level report. Additionally you must have permission to access student level data.) If you click on the hyperlink (or “hot key”) for the subject area (in this example, Math), you will receive a list of ALL students (regardless of predicted PSSA performance level) included in the report and their state NCE scores and actual PSSA performance levels. [CLICK] PVAAS Overview
10
Student Lists from Performance Diagnostic Reports
7/7/10 Student Lists from Performance Diagnostic Reports Current PSSA Performance Category for each student This slide displays the student list in the selected categories from the Performance Diagnostic Report with their individual 2008 State NCE scores and their observed 2008 PSSA performance levels. All data in this report are “sortable” by clicking on the column title of the field for which the sort is desired. In addition, by clicking on the individual student name, you will be directed to the Individual Student Report, which we will discuss shortly. [CLICK] PVAAS Overview
11
How to interpret? Question for Educators:
7/7/10 How to interpret? This is an example of a grade level Performance Diagnostic Report. This report can be generated at a district or school level. The purpose of this report is to provide insight into the growth of students who are predicted to perform at the Below Basic, Basic, Proficient or Advanced levels on the current year’s exam in this subject. This prediction is based on the students’ previous performances. The report consists of two sections – a graph and a table. On the graph there are blue bars representing the growth of the predicted PSSA performance subgroup over the last tested year. The gold bars represent the growth of the three prior years’ subgroups (not including the current year) in this predicted PSSA performance category. You will notice that we are missing a blue bar for students predicted to be Basic. This is due to the fact that PVAAS will provide an estimate only for a subgroup of students with 5 or more students. Otherwise the error becomes too large for the estimate to be meaningful. The error is represented by the red whisker (or I bar) at the end of each bar. This is a margin of error whisker that says that the true value of growth is likely to be within the whisker. The whisker plays a critical part in the interpretation of the growth as is indicated in the next slide. [CLICK] Question for Educators: Are all types of learners showing a year’s worth of growth? PVAAS Overview
12
PVAAS Performance Diagnostic Growth Descriptors - Interpretation
What the whiskers tell us… Exceeded the standard for PA Academic Growth Blue Met the standard for PA Academic Growth Green Zero (0) Line Growth The position of the whisker relative to the green zero (0) reference line is the key to the interpretation of the Performance Diagnostic Report. On this slide, we display the interpretation of the whiskers for this report The green line represents the standard for PA Academic Growth. The key is where the red whiskers or I-bars are located relative to the green line. If the whisker or I-bar intersects the green line at all, we conclude that the group of students has met the standard for PA Academic Growth. If the whisker or I-bar is completely below the green line, we conclude that the group of students did not meet the standard for PA Academic Growth. If the whisker or I-bar is completely above the green line, we conclude that the group of students exceeded the standard for PA Academic Growth. So for grades 4-8 in Reading and math, the whiskers tell us whether the group made more than a year of growth, a year of growth, or less than a year of growth. For Science, Writing and Grade 9-11 Reading and Math, the whiskers tell us whether the group performed better than expected, as expected or less than expected. Did not meet the standard for PA Academic Growth Pink
13
Quick Interpretation: Flip Chart
14
Your Turn! Performance Diagnostic Report
7/7/10 Your Turn! Performance Diagnostic Report This is an example of a grade level Performance Diagnostic Report. This report can be generated at a district or school level. The purpose of this report is to provide insight into the growth of students who are predicted to perform at the Below Basic, Basic, Proficient or Advanced levels on the current year’s exam in this subject. This prediction is based on the students’ previous performances. The report consists of two sections – a graph and a table. On the graph there are blue bars representing the growth of the predicted PSSA performance subgroup over the last tested year. The gold bars represent the growth of the three prior years’ subgroups (not including the current year) in this predicted PSSA performance category. You will notice that we are missing a blue bar for students predicted to be Basic. This is due to the fact that PVAAS will provide an estimate only for a subgroup of students with 5 or more students. Otherwise the error becomes too large for the estimate to be meaningful. The error is represented by the red whisker (or I bar) at the end of each bar. This is a margin of error whisker that says that the true value of growth is likely to be within the whisker. The whisker plays a critical part in the interpretation of the growth as is indicated in the next slide. [CLICK] Question for Educators: Are all types of learners showing a year’s worth of growth? PVAAS Overview
15
Your Turn! Performance Diagnostic Report
7/7/10 Your Turn! Performance Diagnostic Report This is an example of a grade level Performance Diagnostic Report. This report can be generated at a district or school level. The purpose of this report is to provide insight into the growth of students who are predicted to perform at the Below Basic, Basic, Proficient or Advanced levels on the current year’s exam in this subject. This prediction is based on the students’ previous performances. The report consists of two sections – a graph (as seen here) and a table. On the graph there are blue bars representing the growth of the predicted PSSA performance subgroup over the last tested year. The gold bars represent the growth of the three prior years’ subgroups (not including the current year) in this predicted PSSA performance category. You will notice that we are missing a blue bar for students predicted to be Below Basic. This is due to the fact that PVAAS will provide an estimate only for a subgroup of students with 5 or more students. Otherwise the error becomes too large for the estimate to be meaningful. The error is represented by the red whisker (or I bar) at the end of each bar. This is a margin of error whisker that says that the true value of growth is likely to be within the whisker. The whisker plays a critical part in the interpretation of the growth as is indicated in the next slide. [CLICK] Question for Educators: Did all types of learners perform as expected? PVAAS Overview
16
Patterns of Growth A C B 12/2/2018 PVAAS Overview 16 Facilitator says:
The POWER of the PVAAS Performance Diagnostic Reports are the PATTERNS that they can display about your school/district. These patterns serve as an indicator of where you are headed in the right direction in terms of growing students and where you may have areas for improvement. The meaning of the pattern of growth should be viewed through the context of the achievement goals of your school. What are your school’s goals for achievement? And, as a result where do you need to make more than a year’s worth of academic growth or make at least a year’s worth of academic growth? This slide displays some common patterns that we see in Performance Diagnostic reports: A: Tent Pattern This pattern occurs when school learning experiences benefit students in the middle of its population more than their lower-achieving or higher achieving peers. This trend can be reversed by attending to students at the extremes of the grade level’s population. [CLICK] B: Uniform Pattern This pattern occurs when school learning experiences benefit all students in approximately the same way. In this example, all students in all predicted PSSA performance levels demonstrated positive and significant growth. A uniform pattern with all bars below the green zero (0) reference line would obviously indicate that none of the learners demonstrated positive growth. C: Downward Shed Pattern This pattern occurs when school learning experiences benefit lower achieving students more than their higher-achieving peers. For many reasons, the system is not appropriately addressing the needs of all students, but rather has a focus on the lower- achieving students. B PVAAS Overview 16
17
Performance Diagnostic Subgroup Reports
7/7/10 Performance Diagnostic Subgroup Reports PVAAS also provides Performance Diagnostic Reports for PSSA demographic subgroups. This is a display of the new subgroup selection box as of fall 2010. The categories on this screen reflect the categories on the statewide PSSA student level file. [CLICK] PVAAS Overview
18
Compare Subgroup to Entire Grade Level
7/7/10 Compare Subgroup to Entire Grade Level Same School – Same Grade – Same Subject The real power of the Performance Diagnostic for subgroups becomes clear when you compare the growth of the whole grade level to the growth of the subgroup. You should be aware that the report on the whole group DOES also include those in the subgroup. As you can see in this report, the growth pattern of the Special Education subgroup is similar to that of the entire grade except in the case of student predicted to be Advanced. This result might suggest some further investigation is warranted. [CLICK] Entire grade level Subgroup Question for Educators: How do our subgroups compare to the entire grade level? PVAAS Overview
19
Today Session: > 1 Year
Create a Collective Document > standard for PA Academic Growth Whole Grade, 4-8, 11 Reading and Math Only Special Education Subgroup
20
Your Materials District Level Packet Today District Level
You could do by School level if large enough numbers of students Math and Reading Grades 4-8, 11 Whole Grade and Special Ed Report 24 reports
21
Your Materials Grade Level and Subgroup Reports Highlighted on Packet
ONLY those Groups that Made > standard for PA Academic Growth We are looking at PREVIOUS years only/GOLD bars on report You may want to look at previous groups on your own
22
Your Materials 24 Blank Performance Diagnostic Charts Plan:
District Code: Indicate on EACH chart IF >standard for PA Academic Growth Below Basic Basic Proficient Advanced Find Your Starting Point!
23
Your Role: > standard for PA Academic Growth
7/7/10 Your Role: > standard for PA Academic Growth (One Year’s Growth) X X X X Below Basic Basic Proficient Advanced The interpretation of the position of the whisker relative to the green zero (0) line is slightly different depending on the methodology (grades/subjects) considered. On this slide, we display the interpretation of the whiskers for the Growth Standard methodology reports. The green line represents a gain value of 0, indicating that the cohort has experienced one year’s growth for the academic year just completed. The key is where the red whiskers or I-bars are located relative to the green line. If the whisker or I-bar intersects the green line at all, we conclude that the group of students has met the Growth Standard and has experienced one year’s growth. If the whisker or I-bar is completely below the green line, we conclude that the group of students did not meet the Growth Standard and therefore experienced less than one year’s growth. If the whisker or I-bar is completely above the green line, we conclude that the group of students exceeded the Growth Standard and experienced more than one year’s growth. [CLICK]
24
After Today’s Session We will create a summary document
Whole Grade and Subgroup, 4-8, 11 Return to your IU PVAAS Contact IU gets approval of District Admin to share regionally
25
Disclaimers If a district is not included on a grade/subject/subgroup report it may be due to no students or too few students to report a measure of growth. This cannot be determined from this report. This report only reflects: Districts that had 5 students in a performance level group, and That group of students exceeded the standard for PA academic growth.
26
After Today’s Session Then, you can make connections with each other and ask questions Did whole grade level grow or just subgroup? Number of Students in group(s)? Results for current year only or previous groups as well? Indicating a sustained pattern of growth What may be contributing to these results?
27
BREAK!!!
28
Accessing the Performance Diagnostic Reports
29
CUSTOM DIAGNOSTIC REPORT
PVAAS and Special Education CUSTOM DIAGNOSTIC REPORT
30
Custom Diagnostic Report
What is it? A customized report that allows a user to examine growth for groups of students that are defined by the user. The report enables users to look at the growth of the lowest, middle, and highest achieving students in the user-defined group. This does not mean they are low, middle and high achieving It is relative the group selected by the user The Custom Diagnostic Report provides procedures for examining the growth of user-defined subgroups of 15 or more students. Educators may choose to use this report to explore the effects of particular intervention programs, varied curricular experiences, and or varied instructional practices. Schools should use caution when interpreting results from this report as this report does not infer any causal relationships between any educational variables and student growth. Additionally, this report should NOT be used to estimate teacher effect on learning. PVAAS Overview
31
Custom Diagnostic Report
How might a school use this report? Explore effects of intervention programs, varied curricular and/or instructional experiences, etc. Determine and identify patterns or trends of progress among students with similar educational opportunities.
32
Custom Diagnostic Report
Cautions: This report does not infer any causal relationships between any educational variables and student growth. This report cannot and should not be used to estimate teacher effect on learning. A different statistical methodology would be used for teacher level reporting Additional data would be needed as well
33
Access to the Custom Diagnostic Report?
By [CLICKING] on the subject in the table, you will be brought to a student list on a Student Selection Screen for the Custom Diagnostic Report. PVAAS Overview
34
Custom Diagnostic Report Student Selection Screen
There must be at least 15 students selected within the same grade level and with two consecutive years of data for a report to be generated! This slide displays the Custom Diagnostic Report Student Selection Screen. Remember that there must be a minimum of 15 students selected in order to create a report. The criteria for the selection is user-defined and should be established before selecting the students. Select students by checking the box next to their names or press the Select All button at the bottom of the screen. Once the students have been selected, press the Submit button at the bottom of the screen. PVAAS Overview
35
Why must at least 15 students be selected?
The Custom Diagnostic Report provides the user with a growth report based on a user-defined group of students. This user-defined group is split into three approximately equal-sized subgroups based on the selected students’ PSSA performance history (lowest, middle, and highest achieving subgroups relative to the user-defined group). PVAAS Overview
36
Why do at least 15 students need to be selected?
PVAAS cannot estimate growth/gain for fewer than five (5) students. In order to provide comparison groups, the report splits the selected students into 3 groups. Therefore, there must be a minimum of 15 students so that each comparison group has a minimum of 5 students.
37
Custom Diagnostic Report
Here is an example of a Custom Diagnostic Report for a selected group of students. Selected students are split into 3 approximately equal-sized subgroups based on their previous performances. The placement in the lower 3rd, middle 3rd and upper 3rd is based on the scores relative to the scores of only the selected students. The blue bars display the mean gain for each of the three groups. Emphasize that placement in the lower, middle or high group is relative only to the set of selected students and not any larger set of students. The report also provides the mean gain for each of the disaggregated groups and a list of the students in each group. PVAAS Overview
38
How are students placed into Low, Middle, and High Groups?
A student is placed in the Low, Middle, or High subgroup based on the average of his or her current and previous year PSSA performance The selected students are placed in one of the three groups based only on the students the user selected for this report! PVAAS Overview
39
Considerations It is important to first think about the students you are selecting in a Custom Diagnostic Report! The criteria for the selection of students are defined by the user and should be established before selecting the students! Think carefully about the educational programming or opportunities that are being investigated. The clearer the criteria are defined, the clearer the interpretation! PVAAS Overview
40
“See” the Impact Patterns on the Custom Diagnostic
provide the meaning for an evaluation process!
41
Examples of Growth Patterns
Tent Pattern Downward Shed Pattern Uniform Pattern The POWER of the PVAAS Performance Diagnostic Reports are the PATTERNS that they can display about your school/district. These patterns serve as an indicator of whether you are headed in the right direction in terms of growing students and where you may have areas for improvement. The meaning of the pattern of growth should be viewed through the context of the achievement goals of your school. What are your school’s goals for achievement? And, as a result where do you need to meet or exceed the standard for PA Academic growth? This slide displays some common patterns that we see in Performance Diagnostic reports: A: Tent Pattern This pattern occurs when school learning experiences benefit students in the middle of its population more than their lower-achieving or higher achieving peers. This trend can be reversed by attending to students at the extremes of the grade level’s population. B: Uniform Pattern This pattern occurs when school learning experiences benefit all students in approximately the same way. In this example, all students in all predicted PSSA performance levels demonstrated positive and significant growth. A uniform pattern with all bars below the green zero (0) reference line would obviously indicate that none of the learners demonstrated positive growth. C: Downward Shed Pattern This pattern occurs when school learning experiences benefit lower achieving students more than their higher-achieving peers. For many reasons, the system is not appropriately addressing the needs of all students, but rather has a focus on the lower- achieving students. PVAAS Overview 41
42
Let’s consider an example…
Jazzberry Jam Elementary School offers an after- school tutoring program for students in grades 4-5. In SY , using multiple selection criteria, the district invited 75 students to enroll in the program. Beginning October 2010, 45 grade 5 students began after-school tutoring in Reading. By the end of the school year, it was determined that only 39 of the 45 grade 5 students received tutoring support throughout the school year and had similar educational experiences within the classroom. PVAAS Overview
43
The school creates a Custom Diagnostic Report for 5th Grade Reading by selecting all 39 grade 5 students who participated in the After-School Reading Tutoring Program. The resulting report indicates that the top third of students in regards to achievement exceeded the standard for PA Academic Growth, while the other two-thirds of students (low and middle in regards to achievement) met the standard for PA Academic Growth. PVAAS Overview
44
Next Steps The school may now wish to look at the list of 13 students in the ‘High’ group, and ask themselves relevant questions such as: Did these 13 students have a different tutoring experience than the other 26 students? Were different instructional strategies used with the 13 students in the ‘High’ group v. the other 26 students? Were the needs of the 13 students in the ‘High’ group different from the other 26 students? How many hours of tutoring did students receive? Was there large variability in those hours? Who provided the services? What was the quality of the service? PVAAS Overview
45
Next Steps Based on answers to the previous questions, Jazzberry Jam Elementary staff decide to refine their list and create a ‘new’ Custom Diagnostic Report that investigates the growth of only students receiving 45 hours or more of tutoring instruction. As long as a minimum of 15 students meet this additional criterion, a Custom Diagnostic Report can be produced for these students. PVAAS Overview
46
Next Steps The school may also wish to create a Custom Diagnostic Report based on those students who were eligible to participate in the Tutoring Program, but did not participate. This report would serve as a ‘control’ or comparison.
47
A ‘new’ Custom Diagnostic Report was created for the 24 grade 5 students who participated in the After-School Reading Tutoring Program for a minimum of 45 hours throughout the school year. The ‘new’ report indicates that all groups of students (Low, Middle, and High) exceeded the standard for PA Academic Growth! The two reports provide additional evidence that not only is the Reading tutoring benefitting grade 5 students, but the largest impact for growth is occurring for those grade 5 students who participate in the program for a minimum of 45 hours during the school year! PVAAS Overview
48
CUSTOM DIAGNOSTIC: LET’S GO LIVE!!
49
REMEMBER: TWO Types of PVAAS Information
7/7/10 REMEMBER: TWO Types of PVAAS Information Looking Forward/Planning… PVAAS Projection Reports For Individual Students and Cohorts of Students Looking Back/Evaluation… Value-added Growth Reports For Groups of Students In summary, we revisit this slide from earlier in the presentation. Remember, the two PVAAS methodologies really look at two different issues. The Value-added, or growth, information “looks back”…it helps schools to evaluate the effectiveness of the school. How much growth did students make in the past school year? The Projection information “looks forward”…it helps schools plan for the future. Are students on a path to proficiency or higher? Both serve different purposes, and both are equally important for continuous school improvement. Again –PVAAS does NOT provide a growth estimate for individual students. [CLICK] Today PVAAS Overview
50
STUDENT PROJECTION REPORTS
PVAAS and Special Education STUDENT PROJECTION REPORTS
51
be basic, proficient and/or advanced on a
Student Projections Wouldn’t it be great to know the likelihood that a student will be basic, proficient and/or advanced on a future PSSA? The answer to this, of course, is YES!!!! PVAAS estimates the chance that a student’s performance on a future PSSA will reach the Basic, Proficient, or Advanced performance levels. Projections provide realistic insights into possible future performance if conditions remain the same for the student(s). While not everything stays the same, this is a highly reliable indicator of future performance. PVAAS Overview
52
Student Projection Methodology
Is this the same methodology as the one used to estimate growth? No, the projection methodology is a separate modeling process that focuses on individual students. What data are used in this methodology? ALL available longitudinal data in both Reading and Math are used in projection calculations for all four subjects. What students have projections? In what subjects? Projections are available for all students who have a minimum of 2 years of historical data. The next slide details the grade levels and subjects in which projections are reported. The Student Projection Methodology is a separate modeling process that focuses on individual students although projections are available for both individual students and groups of students. Most important to consider is that the projection methodology uses ALL of the available data in both Reading and Math in its’ projection calculations. This can be done because students’ performance in Reading is highly correlated to their performance in Math (and the reverse is also true). Projections are available for individual students who have a minimum of 2 years of historical data, as well as summaries of projections for groups of students. PVAAS Overview
53
PVAAS Fall 2011 Reporting: Projections to Basic, Proficient, Advanced
Math and Reading: 3 4 or 5 4 5 or 6 5 6 or 7 6 7 or 8 7 8 8 11 Writing: 3 5 4 5 5 8 6 8 Science: 3 4 4 NA This slide shows you what projections are available for students with sufficient data in fall 2011 reporting. Note that the projections available are dependent upon the grade level in which the PSSA was last taken, meaning students last taking the PSSA in grades This is important to remember for middle and high schools where they will need to access projections for students last tested in grade levels not in their building. For example, high schools serving students in grades 9-12 will need to access middle school data to obtain projections for those students last tested in grade 8 in the middle school(s). Unless your district has submitted currently enrolled information on its students. In that case, users can retrieve projections on students in the current grade level in which they are enrolled instead of where they were last tested. Information and instructions on doing this will be sent to districts in the fall Grade PSSA Last Taken Grade PSSA Last Taken Grade PSSA Last Taken PVAAS Overview
54
PVAAS Projection Reporting
Student Projection Report District/School Projection Summary Report Here is a list of the PVAAS projection reports that we will discuss in this webinar. We recommend that educators consider these reports in this order when reviewing PVAAS data to be integrated with other data sources for instructional decision-making. PVAAS Overview
55
PVAAS Student Projection Report
This is what a Student Projection Report looks like. You access this from the Projections tab when you are on the website. The red dots show the student’s observed performance in Reading at grade levels 3, 4, 5, 6 and 7. The yellow square to which the red arrow points indicates the projection for 8th grade Math. This says that based on this student’s history and his/her achievement pattern (taking both Reading and Math into account) the student is projected to be in the 44th state percentile on the 8th grade Math assessment. The calculation of this projection assumes two important criteria: Conditions that were in place for the data that are used to create the model continue to be in force; Where possible, the calculation adjusts the probabilities based on the school most likely to attend for the particular student. If there are not sufficient data to make this adjustment, the projective value will assume the average school experience. The red, blue and green lines on the graph indicate the percentile boundaries for the Basic, Proficient and Advanced categories for the specified subject area based on the most recent PSSA testing. These values are utilized in the calculation of the probability of a student score being in an indicated category. The center table highlights the probabilities that this projected score will be in the Basic category or above (92.9%), in the Proficient category or above (72.0%), or in the Advanced category (26.5%). A table of interpretations and recommendations based on the probabilities are also provided. For your information: Some districts are sharing these reports with parents. PDE has a parent letter template available to accompany this report if your district or school chooses to do that systemtically. We will move to the next slide to enlarge the sections on the bottom of the report for better viewing. PVAAS Overview
56
PVAAS Student Projection Report
The first table in the center of the report highlights the probabilities that this projected score will be in the Basic category or higher (92.8%), in the Proficient category or higher (72.0%), or in the Advanced category (26.5%). These are risk indicators for the educational team. A table of interpretations and recommendations based on the probabilities are also provided at the bottom of the report. August 2011 PVAAS Overview
57
More examples Report A Report B
Here are two examples of PVAAS student projection reports. Report A: this student has a high probability of at least reaching a proficient level. Report B: this student is at risk of not reaching a proficient level of performance. PVAAS Overview
58
Quality of Projections for Decision Making
12/2/2018 Quality of Projections for Decision Making The projection is precise, and is created using a model that has been reviewed and approved by four different peer review panels and the GAO (US Government Accountability Office). The most recent 2008 growth model proposal to USDOE includes information regarding the statistical model and projection reliability study. ( Recent studies have confirmed that the PVAAS projections (even as far as 3 years into the future) are more reliable at looking at the future performance of a student than the most recent PSSA score. Discuss the QUALITY of the projections: Recent studies have confirmed that the PVAAS projections (even as far as three years into the future) are more reliable at looking at the future performance of a student than the most recent PSSA (state assessment) score. Repeat this sentence TWICE! The projection that is estimated is precise and is created using a model that has been reviewed and approved by four different federal peer review panels and the GAO (US Government Accountability Office). The 2008 growth model proposal to USDOE includes information regarding the statistical model and a projection reliability study. This proposal can be found on the USDOE website. As you are viewing and using Student Projection Reports, keep in mind that the projection is an estimate with an associated probability of success for reaching a PSSA performance level of at least Basic, Proficient, or Advanced based on the conditions that created the projection remaining the same. It is one indicator and should be used with other data such as PSSA, 4Sight, and other benchmark and diagnostic assessment data. Remember the question to be asked when using the Student Projection Report, “Are you as a teacher, a principal, a tutor, a coach, etc. satisfied with the probability that this student may reach the Proficient level or higher on a future PSSA?” [CLICK] PVAAS Overview
59
Student Projection Summaries
Wouldn’t it be great to have a summary of all of the probabilities that students in one school or in the entire district will be proficient on a future PSSA? And for subgroups? The answer to this question, of course, is YES!!!! PVAAS provides Projection Summary Reports at both the District and School levels! PVAAS Overview
60
Projection Summary Reports
What are they? This is a report that summarizes the numbers and percentages of students in various likelihood ranges of performing at a proficient level on a future PSSA exam. How might a school use these reports? Intervention Planning Resource Allocation Strategic Planning School Improvement Planning Cautions This report provides ONE indicator about likelihood of future performance. Additional data should be used with the projection for better decision-making. PVAAS includes Projection Summary Reports – a report that summarizes the numbers and percentages of students in likelihood ranges of performing at the proficient level on a future PSSA exam. This gives you a big picture view of where students are headed on the PSSA in the future. PVAAS Overview
61
Quick Access to Student Projections
Projection Summary Reports Proficient Advanced Whole Grade and Subgroups
62
District (Single Grade) Projection Summary Report
12/2/2018 District (Single Grade) Projection Summary Report The District (Single Grade) Projection Summary Report summarizes the probabilities of all students at the selected grade level within the entire district for whom projections can be calculated based on the grade last tested in that district. There are other Projection Summaries reports available from PVAAS. Participants are encouraged to explore these other Projection Summary Reports. [CLICK] PVAAS Overview
63
PROJECTION SUMMARY REPORTS: LET’S GO LIVE!!
64
7/7/10 PVAAS Help Menus Please be aware that you can access PVAAS Help Menus if you need assistance with any report within the system. The Help Menus have been redesigned around Frequently Asked Questions (FAQs) that the PVAAS Core Team and IU staff often receive. There are over 400 pages of Help Menus that are specific to the report you are viewing at that time. They can be printed and used during a team meeting or when you are individually reviewing a report. [CLICK]
65
PVAAS Podcasts on iTunes U
12/2/2018 PVAAS Podcasts on iTunes U The PVAAS Statewide Team for PDE has developed several short podcasts to assist in making PVAAS reporting clear and meaningful. They are available through iTunesU and are offered for both the public reporting site and the password-protected site. You can access these podcasts by clicking on the link on the sign-in page of the PVAAS website. PVAAS Overview
66
12/2/2018 You will need to download the free iTunes software prior to downloading and viewing the podcasts. All podcasts are free to download. PVAAS Overview
67
7/7/10 Questions: PVAAS Materials or Statewide Implementation PVAAS Report Web Site Questions related to the statewide implementation of PVAAS or PVAAS materials can be directed to or Remember to go to to access your district and school PVAAS reports. [CLICK] PVAAS Overview
68
333 Market Street Harrisburg, PA 17126
7/7/10 333 Market Street Harrisburg, PA 17126 PVAAS Overview
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.