The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Jackson County Schools March 11, 2013.

Slides:



Advertisements
Similar presentations
Summary of NCDPI Staff Development 12/4/12
Advertisements

Copyright © 2010, SAS Institute Inc. All rights reserved. Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
Student Learning Targets (SLT)
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013 The Power of EVAAS.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Performance Diagnostic Report PVAAS Overview 2013 Blue Bar – Current Year Missing Bar – Insufficient Number of Students Whisker – Margin of Error on Growth.
Student Learning targets
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
Beginning – Intermediate October 16, 2012 EVAAS for Educators.
Information for school leaders and teachers regarding the process of creating Student Learning Targets. Student Learning targets.
Understanding EVAAS Teacher Effectiveness Reporting Washington County Public Schools April 25, 2013.
Getting Started Log on to this site: Complete Tasks Before We Begin Section 1.Consensogram Activity 2.Burning.
EVAAS for Educators Mary Keel, Ed.D. Robin Loflin Smith, Ed.D. Tara Patterson, MSA.
Beginning – Intermediate – Advance Date EVAAS for Educators.
Beginning – Intermediate – Advanced Friday, September 14, 2012 EVAAS for Educators.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
October 2012 Hyde County. Before We Begin… Visit:  Add the Region 1.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
CTE Directors April 11, 2013 Understanding EVAAS: Teacher Effectiveness Reporting.
Beginning – Intermediate October 16, 2012 WRESA Internet Access: Log into AB Tech No Password EVAAS for Educators.
Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013 The Power of EVAAS.
Beginning – Intermediate – Advance November 8, 2012 EVAAS for Educators.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
Turning Data into Action The GCS Journey. Where we started We had data everywhere It didn’t come with instructions We talked about data-based decision.
Using EVAAS to Make Data- Driven Decisions Clay County School April 20, 2012 Jan King Professional Development Consultant, NCDPI.
Student Assessment Data Portal
TOPSpro Special Topics
SASEVAAS A Way of Measuring Schooling Influence
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY???
NWEA Measures of Academic Progress (MAP)
PowerSchool for Parents
Reports: Pivot Table ©2015 SchoolCity, Inc. All rights reserved.
Beginning – Intermediate October 16, 2012 WRESA
New EVAAS Teacher Value-Added Reports for
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Beginning – Intermediate – Advance September 28, 2012
MD Online IEP System Instructional Series – PD Activity
Reflective Assessments
Overview: Understanding and Building a Schoolwide Assessment Plan
Session 4 Objectives Participants will:
EVAAS Overview.
Overview of Student Learning Objectives (SLOs) for
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
CORE Academic Growth Model: Results Interpretation
Proactive Assessments
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Evaluating the Quality of Student Achievement Objectives
Our Agenda Welcome, Introductions, Agenda Overview EVAAS and Data
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013.
Analyze Student Work Sample 2 Instructional Next Steps
Study Island Student Demo:
Chicago Public Schools
Why should you care about the EVAAS Teacher Value Added Report?
Summer Institute and Home Base
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Analyzing Student Work Sample 2 Instructional Next Steps
Maryland Online IEP System Instructional Series - PD Activity #5
Consider the Evidence Evidence-driven decision making
Selecting Baseline Data and Establishing Targets for Student Achievement Objectives Module Welcome to the Polk County Selecting Baseline Data and.
Maryland Online IEP System Instructional Series - PD Activity #5
Presentation transcript:

The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Jackson County Schools March 11, 2013

Today’s Presenters Joyce Gardner Professional Development Consultant Region 8 Joyce.gardner@dpi.nc.gov Becky Pearson Professional Development Consultant Region 8 becky.pearson@dpi.nc.gov Jason Rhodes Professional Development Consultant Region 8 email@dpi.nc.gov

Agenda Welcome, Introductions, Agenda Overview Pre Assessment EVAAS Basics Diving into Reports Leveraging EVAAS to Change Instruction

Outcomes Explore and understand the EVAAS philosophy Understand and use various EVAAS reports Using report data to drive changes in teaching that will impact student learning

Resources Facilitator Preference – may want a presenter and a driver, use split screens

https://center.ncsu.edu/nc Virtual Resources Data Literacy Module https://center.ncsu.edu/nc This is the landing page for all DPI wikis. The regional wikis are in the bottom right hand corner and the NCEES one is in the top left. NC Education, where the modules are housed is also in the top left. If time allows, you could have them explore this resource. Discuss EVAAS Home page and NCDPI SAS log in with the resources there. EVAAS is one point of Data. Data Literacy is a critical skill for educators. This module provides an introduction to data literacy. It includes information on types of data, strategies for analyzing and understanding data, and processes for determining how these can influence instructional practices. This module aims to provide learning experiences that develop or enhance abilities to find, evaluate, and use data to inform instruction. of the Data Resource Guide is to provide information and resources to help administrators, data coaches, teachers, and support staff with data driven decision making for their schools and districts. Districts and charter schools are encouraged to use this guide to design and implement training for data teams with the goal of increased data literacy and student achievement.

Pre-Assessment The following slides are Poll Everywhere slides to get a feel for what participants already know. The Poll Everywhere questions are hyperlinked on the agenda on wiki Facilitators should clear poll results after the presentation and/or check to see if the group before you cleared their results

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/ClXW6pEwZ1i38mu If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I am very familiar with the Educator Val...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/CzLR9RuU4H4ldqh If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I know how to log in to the EVAAS websit...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/lFYpvgtwoYG51M4 If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I know how to navigate the EVAAS website...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/bFqepkNUn3S2BgZ If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I understand EVAAS report names

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/ZEptyKAfYGg8Jlf If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I know how to use the EVAAS website to g...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/xSVQdDdIFC5f9Uk If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I know how to access EVAAS reports for i...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/ZsKeU9SKdOJvyPb If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I am able to analyze the metrics in EVAA...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/vr6HNQnkZlLBegc If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I know how to collect evidence from EVAA...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/IqHRcV9uqKMtEnX If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I know how to interpret the following re...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/xAbvO8Ls9V6RzC3 If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I am able to communicate the findings of...

Don’t forget: You can copy-paste this slide into other presentations, and move or resize the poll. http://www.polleverywhere.com/multiple_choice_polls/f8iVX49gkClzxwW If you like, you can use this slide as a template for your own voting slides. You might use a slide like this if you feel your audience would benefit from the picture showing a text message on a phone. Poll: I am able to use data analysis to initia...

What is Data Literacy? The understanding needed to: Find Evaluate Utilize data to inform instruction. Data literacy refers to the understanding needed to find, evaluate, and utilize data to inform instruction. A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.

A Data Literate Person Can… A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making. Data literacy refers to the understanding needed to find, evaluate, and utilize data to inform instruction. A data literate person possesses the knowledge to gather, analyze, and graphically convey information to support short and long-term decision-making.

Table Talk What is EVAAS? How are you currently using EVAAS? What benefits/difficulties have you experienced? What have you learned?

Benefits and Considerations for Teachers Professional Development is the Key Data Conversations / True PLCs Culture of School Sensitivity of Data Finger Pointing and Blame Game Window vs. Mirror Understand academic preparedness of students before they enter the classroom. Monitor student progress, ensuring growth opportunities for all students. Modify curriculum, student support, and instructional strategies to address the needs of all students. Participants turn and Talk: Benefits and Considerations? “Popcorn Out” All educators must learn to use the data. DATA CONVERSATIONS are imperative. Principal’s role is to prepare teachers to work with the data. The purpose of EVAAS is to support student understanding and to make appropriate instructional, logistical, and professional decisions to support student achievement and growth.

NC Professional Teaching Standards Standard I: Teachers demonstrate leadership.  Take responsibility for the progress of all students  Use data to organize, plan, and set goals  Use a variety of assessment data throughout the year to evaluate progress  Analyze data Standard IV: Teachers facilitate learning for their students.  Use data for short and long range planning Standard V: Teachers are reflective on their practice.  Collect and analyze student performance data to improve effectiveness Connection to NC Professional Teaching Standards Rationale for Data Literacy

Teachers contribute to the academic success of students. Standard 6 for Teachers Teachers contribute to the academic success of students. The work of the teacher results in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth.

Benefits for Principals Gain a consolidated view of student progress and teacher effectiveness, as well as the impact of instruction and performance. Bring clarity to strategic planning and function as a catalyst for conversations that must take place to ensure that all students reach their potential. Understand and leverage the strengths of effective teachers. Use the valuable resource of effective teaching to benefit as many students as possible. Often in middle schools, we test students by homeroom, and when our data returns, we must, at the school level, redistribute the students to align with the teacher that taught those students. EVAAS, however, is sophisticated enough to put kids in the classes where they belong because it pulls the student assignment from NCWISE. If the school’s NCWISE data is correct, the student will be properly placed with the appropriate teacher for each course. Talk about working with Kim to pull prediction data and used it in data conversations with every single EOC teacher. She provided each teacher with this information. Planning and discussion sessions. Used data in scheduling as well.

NC Standards for School Executives Standard 2: Instructional Leadership Focuses his or her own and others’ attention persistently and publicly on learning and teaching by initiating and guiding conversations about instruction and student learning that are oriented towards high expectations and concrete goals; Creates processes for collecting and using student test data and other formative data from other sources for the improvement of instruction Ensures that there is an appropriate and logical alignment between the curriculum of the school and the state’s accountability program But also relates to Standard 3 Cultural Leadership – fair and consistent evaluations of teachers, provides for differentiated pd according to teachers’ needs, etc. And to Standard 7 Micropolitical Leadership – allocation of resources, communication within the school.

Standard 8 for School Executives Academic Achievement Leadership School executives will contribute to the academic success of students. The work of the school executive will result in acceptable, measurable progress for students based on established performance expectations using appropriate data to demonstrate growth.

Changes in Reporting for 2012-13 2011-12 2012-13 Above Exceeds Expected Growth Not Detectably Different Meets Expected Growth Changes in reporting for 2012-13 2011-12 Color coding and descriptors Above (Green) – students in the district made significantly more progress in this subject than students in the average district in NC. Progress was at least two standard errors above average. NDD (Yellow) – Not Detectably Different from students in the average district. Less than two standard errors above average and no more than two standard errors below it. Below (Light Red) – students in the district made significantly less progress in this subject than students in the average district in NC. Progress was more than two standard errors below average. 2012-13 Color coding and descriptors Exceeds Expected Growth (Blue): Estimated mean NCE gain is above the growth standard by at least 2 standard errors. Meets Expected Growth (Green): Estimated mean NCE gain is below the growth standard by at most 2 standard errors but less than 2 standard error above it. Does Not Meet Expected Growth (Red): Estimated mean NCE gain is below the growth standard by more than 2 standard errors. The descriptors in EVAAS now match the Standard 6 Ratings. Below Does Not Meet Expected Growth

Exceeds Expected Growth Does Not Meet Expected Growth Teacher Ratings Categories Teachers 2012-13 1 2 3 4 5 6 Demonstrate Leadership Establish Environment Know Content Facilitate Learning Contribute to Academic Success Reflect on Practice Exceeds Expected Growth 5 Rating Categories Meets Expected Growth 3 Rating Categories Not Demonstrated Developing Proficient Accomplished Distinguished Does Not Meet Expected Growth Meets Expected Growth Exceeds Expected Growth Let’s take a few seconds to look at the different rating categories presented on this slide. For teachers, nothing changes about the first five standards. For standard six, the rating options are the following: does not meet expected growth, meets expected growth, and exceeds expected growth. Does Not Meet Expected Growth

Table Talk How do you explain the concept of Achievement vs. Growth?

Student Achievement End of School Year Proficient A focus on achievement or proficiency looks like this… Student is able to meet specific standards Students fall into a limited range or band of achievement Does not account for change outside of that range Does not account for student ability before they came to class “I can do it” End of School Year

Student Growth Change over time Start of End of School Year Proficient Change over time A focus on student growth or progress looks like… Takes into account student achievement within range or beyond that range Compares student achievement to how they were predicted to achieve Discerns between teacher impact and student ability Accounts for student ability before they came to class “Improvement or progression” Not Proficient Start of School Year End of School Year

Achievement vs. Growth Student Achievement: Where are we? Highly correlated with demographic factors Student Growth: How far have we come? Highly dependent on what happens as a result of schooling rather than on demographic factors By concentrating on the growth students make, EVAAS puts the emphasis on something educators can influence.

The EVAAS Philosophy All students deserve opportunities to make appropriate academic progress every year. There is no “one size fits all” way of educating students who enter a class at different levels of academic achievement. EVAAS value-added modeling is based on the philosophy that all kids count and that schools should not be held responsible for the things they cannot change, like a child’s socio-economic status, and that schools should be responsible for the things they can change, like a child’s growth during a year of schooling. We believe that: --All kids count --All kids can learn --All kids deserve opportunities to make appropriate academic progress every year --Educators can manage their effectiveness to improve student opportunities.

The EVAAS Philosophy Adjustments to instruction should be based on the students’ academic needs, not on socio-economic factors. "What teachers know and can do is the most important influence on what students learn." (National Commission on Teaching and America's Future, 1996) EVAAS value-added modeling is based on the philosophy that all kids count and that schools should not be held responsible for the things they cannot change, like a child’s socio-economic status, and that schools should be responsible for the things they can change, like a child’s growth during a year of schooling. We believe that: --All kids count --All kids can learn --All kids deserve opportunities to make appropriate academic progress every year --Educators can manage their effectiveness to improve student opportunities.

Achievement and Poverty Some people believe that this type of analysis is unfair because it penalizes economically disadvantaged students. This scatterplot illustrates the correlation between student achievement and poverty, which is unfair. Now let’s look at the correlation between student growth and poverty. Discuss ‘fair’ in relation to the data and to teacher evaluation/accountability vs. ‘fair’ delivery of curriculum and instruction to subgroups How is this fair?

Academic Growth and Poverty No one is doomed to failure.

High-Achieving Students and Progress All schools in Tennessee in 2011 - Math students in grades 4 through 8. Districts, schools, and teachers that serve high achieving students can make excellent progress, just as easily as those that serve low achieving students.

Proficiency vs. Growth Scenario Proficient Growth 5th grader begins the year reading at a 1st grade level. Ends the year reading at a 4th grade level.   5th grader begins the year reading at a 7th grade level. Ends the year reading at the 7th grade level. NO YES Click on slide for answers Which scenario is a better indicator of the effectiveness of the teacher? YES NO

Table Talk How could you use the concept of achievement vs. growth when speaking with parents? How does the achievement vs. growth conversation guide PLCs?

EVAAS Overview

SAS EVAAS Analyses What is EVAAS? LOOKING BACK End of Grade Writing ACT End of Course End of Grade LOOKING AHEAD Planning for Students’ Needs: Student Projections to Future Tests LOOKING BACK Evaluating Schooling Effectiveness: Value Added & Diagnostic Reports Looking Back = Reflective Looking Ahead = Proactive Copyright © 2010, SAS Institute Inc. All rights reserved.

How can EVAAS help me? Improve the Education Program EVAAS: Looking Back Past Program Effectiveness Local Knowledge & Expertise EVAAS: Looking Ahead Incoming Student Needs EVAAS helps by allowing educators to: Analyze past program performance for trends Make informed projections for current and incoming students

Answers the question of how effective a schooling experience is for learners Produces reports that Predict student success Show the effects of schooling at particular schools Reveal patterns in subgroup performance EVAAS extracts data AFTER DPI collects data through the secure shell. DPI runs processes and checks for validity. Once DPI has completed their processes with the data, they present to the SBE. At this point, data is sent to EVAAS.

Test Your Knowledge of EVAAS Reports At your tables, you will find copies of a variety of reports available from EVAAS and labels for each report. Working with your group, match the report label with the appropriate report.

Reflective Assessments

Value-Added Reporting

District Value Added Report Use to evaluate the overall effectiveness of a district on student progress Compares each district to the average district in the state for each subject tested in the given year Indicates how a district influences student progress in the tested subjects We will look at three kinds of reports - value-added, diagnostic, performance diagnostic - at the district level and review how to read them b/c this is the same way you will read your school data. The reports have elements in common and once you can interpret the district reports, you’ll be able to read your school reports easily.

Use this report to evaluate the overall effectiveness of a school on student progress. The School Value Added Report compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects. * Facilitator preference for the next few slides until break – use only power point or use live site to model – more questions pop up when using the live site

The School Value Added Report compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects.

Value-Added Reporting Scores from the EOG tests are converted to State NCEs (Normal Curve Equivalent scores) for the purpose of these analyses. NCE scores have the advantage of being on an equal-interval scale, which allows for a comparison of students' academic attainment level across grades. NCE scores remain the same from year to year for students who make exactly one year of progress after one year of instruction, even though their raw scores would be different. Their NCE gain would be zero. If the Mean NCE Gain is greater than or equal to zero, the average student in this school has achieved a year’s worth of academic growth in a year

Mean NCE Gain If the Mean NCE Gain is greater than or equal to zero, the average student in this school has achieved a year’s worth of academic growth in a year If the Mean NCE Gain is less than zero, the average student in this school has achieved less growth than expected

Value-Added Reporting The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state. Student achievement levels appear at the bottom of the report in the Estimated School Mean NCE Scores section. The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. Compare the estimated grade/year mean for a school to the NCE Base. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.

District Diagnostic Reports Use to identify patterns or trends of progress among students expected to score at different achievement levels Caution: subgroup means come from “a liberal statistical process” that is “less conservative than estimates of a district’s influence on student progress in the District Value Added Report”

District Diagnostic Report This report is intended for diagnostic purposes only and should not be used for accountability.

What do you see? Use this report to identify patterns or trends of progress among students expected to score at different achievement levels. This report is intended for diagnostic purposes only and should not be used for accountability. Explain that on this report the students are grouped into quintiles. Students are assigned to groups on a statewide basis. The assignment pattern shows schools how their students are distributed compared to other students in the same grade across the state. on the performance as it compares to similar students through out the state.

Features of the Diagnostic Report Quintiles Green Zero Line Previous Cohort(s) Confidence Band Whiskers 2 Standard Errors

Features of the Diagnostic Report Clickable Information Reference Gain Standard Error

District Performance Diagnostic Reports Use to identify patterns or trends or progress among students predicted to score at different performance levels as determined by their scores on NC tests Students assigned to Projected Performance Levels based on their predicted scores Shows the number (Nr) and percentage of students in the district that fall into each Projected Performance Level Click on the underlined number in the Mean or Nr of Students row for a subgroup to see the names of the students assigned to the subgroup Click on the % of Students for the current year or for Previous Cohort(s) to see the data in Pie Chart format. Mean Differences The Mean of the difference between the students’ observed test performance and their predicted performance appears for each Projected Performance Level, along with the Standard Error associated with the Mean. The Standard Error allows you to establish a confidence band around the Mean. A large negative mean indicates that students within a group made less progress than expected. A large positive mean indicates that students within a group made more progress than expected. A mean of approximately 0.0 indicates that a group has progressed at an average rate in the given subject. When the means among groups vary markedly, districts may want to explore ways to improve the instruction for students making less progress.

Interpreting the Pie Chart Green Light Red Yellow The Pie Chart shows the percent of students in each subgroup and compares their progress to the Growth Standard. Yellow: students in this group progressed at a rate similar to that of students in the average district in the state. Light Red: students in the group made more than one standard error less progress in this subject than students in the average district in the state. Green: the progress of students in this group was more than one standard error above that of students in the average district in the state.

Your Turn to Interpret What do you see? What do you want to know? 15 elementary schools

Your Turn to Interpret

The Power of Patterns

Diagnostic Reports Looking for Patterns The green line on the chart is the Reference line, or the amount of progress students need to make to maintain their entering achievement level. Bars above the line indicate that students in that group made good progress. Bars below the line indicate that students left this grade at a lower achievement level than when they started.   Blue bars show the progress of students in the most recent year. Gold bars show the progress of students in up to three previous cohorts, when data are available. No bar is presented for groups with fewer than five students. The red vertical line that intersects each bar indicates one standard error above and below the progress measure. The standard error allows you to establish a confidence band around the estimate. In this school, some subgroups of students are not making sufficient gain. Students in the lowest subgroups have not made sufficient progress while high achieving students are making excellent gain. The lack of appropriate progress among low achieving students is a pattern that has been repeated from previous years, indicating a persistent lack of effectiveness with lower achieving students. This is one of the most intriguing components of EVAAS. Common Diagnostic Patterns activity *Pattern Slides are on wiki

School Diagnostic Shed Pattern In this example, the lowest achieving students are making sufficient progress. Students at an average achievement level are making expected progress. However, the highest achieving students appear to be losing ground. Teachers and administrators will want to find ways to create more progress opportunities for high achieving students.

School Diagnostic Reverse Shed Pattern In this example, high achieving students are making excellent progress. Students who are average in achievement also are making sufficient progress. In contrast, the lowest achieving students are not making as much progress as they should. A pattern like this one will widen the achievement gap. Teachers and administrators should consider how to help lower achieving students gain more ground.

School Diagnostic Tent Pattern In this example, the students in the middle of the achievement distribution are making sufficient progress, but both lower achieving and higher achieving students are falling behind their peers. In this case, teachers and administrators will want to consider both how to support low-achieving students and how to challenge high-achieving students.

School Diagnostic V Pattern In this example, the opposite of the Tent Pattern, only the lowest and the highest achieving students are making good progress. Students in between have not had enough opportunities for academic growth.

School Diagnostic Opportunity Gap Pattern In this example, the students in every achievement group are making sufficient progress in the most recent year, except for the second group. Teachers and administrators will want to consider how to adjust the classroom instruction to meet these students’ needs. In addition, what approaches that are successful with the lowest achieving students could be expanded to include students in the second achievement group?

What would an ideal pattern on a Diagnostic Report look like for closing the achievement gap? Have participants draw an ideal pattern On index card, make a box with 1-5 at bottom. Draw the ideal diagnostic report; discuss Ideal to narrow the achievement gap: 1 is highest, descending to 5 Common Diagnostic Patterns activity – look at common patterns; taking turns, explain the pattern to your partner

Diagnostic Reports – Desirable Pattern Print and handout this slide for the next activity on drawing a desirable pattern or have participants use plain paper.

Diagnostic Report Desirable Pattern In this example, all bars above the green line indicating the district was highly effective with students in all achievement groups. Additionally, students in the lowest quintile made more progress than students in the other quintiles. Effectively, these students are starting to catch up with their peers; the gap is closing because they are increasing their performance more than a year’s worth of growth.

Diagnostic Reports – Whiskers In this diagram, the two bars have the same height, but the whiskers extend to different lengths. On the left, the whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓). On the right, the whiskers contain the green line, so the group represented by the bar made average progress (-). The red whisker represents the confidence interval due to the standard error for the mean.  There is a high probability that the actual mean falls somewhere within the Whiskers.  The size of the confidence interval is determined by the sample size.  The larger number of data points the smaller the standard error, smaller the whisker, and therefore the confidence interval.   If the whisker passes over the green line (reference line),the data shows expected growth since there is a chance the mean is actually on the other side of the green line.  It is not certain that the teacher is exclusively above or below the reference line, if the whisker crosses the green reference line. 

1. Go to the website www.ncdpi.sas.com Copyright © 2010, SAS Institute Inc. All rights reserved.

1. Go to ncdpi.sas.com 2. BOOKMARK IT! 3. Secure & Convenient Online Login Copyright © 2010, SAS Institute Inc. All rights reserved.

Do you see this? Copyright © 2010, SAS Institute Inc. All rights reserved.

Reality Check Activity Using reports from your school, choose one grade level and subject area. Go to the School Diagnostic Report. Identify the pattern found in that report. What does this data pattern tell you as teacher? What are your next steps? *Handout: “Interpreting Your School’s Results”

The Power of the HELP Button Activity Refer back to your Diagnostic Reports. Using the HELP button, find information about the red lines called whiskers running vertically through the bars on the graph. Share with a partner your explanation of the Diagnostic Report “whiskers.” With so many reports and options, EVAAS can become overwhelming! Always remember HELP is nearby by pressing the HELP button in the upper right hand corner.

Overview of School Effects Use the Ian Middle School Value-Added and Diagnostic reports (from the wiki) to complete the table. Navigate to a Value-Added Report and enter the Tested Subject/Grade name in the Overview of School Effectiveness table below. Locate the color for the most recent year. If the color is RED, place an “X” in the Overall Results column. Use a separate row for each grade for EOG reporting. If your school tests in both EOG and EOC subjects, record the EOG subjects and grades and then choose EOC from the Tests tab. For each test, subject, and/or grade, note the color for the most recent year. If the color is RED, place an “X” in the Overall Results column.

Overview of School Effects (sample data)

Overview of School Effects (sample data) These documents will be uploaded to the wiki

Overview of School Effects (sample data) Double check for correct answers

Overview of School Effects It’s Your Turn! Find the blank table. Do this by yourself. Using your data Fill in your table. Participants will examine sample data located on wiki site to complete this activity. Locate the blue bars on the graph for each of the 5 Achievement Groups. Also note the red whiskers. For any blue bars above the green line (where the whiskers are also completely above), place an up arrow () in the appropriate cell of the Overview of School Effectiveness table. For any blue bars below the green line (where the whiskers are also completely below), place a down arrow () in the table. For any blue bars at or near the green line (the whiskers cross the green line), place a horizontal dash (–) in the table.

Overview of School Effects What did you find? Interesting Patterns Insights Areas of Concern Areas of Celebration District chart paper activity – share data in any way you want….

Student Pattern Report This report is a customized Diagnostic report where you can examine progress for groups of students of your choice. It is only available from the school level and only to users w/access to student reports.

Student Pattern Report Key points to remember: The report shows growth for the lowest, middle, and highest achieving students within the chosen group. The report can be used to explore the progress of students with similar educational opportunities. Like all diagnostic reports, this report is for diagnostic purposes only. A minimum of 15 students is needed to create a Student Pattern Report. Enables you to see how effective the school has been with lowest, middle and highest achieving students at least 15 w/predicted and observed scores.

Student Pattern Report Take a look at this data, what do you notice? What are your thoughts? Higher students did better than expected. Our ML did not do as well as predicted The groups l, m, h placed in 3rds based on their predicted scores and where they fall in the distribution Teacher self reflection how effective were you in teaching based on their predicted score.

Student Pattern Report This teacher may have had 100% proficiency but we are looking at growth. This teacher did not contribute to the students learning. Students listed by name by subgroups, so that you can identify which group each child falls in. Look at individual students to see if we note any patterns. We can also look at race, gender, and see if some teachers are teaching better to a certain sub-population.

Key Questions We need to ask some key questions to find out why some students had better growth than others. We could even look at each subgroup individually and think about what contributes to negative and positive growth in the classroom.

Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? In this case we are comparing students in the same sub-group, the group that is considered the H group. Some key questions we might want to ask include: After asking these questions we find that in this particular report the hours that a student participated in a program made positive difference in growth. At this point we looked at number of hours so we ran another report of all 31 students to see if it had a large effect on student growth. We looked at all students that had over 40 hours of enrichment/remediation etc…

Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? YES! Rerun the report with new criteria.

Student Pattern Report – Next Steps All 31 Students in the Program 16 Students who attended for 40+ hours The 16 students that had over 40 hours in a program showed far greater growth than their counter parts that did not participate in the program. The 15 that didn’t participate the 40+ really negatively effected the overall growth. If you run a report and this is your result think the next step to figure out what the number actually mean. This shows the program did what you wanted.

Less Informed Conclusion: We need to change the selection criteria for this program. More Informed Conclusion: We need to adjust the recommended hours for participants.

Custom Student Report Have participants visit wiki to download step by step instructions.

Custom Student Report HANDOUT Post directions on the EVAAS Wiki Copyright © 2010, SAS Institute Inc. All rights reserved.

Academic At-Risk Reports These reports may be used to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. At Risk reports for EOG and EOC subj. include students with a 0-70% probability of scoring in the level 3 range. The range for writing in 0-80%. The reports are presented in 3 categories: AYP AT Risk- at risk for not meeting the academic indicators for AYP. EOG M & R grades 4-8. EOC Alg. I and Eng. I. For EOG tests students w/at least 3 prior data points (or test scores) will have projections in M & R in the next grade. These scores are not content specific. Projections for Alg. I and Eng. I may be made as early as 6th grade with sufficient data. Graduation at Risk-reports for students as risk for not making a level 3 on EOC subjs. Like Alg. 2, Chem., Geom. Phys. Sci, and Pysics. Students that have taken these tests buut have not scored at least level 3 will still have projections to these subjects. Under Reports – Click Academic At Risk Reports These are reports that you will want to spend some time really pouring through.

Academic At-Risk Reports 3 Categories At Risk- at risk for not meeting the annual federal academic indicators Graduation at Risk-reports for students at risk for not making a Level III on EOC subjects required for graduation Other at Risk-reports for students at risk for not making Level III on other EOC subjects Same report

Academic at Risk Reports Be Proactive Use these reports for discussing, developing and implementing targeted intervention and support to students who are at risk for not meeting future academic milestones. for EOG and EOC subjects include students with a 0-70% probability of scoring in the Level III range or 0-80% for writing

Making Data Driven Decisions 2% of achieving a level 3 on EOC in Alg. I EVAAS can show growth so teachers may want to take on this child to show some on this child to show some serious growth. Have programs in place show great growth for the students. EVERY Kid matters measuring growth not proficiency. Talk about the “clickables” and ways to disaggregate this data: students are listed alpha. w/demographic and other info. You can sort the report by clicking on the underlined column heading. A key to each column headings appears below the report. To see a student report, click on the students name. All students in the report have a 0-70% probability of scoring level 3 in the subject you have chosen (0-80% writing) assuming they have the avg. schooling experience in NC. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different stratege Talk about the defaults

What Are Projections?

What Are Projections Anyway? Given a specific set of circumstances… …what’s the most likely outcome?

What Are Projections Anyway? Given this student’s testing history, across subjects… …what is the student likely to score on an upcoming test, assuming the student has the average schooling experience?

EVAAS Projections What are they based on? Expectations based on what we know About this student and other students who have already taken this test Prior test scores (EOC/EOG), across subjects Their scores on the test we’re projecting to

What’s the Value of the Projections? Projections are NOT about predicting the future. They ARE about assessing students’ academic needs TODAY. Although projections indicate how a student will likely perform on a future test, their real value lies in how they can inform educators today. By incorporating the projections into their regular planning, teachers, administrators, and guidance counselors can make better decisions about how to meet each student’s academic needs now. Copyright © 2010, SAS Institute Inc. All rights reserved.

Assessing Students’ Needs What are this student’s chances for success? What goals should we have for this student this year? What goals should we have for this student in future years? What can I do to help this student get there? When assessing students’ academic needs, educators will want to keep these key questions in mind. Copyright © 2010, SAS Institute Inc. All rights reserved.

Using Projections to Take Action Identify students Assess the level of risk Plan schedules Identify high-achievers Assess the opportunities Inform Identify students who need to participate in an academic intervention Assess the level of risk for students who may not reach the Proficient mark Plan schedules and resources to ensure that you can meet students’ needs Identify high-achievers who will need additional challenges Assess the opportunities for high-achieving students who are at risk of not reaching Advanced Inform course placement decisions

Making Data Driven Decisions Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different strategies Talk about the defaults

Data Mining Data mining is sometimes referred to as data or knowledge discovery. Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. Answer the following questions based on your data.

Reflection + Projection = TODAY

Student Project Report Red dot: Student's testing history. Roll over a dot to see the school and district in which the student was tested. Yellow box: Student's Projected State Percentile, assuming average progress. Performance Level Indicators: Cut score required to be successful at different performance levels, expressed in State Percentiles. See the key below the graph.

Student Project Report Reading left to right, Student's projected State Percentile for the chosen test. Probability for success at different performance levels.

Student Project Report The table shows the student's testing history, across gradesin State NCEs (EOG Math and Reading) or scale score points (all other tests).. For EOC tests, the season in which the test was administered, Fall (F), Spring (Sp), or Summer (Su), is indicated. The year of the test refers to the school year to which the test is attributed. For example, EOC tests administered in the summer and fall of 2010 will be labeled 2011 because they are attributed to the 2010-2011 school year. 3rd grade pretests are considered to measure 2nd grade achievement and are therefore attributed to the previous school year and labeled (2) for 2nd grade.

Thinking of the State Distribution by QUINTILES each student’s achievement quintile based on his/her Projected State Percentile

Note the Student’s Projected QUINTILE Notice where each student profiles in the state distribution. That is, identify each student’s achievement quintile based on his/her Projected State Percentile.

Reflecting on Past Effectiveness to Plan for Differentiating Student Instruction Entering Achievement Use this report to identify past patterns or trends of progress among students expected to score at different achievement levels

Reflecting on Past Effectiveness to Plan for Differentiating Student Instruction QUINTILE 2 Past Effectiveness Entering Achievement How effective was your school with the lowest two quintiles?

Academic Preparedness Report

Academic Preparedness Report Activity: Use the Bridge to Differentiated Instruction Document This report shows the probability that students within a grade will score at or above Level III on future tests. The table shows the number and percentage of students in each of three probability groups, as well as the number and percentage of students who have already passed the test with a Level III or higher and those with insufficient data for a projection. Green: Students whose probability of proficiency on the chosen test is greater than or equal to 70% Yellow: Students whose probability of proficiency on the chosen test is between 40% and 70% Light Red: Students whose probability of proficiency on the chosen test is less than or equal to 40% Blue: Students who have already passed the test with a Level III or higher. White: Students who do not have a projection, due to lack of sufficient data.

Teacher Value-Added Report

Why should you care about your EVAAS Teacher Value Added Report? Beginning with your 2013 report, it becomes part of your evaluation. Standard 6 – Teachers contribute to the academic success of their students. (Measurable Progress) Standard 4 – Teachers facilitate learning for their students Teachers plan instruction appropriate for their students Use data for short and long range planning Standard 5 – Teachers reflect on their practice. Teachers analyze student learning. But your report is not just an evaluation component. It is also a powerful tool for improving your effectiveness as a teacher. So why else should you care?

Why should you care about your EVAAS Teacher Value Added Report? You care about your students.

Achievement vs. Progress Student Progress – How far have I come? Highly dependent on what happens as a result of schooling rather than on demographic factors.

Achievement vs. Progress Focus on progress Educators can influence this Minimum expectation = one year of academic gain By concentrating on the progress students make, EVAAS puts the emphasis on something educators are responsible for and can do something about. Average progress (one year of academic gain) is the minimum expectation. In other words, it is expected that students will not lose ground, relative to their peers, in the course of the year

Understanding Value Added Projection report looks at past testing information and projects how a student will perform. Student’s own past performance Performance of students who have taken the test previously Students must have three prior test scores for something to be included in the teacher’s predictive report. Whole cohort of students analyzed.

Improve the Education Program EVAAS can tell you WHAT happened. It’s up to YOU to determine WHY it happened and what you want to do about it. Improve the Education Program EVAAS Local Knowledge & Expertise

Info about Teacher Reports State Growth Standard/State Average = 0.0 Standard Error = a measure of uncertainty Usually, the more data you have, the smaller the standard error. Index = Teacher Estimate divided by Standard Error

Effectiveness Categories Copyright © 2010, SAS Institute Inc. All rights reserved.

Effectiveness Level Determination Exceeds Expected Growth: Teachers whose students are making substantially more progress than the state average Index is 2 or greater

Effectiveness Level Determination Meets Expected Growth: Teachers whose students are making the same amount of progress as the state average Index is equal to or greater than -2 but less than 2

Effectiveness Level Determination Does Not Meet Expected Growth: Teachers whose students are making substantially less progress than the state average Index is less than -2

Evaluation Composite Index: Teacher Estimate Divided by Standard Error Courses included in calculation Statewide distribution of teacher status.

Understanding Teacher Value-Added Reports Teacher Estimate: How much progress did this teacher’s students make compared to other students across the state? Index: Teacher estimate divided by the standard error. Index is the basis by which teachers are assigned to effectiveness levels.

EVAAS Teacher Value Added Report

EVAAS Teacher Value Added Report Supplemental Information Table

Student Teacher Linkages

EVAAS Student Report

Teacher Diagnostic Report

Making Generalizations

Making Generalizations What generalizations can we make? What do we not know? How do we find out?

EVAAS Teacher Diagnostic Report

School Composites

Using Teacher Reports to Improve Student Progress Identify highly effective teachers Identify teachers who need support Identify strengths and areas for improvement of individual teachers Identify school-wide strengths and weaknesses to inform and provide professional development opportunities Facilitate powerful, crucial conversations between teachers and administrators Impact scheduling decisions *See “Using Teacher Data”

Role Play Activity

PLC Predictions and Possibilities

Exit Tickets and Feedback As you reflect on today’s session, use two sticky notes to capture your thoughts on these topics: Greatest Take Away Now, I Need… http://go.ncsu.edu/ncdpi-resa_survey Place links on the EVAAS Wiki http://go.ncsu.edu/ncdpi-resa_survey Need a google form with short questions specifically about this session. Can be embedded into wiki page.