Reflective Assessments

Slides:



Advertisements
Similar presentations
AIMSweb Progress Monitor Online User Training
Advertisements

Haywood County Schools February 20,2013
Copyright © 2013, SAS Institute Inc. All rights reserved. NEW TEACHER REPORTS OCTOBER TRAINING 2013.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Welcome to the Turnitin.com Instructor Quickstart Tutorial ! This brief tour will take you through the basic steps teachers and students new to Turnitin.com.
Performance Diagnostic Report PVAAS Overview 2013 Blue Bar – Current Year Missing Bar – Insufficient Number of Students Whisker – Margin of Error on Growth.
August Introductions 2  Data on the Educator Portal can be difficult to understand, hard to navigate and a challenge to use. This training is.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
W531 Excel Tutorial. To begin, please visit the Indiana Department of Education website at From the menu on the.
Supporting Students with IEPs: They CAN and ARE Making Progress! January 2013 Jennifer Ross, PVAAS Statewide Team for PDE Bonnie Dyer, AIU3 Curriculum.
Beginning – Intermediate October 16, 2012 EVAAS for Educators.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Phase 2 Presentation.
Getting Started Log on to this site: Complete Tasks Before We Begin Section 1.Consensogram Activity 2.Burning.
HOW DO I USE THINKGATE? Presented By: Mercy Aycart From: South Miami Senior High Data have no meaning…meaning is imposed.
TVAAS Tennessee Value-Added Assessment System
Created by Tammillye Ward Thinkgate Using Thinkgate to Answer the 4 Critical Questions of a PLC What do we want the students to learn? How will we know.
EVAAS for Educators Mary Keel, Ed.D. Robin Loflin Smith, Ed.D. Tara Patterson, MSA.
Beginning – Intermediate – Advance Date EVAAS for Educators.
Beginning – Intermediate – Advanced Friday, September 14, 2012 EVAAS for Educators.
MATRIX OF ACHIEVEMENT AND PROGRESS (MAAP) A New Interactive Data Tool for Ohio Districts.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Thinkgate & The Facilitator Utilizing Thinkgate to Answer Powerful Questions What do we want the students to learn? How will we know they learned it? How.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Beginning – Intermediate October 16, 2012 WRESA Internet Access: Log into AB Tech No Password EVAAS for Educators.
Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013 The Power of EVAAS.
Beginning – Intermediate – Advance November 8, 2012 EVAAS for Educators.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
1 Testing Various Models in Support of Improving API Scores.
PeerWise Student Instructions
Mastering Mastery Connect- Assessments
Overview of the new State Accountability System
Student Assessment Data Portal
Setting up Categories, Grading Preferences and Entering Grades
How to use myProgress (Performance Matters)
SASEVAAS A Way of Measuring Schooling Influence
What does the Research Say About . . .
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools.
Reports: Pivot Table ©2015 SchoolCity, Inc. All rights reserved.
Using Thinkgate to Answer the 4 Critical Questions of a PLC
Beginning – Intermediate October 16, 2012 WRESA
New EVAAS Teacher Value-Added Reports for
MPS Research & Evaluation
AIRWays Reporting Training Module
Beginning – Intermediate – Advance September 28, 2012
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Jackson County Schools March 11, 2013.
MPS Research & Evaluation
EVAAS Overview.
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
STARS Training – How to Give District Tests
Proactive Assessments
STARS Reports.
Our Agenda Welcome, Introductions, Agenda Overview EVAAS and Data
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013.
Displaying Numerical Data Using Box Plots
Why should you care about the EVAAS Teacher Value Added Report?
Overview of School Effects (sample data)
Analysing your pat data
Climate Surveys.
Online Reporting System
“Reviewing Achievement and Value-Added Data”
Module 4: The Highlights!
Module: 9 Mapping the Standards How the 2020 Colorado Academic Standards Work Together for Colorado Students! Estimated time: 60 minutes.
Presentation transcript:

Reflective Assessments Participants should go to the EVAAS wiki. Under the agenda section have users click on the reflective assessments link. Files to support this portion of the presentation can be found there.

Value-Added Reports Use to evaluate the overall effectiveness of a school on student progress. Compares each school to the average school in the state. Comparisons are made for each subject tested in the given year and indicate how a school influences student progress in those subjects. Has to be more than -1.8 to be below; more than 2 standard errors to be above. (Provide definition of value-added.) Some things to note are that “like” students are in a subgroup. Show how to read the report. Explain that 0 is the equivalent of one year of growth. Looking at the bottom of the page at the green, yellow, and red descriptors. Explain that there is a blue descriptor when there is not enough data to make a distinction. If your LEA uses DIBELS, the reports have similarities in design with the colors. Have the participants look at the data and talk about how you can see here. Look at trends in the same grade. Look at the students that moved from 6th grade in 2011 to 7th grade in 2012 and to 8th grade in 2012. Talk about what you can take a way from this report.

Diagnostic Reports – the whiskers In this diagram, the two bars have the same height, but the whiskers extend to different lengths. On the left, the whiskers lie completely below the green line, so the group represented by the bar made less than average progress (↓). On the right, the whiskers contain the green line, so the group represented by the bar made average progress (-). The red whisker represents the confidence interval due to the standard error for the mean.  There is a high probability that the actual mean falls somewhere within the Whiskers.  The size of the confidence interval is determined by the sample size.  The larger number of data points the smaller the standard error, smaller the whisker, and therefore the confidence interval.   If the whisker passes over the green line (reference line),the data shows expected growth since there is a chance the mean is actually on the other side of the green line.  It is not certain that the teacher is exclusively above or below the reference line, if the whisker crosses the green reference line. 

Diagnostic Reports Looking for Patterns In this school, some subgroups of students are not making sufficient gain. Students in the lowest subgroups have not made sufficient progress while high achieving students are making excellent gain. The lack of appropriate progress among low achieving students is a pattern that has been repeated from previous years, indicating a persistent lack of effectiveness with lower achieving students. This is one of the most intriguing components of EVAAS. Common Diagnostic Patterns activity *Pattern Slides are on wiki

School Diagnostic Shed Pattern In this example, the lowest achieving students are making sufficient progress. Students at an average achievement level are making expected progress. However, the highest achieving students appear to be losing ground. Teachers and administrators will want to find ways to create more progress opportunities for high achieving students.

School Diagnostic Reverse Shed Pattern In this example, high achieving students are making excellent progress. Students who are average in achievement also are making sufficient progress. In contrast, the lowest achieving students are not making as much progress as they should. A pattern like this one will widen the achievement gap. Teachers and administrators should consider how to help lower achieving students gain more ground.

School Diagnostic Tent Pattern In this example, the students in the middle of the achievement distribution are making sufficient progress, but both lower achieving and higher achieving students are falling behind their peers. In this case, teachers and administrators will want to consider both how to support low-achieving students and how to challenge high-achieving students.

School Diagnostic V Pattern In this example, the opposite of the Tent Pattern, only the lowest and the highest achieving students are making good progress. Students in between have not had enough opportunities for academic growth.

School Diagnostic Opportunity Gap Pattern In this example, the students in every achievement group are making sufficient progress in the most recent year, except for the second group. Teachers and administrators will want to consider how to adjust the classroom instruction to meet these students’ needs. In addition, what approaches that are successful with the lowest achieving students could be expanded to include students in the second achievement group?

What would an ideal pattern on a Diagnostic Report look like for closing the achievement gap? Have participants draw an ideal pattern On index card, make a box with 1-5 at bottom. Draw the ideal diagnostic report; discuss Ideal to narrow the achievement gap: 1 is highest, descending to 5 Common Diagnostic Patterns activity – look at common patterns; taking turns, explain the pattern to your partner

Diagnostic Reports – Desirable Pattern Print and handout this slide for the next activity on drawing a desirable pattern or have participants use plain paper.

Diagnostic Report Desirable Pattern In this example, all bars above the green line indicating the district was highly effective with students in all achievement groups. Additionally, students in the lowest quintile made more progress than students in the other quintiles. Effectively, these students are starting to catch up with their peers; the gap is closing because they are increasing their performance more than a year’s worth of growth.

Diagnostic & Performance Diagnostic Reports (Part 2)

Overview of School Effects (sample data) Place activity instructions on the wiki. Use the Value-Added and Diagnostic reports to complete the table. Navigate to a Value-Added Report and enter the Tested Subject/Grade name in the Overview of School Effectiveness table below. Locate the color for the most recent year. If the color is RED, place an “X” in the Overall Results column. Use a separate row for each grade for EOG reporting. If your school tests in both EOG and EOC subjects, record the EOG subjects and grades and then choose EOC from the Tests tab. For each test, subject, and/or grade, note the color for the most recent year. If the color is RED, place an “X” in the Overall Results column.

Overview of School Effects (sample data) Drill down to the Diagnostic Report for each Tested Subject/Grade. Locate the blue bars on the graph for each of the 5 Achievement Groups. Also note the red whiskers. For any blue bars above the green line (where the whiskers are also completely above), place an up arrow () in the appropriate cell of the Overview of School Effectiveness table. For any blue bars below the green line (where the whiskers are also completely below), place a down arrow () in the table. For any blue bars at or near the green line (the whiskers cross the green line), place a horizontal dash (–) in the table.

Overview of School Effects (sample data) These documents will be uploaded to the wiki An extra set of sample data has been loaded on the wiki if participants can’t log into EVAAS

Overview of School Effects (sample data) These documents will be uploaded to the wiki

Overview of School Effects (sample data) These documents will be uploaded to the wiki

Overview of School Effects (sample data) Double check for correct answers

1. Go to the website www.ncdpi.sas.com Copyright © 2010, SAS Institute Inc. All rights reserved.

1. Go to the website ncdpi.sas.com Copyright © 2010, SAS Institute Inc. All rights reserved.

1. Go to ncdpi.sas.com 2. BOOKMARK IT! 3. Secure & Convenient Online Login Copyright © 2010, SAS Institute Inc. All rights reserved.

Do you see this? Then Sit Tight! Copyright © 2010, SAS Institute Inc. All rights reserved.

Overview of School Effects It’s Your Turn! Find the blank table. Do this by yourself. Using sample data Fill in your table. Participants will examine sample data located on wiki site to complete this activity. Locate the blue bars on the graph for each of the 5 Achievement Groups. Also note the red whiskers. For any blue bars above the green line (where the whiskers are also completely above), place an up arrow () in the appropriate cell of the Overview of School Effectiveness table. For any blue bars below the green line (where the whiskers are also completely below), place a down arrow () in the table. For any blue bars at or near the green line (the whiskers cross the green line), place a horizontal dash (–) in the table.

Overview of School Effects What did you find? Interesting Patterns Insights Areas of Concern Areas of Celebration This is a sharing activity – Think-Pair-Share, TTP, Tea Party, etc.

1. Go to the website ncdpi.sas.com Log back in. Copyright © 2010, SAS Institute Inc. All rights reserved.

Finding Your Patterns Have participants find their school or a school in their district. Identify the patterns for your subjects and grade levels Patterns and explanations can be located on the wiki in the Reflective Assessments portion of the agenda.

Interpreting Your Results Complete the Interpreting Your Results downloadable take home form on the Wiki. Have participants go to the Reflective Assessments section of the wiki and download the Interpreting Your Results documents.

Student Pattern Report This report is a customized Diagnostic report where you can examine progress for groups of students of your choice. It is only available from the school level and only to users w/access to student reports.

Student Patterns Report Key points to remember: The report shows growth for the lowest, middle, and highest achieving students within the chosen group. The report can be used to explore the progress of students with similar educational opportunities. Like all diagnostic reports, this report is for diagnostic purposes only. A minimum of 15 students is needed to create a Student Pattern Report. Enables you to see how effective the school has been with lowest, middle and highest achieveing students at least 15 w/predicted and observed scores.

Student Pattern Report Take a look at this data, what do you notice? What are your thoughts? Higher students did better than expected. Our ML did not do as well as predicted The groups l, m, h placed in 3rds based on their predicted scores and where they fall in the distribution Teacher self reflection how effective were you in teaching based on their predicted score.

Student Pattern Report This teacher may have had 100% proficiency but we are looking at growth. This teacher did not contribute to the students learning. Students listed by name by subgroups, so that you can identify which group each child falls in. Look at individual students to see if we note any patterns. We can also look at race, gender, and see if some teachers are teaching better to a certain sub-population.

Key Questions We need to ask some key questions to find out why some students had better growth than others. We could even look at each subgroup individually and think about what contributes to negative and positive growth in the classroom.

Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? In this case we are comparing students in the same sub-group, the group that is considered the H group. Some key questions we might want to ask include: After asking these questions we find that in this particular report the hours that a student participated in a program made positive difference in growth. At this point we looked at number of hours so we ran another report of all 31 students to see if it had a large effect on student growth. We looked at all students that had over 40 hours of enrichment/remediation etc…

Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? YES! Rerun the report with new criteria.

Student Pattern Report – Next Steps All 31 Students in the Program 16 Students who attended for 40+ hours The 16 students that had over 40 hours in a program showed far greater growth than their counter parts that did not participate in the program. The 15 that didn’t participate the 40+ really negatively effected the overall growth. If you run a report and this is your result think the next step to figure out what the number actually mean. This shows the program did what you wanted.

Less Informed Conclusion: We need to change the selection criteria for this program. More Informed Conclusion: We need to adjust the recommended hours for participants.