Proactive Assessments

Slides:



Advertisements
Similar presentations
Tennessee Comprehensive Assessment Program Gateway and End of Course 2007.
Advertisements

Discovery Education ThinkLink Assessment. Founded by Vanderbilt University Acquired in March 2006 by Discovery Education Specializes in Predictive Assessment.
Summary of NCDPI Staff Development 12/4/12
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
6 th Annual Focus Users’ Conference Florida Reporting Presented by: Bethany Heslam.
Haywood County Schools February 20,2013
Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013 The Power of EVAAS.
Summer Copyright (c) 2012 Pearson Education, Inc. or its affiliate(s). All rights reserved. All names and data used in this presentation are fictitious.
PVAAS + Other Data Consultation 2013 PVAAS AND OTHER DATA TOOLS SCHOOL CONSULTATION FALL 2013.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
WATAUGA COUNTY SCHOOLS MIDDLE GRADES PRE-MATH 1 & MATH 1 PLACEMENT Advanced Math Placement Procedure.
Staar Trek The Next Generation STAAR Trek: The Next Generation Performance Standards.
1. 2 What tests will students have to take? High School COURSES with EOCs MathEnglishScienceSoc. Studies Algebra I Geometry Algebra II Eng. I-R&W Eng.
Information provided by LISD Assessment Office.  STAAR stands for: › State of Texas Assessments of Academic Readiness  Implemented in for school year.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Supporting Students with IEPs: They CAN and ARE Making Progress! January 2013 Jennifer Ross, PVAAS Statewide Team for PDE Bonnie Dyer, AIU3 Curriculum.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Phase 2 Presentation.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
ACCOUNTABILITY UPDATE Accountability Services.
Department of Research and Planning November 14, 2011.
Beginning – Intermediate – Advance Date EVAAS for Educators.
Beginning – Intermediate – Advanced Friday, September 14, 2012 EVAAS for Educators.
MATRIX OF ACHIEVEMENT AND PROGRESS (MAAP) A New Interactive Data Tool for Ohio Districts.
MEAP / MME New Cut Scores Gill Elementary February 2012.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Future Ready Schools National Assessment of Educational Progress (NAEP) in North Carolina Wednesday, February 13, 2008 Auditorium III 8:30 – 9:30 a.m.
FBISD Grading Policy Changes EIA(Local) Student/Parent Handbook Page 11.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
2012 MOASBO SPRING CONFERENCE Missouri Department of Elementary and Secondary Education 1 April 26, 2012.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013 The Power of EVAAS.
Beginning – Intermediate – Advance November 8, 2012 EVAAS for Educators.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
November 2009 Copyright © 2009 Mississippi Department of Education 1 Mississippi Department of Education Office of Research and Statistics Mississippi.
Turning Data into Action The GCS Journey. Where we started We had data everywhere It didn’t come with instructions We talked about data-based decision.
Using EVAAS to Make Data- Driven Decisions Clay County School April 20, 2012 Jan King Professional Development Consultant, NCDPI.
Understanding Growth and ACT Aspire Reports.
SASEVAAS A Way of Measuring Schooling Influence
What is Value Added?.
Teacher SLTs
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools.
Data Review and Discussion David Holland
NWEA Measures of Academic Progress (MAP)
New EVAAS Teacher Value-Added Reports for
Beginning – Intermediate – Advance September 28, 2012
Understanding Results
Reflective Assessments
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Jackson County Schools March 11, 2013.
EVAAS Overview.
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
Principal’s Meeting October 23, 2014.
Our Agenda Welcome, Introductions, Agenda Overview EVAAS and Data
October 2012 NCDPI NCRESA EVAAS Training Refresher Session.
Setting MAP Goals Grades 7 – 10.
The Power of EVAAS Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013.
Timeline for STAAR EOC Standard Setting Process
Why should you care about the EVAAS Teacher Value Added Report?
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Crandall ISD Data Day August 9, 2017.
Jayhawkville Central High School
Driving Through the California Dashboard
Quantitative Measures: Measuring Student Learning
“Reviewing Achievement and Value-Added Data”
Teacher SLTs
Presentation transcript:

Proactive Assessments

Academic At-Risk Reports These reports may be used to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. At Risk reports for EOG and EOC subj. include students with a 0-70% probability of scoring in the level 3 range. The range for writing in 0-80%. The reports are presented in 3 categories: AYP AT Risk- at risk for not meeting the academic indicators for AYP. EOG M & R grades 4-8. EOC Alg. I and Eng. I. For EOG tests students w/at least 3 prior data points (or test scores) will have projections in M & R in the next grade. These scores are not content specific. Projections for Alg. I and Eng. I may be made as early as 6th grade with sufficient data. Graduation at Risk-reports for students as risk for not making a level 3 on EOC subjs. Like Alg. 2, Chem., Geom. Phys. Sci, and Pysics. Students that have taken these tests buut have not scored at least level 3 will still have projections to these subjects. Under Reports – Click Academic At Risk Reports These are reports that you will want to spend some time really pouring through.

Academic At-Risk Reports 3 Categories AYP at Risk- at risk for not meeting the academic indicators for AYP Graduation at Risk-reports for students at risk for not making a Level III on EOC subjects required for graduation Other at Risk-reports for students at risk for not making Level III on other EOC subjects Same report

Academic at Risk Reports Be Proactive Use these reports to determine local policy for providing targeted intervention and support to students who are at risk for not meeting future academic milestones. for EOG and EOC subjects include students with a 0-70% probability of scoring in the Level III range or 0-80% for writing

Making Data Driven Decisions 2% of achieving a level 3 on EOC in Alg. I EVAAS can show growth so teachers may want to take on this child to show some on this child to show some serious growth. Have programs in place show great growth for the students. EVERY Kid matters measuring growth not proficiency. Talk about the “clickables” and ways to disaggregate this data: students are listed alpha. w/demographic and other info. You can sort the report by clicking on the underlined column heading. A key to each column headings appears below the report. To see a student report, click on the students name. All students in the report have a 0-70% probability of scoring level 3 in the subject you have chosen (0-80% writing) assuming they have the avg. schooling experience in NC. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different stratege Talk about the defaults

What Are Projections?

What Are Projections Anyway? Given a specific set of circumstances… …what’s the most likely outcome?

What Are Projections Anyway? Given this student’s testing history, across subjects… …what is the student likely to score on an upcoming test, assuming the student has the average schooling experience?

EVAAS Projections What are they based on? Expectations based on what we know About this student and other students who have already taken this test Prior test scores (EOC/EOG), across subjects Their scores on the test we’re projecting to

What’s the Value of the Projections? Projections are NOT about predicting the future. They ARE about assessing students’ academic needs TODAY. Although projections indicate how a student will likely perform on a future test, their real value lies in how they can inform educators today. By incorporating the projections into their regular planning, teachers, administrators, and guidance counselors can make better decisions about how to meet each student’s academic needs now. Copyright © 2010, SAS Institute Inc. All rights reserved.

Assessing Students’ Needs What are this student’s chances for success? What goals should we have for this student this year? What goals should we have for this student in future years? What can I do to help this student get there? When assessing students’ academic needs, educators will want to keep these key questions in mind. Copyright © 2010, SAS Institute Inc. All rights reserved.

Using Projections to Take Action Identify students Assess the level of risk Plan schedules Identify high-achievers Assess the opportunities Inform Identify students who need to participate in an academic intervention Assess the level of risk for students who may not reach the Proficient mark Plan schedules and resources to ensure that you can meet students’ needs Identify high-achievers who will need additional challenges Assess the opportunities for high-achieving students who are at risk of not reaching Advanced Inform course placement decisions

Making Data Driven Decisions Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. These students will need support and intervention to provide them with a better than average schooling experience if they are to be successful. Consider different strategies Talk about the defaults

Data Mining Data mining is sometimes referred to as data or knowledge discovery. Have participants access their academic at risk report. Select a grade level and subject to view achievement probability. Answer the following questions based on your data.

Reflection + Projection = TODAY

Student Project Report Red dot: Student's testing history. Roll over a dot to see the school and district in which the student was tested. Yellow box: Student's Projected State Percentile, assuming average progress. Performance Level Indicators: Cut score required to be successful at different performance levels, expressed in State Percentiles. See the key below the graph.

Student Project Report Reading left to right, Student's projected State Percentile for the chosen test. Probability for success at different performance levels.

Student Project Report The table shows the student's testing history, across gradesin State NCEs (EOG Math and Reading) or scale score points (all other tests).. For EOC tests, the season in which the test was administered, Fall (F), Spring (Sp), or Summer (Su), is indicated. The year of the test refers to the school year to which the test is attributed. For example, EOC tests administered in the summer and fall of 2010 will be labeled 2011 because they are attributed to the 2010-2011 school year. 3rd grade pretests are considered to measure 2nd grade achievement and are therefore attributed to the previous school year and labeled (2) for 2nd grade.

Thinking of the State Distribution by QUINTILES each student’s achievement quintile based on his/her Projected State Percentile

Note the Student’s Projected QUINTILE Notice where each student profiles in the state distribution. That is, identify each student’s achievement quintile based on his/her Projected State Percentile.

Reflecting on Past Effectiveness to Plan for Differentiating Student Instruction Entering Achievement Use this report to identify past patterns or trends of progress among students expected to score at different achievement levels

Reflecting on Past Effectiveness to Plan for Differentiating Student Instruction QUINTILE 2 Past Effectiveness Entering Achievement How effective was your school with the lowest two quintiles?

Academic Preparedness Report

Academic Preparedness Report Activity: Use the Bridge to Differentiated Instruction Document This report shows the probability that students within a grade will score at or above Level III on future tests. The table shows the number and percentage of students in each of three probability groups, as well as the number and percentage of students who have already passed the test with a Level III or higher and those with insufficient data for a projection. Green: Students whose probability of proficiency on the chosen test is greater than or equal to 70% Yellow: Students whose probability of proficiency on the chosen test is between 40% and 70% Light Red: Students whose probability of proficiency on the chosen test is less than or equal to 40% Blue: Students who have already passed the test with a Level III or higher. White: Students who do not have a projection, due to lack of sufficient data.

Custom Student Report Have participants visit wiki to download step by step instructions.

Custom Student Report HANDOUT Post directions on the EVAAS Wiki Copyright © 2010, SAS Institute Inc. All rights reserved.

Questions?