Custom Reports: SCGs and VCGs. Standard Comparison Group (SCG)

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Impact on Student Learning The conversation is about multiple indicators for this category BUT Few if any places actually have viable multiple indicators.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
PVAAS School Consultation Fall 2011 PVAAS Statewide Core Team
Pennsylvania Value-Added Assessment System Overview: PVAAS
Fall 2014 MAP NWEA Score Comparison Alliance Dr. Olga Mohan High School October 22, 2014.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
PVAAS + Other Data Consultation 2013 PVAAS AND OTHER DATA TOOLS SCHOOL CONSULTATION FALL 2013.
How was your MAP ® experience?  As you get settled, tell us about your MAP experience.  Please add comments or questions to the graffiti wall. Use the.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
1 NAEP th Grade Economics Assessment. 2 ► First NAEP assessment of economics ► Content areas: market economy, national economy, and international.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Classroom Assessment A Practical Guide for Educators by Craig A
Mark DeCandia Kentucky NAEP State Coordinator
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Richland School District 2013 EOY Gains Analysis Data reflects implementation by 7/16/13 export.
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
PLC-Data Disaggregation Week of September Before the meeting Make sure each person has signed in. Review the group norms Todays goals are to disaggregate.
NYC ACHIEVEMENT GAINS COMPARED TO OTHER LARGE CITIES SINCE 2003 Changes in NAEP scores Class Size Matters August
Prepared by: Scott R. Morrison Director of Curriculum and Instructional Technology 11/3/09.
+ Equity Audit & Root Cause Analysis University of Mount Union.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
Using a teacher identifier to improve student achievement and instruction Elliott Asp Cherry Creek Schools February 2009.
Benchmark Data. World History Average Score: 56% Alliance: 96%
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
School Performance Framework Sponsored by The Colorado Department of Education Summer 2010 Version 1.3.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
The Power of Two: Achievement and Progress. The Achievement Lens Provides a measure of what students know and are able to do relative to the Ohio standards,
Online Reporting System. Understand the role and purpose of the Performance Reports in supporting student success and achievement. Understand changes.
2005 NAEP Results Mathematics San Diego City Schools Board of Education Workshop January 17, 2006 S D CS.
NAEP 2011 Mathematics and Reading Results NAEP State Coordinator Mark DeCandia.
The Nation’s Report Card Science National Assessment of Educational Progress (NAEP)
Stepping Stones to Using Data Measures of Academic Progress, MAP, DesCartes: A Continuum of Learning, Partnering to help all kids learn, Power of Instructional.
Lee County Schools.  Understand components of the gains analysis report  Review and manipulate assessment gains data for your own schools  Utilize.
Iowa School Report Card (Attendance Center Rankings) December 3, 2015.
Academic Growth in Math High Poverty Students – WASL and MAP Feng-Yi Hung, Ph.D Director of Assessment and Program Evaluation Clover Park School District.
STUDENT GROWTH GOALS PART 2- “HOW MUCH GROWTH”. LAST WEEK Who: 8.1- the 9 th grade team 6.1- the 9 th grade class 3.1 –the Special Eduction/ELL/LAP students.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Massachusetts Comprehensive Assessment System (MCAS) /22/2010.
ACT ASPIRE GROWTH REPORTS. DISTRICTS AND SCHOOLS THAT PARTICIPATED IN ACT ASPIRE ASSESSMENTS (READING, MATH, ENGLISH, SCIENCE AND WRITING) WITH AN N COUNT.
PVAAS School Consultation Guide Fall 2010 Session C: 9-12 High School – All Data Tools PVAAS Statewide Core Team
Department of Research and Evaluation Santa Ana Unified School District 2011 CST High School.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
IMPACT OF READ NATURALLY SUMMER AND FALL 2013 ACTION RESEARCH PROJECT AND EDU604 CULMINATION PROJECT DOANE COLLEGE SUE SCHLICHTEMEIER-NUTZMAN, PH.D. By.
PVAAS Looking Back How Effective are We?. Quintile Diagnostic Report: A view of growth based upon where the group profiles compared to the state distribution.
1 Average Range Fall. 2 Average Range Winter 3 Average Range Spring.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
A Growth Measure for ALL Students.
What is Value Added?.
IT’S ALL ABOUT GROWTH!. Hemet Unified School District’s Use of Measures of Academic Progress (MAP)
Preliminary Analysis of EOG/EVOS Data – Greene County 2009,2010,2011
NWEA Measures of Academic Progress (MAP)
CMSD Fall Data Check In November 3, 2017.
Understanding Results
CORE Academic Growth Model: Results Interpretation
Proactive Assessments
Danvers Public Schools: Our Story
Analysing your pat data
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
School Improvement Plan
Presentation transcript:

Custom Reports: SCGs and VCGs

Standard Comparison Group (SCG)

What is an SCG?  Custom Report  Evaluates your students’ observed growth against normative group  Same grade & initial score  Same number of instructional weeks  Allows you to compare growth patterns across different groups of students 3

Regular reports give you high level view of growth data SCGs break growth data down for deeper analysis Pivot tables allow you to sort data by – Gender – Ethnicity – School – Subject Digging Deeper into Growth Data 4

Six Pivot Reports: – Growth by Achievement – Growth by School – Growth by Subgroup (ethnicity) – Growth Figure (bar graph) – Testing Conditions – Testing Conditions & Growth SCG Reports Reports are best used in relation to a research question that needs to be answered with data.

Using Questions to Guide Data Collection Dylan Wiliam calls it: “Question-driven data collection.” Not…. “Data-driven decision-making.”

Report Tabs

Tab: Growth by Achievement Question: How did our kids grow in various percentile ranges of achievement compared to other kids with similar scores and similar amounts of instructional time? – Or, “Are our high-achieving students growing as much as our low-achieving students?” Questions to Drive the Data

Growth by Achievement

Filter your Data  Filters help administrators to dig into the data

Tab: Growth by Achievement Question: How did our kids grow in various percentile ranges of achievement compared to other kids with similar scores and similar amounts of instructional time? Answer: Most students in grades 1-8 grew below average compared to other students that had similar amounts of instructional time. Students in grade K, 9 & 10 grew average or above average compared to students with similar amounts of instructional time. Questions to Drive the Data

Tab “Growth By School” Question: How did students in my building grow compared to other students nationally by grade, and with similar amounts of instructional time? Questions to Drive the Data

Growth by School

Tab “Growth By School” Question: How did students in my building grow compared to other students nationally by grade, and with similar amounts of instructional time? Answer: In elementary building #1, all grades except 6 th grew below average compared to other students with similar starting scores and instructional time. Grade 6 showed above average growth. Questions to Drive the Data

Tab “Growth By Ethnicity” Question: Which subgroups of students grew average or above average compared to other subgroups nationally that had the similar scores and similar amounts of instructional time? Questions to Drive the Data

Growth by Ethnicity District Data: Filters always available:

Tab “Growth By Ethnicity” Question: Which subgroups of students grew average or above average compared to other subgroups nationally that had the similar scores and similar amounts of instructional time? Answer: Native Hawaiian and Pacific Islander and Asian students showed average growth compared other students, while all other groups showed below average growth. Questions to Drive the Data

Tab “Growth Figure” Question: What was the observed actual growth of our students compared to students nationally that had similar scores and instructional time? Questions to Drive the Data

Growth Figure

Tab “Growth Figure” Question: What was the observed actual growth of our students compared to students nationally that had similar scores and instructional time? Answer: Grades K, 4, 7, 9 & 10 showed more observed growth compared to the average normative growth of students with similar scores and instructional time. All other grades showed below average growth. Questions to Drive the Data

Growth Graph Grades K, 4, 7, 9 & 10 showed higher growth than average normative growth.

Tab “Testing Conditions” Shows the percentages of students within each school whose MAP assessments triggered one or more “flags” Indicates that the student was not fully engaged when taking the assessment – Low accuracy = significantly less than 50% correct (higher SEM) – Short duration = little time on test – Unusual time increase/decrease = differences between fall to spring test duration Questions to Drive the Data

Tab “Testing Conditions” Question: Throughout the district, are there any flags related to testing conditions that might be cause for further investigation? Questions to Drive the Data

Testing Conditions

Tab “Testing Conditions” Question: Throughout the district, are there any flags related to testing conditions that might be cause for further investigation? Answer – Only 9 out of 52 buildings showed no flags for testing conditions. Many buildings had flags for multiple categories, especially related to low accuracy and short duration. Schools with multiple flags may want to investigate the reasons for these flags. Questions to Drive the Data

Tab – “Testing Conditions & Growth” Question – What impact did low accuracy and short test duration have on the growth rates of these students? Questions to Drive the Data

Testing Conditions and Growth

Pricing and Support  30¢ per student  $1,000 minimum  Includes one hour of live online support 29

Tab – “Testing Conditions & Growth” Question – What impact did low accuracy and short test duration have on the growth rates of these students? Answer: The district wide aggregate growth percentile is 41%ile. For schools that had significant flags for low accuracy and short test duration, the aggregate growth percentiles were significantly lower. This may indicate that high numbers of disengaged students may have depressed the district-wide growth percentile. Questions to Drive the Data

Virtual Comparison Group (VCG)

Since my students and my school aren’t typical, how can you expect my students to make typical progress? – Need an apples to apples comparison – A proof point to demonstrate what is possible That’s great, but… 32

What is a VCG?  Custom Report  Evaluates individual students’ observed growth against a virtual comparison group  Same grade & initial score  Same number of instructional weeks  Similar F/R eligibility  Same urban/rural classification 33

What is a VCG?  One step further….  We apply the filters to create a group that matches each of your students  Virtual Comparison Group = virtual peer group 34

How we create a VCG 35 We identify included students Identify all matching students from GRD School Income, Urban vs. Rural Classification Grade, Subject, Starting Achievement, Instructional Weeks Randomly select comparison group

Class Reports – Includes questions to guide interpretation School Reports District Reports with pivot tables What do you get?

Class Report

Questions to guide interpretation

School Report 6 th Grade 7 th Grade 8 th Grade

District Report

Growth by Achievement Growth by School Growth by Ethnicity Testing Conditions and Growth Hybrid Success Metric VCG & Hybrid Growth Pivot Tables

Hybrid Success Goal – For students above state proficiency target = average growth from norming study – For students below state proficiency target = amount of growth needed to be proficient by 10 th grade Hybrid Success Metric

Pricing and Support  90¢ per student  $1,500 minimum  Includes two hours of live online support 45