August 13, 2012 8:00 a.m. – Noon by Doug Greer & Laurie Smith.

Slides:



Advertisements
Similar presentations
Understanding How the Ranking is Calculated 2011 TOP TO BOTTOM RANKING.
Advertisements

Top-to-Bottom Ranking & Priority/Focus/Reward Designations Understanding the.
Presented to the State Board of Education August 22, 2012 Jonathan Wiens, PhD Office of Assessment and Information Services Oregon Department of Education.
No Child Left Behind Act January 2002 Revision of Elementary and Secondary Education Act (ESEA) Education is a state and local responsibility Insure.
Alexander Schwarz Office of Psychometrics, Accountability, Research and Evaluation Michigan Department of Education.
Monthly Conference Call With Superintendents and Charter School Administrators.
Accountability Programs MICHIGAN SCHOOL TESTING CONFERENCE FEBRUARY 19, 2014.
Overview of the Idaho Five Star Rating System Dr. TJ Bliss Director of Assessment and Accountability
What is a Z Score?. The State’s Waiver from NCLB All schools will achieve 85% proficiency for all students in all subjects (as measured on a statewide.
Understanding Wisconsin’s New School Report Card.
Leader & Teacher SLTs 2014 – ComponentEvaluation for TeachersEvaluation for School Leaders Setting GoalsTeachers set two SLTs in collaboration with.
Review Planning Faribault Public Schools DATA DAY.
Venessa A. Keesler, Ph.D. Bureau of Assessment and Accountability Michigan Department of Education Presentation to MASFPS Fall Directors’ Institute October.
Top-to-Bottom Ranking & Priority/Focus/Reward Designations Understanding the.
July 30, :00 – 10:30 a.m. by Doug Greer.  What will accountability such as AYP look like this August and how will this impact our district?  What.
Understanding How the Ranking is Calculated
UNDERSTANDING HOW THE RANKING IS CALCULATED Top-to-Bottom (TTB) Ranking
Wayne County High School Interventions for Student Achievement Marlene Griffith Mathematics Department Chair Wayne County High School (912) , ext.
Top-to-Bottom Ranking & Priority/Focus/Reward Designations Understanding the.
Michigan’s Accountability Scorecards A Brief Introduction.
Information on Focus Schools Released/Retained Fall 2015.
MARSHALL PUBLIC SCHOOLS STATE ACCOUNTABILITY RESULTS Multiple Measurement Rating (MMR) – Initial Designation.
A Parent’s Guide to Understanding the State Accountability Workbook.
1 Paul Tuss, Ph.D., Program Manager Sacramento Co. Office of Education August 17, 2009 California’s Integrated Accountability System.
ASSESSMENT & ACCOUNTABILITY Updates to Student Testing and School Accountability for the school year.
Understanding How the Ranking is Calculated 2011 TOP TO BOTTOM RANKING.
School & district accountability reporting Title I Technical Assistance & Networking Session October 17, 2013.
A Closer Look at Adequate Yearly Progress (AYP) Michigan Department of Education Office of Educational Assessment and Accountability Paul Bielawski Conference.
School Performance Framework Sponsored by The Colorado Department of Education Summer 2010 Version 1.3.
Michigan Accountability Data Tools February 1, 2013.
Ohio’s New Accountability System Ohio’s Response to No Child Left Behind (NCLB) a.k.a. Elementary & Secondary Education Act a.k.a. ESEA January 8, 2002.
Department of Research and Planning November 14, 2011.
Helping EMIS Coordinators prepare for the Local Report Card (LRC) Theresa Reid, EMIS Coordinator HCCA May 2004.
MI-SAAS: Michigan School Accreditation and Accountability System Overview of Key Features School Year.
1 Michigan School Accreditation and Accountability System pending legislative approval Venessa A. Keesler, Ph.D. March 16, 2011.
MEAP / MME New Cut Scores Gill Elementary February 2012.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction Update on the California English Language Development Test.
Mock Data Retreat Pam Lange TIE/ESA 7. 2 Agenda  Based on school’s need  May be ½ day/ full day/ two days  Work with district to determine needs –
Public School Accountability System. Background One year ago One year ago –100 percent proficiency required in –AMOs set to increase 7-12 points.
MERA November 26,  Priority School Study  Scorecard Analyses  House Bill 5112 Overview.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Iowa School Report Card (Attendance Center Rankings) December 3, 2015.
ESEA Federal Accountability System Overview 1. Federal Accountability System Adequate Yearly Progress – AYP defined by the Elementary and Secondary Education.
PVAAS School Consultation Guide Fall 2010 Session C: 9-12 High School – All Data Tools PVAAS Statewide Core Team
Accountability Scorecards Okemos Board of Education September 2013.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
MDE Accountability Update SLIP Conference, January 2016.
ADEQUATE YEARLY PROGRESS. Adequate Yearly Progress Adequate Yearly Progress (AYP), – Is part of the federal No Child Left Behind Act (NCLB) – makes schools.
Understanding Your Top from Your Bottom: A Guide to Michigan’s Accountability System September 2013 Mitch Fowler
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
February 2016 Our School Report Cards and Accountability Determinations South Lewis Central School District.
Accountability Scorecards Top to Bottom Ranking February 2016.
Top to Bottom and Persistently Lowest Achieving Schools Lists Federally Approved Requirements for Identifying Persistently Lowest Achieving Schools August.
Public School Accountability System. Uses multiple indicators for broad picture of overall performance Uses multiple indicators for broad picture of overall.
Minnesota’s Proposed Accountability System “Leading for educational excellence and equity. Every day for every one.”
MDE Accountability Update MSTC Conference, February 2016.
Anderson School Accreditation We commit to continuous growth and improvement by  Creating a culture for learning by working together  Providing.
Update on District and School Accountability Systems 2014 AdvancED Michigan Fall Conference November 7, 2014.
1 Testing Various Models in Support of Improving API Scores.
Every Student Succeeds Act (ESSA) Accountability
Student Achievement Data Displays Mathematics & Reading Grade 3
New Statewide Accountability System
Framework for a Next-Generation Accountability System
Prepared for Quincy Schools – November 2013
Prepared for DD Key Contacts – September 2013
Summary of Final Regulations: Accountability and State Plans
2019 Report Card Update Marianne Mottley Report Card Project Director
Understanding How the Ranking is Calculated
State of Wisconsin School Report Cards Fall 2014 Results
Michigan School Accountability Scorecards
Presentation transcript:

August 13, :00 a.m. – Noon by Doug Greer & Laurie Smith

1. RELATIONSHIPS 2. RELEVANCE 3. RIGOR 4. RESULTS

Please take a moment to briefly describe your answer during sessions of review and at designated times to the following questions:  What is working to help struggling math students?  What is working to help readers who struggle?  What is working to help writers who struggle?  What is working to help struggling science students?  What is working to help students who struggle with social studies (content and skills)?

 What are others doing to help struggling learners? Take minutes …  What does the new accountability data mean for our schools and our students?  How do we use the tools provided by MDE to improve teaching and learning?  What other data should we consider when closing the achievement gaps?

 Tuesday, July 31: “Embargoed” notice to district superintendents of Priority and Focus schools  Thursday, August 2: Public release likely of the following: ◦ Ed YES! Report Card (old letter grade) ◦ AYP Status (old pass or fail system) ◦ Top to Bottom Ranking and possibly:  Reward schools (Top 5%, Top improvement, BtO)  Focus schools (largest achievement gap top vs. bottom)  Priority schools (Bottom 5%) Doug Greer x4109

Principal 2 of 4 – Accountability & Support 1.Top to Bottom Ranking given to all schools with 30 or more students tested, full academic year (0 – 99 th percentile where 50 th is average) 2.NEW designation for some schools  Reward schools (Top 5%, Significant Improvement or Beating the Odds)  Focus schools (10% of schools with the largest achievement gab between the top and bottom)  Priority schools (Bottom 5%, replaces PLA list) 3.NEW in 2013, AYP Scorecard based on point system replacing the “all or nothing” of NCLB.

10 Understanding the TWO Labels Priority/Focus/Reward (Top to Bottom List) AYP Scorecard (Need > 50%) Green-Yellow-Red Normative—ranks schools against each other Criterion--referenced—are schools achieving a certain PROFICIENCY level? Focuses attention on a smaller subset of schools; targets resources Given to all schools; acts as an “early warning” system; easy indicators The primary mechanism for sanctions and supports Used primarily to identify areas of intervention and differentiate supports Fewer schoolsAll schools

 Z-scores are centered around zero or the “state average”  Positive is ABOVE the state average  Negative is BELOW the state average State Average Z-score = Zero % 69% 84% 98% % 16% 2% Percentile State Average

13

 In terms of achievement gaps, how well do you think your school (or schools in your district) compare to all schools in the state?  Specifically, which content areas do you feel will have the smallest gaps versus the largest gaps relative to the state average?

Top-to-Bottom Individual School Lookup … Download MS Excel file at Accountability Page of OAISD

 Some schools may be exempt from Focus school designation in year 2 IF they are deemed Good-Getting-Great (G-G-G): ◦ Overall achievement is above 75 th percentile ◦ Bottom 30% meets Safe Harbor improvement (or possibly AYP differentiated improvement)  G-G-G schools will be exempt for 2 years, then will need to reconvene a similar deep diagnostic study in year 4. Note: See ESEA Approved Waiver pp

 Unlike Priority label, Focus label may only be one year. (Title I set-aside lasts 4 years) NOTE: AYP Scorecard, Top to Bottom Ranking and Reward/Focus/Priority designation for August 2013 determined by Fall MEAP, 2012 and Spring MME, 2013.

Requirement for all Focus schools: ◦ Notification of Focus status by August 21, 2012 via the Annual Ed Report ◦ Quarterly reports to the district board of education ◦ Deep diagnosis of data prior to SIP revision (if Title I by Oct 1) ◦ Professional Dialogue, toolkit available to all (if Title I requires DIF with time range of Oct – Jan.) ◦ Revision of School Improvement Plan with activities focused on the Bottom 30% included (if Title I additional revisions to Cons App, both by Jan 30) ◦ NOTE: Additional requirements of Title I schools regarding set-asides and specific uses of Title I funds.

 Supports Available: ◦ OAISD work session on August 13 ◦ OAISD follow up session ??? TBD ◦ OAISD work session on “Data Utilization driving Instruction and School Improvement” October 25 ◦ “Defining the Problem (Data  Planning)” work session at OAISD on January 22, 2013 ◦ “SIP Planning Session” at OAISD on March 22, 2013 ◦ Individualized support by OAISD per request ◦ MDE Toolkit available in September, 2012 ◦ Sept. MDE assigns DIF for Title I schools only ◦ MDE Regional meeting on September 11 in GR

20 Understanding the TWO Labels Priority/Focus/Reward (Top to Bottom List) AYP Scorecard (Need > 50%) Green-Yellow-Red Normative—ranks schools against each other Criterion--referenced—are schools achieving a certain PROFICIENCY level? Focuses attention on a smaller subset of schools; targets resources Given to all schools; acts as an “early warning” system; easy indicators The primary mechanism for sanctions and supports Used primarily to identify areas of intervention and differentiate supports Fewer schoolsAll schools

Top to Bottom Ranking: 95 th 2 points possible: 2 = Achievement > linear trajectory towards 85% by 2022 (10 years from 11/12 baseline) 1 = Achievement target NOT met; Met Safe Harbor 0 = Achievement target NOT met; Safe Harbor NOT met

Top to Bottom Ranking: 75 th STATUS: Lime Green

STATUS: Orange Top to Bottom Ranking: 50 th

24

Normal “Bell-Shaped” Curve Above Avg. Top 10% target Below Avg. Bottom 10% target Average

% at Level 1 or Lv 1 & 2 or above set % % at Level 4 or Lv 3 & 4 or below set % Average Average Scale Score or Average % Correct

27 SMART Measureable Objective: All students will increase skills in the area of math on MEAP and Local assessments: The average scale score for all students in math on the MEAP will increase from 622 (10/11) to 628 by 2013/14 school year (2 points per year) The percentage of all students reaching Level 1 on the math portion of the MEAP will increase from 28% ( ) to 40% by 2013/14 school year (4% per year) The percentage of all students at Level 4 on the math portion of the MEAP will decrease from 18% (10/11) to 6% by 2013/14 school year (4% per year) The average proficiency across the grade levels on the Winter Benchmark in Delta Math will increase from 74% ( ) to 85% by the January, The number of students identified as “At Risk” on Delta Math on the Fall screener will reduce from 58 ( ) to 40 by the Fall of Goal: All students will be proficient in math.

28 SMART Measureable Objective: All students will increase skills in the area of math on MEAP and Local assessments: The average percentage correct for all students in math on the MEAP will increase from 52% (10/11) to 61% by 2013/14 school year (3% per year) The percentage of all students reaching 80% accuracy on math portion of the MEAP will increase from 28% ( ) to 40% by 2013/14 school year (4% per year) The percentage of all students reaching 40% accuracy on math portion of the MEAP will increase from 82% (10/11) to 94% by 2013/14 school year (4% per year) Percent Correct example from 2010/11 New Cut Score Proficiency and Scale Score on previous slide from 2011/12 Goal: All students will be proficient in math.

Take a break then discuss (or vice versa) the following two questions:  Why should MDE use Full Academic Year (FAY) students (those who have 3 counts in the year tested) to hold schools accountable?  Why should local school districts NOT use FAY student data to set goals to improve instruction?

TOP TO BOTTOM RANKING Ranks all schools in the state with at least 30 full academic year students in at least two tested content areas (Reading, Writing, Math, Science and Social Studies weighted equally plus graduation). Each content area is “normed” in three categories: 2 years of Achievement (50 – 67%) 3 – 4 years of Improvement (0 – 25%) Achievement gaps between top and bottom (25 – 33%) Graduation rate (10% if applicable) 2 year Rate (67%) 4 year slope of improvement (33%)

 For science, social studies, writing, and grade 11 all tested subjects HOW IS THE TOP TO BOTTOM RANKING CALCULATED Two-Year Average Standardized Student Scale (Z) Score Four-Year Achievement Trend Slope Two-Year Average Bottom 30% - Top 30% Z-Score Gap School Achievement Z-Score School Performance Achievement Trend Z-Score School Achievement Gap Z-Score School Content Area Index 1/ 2 1/ 4 Content Index Z- score

 Z-scores are centered around zero or the “state average”  Positive is ABOVE the state average  Negative is BELOW the state average State Average Z-score = Zero % 69% 84% 98% % 16% 2% Percentile State Average

Top-to-Bottom Individual School Lookup … Download MS Excel file at or OAISD Accountability page

-.4 to.4.5 to to -.5 When finished with the worksheet please add to the Google “Chalk Talk” about what works.

………

……… Suppose there are 20 students (most of whom are shown) and the average Z-score of all 20 is 0.28, this represents the Achievement Score before it is standardized again into the Z-score

……… Top 30% of students (n=6) has average score of 1.62 Mid 40% (n=8) has average score of Bottom 30% (n=6) has average score of Gap = – 1.62 or then standardized

Performance Level Change (“growth”) Year X Grade Y MEAP Performance Level Year X+1 Grade Y+1 MEAP Performance Level Not Proficient Partially ProficientProficientAdv LowMidHighLowHighLowMidHighMid Not Proficient Low MIISI Mid DMIISI High DDMIISI Partially Proficient Low SDDDMIISI High SD DDMIISI Proficient Low SD DDMIISI Mid SD DDMII High SD DDMI Advanced MidSD DDM

 GLOBAL data ◦ District level  School level  Grade Level ◦ Best used to study trends of student performance over time (3-5 years) & across different subgroups. ◦ Informs school-wide focus, must drill deeper  STUDENT level data ◦ Use only when timely reports (less than 2 weeks) are available at a more specific diagnostic level.  DIAGNOSTIC levels ◦ Cluster (formerly Strands in HS/GLCEs) ◦ Standards (formerly Expectations in HS/GLCEs) ◦ Learning Targets

 Have you seen this new IRIS report? ◦ What are your predictions around what the historic cut scores will look like? ◦ Do you have assumptions about strengths and weaknesses at certain grade levels and content areas?

 You may have noticed many of the green lines are stagnant. Did you notice any bright spots with a steady increase and separation from state & county average?

 Surfacing experiences and expectation  Make predictions, recognize assumptions and possible learning  Analyzing the data in terms of observable facts  Search for patterns, “ah-ha”, be specific (avoid judging & inferring)

 Within the Google Doc Collection: ◦ Dialogue in small groups and record what is observable in the district data at ALL grade levels. ◦ Do NOT judge, conjecture, explain or infer. ◦ Make statements about quantities (i.e. 3 rd grade math fluctuated between 57-72%; however the past three years have been stagnant around 64% ◦ Look for connections across grade levels (i.e. A sharp increase was seen in 5 th grade math in 2009 (53  80%), then the same group of students increased in 7 th grade math in 2011 (54  76%)

School Year % Adv + Prof% Adv% Prof% Partial% Not Prof Number Assessed Mean Scale Score %22.00%42.50%19.70%15.80% %20.80%39.60%24.80%14.90% %19.80%40.00%23.10%17.10% %12.50%46.80%24.70%16.10% %9.00%37.00%35.00%18.00% OAISD

REPEAT Activate & Explore until data drilled down to diagnostic level Doug Greer x4109

Diagnostic …NOT Timely

 Dig DEEPER than just proficiency by looking at trends at both the strand and GLCE level.  Triangulate, i.e.

 What are some of the advantages of the ACT Explore Item Analysis and released items?

Once you have dug deeper and looked at multiple types of data, then ask:  What conclusions can be drawn?  Are our current focus addressing the issues?  What theories do you have that are supported by data about why deficiencies exist?

 Develop an action plan: ◦ WHO should explore this data? WHO are the experts able to make instructional changes? WHO needs to be empowered? ◦ WHEN will time be given to dialogue about data that will impact instruction and ultimately make a difference for students? ◦ WHAT data have you filtered that will be useful in a data dialogue? WHAT four steps will you use to facilitate a data dialogue? ◦ To truly have a balanced assessment system, WHAT data is missing or under utilized?

“There exists a vast sea of information … As leaders, you must filter this information and select small, critical components for the practitioners to draw solid conclusions that will result in improved teaching and learning.” Doug Greer