Groton Data Day Accountability, Performance, and Balanced Assessments Facilitated by: Neal Capone District Data Coordinator CNYRIC.

Slides:



Advertisements
Similar presentations
Developing School Improvement Plans #101
Advertisements

Elementary and Secondary Education Act (ESEA) “No Child Left Behind” Act of 2001 Public Law (NCLB) Brian Jeffries Office of Superintendent of.
‘No Child Left Behind’ Loudoun County Public Schools Department of Instruction.
Before IDEA One in five children with disabilities was educated. One in five children with disabilities was educated. More than 1 million children with.
Update on Data Reporting April LEAP Changes LEAP software will be released shortly. Final LEAP software will not be available before mid-July. We.
How No Child Left Behind (NCLB) Accountability Works in New York State: Determining Status Based on Results October 2010 The New York State.
How Adequate Yearly Progress (AYP) Is Determined Using Data The New York State Education Department August 21, 2012.
Michigan Merit Examination ELA Assessment Analysis Presented by: Dr. Joan Livingston.
School District of University City Jackson Park Elementary School SCHOOL IMPROVEMENT PLAN Joylynn Wilson, Superintendent Monica Hudson, Principal.
Flexibility in Determining AYP for Students with Disabilities Background Information—Slides 2—4 School Eligibility Criteria—Slide 5 Calculation of the.
How Adequate Yearly Progress (AYP) Is Determined Using Data The New York State Education Department November 12, 2014.
NCOSP Learning Community Forum March 2007 Science Curriculum Topic Study Examining Student Thinking.
How Adequate Yearly Progress (AYP) Is Determined Using Data The New York State Education Department November 12, 2014.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
No Child Left Behind (NCLB) Accountability in New York State Using 2010–11 School Year Results To Determine 2011–12 School Year Status The New York State.
New York’s Differentiated Accountability Pilot: An Overview.
School Progress Index 2012 Results Mary Gable- Assistant State Superintendent Division of Academic Policy Carolyn Wood - Assistant State Superintendent.
Data Warehouse New Data Administrator Training October 3, 2014 Data Coordinators: Larry Hunt & Angie Russell.
The DATA WISE Process and Data- Driven Dialogue Presented by: Lori DeForest (315)
How No Child Left Behind (NCLB) Accountability Works in New York State: Implementing NCLB December 11, 2008 The New York State Education Department.
The New York State Accountability System: Simplified Emma Klimek April 16, 2009.
ESEA ACCOUNTABILITY JAMESVILLE-DEWITT
School Report Card ACCOUNTABILITY STATUS REPORT: ENGLISH LANGUAGE ARTS, MATHEMATICS, SCIENCE, AND GRADUATION RATE For GREENVILLE CSD.
State and Federal Testing Accountability: Adequate Yearly Progress (AYP) Academic Performance Index (API) SAIT Training September 27, 2007.
Title I Annual Meeting What Every Family Needs to Know!
Data Driven Dialogue: Practical Strategies for Collaborative Inquiry
RTI SUMMIT APRIL 28, 2010 GAIL JESSETT, SPOKANE PUBLIC SCHOOLS DEBBIE LAHUE, NORTHEAST WA ESD 101 Coaching Within Your Team Becoming Each Other’s Best.
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
Draft: September 26, Differentiated Accountability Proposal.
Data-Driven Dialogue Predicting, Exploring, and Explaining Data
] Using Data to Improve Instruction: Building on Models that Work St. Louis, MO Professional Development Model Using the Surveys of Enacted Curriculum.
NCLB: Then and Now. “…to ensure that all children have a fair, equal, and significant opportunity to obtain a high-quality education and reach, at a minimum,
Differentiated Accountability Proposal. Draft: September 24, USED Differentiated Accountability Model -March 18: Secretary Spellings announced.
Jackson County School District A overview of test scores and cumulative data from 2001 – 2006 relative to the following: Mississippi Curriculum Test Writing.
Support and Assistance in High School Verification and CTE Reporting July 6, 2011.
The elements of the proposed accountability model are subject to change.
Testing Coordinators: October 4, 2007 Adequate Yearly Progress (AYP) and Academic Performance Index (API)
How No Child Left Behind (NCLB) Accountability Works in New York State: Determining Status Based on Results October 14, 2009 The New York.
No Child Left Behind. HISTORY President Lyndon B. Johnson signs Elementary and Secondary Education Act, 1965 Title I and ESEA coordinated through Improving.
School Planning Councils Collaborative Inquiry to Improve Student Achievement with Bruce Wellman Webcast February 16, 2005 Host: BC Ministry of Education.
Michigan School Report Card Update Michigan Department of Education.
School and District Accountability Rules Implementing No Child Left Behind (NCLB) The New York State Education Department March 2006.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
School and District Accountability Reports Implementing No Child Left Behind (NCLB) The New York State Education Department March 2004.
Anderson School Accreditation We commit to continuous growth and improvement by  Creating a culture for learning by working together  Providing.
A GUIDE FOR CANTON PUBLIC SCHOOL DISTRICT’S PARENTS AND STAKEHOLDERS The Mississippi Literacy-Based Promotion Act
Data for the 2000 and 2001 Cohorts February 2006.
NYSED Policy Update Pat Geary Statewide RSE-TASC Meeting May 2013.
School Report Card and Identification Progression
HS Accountability Reports
Overview Page Report Card Updates Marianne Mottley – Director Office of Accountability.
The New York State Education Department
[Name of your School] Title I Annual Meeting
Welcome Please sit in groups. Thanks. November 29, 2016
JE Moss Elementary Leadership Meeting
Welcome Please sit in groups. Thanks. January 4, 2017
Using Collaborative Inquiry to Empower Culture and Climate Change
[Name of your School] Title I Annual Meeting
July 26, 2017 Margie Johnson, Ed.D. Collaborative Inquiry Coordinator
Welcome Please sit in groups. Thanks. October 11, 2016
ESSA Update “Graduation Rate & Career and College Readiness”
Two Rivers Collaborative Inquiry Data Meeting
Welcome Please sit in groups of 3 or 4. Thanks. October 10, 2016
Welcome Please sit in groups. Thanks. January 4, 2017
Developing School Improvement Plans #101
Welcome Please sit in groups. Thanks. January 10 & 25, 2017
How Adequate Yearly Progress (AYP) Is Determined Using Data
State of Wisconsin School Report Cards Fall 2014 Results
Using Collaborative Inquiry Process for Providing Feedback
Presentation transcript:

Groton Data Day Accountability, Performance, and Balanced Assessments Facilitated by: Neal Capone District Data Coordinator CNYRIC

Agenda Grounding – Spring Synectic Data Literacy – Accountability and Assessment 3-8 ELA/Math Collaborative Learning Cycle – Score Trend Comparison – Cohort Trend and Subgroup Performance Balanced Assessment – Rick Stiggens – Self-Evaluation

The Region Serviced by the CNYRIC

Data Flow SIS (Student Management System) PD Data System IEP Direct NutriKids/ Transfinder Level 2 Repository (SED) Data Warehouse (Level 1) Level 1 Container COGNOS DataMentor nySTART

Synectic What are some popular Spring Activities?

SYNECTIC Data Analysis is like … because... Syn (bring together) Ectic (diverse elements)

Grounding Exercise Name Position Share your Synectic

“Using data effectively does not mean getting good at crunching numbers. It means getting good at working together to gain insights from student-assessment results and to use the insights to improve instruction.” - Kathryn Boudett, Elizabeth City, & Richard Murnane, “When 19 Heads Are Better Than One,” Education Week, December 7, 2005.

Word Splash Work with a partner to define as many terms as you can on the Word Splash

Data Warehouse SIRS NYSSIS Continuous Enrollment Performance Index AYP AMO Effective AMO NYSAA Participation Rate Accountability Subgroups Safe Harbor BEDS NYSESLAT Accountability Cohort AVR Graduation Cohort COGNOS Differentiated Accountability Triangulating Data Word Splash Sampling Principle Summative Assessment Formative Assessment Scale Score --- Raw Score Standards-Referenced Test

Calculation of the Performance Index (PI) Elementary-Middle Levels: PI = [(number of continuously enrolled tested students scoring at Levels 2, 3, and 4 + the number scoring at Levels 3 and 4) ÷ number of continuously enrolled tested students] X 100 Secondary Level: PI = [(number of cohort members scoring at Levels 2, 3, and 4 + the number scoring at Levels 3 and 4) ÷ number of cohort members] X 100 A Performance Index (PI) is a value from 0 to 200 that is assigned to an accountability group, indicating how that group performed on a required State test (or approved alternative) in English language arts, mathematics, or science. PIs are determined using the following equations:

Level 1: 5 students Level 2: 15 students Level 3: 45 students Level 4: 10 students PI = ( ) + ( ) 75 PI = 167 X 100

Measure/PurposeCohort UsedStandard/AMO Subgroup Accountability Performance All grade 3-8 students) or designated ungraded students) reported in the repository as continuously enrolled (one-year continuous enrollment = enrolled BEDS day through assessment dates) English: PI of 167 Math: PI of 152 Science; PI of or more students for ELA or Math Participation Rate All grade 3-8 students (or designated ungraded students) reported in the repository as enrolled during assessment administration and make-up dates ELA and Math: 95% Science: 80% for “all students” 40 or more students for ELA or Math Elementary/Middle Level Accountability

High School Accountability Measure/PurposeCohort UsedStandard/AMO Subgroup Accountability English and Math Performance 2007 Accountability Cohort (one-year continuous enrollment in fourth year of HS = enrolled BEDS day through June 30, 2011) English: PI of 183 Math: PI of or more students for ELA or Math English and Math Participation All students reported in State Repository as enrolled in grade 12 on June 30, 2011 and students who graduated between July 1, 2010 and June 30, %40 or more students Graduation Rate 2006 Graduation-Rate Cohort (five months’ enrollment) including transfers to GED 80% for “all students”

An Effective AMO is the lowest PI that an accountability group of a given size can achieve in a subject for the group’s PI not to be considered significantly different from the AMO for that subject. If an accountability group's PI equals or exceeds the Effective AMO, the group is considered to have made AYP. Effective AMOs Further information about confidence intervals and Effective AMOs is available at:

2010–11 Safe Harbor Calculation for ELA and Math Safe Harbor is an alternate means to demonstrate AYP for accountability groups whose PI is less than their Effective AMO. The Safe Harbor Target calculation for ELA and Math for using the PI is: Safe Harbor Target = { PI} + [(200 – { PI})  0.10]* For a group to make safe harbor in English or math, it must meet its Safe Harbor Target and also meet the science (at the elementary/middle level) or graduation rate (at the secondary level) qualification for safe harbor. To qualify at the elementary/middle level, the group must make the State Standard or its Progress Target in science in grades 4 and/or 8. At the secondary level, it must make the State Standard or its Progress Target for graduation rate.

21 Phase Diagnostic Differentiated Accountability Model Category CORRECTIVE ACTIONIMPROVEMENTRESTRUCTURING CURRICULUM AUDITSCHOOL QUALITY REVIEW ASSIGNMENT OF Joint Intervention Team and Distinguished Educator FOCUSEDCOMPBASICFOCUSEDCOMPREHENSIVEFOCUSEDCOMP SURR Intensity of Intervention FAILED AYP 2 YEARS Plan/Intervention CORRECTIVE ACTION PLAN & IMPLEMENTATION OF CURRICULUM AUDIT IMPROVEMENT PLAN CREATE AND IMPLEMENT External personnel to revise and assist school implement the most rigorous plan or, as necessary, PHASE-OUT /CLOSURE Oversight & Support SED provides TA to districts: sustaining greater latitude and more responsibility for addressing schools SED empowers districts: gives them the support and assistance necessary to take primary responsibility for developing and implementing improvement strategies SED & its agents work in direct partnership with the district

Student Management System (SIS) Special Education Package (IEP Direct) Active Enrollment Program Services (LEP, CTE, Summer School, Poverty, Free and Reduced) Demographics Program Start and End Dates Process Log Disability Code NYSAA Eligibility 504 Safety Net Eligibility AIS 209 Code (RTIm) Title I 0286 Code (RTIm) At Data Warehouse Refresh Disability Code Monthly DW Refresh Nightly Centris Sync Demographics Data Warehouse (Level 1) Level 1 Container

Data Warehouse SIRS NYSSIS Continuous Enrollment Performance Index AYP AMO Effective AMO NYSAA Participation Rate Accountability Subgroups Safe Harbor BEDS NYSESLAT Accountability Cohort AVR Graduation Cohort COGNOS Differentiated Accountability Triangulating Data Word Splash Sampling Principle Summative Assessment Formative Assessment Scale Score --- Raw Score Standards-Referenced Test

District Report Card

Managing Modeling Mediating Monitoring Data-Driven Dialogue The Collaborative Learning Cycle

"He uses statistics as a drunken man uses lamp-posts... Andrew Lang ( ) In reference to an individual who misuses data: …for support rather than illumination."

Data-Driven Dialogue The Collaborative Learning Cycle Activating and Engaging Managing Modeling Mediating Monitoring What are some predictionsWhat are some predictions we are making? we are making? With what assumptions areWith what assumptions are we entering? we entering? What are some questionsWhat are some questions we are asking? we are asking? What are some possibilitiesWhat are some possibilities for learning that this for learning that this experience presents to us? experience presents to us?

What is a prediction you made? What might be some assumptions that influenced your prediction?

Data-Driven Dialogue The Collaborative Learning Cycle Managing Modeling Mediating Monitoring Exploring and Discovering What important points seem to “pop out”?What important points seem to “pop out”? What are some patterns, categories, or trends that are emerging?What are some patterns, categories, or trends that are emerging? What seems to be surprising or unexpected?What seems to be surprising or unexpected? What are some things we have not yet explored?What are some things we have not yet explored?

Principles of Data-Driven Dialogue Importance of Predictions Conscious Curiosity Purposeful Uncertainty Visually Vibrant Information Third Point

Data-Driven Dialogue The Collaborative Learning Cycle Managing Modeling Mediating Monitoring Exploring and Discovering What important points seem to “pop out”?What important points seem to “pop out”? What are some patterns, categories, or trends that are emerging?What are some patterns, categories, or trends that are emerging? What seems to be surprising or unexpected?What seems to be surprising or unexpected? What are some things we have not yet explored?What are some things we have not yet explored?

Data-Driven Dialogue The Collaborative Learning Cycle Managing Modeling Mediating Monitoring Organizing and Integrating What inferences/ explanations/ conclusions might we draw?What inferences/ explanations/ conclusions might we draw? What additional data sources might we explore to verify our explanations?What additional data sources might we explore to verify our explanations? What are some solutions we might explore... ?What are some solutions we might explore... ? What data will we need to collect to guide implementation?What data will we need to collect to guide implementation?

“My team has created a very innovative solution, but we’re still looking for a problem to go with it.”

 Curriculum  Instructional methods and materials  Teacher knowledge and skills  Student readiness  Infrastructure Causal Arenas

Theories of Causation Observation: record three possible theories of causation re: your observation Circle one theory. In this space, record at least three sources of data you could use to confirm this theory.

Data-Driven Dialogue The Collaborative Learning Cycle Managing Modeling Mediating Monitoring Organizing and Integrating What inferences/ explanations/ conclusions might we draw?What inferences/ explanations/ conclusions might we draw? What additional data sources might we explore to verify our explanations?What additional data sources might we explore to verify our explanations? What are some solutions we might explore... ?What are some solutions we might explore... ? What data will we need to collect to guide implementation?What data will we need to collect to guide implementation?

Time to Share Share ONE observation Share ONE theory of causation Share additional data sources that you would want to explore to confirm or disprove your theory

Balanced Assessment