New Mexico Principal Support Network Helping Leaders Use Accountability Data Effectively.

Slides:



Advertisements
Similar presentations
Assessing Students in the 21st Century
Advertisements

School Leadership Team Fall Conference West Virginia Department of Education Division of Educator Quality and System Support Bridgeport Conference Center.
Building Effective Leadership Teams: A Practitioner’s Look
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
 Reading School Committee January 23,
1 Program Improvement Update Foundations for writing the LEA Addendum.
School District of University City Jackson Park Elementary School SCHOOL IMPROVEMENT PLAN Joylynn Wilson, Superintendent Monica Hudson, Principal.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Training in Instructional Consultation, Assessment & Teaming Todd A. Gravois, Ph.D. Edward Gickling, Ph.D. & Sylvia Rosenfield, Ph.D.
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
Data Disaggregation: For Data Driven Decision Making By Ron Grimes: Special Assistant to the Assistant Superintendent Office of Career and Technical Accountability.
Grade 12 Subject Specific Ministry Training Sessions
Closing the Achievement Gap A 3-hour training for experienced SBDM Council members.
Technology Use Plan Mary Anderson 7/29/08 EDTECH 571 click to go to each slide.
Student Assessment Inventory for School Districts Inventory Planning Training.
Alaska School Leaders Institute Moving Toward Implementation of Alaska’s ELA & Math Standards.
Bibb County Schools Standard 1: Vision and Purpose Standard: The system establishes and communicates a shared purpose and direction for improving.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Data for Student Success Regional Data Initiative Presentation November 20, 2009.
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
Jackson Public School District Holistic Accountability in Action.
Proficiency Delivery Plan Strategies Curriculum, Assessment & Alignment Continuous Instructional Improvement System ( CIITS) New Accountability Model KY.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
RtI in Georgia: Student Achievement Pyramid of Intervention
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Professional Learning Communities “The most promising strategy for sustained, substantial school improvement is developing the ability of school personnel.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Technology Needs Assessment Presentation for Los Fresnos HS
A state-wide effort to improve teaching and learning to ensure that all Iowa students engage in a rigorous & relevant curriculum. The Core Curriculum.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
PREPARING [DISTRICT NAME] STUDENTS FOR COLLEGE & CAREER Setting a New Baseline for Success.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
Florida Center for Reading Research: Mission & Projects Dr. Marcia L. Grek Council of Language Arts Supervisors, May 2003.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Texas STaR Chart School Technology and Readiness.
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
10+ Ways to Analyze Data Presenter: Lupe Lloyd Lupe Lloyd & Associates, Inc.
By Billye Darlene Jones EDLD 5362 Section ET8004-1B February, 2010.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
A simple tool for a complex job INDISTAR. Learning Outcomes As a result of this training, participants will be able to… Navigate the Wisconsin Indistar.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
1 Monitoring and Revising the Title I, Part A, Schoolwide Plan Title I University March 11, 2015 Virginia Department of Education Office of Program Administration.
Curriculum Overview May 2011 Travis Bracht Director of Student Learning.
Summer Series, 2007 Building Capacity to Make Research-Based Practice Common Practice In Georgia Utilizing the Keys to Quality.
Presented by Mary Barton SATIF CFN 204 Principals’ Conference September 16, 2011.
Statewide System of Support For High Priority Schools Office of School Improvement.
Required Skills for Assessment Balance and Quality: 10 Competencies for Educational Leaders Assessment for Learning: An Action Guide for School Leaders.
DSS Stats at a Glance DSS Stats at a Glance Decision Support System Decision Support System DATA REPORTING SYSTEMS.
Amendments to the District ESE Policy and Procedures that outline Virtual education guidelines appear in blue. "The noblest pleasure is the joy of understanding."
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Karin Aure Dixon, Ed.D. 87th Annual CERA Conference December 4, 2008
What’s the connection to Ohio’s other initiatives?
Presentation transcript:

New Mexico Principal Support Network Helping Leaders Use Accountability Data Effectively

Presented By: Beata I. Thorstensen, NM Office of Education Accountability Jan Sheinker, Sheinker Education Services Judy Englehart, Aztec Municipal Schools Tania Prokop, Aztec Municipal Schools Presented At: CCSSO’s Education Leaders Conference, September Hilton St. Louis at the Ballpark

NM Principal Support Network: Began in 2005 with 40 principals & assistant principals in 8 districts. Expanded in 2006 to superintendents, principals, assistant principals and district staff from 28 districts. Expanding again in 2007 to 43 districts. Purpose: to provide comprehensive assessment data, data analysis tools and professional development on data-based decision making. Funded through the generous support of The Wallace Foundation.

What the PSN Does: Provides a network of training and support for assistant principals, principals, superintendents and other school district leaders. Focuses on the analysis, interpretation and use of high stakes accountability data. Goal: to help educational leaders use data for school improvement, communication and advocacy.

PSN Curriculum: Data Analysis and Use  Analyzing high stakes assessment data to pinpoint areas for school improvement.  Communicating data to school boards, superintendents, teachers, students and the community.  Using evidence in conjunction with best practices for data-based decision making.

PSN Curriculum: Data-Based Decision Making  Providing best practice information on the facets of data-based decision making  Informing comprehensive school improvement plans utilizing New Mexico’s Educational Plan for Student Success (EPSS).

PSN Curriculum: Connecting with Peers  Members work with peers both within their district and outside of their district to: Analyze student data Build comprehensive school improvement plans Share promising practices for interventions

Outcomes of PSN: Data Use Used to communicate data with district staff, teachers and the community. Used to facilitate visits with school improvement teams from the Public Education Department. Used to inform the development of the statewide data warehouse reporting system. Used to inform decisions about instructional/curriculum interventions.

PSN Membership & Format Superintendents, Principals, Assistant Principals and other district data staff. Focus on Schools in Need of Improvement. 3-4 meetings per year.

Excel Pivot Tables for District & School Level Data Analysis:

Assessing Student Performance Beyond the Report Card: Allowing Drill- Down to Student Level

Example of Student Level Drill-Down:

Example of Data Display: Simple Bar Chart

Logging into NM DBDM

Surfing NM DBDM Start Anywhere

Purpose of DBDM for PSN A tool for leaders (helping their schools): Multiple entry points for differentiated support Specific, real time explanations, examples, links, school stories, and resources On-going job-embedded assistance Continuous monitoring, collaboration, and feedback Continuous data-based school improvement system into routine practice

Establish a school improvement team What is a school improvement team? How do we establish a school improvement team? Who is on our team? What do we do? How does the school improvement team make time to do its work?

Develop a hypothesis What information does our school or district need to make decisions that will improve student achievement? How is our school doing compared to the standard?

Develop a hypothesis What information does our school or district need to make decisions that will improve student achievement? Baseline – What do you already know about your school? What learning strengths and weaknesses are evident in the school data? Which subgroups of students are having difficulty learning? What instructional changes might improve student learning in the areas of weakness? What professional development is needed to improve student learning in the areas of weakness? What materials and equipment are needed to support changes in instruction? How is our school doing compared to the standard? Baseline – What do we think we know about how we are doing? Overall school results. Disaggregated by gender. Disaggregated by disability status. Disaggregated by ethnicity. Disaggregated by English proficiency. Disaggregated by income status. Disaggregated by migrant status.

Gather data to assess needs What are the most useful sources of student data? Why use multiple measures? What are the most useful sources of direct student achievement data? What are the most useful sources of indirect student achievement data? What are the most useful sources of subgroup student achievement data? What are the most useful sources of demographic data? How do context variables impact the validity of our interpretation? What do we have? What do we need?

Gathering Data Direct Measures  NMSBA– From Pivot Tables  Other Assessments –DIBELS, Short-cycle, etc.  Teacher Made Assessments Indirect Measures  Attendance and graduation rates  Information About Curriculum Demographics  Ethnicity and race proportions  Gender proportions  Socio-economic percentages  Language status proportions  Disability status percentages  Migrant status proportions Other?  Subgroups  Class size  Teacher training  Student mobility

Use data How do we organize the data to help us answer important questions? What do different sources tell us? What do different displays tell us? How do we display the data? What patterns exist in the data? How do we present data to the school and examine it? What are the tests designed to measure? Is there confirmation across data? How should we present data and conclusions to the school community? How do we formulate data-based goals? Does our interpretation raise new questions? What is our level of confidence in our interpretation?

Use data How do we organize the data to help us answer important questions? What do different sources tell us? What do different displays tell us? How do we display the data? What patterns exist in the data? How do we present data to the school and examine it? What are the tests designed to measure? Is there confirmation across data? How should we present data and conclusions to the school community? How do we formulate data-based goals? Does our interpretation raise new questions? What is our level of confidence in our interpretation?

Formulating Data Based Goals When preparing to set goals based on the data, schools clarify the results to determine the area of greatest concern for setting one or two important goals for improvement. The school:  uses the information about differences in achievement across content areas (reading, writing, mathematics) to pinpoint the goal for improvement  uses the clarification of specific standards (basic reading, reading comprehension, math computation, geometry, math problem solving) or benchmarks to help plan the strategies and interventions The goals for improvement are:  specific to the content area of greatest concern (reading, writing, mathematics)  sometimes related to strategies for improving specific standards (reading for comprehension, use literature and media) or benchmarks (Reading for information, Reading strategies, Literature)

Use data How do we organize the data to help us answer important questions? What do different sources tell us? What do different displays tell us? How do we display the data? What patterns exist in the data? How do we present data to the school and examine it? What are the tests designed to measure? Is there confirmation across data? How should we present data and conclusions to the school community? How do we formulate data-based goals? Does our interpretation raise new questions? What is our level of confidence in our interpretation?

Does our interpretation raise new questions? Which specific standards or benchmarks are students farthest from achieving? Impact of benchmarks on overall standard Which specific subgroups of students are failing to achieve the standards? Impact of a benchmark on scores for all subgroups How have past changes affected student performance? Data over time What is our level of confidence in our interpretation? Error of measurement. Distance from cutpoints. Reliability of scores/proficiency classification. Different scores on two reading tests. Differences in what is measured. Differences in how it is measured. Differences in degree of alignment with standards. Different results for state and local tests. Differences between SBA and short- cycle results. Differences is specificity. Differences in sample size.

Develop a data-based plan What must be considered when setting data-based goals? How do we set data-based goals? How can additional data help us identify the interventions we need? How do we select interventions? How do we select interventions for targeted subgroups? How do we plan to include parents in interventions? What staff development and support are necessary? How does our plan impact our budget? What is our time line? What assignments are necessary?

Develop a Plan goal clear Is the goal clear to everyone? address specific data-based needs Do strategies/activities address specific data-based needs? activities specific and sequential Are the activities specific and sequential? persons responsible stated Are persons responsible stated (who will do what, when, how)? resources specifically stated Are resources specifically stated? intervention scientifically research based/research proven Are the selected intervention scientifically research based/research proven? evaluation deadlines clearly stated Are task and evaluation deadlines clearly stated? specific professional development Do specific professional development activities support full implementation of the strategies and interventions? parents and community specific Are activities to involve parents and community specific to the strategies and interventions to be implemented? evidence of completion specifically identified Is evidence of completion specifically identified?

Monitor progress and document success How do we monitor implementation of the plan? How do we use data to monitor progress toward our goals? How do we know if we made the right decisions? How do we use data to document success in meeting goals? What should we report to the public?

Other information about purpose and use Explanations and search tools  overview  “how to” guide for various audiences  glossary of terms  annotated bibliography of school improvement publications  key work search Links to State customized version  on-line data and school improvement resources  examples of school, district, and state data use  State school improvement documents Password protected school accounts  for tracking progress on improvement implementation.  for on-time interactions with state support teams, district support personnel, and technical assistance providers.

Why Plans Fail Avoiding Pitfalls: Common Reasons Why Plans Fail never fully implemented The plan is never fully implemented. Timelines are not met Timelines are not met for each activity. Interventions are not evident Interventions are not evident in all classrooms. Tasks are not Tasks are not all completed on time. Resources are not acquired Resources are not acquired or deployed in accordance to the plan. Next steps are not articulated Next steps are not articulated.

New Mexico Principal Support Network Data Based Decision Making Website found at: Office of Education Accountability New Mexico Department of Finance & Administration Contact Information: