Data Driven Decisions for School Improvement

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

© Herts for Learning Ltd Inspecting Sixth Form 1 st December 2014.
Tracy Unified School District Leadership Institute – “Leading the Transformation” Breakout Session Authentic Data Driven Decision Making July/August 2014.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Student Learning targets
Glendale Elementary School District Professional Development August 15, 2012.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Principal Professional Development Summer 2005 Getting Ready for School Improvement Planning Office of School Intervention & SupportOffice of Accountability,
DATA PRESENTATION February 5, 2008 ARE WE IMPROVING? WAS 2007 BETTER THAN 2006?
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Why must we do Data Teams? We know the implementation of DT benefit STUDENTS -helps teachers identify exceeding/proficient students and plan for targeted.
Guide to Test Interpretation Using DC CAS Score Reports to Guide Decisions and Planning District of Columbia Office of the State Superintendent of Education.
Northwest ISD Target Improvement Plan Seven Hills Elementary
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Planning for Continuous School Improvement
Setting Your Goals For TTESS Memorial HS Training September 11, 2015.
Evaluating Growth Patterns. Setting the Stage  Welcome/Introductions  Structure for the day  Materials review R A M Materials Reports Activity M A.
DECISION-MAKING FOR RESULTS HSES- Data Team Training.
Instructional Leadership Supporting Common Assessments.
Is it working? Tracking outcomes of Professional Development Title I & Title IIA Statewide Conference June 3, 2016.
Do Now  You have given your beginning of the year diagnostic assessment. Your 30 students produce these results:  20 score below 50%  7 score between.
Every Student Matters Understanding the Indexes, the Tests, and Targeted Goal for STAAR 2016.
Turning Data into Action The GCS Journey. Where we started We had data everywhere It didn’t come with instructions We talked about data-based decision.
Student Growth What does it Mean for Principals and Teachers?
National Summit for Principal Supervisors Building the Capacity of Principals to Lead the Improvement of Instruction A Presentation by the Charlotte.
Human Capital Accountability
Data Driven Decisions for School Improvement
Differentiating Data: Making Data Come Alive (K-12)
Secondary Assessment & Grade Reporting
Middle School Training: Ensuring a Strong Foundation of Supports
Data Driven Dialogue.
Welcome! Ohio’s State Tests and the Third Grade Guarantee
Wethersfield Teacher Evaluation and Support Plan
Please sit at the appropriate table with your IC/Principal.
A Workshop for teachers new to Wayzata Public Schools
Promoting Learning and Understanding for Students in Mathematics
TAIS Overview for Districts
Data Review Team Time Spring 2014.
ORIGINAL SMART GOAL FORM
Vision 20/20: Checks and Balances
Assessment and Instructional Planning
School Improvement Plans and School Data Teams
Overview of Student Learning Objectives (SLOs) for
Leadership for Standards-Based Education
STAAR State of Texas Assessment of Academic Readiness
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Fahrig, R. SI Reorg Presentation: DCSI
PLCs Professional Learning Communities Staff PD
Data Analysis and School Improvement
Response to Intervention = RTI
CURRICULUM & INSTRUCTION
Crandall ISD Data Day August 9, 2017.
Reflection and Data Mining
Consider the Evidence Evidence-driven decision making
Connecting Prior Year Performance to Present Unit Planning
Fahrig, R. SI Reorg Presentation: DCSI
Selecting Baseline Data and Establishing Targets for Student Achievement Objectives Module Welcome to the Polk County Selecting Baseline Data and.
Links for Academic Learning: Planning An Alignment Study
SUPPORTING THE Progress Report in MATH
Professional Development Plan for Austin Elementary
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Using Assessment Results to Inform Teaching and Learning
Tom Bean High School Targeted Improvement Plan Summary
Fahrig, R. SI Reorg Presentation: DCSI
Overview of Student Learning Objectives (SLOs) for
Developing SMART Professional Development Plans
Resources are available at sim.abel.yorku.ca
Presentation transcript:

Data Driven Decisions for School Improvement

AGENDA Disaggregating Current Data Finding Glows and Grows Examining your data through the lens of accountability Finding Glows and Grows Identifying campus strengths to leverage resources mores effectively Exploring Value of Staff and Programs Evaluating programs and staff for alignment to results with consideration of why things are or are not effective

Framing our Lesson We Will I Will Examine processes to collect and analyze data in order to increase accountability ratings and improve our schools. I Will Create one goal for the 2017-2018 school year that will have a positive impact on school improvement and accountability.

Setting the Stage for Improvement

Critical Success Factors What data sources do you have influence or control of? Based on audience, have teams create lists of data sources for each CSF…circle the one that seems to be the most important of most discusses by all educational stakeholders.

Compare this list to generated list Compare this list to generated list. Which data sources provides most bang for the buck?

Disaggregating Current Data What key data source, which is critical to accountability, do schools have great influence on? How can we use this knowledge to enact significant change from one year to the next?

Disaggregating Current Data How do we make sense of the STAAR results? Understand what the report means Know the terminology used by the state Use the STAAR performance summary guide Look at the Performance Level Summary for the assessments given at your campus. What are your initial thoughts regarding the performance levels and student achievement? Look

Disaggregating Current Data Data analysis involves decoding skills as well as understanding and knowing what you are looking at and why. Step 1: Why are you looking at a data set? Step 2: What do you hope to learn from the data?

Disaggregating Current Data Step 3: What are the headings, axes, or labels? Step 4: What important information stands out to you? Step 5: What conclusions have you arrived at? Step 6: How would you share this information with someone else?

Disaggregating Current Data The data is more than just numbers. It’s the story of a child who may or may not have been successful. It has to “STICK” with you. taisresources.net | Quality Data to Drive Instruction

Disaggregating Current Data

Disaggregating Current Data Where you surprised by the results? Do you have any systems or processes to help predict results or progress? How is this data driving your school improvement plan for the new school year?

Disaggregating Current Data What must occur to change your Index scores? How many students (not percentages) did not “Approach Grade Level?”

Disaggregating Current Data Of the student’s who did Approach Grade Level, how many “Meet Grade Level?” Are your GT students achieving the “Masters Grade Level” standard?

Disaggregating Current Data How did your accountability groups do? Do you know which ones count for your campus? What is the difference between Eco Dis students and non-Eco Dis students at your school? What are the differences with your ELL’s?

Disaggregating Current Data When examining your data, what are the differences between accountability groups? How can this information be used with Teachers? With Students? With Instructional Practices?

Let’s look at school performance and compare it to state and federal accountability targets.

Accountability Targets

INDEX 1 Formula # of Tests “Approaching Grade Level” _____________________________________________ Total # of Tests Taken

INDEX 2 Formula Progress Measure for ELA/Reading and Mathematics ALL A.A. H. W. A.I. A. P.I. 2+ SPED ELL Total Pts. Max # of Tests # Met of Exceeded Progress # Exceeded Progress Percent of Tests Met or Exceeded Progress Percent of Tests Exceeded Progress All Subjects Weighted Progress Rate TOTAL INDEX 2: SCORE (Total Points Divided by Maximum Total Points) Progress Measure for ELA/Reading and Mathematics 200 pts for each group MSC = 25 students, 7 for ALL

INDEX 3 Formula

INDEX 4 Formula

Disaggregating Current Data After looking at the data, and reviewing how to calculate accountability scores, what are some critical areas to focus on for success?

Disaggregating Current Data Have these been priorities at your campus? If so how are these priorities reflected in your campus improvement plan? If not, what would be necessary in order to effect change in these areas?

A Quick Look at HB 22 Changes to the accountability formula calculation. A-F will continue Actual methodology is still pending Schools receive ratings beginning 2019 District in 2018

Data and School Improvement Knowing that accountability is driven by standardized test scores, how are initiatives prioritized at your school/district so that your greatness is manifested on paper?

Data and School Improvement If we repeat what we did last year, will we experience growth and improvement this year? Thinking about the question above what does your campus improvement plan look like?

Data Sources Aside from STAAR test results, what other data sources do you have? Reference back to your CSF data sources. How often do you look at these sources? If your teachers were asked for data sources, how do you think they would respond?

Finding Glows and Grows Why are we looking at data? Dr. Victoria Bernhardt - Six Reasons to Use Data

Finding Glows and Grows Looking at your CSF data sources, identify 3 areas where your school is “GLOWING.” What evidence do you have to support your conclusion? Pass out a chart tablet page with markers….make a T-Chart with Glows on one side and grows on the other.

Finding Glows and Grows If you do not have any evidence, what can you do to generate information that provides you with evidence to validate your statement? With your teams, explain why these areas are glowing. Look for commonalities.

Finding Glows and Grows Looking at your CSF data sources, identify 3 areas where your school needs to “GROW” What evidence do you have to support your conclusion?

Finding Glows and Grows Have these “GROWS” been “GROWS” for a while? What are you doing differently to achieve your goals? With your teams, examine why these areas are in need. Look for commonalities.

What support do you need to achieve your goal and GROW into another GLOW?

What support do your teachers need to achieve your goal?

Leveraging Strengths How can you leverage your strengths to better address your areas of need? How will these strengths assist with you setting and meeting improvement goals for the year?

Value of Staff and Programs If high quality/effective teaching has the greatest effect on student outcomes, how much time is spent supporting teachers? How do you know what support teachers need? How do you identify your most and least effective teachers?

Value of Staff and Program Making Time for Observations Dr. Paul Bambrick-Santoyo - Responsibility to Build Teacher Quality

Value of Staff and Programs Have you looked at your school appraisal data? How many teachers were distinguished? How many teachers were accomplished? How many teachers were proficient? How many teachers were developing? How many teachers need improvement? Do the results of teacher evaluations match the campus accountability results?

Value of Staff and Programs How many walkthroughs do teachers receive? Are the walkthroughs random? Targeted? Distributed evenly? Do you calibrate results with your instructional leadership team? How often do you look at observation data?

Value of Staff and Programs Do the results of teacher evaluations match the campus accountability results? Why or why not? What resources or processes are need to align student outcomes with educator observations and evaluations?

Value of Staff and Programs Feedback and Observation for Teacher Growth Dr. Paul Bambrick-Santoyo - Developing Teacher Quality through Bite-sized Feedback

Value of Staff and Programs Learning Walks Professional development as a leadership team Calibration exercises Cognitive coaching

Value of Staff and Programs Does your master schedule leverage the strengths of your staff to its fullest? Is data used to create the master schedule? How are student’s distributed to teacher’s? Are your classes balanced according to student need?

Value of Staff and Programs Make a list of the various programs at your school. What effect are these programs having? Have you compared your school to another of similar demographics not using your program?

Value of Staff and Programs What effect are these programs having? Do you a significant difference? What is meant by statistical significance? Are there any programs that may not be the best use of resources at your school?

Value of Staff and Programs Why are you still employing them? What are the consequences of making a change to your program? How do you proceed with changing the program?

Value of Staff and Programs Program Evaluation Example. Campus A is spending $20,000 per year to purchase the SpringBoard ELAR curriculum for all English 1 students and teachers. Approaches GL Meets GL Masters GL Campus A 62.38 48.63 3.17 Region 19 59.12 47.58 2.98 Texas 58.79 44.19 3.64

Value of Staff and Programs The results for the last 3 years at Campus A have followed a similar trend, always 1-3 % points above the regional and state average. Should the campus continue to spend money to purchase this program? Approaches GL Meets GL Masters GL Campus A 62.38 48.63 3.17 Region 19 59.12 47.58 2.98 Texas 58.79 44.19 3.64

Value of Staff and Programs What additional information do you need to arrive at a decision? What might you expect to occur if a change is made? If no change is made? Approaches GL Meets GL Masters GL Campus A 62.38 48.63 3.17 Region 19 59.12 47.58 2.98 Texas 58.79 44.19 3.64

Value of Staff and Programs Have you paused to consider the validity and/or veracity of statements made on your campus? How would you rate your school’s data literacy level? Is data used for all decisions?

Value of Staff and Programs Data Driven Decisions Example I want to have 9 weeks testing on Wednesday and Thursday, nor Thursday and Friday, before ending the 9 weeks grading period and starting spring break. I claim its hurting student grades and adding too much work to teachers because students all start spring break early. How can we use data to arrive at a decision?

Value of Staff and Programs If I know the value of my staff and programs, have I reflected that value in my master schedule for the new year? What data am I using to create the master schedule? How is the master schedule important to school improvement and accountability?

Setting Goals

Setting Goals Create one goal you have for the new school year that will have a measurable impact on school improvement using the data you have reviewed and discussed today.

Setting Goals Use the SMART goal acronym to further explain how you will measure your progress and determine when you should achieve your goal.

Professional Development Texas Accountability Intervention System September 8, 2017 September 29, 2017 Transformational Teacher Institute September 14, 2017 October 11, 2017 November 1, 2017

Resources Visit our Website!! www.esc19.net Check out new sessions in “Click N Learn” Visit the Research and Analysis page

Glenn A. Nathan Research Analyst – ESC Region 19 915. 780 Glenn A. Nathan Research Analyst – ESC Region 19 915.780.6517 (o) ganathan@esc19.net