Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.

Slides:



Advertisements
Similar presentations
Measuring Growth Using the Normal Curve Equivalent
Advertisements

Data Analysis Training Objectives 1.Understand the purpose of interpreting and analyzing data 2.Learn and use general terminology associated.
Using Data to Improve Student Achievement
Data Driven Decisions Moving from 3D to D 3. Data Driven Decisions Moving from 3D to D 3 Malcolm Thomas Director, Evaluation Services Escambia School.
What is Category 6.0 Management of Classroom Processes? Why is it important? What would it look like in action? Assessing the Classroom Learning System.
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
Watertown Public Schools Assessment Report 2009 Ann Koufman-Frederick & WPS Administrative Council School Committee Meeting December 7, 2009 Part I MCAS,
Tools and Charts Language Arts August 3, 2006 Summer 2006 Preschool CSDC.
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
DATA-BASED DECISION MAKING USING STUDENT DATA-BECAUSE IT’S BEST PRACTICE & IT’S REQUIRED Dr. David D. Hampton Bowling Green State University.
Classroom Assessment A Practical Guide for Educators by Craig A
Formative Assessment Using formative assessment to drive content instruction. Jane Davis-Superintendent of Hershey Public School Shanna Duggan-Hershey.
Curriculum Based Measures vs. Formal Assessment
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
A CONNECTION BETWEEN SELF REPORTED GRADES AND FEEDBACK Jaimee Sutherland Farmington Middle School.
Title I Coordinators’ Meeting: Guiding Students to Proficiency December 07, 2005.
Excel: Unlimited Possibilities Graphs, Imputing Data, and Automatic Calculations Secondary Mathematics Pre-School Inservice Cape Coral High School August.
Portfolios: A Useful Resource For All Cynthia Cuellar Astrid Fossum Beth Schefelker The Milwaukee Mathematics Partnership (MMP), an initiative of the Milwaukee.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Know the Rules Nancy E. Brito, NBCT, Accountability Specialist Department of Educational Data Warehouse, Accountability, and School Improvement
Using Data to Improve Student Achievement Secondary Mathematics Preschool Inservice 2006.
12/4/2014ECSE 602 Dr. Y. Xu1 ECSE 602 Instructional Programming for Infants and Young Children with Disabilities This session will cover:  Child Activity.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
The World of Assessment Consider the options! Scores based on developmental levels of academic achievement Age-Equivalent scores.
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
Summative & Formative Data Outcomes Understand Summative and Formative Assessments. Understand Summative and Formative Assessments. Review FCAT Reading.
Data Vocabulary Language Arts Summer Cadre 2006 Migdalia Rosario Varsity Lakes Middle Jennifer Miller Varsity Lakes Middle Pat Zubal Dunbar Middle School.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Adolescent Literacy – Professional Development
Administrator Update January Individuals with Disabilities Education Act (IDEA) 1997 –Students with disabilities must participate in statewide assessment.
Using Data in the Goal-Setting Process Webinar September 30, 2015.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
Using Data to Plan for Instruction Summer 2006 Preschool CSDC.
1 Student Assessment Update Research, Evaluation & Accountability Angela Marino Coordinator Research, Evaluation & Accountability.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
No Child Left Behind Adequate Yearly Progress (AYP) Know the Rules Division of Performance Accountability Dr. Marc Baron, Chief Nancy E. Brito, Instructional.
Do you know where your students are in relationship to the objectives you are responsible for teaching?
Implementing Structured Data Meetings End of Year (EOY) Meetings.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
The Teaching Process. Problem/condition Analyze Design Develop Implement Evaluate.
Becky Martin Continuous Improvement Facilitator Paul Hayes Secondary Student Services Facilitator 17th National Quality Education Conference October 2009.
Assessment and Testing
Excel: Unlimited Possibilities Graphs, Imputing Data, and Automatic Calculations Secondary Mathematics Pre-School In-service Cape Coral High School August.
Scale Scoring A New Format for Provincial Assessment Reports.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Assessing Learners with Special Needs: An Applied Approach, 6e © 2009 Pearson Education, Inc. All rights reserved. Chapter 5: Introduction to Norm- Referenced.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
2006 HURRICANE NAMES AnalyzeIn-service BasalJ-Curve CRISSKagan DIBELSLee EducateMarzano FCATNorm GainsOrder HistogramPareto Looks Like Another Challenging.
Mobile County School District SPED 6-9 End of Year Consultative Session
1 NCEXTEND1 Alternate Assessment with Alternate Achievement Standards Conference on Exceptional Children November 17-18, 2008 NCDPI Division of Accountability.
Performance Monitoring COURTNEY MILLS SCPCSD DIRECTOR OF ACADEMIC PROGRAMS.
Data for Student Success May, 2010 Ann Arbor, MI “It is about focusing on building a culture of quality data through professional development and web based.
AYP and Report Card. Big Picture Objectives – Understand the purpose and role of AYP in Oregon Assessments. – Understand the purpose and role of the Report.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Student Growth Model Salt Lake City School District Christine Marriott Assessment and Evaluation Department Salt Lake City School District State.
Setting Your Goals For TTESS Memorial HS Training September 11, 2015.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Interpreting Test Results using the Normal Distribution Dr. Amanda Hilsmier.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
1 Testing Various Models in Support of Improving API Scores.
Nuts and Bolts of Assessment
Mobile County School District At-Risk/SPED 2-5
NWEA Measures of Academic Progress (MAP)
Standards Based Grading
Standards Aligned System Integration Trainer Professional Development
Presentation transcript:

Using Data to Improve Student Achievement Summer 2006 Preschool CSDC

Outcomes Know why we need to look at data Know why we need to look at data Identify two types of tests Identify two types of tests Understand three types of scores Understand three types of scores Understand Summative & Formative Assessments Understand Summative & Formative Assessments Be able to interpret Summative Assessment Reports Be able to interpret Summative Assessment Reports Know how to use data in instructional planning for increased student learning Know how to use data in instructional planning for increased student learning

Why Look at Data? The purpose of data is to give educators INSIGHT!

Types of Tests Norm-Referenced Test (NRT) Criterion-Referenced Test (CRT)

What is a Norm-Referenced Test (NRT)? A standardized assessment in which all students perform under the same conditions. A standardized assessment in which all students perform under the same conditions. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group.

What is a Criterion-Referenced Test (CRT)? An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group.

Summary NRT and CRT

Physical Education Examples NRT-President’s Challenge Physical Fitness Test NRT-President’s Challenge Physical Fitness Test CRT-Fitnessgram CRT-Fitnessgram

Types of Scores

Raw Score (RS) The number of items a student answers correctly on a test. The number of items a student answers correctly on a test. PE Example: curl-ups=32, push-ups=3, PE Example: curl-ups=32, push-ups=3, mile=7:35, s/r= 26 mile=7:35, s/r= 26 The actual number/time achieved using the correct form on a sub-test of the fitness assessment. The actual number/time achieved using the correct form on a sub-test of the fitness assessment.

Scale Score (SS) Mathematically converted raw scores based on level of difficulty per question. Mathematically converted raw scores based on level of difficulty per question. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. Scale Scores reflect a more accurate picture of the student’s achievement level. Scale Scores reflect a more accurate picture of the student’s achievement level.

Gain Scores Commonly referred to as “Learning Gains” The amount of progress a student makes in one school year.

Physical Education Example: Mile Run Mile Run Pre-test= 10:00 Pre-test=10:00 Pre-test= 10:00 Pre-test=10:00 Post-test= 8:00 Post-test=12:00 Post-test= 8:00 Post-test=12:00 Learning Gain= Learning Gain= Reflected in a percentage=20% gain or loss Reflected in a percentage=20% gain or loss Reflected as a number= or +2:00 Reflected as a number= or +2:00 The mile time learning gain is the only sub-test score that shows improvement in a negative number. The mile time learning gain is the only sub-test score that shows improvement in a negative number. If the mile time increases, the score will be recorded as a positive number. If the mile time increases, the score will be recorded as a positive number.

Learning Gains: Who Qualifies?   All students with a pre- and post-test, including all subgroups (ESE, LEP, etc.).   All students with matched, consecutive year (i.e & 2006) FCAT SSS results, grades 4-10, who were enrolled in the same school surveys 2 & 3 (FTE).

Learning Gains: Which Scores? Gains apply in reading and math, not writing or science. Pre-test may be from same school, same district, or anywhere in the state.

Learning Gains: What equals Adequate Yearly Progress (AYP)? A. Improve FCAT Achievement Levels from 2005 to 2006 (e.g. 1-2, 2-3, 3-4, 4-5) OR B. Maintain “satisfactory” Achievement Levels from (e.g. 3-3, 4-4, 5-5) OR C. Demonstrate more than one year’s growth within Level 1 or Level 2 - determined by DSS Cut Points (not applicable for retained students)

Electronic Score Sheet

Developmental Scale Score Gains Table (DSS Cut Points) Students achieving within Level 1 (or within Level 2) for two consecutive years must gain at least one point more than those listed in the table in order to satisfy the “making annual learning gains” component of the school accountability system. Grade Level ChangeReadingMathematics 3 to to to to to to to

Learning Gains: Retainees A retained student can only be counted as making adequate progress if he/she: Moves up one level. (e.g. 1-2, 2-3, 3-4, 4-5) Maintains a level 3, 4, or 5.

Learning Gains: Activity Using the data on the following table, determine:   which age levels made a learning gain in two of the four sub-tests?

Data Display for FCAT Reading Results Student04/05 Grade Level 05/06 Grade Level Pre-test Achievement Level Pre- test DSS Post-test Achievement Level Post- test DSS Learning Gain Determination A78Level 1Level 2Yes or No Reason: A, B, or C B78Level 4 Yes or No Reason: A, B, or C C78Level 21598Level 21743Yes or No Reason: A, B, or C D88Level 1Level 2Yes or No Reason: A, B, or C E88Level 3 Yes or No Reason: A, B, or C F88Level 11486Level 11653Yes or No Reason: A, B, or C G78Level 5Level 4Yes or No Reason: A, B, or C

Teacher Learning Gains Based on Data Display 5 out of 7 students made learning gains. 71% of this teacher’s students made learning gains and add points towards the school’s grade. No points are given to the school for Student F because he was retained and stayed within level 1 – even though he made significant gains in DSS points. No points are given to Student G because he decreased a level. Total Number of Students with a Pre and Post-test who qualify for learning gain calculations: Reason A Increased 1 or more Achievement Levels Reason B Maintains “satisfactory” levels (3, 4, or 5) Reason C DSS Target Gain (More than a year’s growth) 7221

Class Record Sheet for Learning Gains

Types of Data Results (Summative) Data used to make decisions about student achievement at the end of a period of instruction. Process (Formative) Data gathered at regular intervals during the instructional period; used to provide feedback about student progress and to provide direction for instructional interventions.

A Closer Look at Results Data Examples: President’s Challenge President’s Challenge Fitnessgram Fitnessgram

President’s Challenge Results 05/06

FitnessGram Pilot-HFZ Mile

FitnessGram Pilot-HFZ Push-ups

FitnessGram Pilot-HFZ Sit-n-Reach

FitnessGram Pilot-HFZ Curl-ups

A Closer Look at Formative Data Quizzes Chapter Tests DIBELS District Math Assessments

What tools do we have? FCAT Inquiry (Summative) FCAT Inquiry (Summative) Teacher Tools for Data Collection Teacher Tools for Data Collection (Can be Summative or Formative) Histogram Pareto Chart Run Chart Scatter Diagram Item Analysis

Histogram Bar chart representing a frequency distribution of student scores Heights of the bars represent number of students scoring at same level/score Used to Monitor progress

Histogram: Minutes to Run 1 Mile Time Frequency

Histogram: Score Distribution in 7th Grade Physical Education Push-Ups Frequency

Run Chart Use to: Monitor progress over time Display data in simplest form

Run Chart: Number of Curl-Ups per Week Week Number of words

Class Goal: By the end of 9 weeks, 100% of our class will have an average of at least 80% on our weekly personal fitness class quizzes. Class Run Chart: Percent of Students Averaging at Least 80% on Weekly Math Quizzes Week Percent w/ avg. of at least 80%

Scatter Diagram Scatter Diagram: Quiz Average vs. Test Average Test Average Quiz Average

Scatter Diagram: Hours of Sleep vs Mile Run Times Hours of Sleep Mistakes

Item Analysis Use to: Determine mastered content Determine most common mistakes

CLASSROOM TEST ANALYSIS BENCHMARK ASSESSED ITEM # NUMBER CORRECT NUMBER INCORRECT NUMBER PARTIAL CREDIT NUMBER DISTRACTOR A/1 NUMBER DISTRACTOR B/2 NUMBER DISTRACTOR C/3 NUMBER DISTRACTOR D/4 NUMBER NO ANSWER

ITEM ANALYSIS ACTIVITY

Pareto Chart Use to: Rank issues in order of occurrence Decide which problems need to be addressed first Find the issues that have the greatest impact Monitor impact of changes

Pareto Chart: Types of mistakes in Division Problems Incorrect multiplicationIncorrect subtractionNo decimalOther Mistake Percent Cumulative percentage

Females Age 12: President’s Challenge Pre- to Post- Test Gains Curl ups+3.59 (avg) Curl ups+3.59 (avg) Mile Run-38 sec (avg) Mile Run-38 sec (avg) Push Ups (avg) Push Ups (avg) S/R+1.11 (avg) S/R+1.11 (avg)

Data analysis provides: Insight and Questions

Adapted from Getting Excited About Data, Edie Holcomb What question are we trying to answer? What can we tell from the data? What can we NOT tell from the data? What else might we want to know? What good news is here for us to celebrate? What opportunities for improvement are suggested by the data? Questions to Ponder…

Action Answers! Provides

What information have I gained from my data? What interventions can I put In place? Implement the plan. Analyze the results. Make improvements. Steps to Improvement PLAN DO STUDY ACT

Personal Action Plan What data can I access? What tools can I use to help me monitor progress toward our class goals? What/who else do I need to help me? What is my start date? How will I evaluate the results? P D S A