Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.

Slides:



Advertisements
Similar presentations
Measuring Growth Using the Normal Curve Equivalent
Advertisements

Data Analysis Training Objectives 1.Understand the purpose of interpreting and analyzing data 2.Learn and use general terminology associated.
Using Data to Improve Student Achievement
Data Driven Decisions Moving from 3D to D 3. Data Driven Decisions Moving from 3D to D 3 Malcolm Thomas Director, Evaluation Services Escambia School.
Understanding Stanford 10 Results
What is Category 6.0 Management of Classroom Processes? Why is it important? What would it look like in action? Assessing the Classroom Learning System.
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
Tools and Charts Language Arts August 3, 2006 Summer 2006 Preschool CSDC.
Standards-Based Grading in the Science Classroom How do I make my grading support student learning? Ken Mattingly
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
Formative Assessment Using formative assessment to drive content instruction. Jane Davis-Superintendent of Hershey Public School Shanna Duggan-Hershey.
STANDARDS BASED GRADING Harrison High School. WHY STANDARDS BASED GRADING?  Grades should have meaning  Ensures uniform grading practices  Makes classroom.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
A CONNECTION BETWEEN SELF REPORTED GRADES AND FEEDBACK Jaimee Sutherland Farmington Middle School.
Title I Coordinators’ Meeting: Guiding Students to Proficiency December 07, 2005.
Excel: Unlimited Possibilities Graphs, Imputing Data, and Automatic Calculations Secondary Mathematics Pre-School Inservice Cape Coral High School August.
Portfolios: A Useful Resource For All Cynthia Cuellar Astrid Fossum Beth Schefelker The Milwaukee Mathematics Partnership (MMP), an initiative of the Milwaukee.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Know the Rules Nancy E. Brito, NBCT, Accountability Specialist Department of Educational Data Warehouse, Accountability, and School Improvement
Using Data to Improve Student Achievement Secondary Mathematics Preschool Inservice 2006.
1 Paul Tuss, Ph.D., Program Manager Sacramento Co. Office of Education August 17, 2009 California’s Integrated Accountability System.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
Summative & Formative Data Outcomes Understand Summative and Formative Assessments. Understand Summative and Formative Assessments. Review FCAT Reading.
Data Vocabulary Language Arts Summer Cadre 2006 Migdalia Rosario Varsity Lakes Middle Jennifer Miller Varsity Lakes Middle Pat Zubal Dunbar Middle School.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Administrative Retreat: Assessment System Common Assessments to MAP to KPREP June 7, 2013 Woodford County Schools Curriculum and Instruction.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Adolescent Literacy – Professional Development
Administrator Update January Individuals with Disabilities Education Act (IDEA) 1997 –Students with disabilities must participate in statewide assessment.
Using Data in the Goal-Setting Process Webinar September 30, 2015.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
Using Data to Plan for Instruction Summer 2006 Preschool CSDC.
1 Student Assessment Update Research, Evaluation & Accountability Angela Marino Coordinator Research, Evaluation & Accountability.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
No Child Left Behind Adequate Yearly Progress (AYP) Know the Rules Division of Performance Accountability Dr. Marc Baron, Chief Nancy E. Brito, Instructional.
Do you know where your students are in relationship to the objectives you are responsible for teaching?
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Assessment and Testing
Excel: Unlimited Possibilities Graphs, Imputing Data, and Automatic Calculations Secondary Mathematics Pre-School In-service Cape Coral High School August.
Standards-Based Grading in the Science Classroom How do I make my grading support student learning? Ken Mattingly B.A. – University of Kentucky M.A. –
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Gathering Evidence to Achieve Results.  ALL CSD students and educators are part of ONE proactive educational system.  Evidence-based instruction and.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
2006 HURRICANE NAMES AnalyzeIn-service BasalJ-Curve CRISSKagan DIBELSLee EducateMarzano FCATNorm GainsOrder HistogramPareto Looks Like Another Challenging.
Mobile County School District SPED 6-9 End of Year Consultative Session
Gathering Evidence to Achieve Results. A Culture of Collaboration - PLC Norms - Systems Support A Focus on Results -Pre-assessments -Common Formative.
1 NCEXTEND1 Alternate Assessment with Alternate Achievement Standards Conference on Exceptional Children November 17-18, 2008 NCDPI Division of Accountability.
Performance Monitoring COURTNEY MILLS SCPCSD DIRECTOR OF ACADEMIC PROGRAMS.
Data for Student Success May, 2010 Ann Arbor, MI “It is about focusing on building a culture of quality data through professional development and web based.
The Normal Distribution and Norm-Referenced Testing Norm-referenced tests compare students with their age or grade peers. Scores on these tests are compared.
AYP and Report Card. Big Picture Objectives – Understand the purpose and role of AYP in Oregon Assessments. – Understand the purpose and role of the Report.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Interpreting Test Results using the Normal Distribution Dr. Amanda Hilsmier.
1 Testing Various Models in Support of Improving API Scores.
Nuts and Bolts of Assessment
NWEA Measures of Academic Progress (MAP)
Vision 20/20: Checks and Balances
Standards Based Grading
Standards Aligned System Integration Trainer Professional Development
College and Career Readiness
Presentation transcript:

Using Data to Improve Student Achievement Summer 2006 Preschool CSDC

Outcomes Know why we need to look at data Know why we need to look at data Identify two types of tests Identify two types of tests Understand three types of scores Understand three types of scores Understand Summative & Formative Assessments Understand Summative & Formative Assessments Be able to interpret Summative Assessment Reports Be able to interpret Summative Assessment Reports Know how to use data in instructional planning for increased student learning Know how to use data in instructional planning for increased student learning

Why Look at Data? The purpose of data is to give educators INSIGHT!

Types of Tests NNorm-Referenced Test (NRT) CCCCriterion-Referenced Test (CRT)

What is a Norm-Referenced Test (NRT)? A standardized assessment in which all students perform under the same conditions. A standardized assessment in which all students perform under the same conditions. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group.

What is a Criterion-Referenced Test (CRT)? An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group.

Summary NRT and CRT

Types of Scores

Raw Score (RS) The number of items a student answers correctly on a test. The number of items a student answers correctly on a test. John took a 20 item mathematics test (where each item was worth one point) and correctly answered 17 items. John took a 20 item mathematics test (where each item was worth one point) and correctly answered 17 items. His raw score for this assessment is 17. His raw score for this assessment is 17.

Scale Score (SS) Mathematically converted raw scores based on level of difficulty per question. Mathematically converted raw scores based on level of difficulty per question. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. Scale Scores reflect a more accurate picture of the student’s achievement level. Scale Scores reflect a more accurate picture of the student’s achievement level.

Gain Scores Commonly referred to as “Learning Gains” The amount of progress a student makes in one school year.

Learning Gains: Who Qualifies?   All students with a pre- and post-test, including all subgroups (ESE, LEP, etc.).   All students with matched, consecutive year (i.e & 2006) FCAT SSS results, grades 4-10, who were enrolled in the same school surveys 2 & 3 (FTE).

Learning Gains: Which Scores? Gains apply in reading and math, not writing or science. Pre-test may be from same school, same district, or anywhere in the state.

Learning Gains: What equals Adequate Yearly Progress (AYP)? A. Improve FCAT Achievement Levels from 2005 to 2006 (e.g. 1-2, 2-3, 3-4, 4-5) OR B. Maintain “satisfactory” Achievement Levels from (e.g. 3-3, 4-4, 5-5) OR C. Demonstrate more than one year’s growth within Level 1 or Level 2 - determined by DSS Cut Points (not applicable for retained students)

Developmental Scale Score Gains Table (DSS Cut Points) Students achieving within Level 1 (or within Level 2) for two consecutive years must gain at least one point more than those listed in the table in order to satisfy the “making annual learning gains” component of the school accountability system. Grade Level ChangeReadingMathematics 3 to to to to to to to

Learning Gains: Retainees A retained student can only be counted as making adequate progress if he/she: Moves up one level. (e.g. 1-2, 2-3, 3-4, 4-5) Maintains a level 3, 4, or 5.

Learning Gains: Activity Using the data on the following table, determine:   which students made a learning gain   what percentage of the teacher’s students made a learning gain

Data Display for FCAT Reading Results Student04/05 Grade Level 05/06 Grade Level Pre-test Achievement Level Pre- test DSS Post-test Achievement Level Post- test DSS Learning Gain Determination A78Level 1Level 2Yes or No Reason: A, B, or C B78Level 4 Yes or No Reason: A, B, or C C78Level 21598Level 21743Yes or No Reason: A, B, or C D88Level 1Level 2Yes or No Reason: A, B, or C E88Level 3 Yes or No Reason: A, B, or C F88Level 11486Level 11653Yes or No Reason: A, B, or C G78Level 5Level 4Yes or No Reason: A, B, or C

Teacher Learning Gains Based on Data Display 5 out of 7 students made learning gains. 71% of this teacher’s students made learning gains and add points towards the school’s grade. No points are given to the school for Student F because he was retained and stayed within level 1 – even though he made significant gains in DSS points. No points are given to Student G because he decreased a level. Total Number of Students with a Pre and Post-test who qualify for learning gain calculations: Reason A Increased 1 or more Achievement Levels Reason B Maintains “satisfactory” levels (3, 4, or 5) Reason C DSS Target Gain (More than a year’s growth) 7221

Class Record Sheet for Learning Gains

Types of Data Results (Summative)   Data used to make decisions about student achievement at the end of a period of instruction. Process (Formative)  Data gathered at regular intervals during the instructional period; used to provide feedback about student progress and to provide direction for instructional interventions.

A Closer Look at Summative Data Examples:

FCAT Parent Report

A Closer Look at Formative Data Quizzes Tests Homework Essays Classwork

What tools do we have? FCAT Inquiry (Summative) FCAT Inquiry (Summative) Teacher Tools for Data Collection Teacher Tools for Data Collection (Can be Summative or Formative) Histogram Run Chart Scatter Diagram Item Analysis Pareto Chart

Histogram  Bar chart representing a frequency distribution of student scores  Heights of the bars represent number of students scoring at the same level/score  Used to monitor progress

Histogram: Midterm Grade Distribution – 11 th Grade American History <50%50-59%60-69%70-79%80-89%90-100% Grades Number of Students

Histogram: Final Exam Grade Distribution – 11th Grade American History Grade Number os Students

“L to J” <50%50-59%60-69%70-79%80-89%90-100% Grades Number of Students Grade Number os Students

Run Chart Use to:  Monitor progress over time  Display data in simplest form

Run Chart: Number of Words Defined Correctly on Weekly Quiz Week Number of words All Time Best

Scatter Diagram Scatter Diagram: Quiz Average vs. Test Average Test Average Quiz Average Use to:  Show relationships  Test for possible cause/effect

Scatter Diagram: Hours of Sleep vs Mistakes on Test Hours of Sleep Mistakes

Item Analysis Use to:  Determine mastered content  Determine most common mistakes

CLASSROOM TEST ANALYSIS BENCHMARK ASSESSED ITEM # NUMBER CORRECT NUMBER INCORRECT NUMBER PARTIAL CREDIT NUMBER DISTRACTOR A/1 NUMBER DISTRACTOR B/2 NUMBER DISTRACTOR C/3 NUMBER DISTRACTOR D/4 NUMBER NO ANSWER

Pareto Chart Use to:  Rank issues in order of occurrence  Decide which problems need to be addressed first  Find the issues that have the greatest impact  Monitor impact of changes

Pareto Chart: Types of mistakes in a Geography Quiz SpellingLocationCapitalsOther Mistake Percent Cumulative percentage

Data analysis provides: Insight and Questions

Adapted from Getting Excited About Data, Edie Holcomb  What question are we trying to answer?  What can we tell from the data?  What can we NOT tell from the data? What else might we want to know?  What good news is here for us to celebrate?  What opportunities for improvement are suggested by the data? Questions to Ponder…

Action Answers! Provides

What information have I gained from my data? What interventions can I put In place? Implement the plan. Analyze the results. Make improvements. Steps to Improvement PLAN DO STUDY ACT

Personal Action Plan What data can I access? What tools can I use to help me monitor progress toward our class goals? What/who else do I need to help me? What is my start date? How will I evaluate the results? P D S A