Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.

Slides:



Advertisements
Similar presentations
Measuring Growth Using the Normal Curve Equivalent
Advertisements

Data Analysis Training Objectives 1.Understand the purpose of interpreting and analyzing data 2.Learn and use general terminology associated.
Using Data to Improve Student Achievement
Data Driven Decisions Moving from 3D to D 3. Data Driven Decisions Moving from 3D to D 3 Malcolm Thomas Director, Evaluation Services Escambia School.
Understanding Stanford 10 Results
Top ten non compliance findings from the Office for Exceptional Children from their Special Education Onsite Reviews.
The 8-Step Continuous Improvement Model
Building Level Benchmark Data This represents the percent of students who demonstrated the following proficiency levels on benchmark assessments. AP-Advanced.
What is Category 6.0 Management of Classroom Processes? Why is it important? What would it look like in action? Assessing the Classroom Learning System.
Watertown Public Schools Assessment Report 2009 Ann Koufman-Frederick & WPS Administrative Council School Committee Meeting December 7, 2009 Part I MCAS,
Using Summative Data to Monitor Student Performance: Choosing appropriate summative tests. Presented by Philip Holmes-Smith School Research Evaluation.
Tools and Charts Language Arts August 3, 2006 Summer 2006 Preschool CSDC.
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
Reports and Scores Fen Chou, Ph.D. Louisiana Department of Education August 2006.
Brock’s Gap Intermediate School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Formative Assessment Using formative assessment to drive content instruction. Jane Davis-Superintendent of Hershey Public School Shanna Duggan-Hershey.
Using Data to Improve Adult Ed Programs Administrators’ Workshop.
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Excel: Unlimited Possibilities Graphs, Imputing Data, and Automatic Calculations Secondary Mathematics Pre-School Inservice Cape Coral High School August.
Portfolios: A Useful Resource For All Cynthia Cuellar Astrid Fossum Beth Schefelker The Milwaukee Mathematics Partnership (MMP), an initiative of the Milwaukee.
PLOP, Goals & Objectives Notes PLOP – Free of grammatical and spelling errors – Statement describing how the student is performing the annual goal currently.
Welcome to Third Grade’s Parent Night
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Know the Rules Nancy E. Brito, NBCT, Accountability Specialist Department of Educational Data Warehouse, Accountability, and School Improvement
Using Data to Improve Student Achievement Secondary Mathematics Preschool Inservice 2006.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Summative & Formative Data Outcomes Understand Summative and Formative Assessments. Understand Summative and Formative Assessments. Review FCAT Reading.
Data Vocabulary Language Arts Summer Cadre 2006 Migdalia Rosario Varsity Lakes Middle Jennifer Miller Varsity Lakes Middle Pat Zubal Dunbar Middle School.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Administrative Retreat: Assessment System Common Assessments to MAP to KPREP June 7, 2013 Woodford County Schools Curriculum and Instruction.
1 Curriculum Based Measures Improving Student Outcomes through Progress Monitoring.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Adolescent Literacy – Professional Development
Administrator Update January Individuals with Disabilities Education Act (IDEA) 1997 –Students with disabilities must participate in statewide assessment.
LIGHTS, CAMERA …. ACADEMIC DATA at the Elementary Level Cammie Neal and Jennifer Schau Forsyth County Schools.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
Using Data to Plan for Instruction Summer 2006 Preschool CSDC.
1 Student Assessment Update Research, Evaluation & Accountability Angela Marino Coordinator Research, Evaluation & Accountability.
Progress Monitoring Plan Information and Procedures August Part 2.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
No Child Left Behind Adequate Yearly Progress (AYP) Know the Rules Division of Performance Accountability Dr. Marc Baron, Chief Nancy E. Brito, Instructional.
Do you know where your students are in relationship to the objectives you are responsible for teaching?
Guide to Test Interpretation Using DC CAS Score Reports to Guide Decisions and Planning District of Columbia Office of the State Superintendent of Education.
Understanding Alaska Measures of Progress Results: Reports 1 ASA Fall Meeting 9/25/2015 Alaska Department of Education & Early Development Margaret MacKinnon,
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Student Learning Objectives SLOs April 3, NY State’s Regulations governing teacher evaluation call for a “State-determined District-wide growth.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Welcome to MMS MAP DATA INFO NIGHT 2015.
2006 HURRICANE NAMES AnalyzeIn-service BasalJ-Curve CRISSKagan DIBELSLee EducateMarzano FCATNorm GainsOrder HistogramPareto Looks Like Another Challenging.
Mobile County School District SPED 6-9 End of Year Consultative Session
1 NCEXTEND1 Alternate Assessment with Alternate Achievement Standards Conference on Exceptional Children November 17-18, 2008 NCDPI Division of Accountability.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
2008 Student Progress Monitoring & Data-Based Instruction in Special Education Introduction to Using CBM for Progress Monitoring.
Student Growth Model Salt Lake City School District Christine Marriott Assessment and Evaluation Department Salt Lake City School District State.
Somers Public Schools Building and Departmental Goals
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Interpreting Test Results using the Normal Distribution Dr. Amanda Hilsmier.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Nuts and Bolts of Assessment
Mobile County School District At-Risk/SPED 2-5
NWEA Measures of Academic Progress (MAP)
Curriculum-Based Measurement: A Method for Monitoring Student Academic Progress in Basic Skills.
Vision 20/20: Checks and Balances
Standards Based Grading
College and Career Readiness
Presentation transcript:

Using Data to Improve Student Achievement Summer 2006 Preschool CSDC

Outcomes Know why we need to look at data Know why we need to look at data Identify two types of tests Identify two types of tests Understand three types of scores Understand three types of scores Understand Summative & Formative Assessments Understand Summative & Formative Assessments Be able to interpret Summative Assessment Reports Be able to interpret Summative Assessment Reports Know how to use data in instructional planning for increased student learning Know how to use data in instructional planning for increased student learning

Why Look at Data? The purpose of data is to give educators INSIGHT!

Types of Tests Norm-Referenced Test (NRT) Criterion-Referenced Test (CRT)

What is a Norm-Referenced Test (NRT)? A standardized assessment in which all students perform under the same conditions. A standardized assessment in which all students perform under the same conditions. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group.

What is a Criterion-Referenced Test (CRT)? An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group.

Summary NRT and CRT

Types of Scores

Raw Score (RS) The number of items a student answers correctly on a test. The number of items a student answers correctly on a test. John took a 20 item mathematics test (where each item was worth one point) and correctly answered 17 items. John took a 20 item mathematics test (where each item was worth one point) and correctly answered 17 items. His raw score for this assessment is 17. His raw score for this assessment is 17.

Scale Score (SS) Mathematically converted raw scores based on level of difficulty per question. Mathematically converted raw scores based on level of difficulty per question. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. Scale Scores reflect a more accurate picture of the student’s achievement level. Scale Scores reflect a more accurate picture of the student’s achievement level.

Gain Scores Commonly referred to as “Learning Gains” The amount of progress a student makes in one school year.

Learning Gains: Who Qualifies?   All students with a pre- and post-test, including all subgroups (ESE, LEP, etc.).   All students with matched, consecutive year (i.e & 2006) FCAT SSS results, grades 4-10, who were enrolled in the same school surveys 2 & 3 (FTE).

Learning Gains: Which Scores? Gains apply in reading and math, not writing or science. Pre-test may be from same school, same district, or anywhere in the state.

Learning Gains: What equals Adequate Yearly Progress (AYP)? A. Improve FCAT Achievement Levels from 2005 to 2006 (e.g. 1-2, 2-3, 3-4, 4-5) OR B. Maintain “satisfactory” Achievement Levels from (e.g. 3-3, 4-4, 5-5) OR C. Demonstrate more than one year’s growth within Level 1 or Level 2 - determined by DSS Cut Points (not applicable for retained students)

Developmental Scale Score Gains Table (DSS Cut Points) Students achieving within Level 1 (or within Level 2) for two consecutive years must gain at least one point more than those listed in the table in order to satisfy the “making annual learning gains” component of the school accountability system. Grade Level ChangeReadingMathematics 3 to to to to to to to

Learning Gains: Retainees A retained student can only be counted as making adequate progress if he/she: Moves up one level. (e.g. 1-2, 2-3, 3-4, 4-5) Maintains a level 3, 4, or 5.

Learning Gains: Activity Using the data on the following table, determine:   which students made a learning gain   what percentage of the teacher’s students made a learning gain

Data Display for FCAT Reading Results Student04/05 Grade Level 05/06 Grade Level Pre-test Achievement Level Pre- test DSS Post-test Achievement Level Post- test DSS Learning Gain Determination A78Level 1Level 2Yes or No Reason: A, B, or C B78Level 4 Yes or No Reason: A, B, or C C78Level 21598Level 21743Yes or No Reason: A, B, or C D88Level 1Level 2Yes or No Reason: A, B, or C E88Level 3 Yes or No Reason: A, B, or C F88Level 11486Level 11653Yes or No Reason: A, B, or C G78Level 5Level 4Yes or No Reason: A, B, or C

Types of Data Results (Summative) Data used to make decisions about student achievement at the end of a period of instruction. Process (Formative) Data gathered at regular intervals during the instructional period; used to provide feedback about student progress and to provide direction for instructional interventions.

A Closer Look at Results Data Examples:

FCAT Parent Report

A Closer Look at Formative Data Quizzes Chapter Tests DIBELS District Math Assessments

What tools do we have? FCAT Inquiry (Summative) FCAT Inquiry (Summative) Teacher Tools for Data Collection Teacher Tools for Data Collection (Can be Summative or Formative) Histogram Pareto Chart Run Chart Scatter Diagram Item Analysis

Histogram Bar chart representing a frequency distribution of student scores Heights of the bars represent number of students scoring at same level/score Used to Monitor progress

Histogram: Minutes to Run 1 Mile Time Frequency

Histogram: Grade Distribution in 8th Grade English Grade Frequency

Run Chart Use to: Monitor progress over time Display data in simplest form

Run Chart: Number of Words Spelled Correctly on Weekly Quiz Week Number of words

Scatter Diagram Scatter Diagram: Quiz Average vs. Test Average Test Average Quiz Average

Scatter Diagram: Hours of Sleep vs Mistakes on Test Hours of Sleep Mistakes

Item Analysis Use to: Determine mastered content Determine most common mistakes

CLASSROOM TEST ANALYSIS BENCHMARK ASSESSED ITEM # NUMBER CORRECT NUMBER INCORRECT NUMBER PARTIAL CREDIT NUMBER DISTRACTOR A/1 NUMBER DISTRACTOR B/2 NUMBER DISTRACTOR C/3 NUMBER DISTRACTOR D/4 NUMBER NO ANSWER

ITEM ANALYSIS ACTIVITY

Pareto Chart Use to: Rank issues in order of occurrence Decide which problems need to be addressed first Find the issues that have the greatest impact Monitor impact of changes

Pareto Chart: Types of mistakes in Division Problems Incorrect multiplicationIncorrect subtractionNo decimalOther Mistake Percent Cumulative percentage

Specific Examples District math assessments District math assessments Writing prompts Writing prompts DIBELS DIBELS Classroom Assessments Classroom Assessments

Data analysis provides: Insight and Questions

Adapted from Getting Excited About Data, Edie Holcomb What question are we trying to answer? What can we tell from the data? What can we NOT tell from the data? What else might we want to know? What good news is here for us to celebrate? What opportunities for improvement are suggested by the data? Questions to Ponder…

Action Answers! Provides

What information have I gained from my data? What interventions can I put In place? Implement the plan. Analyze the results. Make improvements. Steps to Improvement PLAN DO STUDY ACT

Personal Action Plan What data can I access? What tools can I use to help me monitor progress toward our class goals? What/who else do I need to help me? What is my start date? How will I evaluate the results? P D S A