1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.

Slides:



Advertisements
Similar presentations
SCIENCE & TESTING. CMT-SCIENCE Given for the first time in Spring 2008 in grades 5 & 8 Consists of multiple choice and open ended questions Based on student.
Advertisements

CURRICULAR MAPPING: ALIGNING ALL INTEGRATED COMPONENTS TO NJCCCS Fred Carrigg Special Assistant to the Commissioner for Urban Literacy.
Understanding the North Carolina School Report Card.
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Value Added Assessment RAD Reading Assessment Teacher Moderation Greg Miller Supervisor of Assessment Lynda Gellner Literacy Consultant Juanita Redekopp.
Data Analysis State Accountability. Data Analysis (What) Needs Assessment (Why ) Improvement Plan (How) Implement and Monitor.
Kentucky’s School Report Card and Spreadsheets
Language Assessment System (LAS) Links TM Census Test.
Data Analysis Protocol Identify – Strengths – Weaknesses – Trends – Outliers Focus on 3 – Strengths – Weaknesses.
Understanding the Process and the Product Professional Development Spring, 2012.
Curriculum Based Measures vs. Formal Assessment
Check-in on Curriculum Progress Next Steps.  Brings all of the pieces together.  Transparency  Creates curriculum conversation  A tool for the journey.
September 12, 2014 Lora M. McCalister-Cruel BDS District Data Coach Bay District Schools Data Analysis Framework.
Planning, Assessment & Research Analysis and Use of Student Data Part I.
Data Analysis Concepts & Terms
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Review Planning Faribault Public Schools DATA DAY.
TKS Student Test Results Spring & Fall Tests Kentucky Performance Rating for Educational Progress (K-PREP) Administered May 2012 Kentucky test aligned.
The DATA WISE Process and Data- Driven Dialogue Presented by: Lori DeForest (315)
Strand A In Depth Context and Introduction. Strand A: Instruction Demonstrates your competency in instruction based on your documentation of NM Teacher.
KCCT Kentucky’s Commonwealth Accountability Testing System Overview of 2008 Regional KPR.
(High/Middle School) HSTW/MMGW Site Presentation ( Month Date, 2006) Promising Practices Next Steps Major Challenges Technical Review Visit (TRV)
Out with the Old, In with the New: NYS Assessments “Primer” Basics to Keep in Mind & Strategies to Enhance Student Achievement Maria Fallacaro, MORIC
ACCESS for ELLs® Interpreting the Results Developed by the WIDA Consortium.
WELCOME TO PARK VIEW MIDDLE SCHOOL NECAP REPORT NIGHT.
Data for Student Success Using Classroom Data to Monitor Student Progress “It is about focusing on building a culture of quality data through professional.
Instructional Coach Training September 30, 2010 AIMS: Generating Reports to Use with CFIP 1.
English Language Arts Single Plan for Student Achievement.
Understanding the TerraNova Test Testing Dates: May Kindergarten to Grade 2.
HOW DO I USE THINKGATE? Presented By: Mercy Aycart From: South Miami Senior High Data have no meaning…meaning is imposed.
Advancing Assessment Literacy Data Gathering IV: Collecting and Collating Data.
So Much Data – Where Do I Start? Assessment & Accountability Conference 2008 Session #18.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Formative assessment and effective feedback at Manor Lakes College
Melrose High School 2014 MCAS Presentation October 6, 2014.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Read to Achieve Parent Presentation What is Read to Achieve? Read to Achieve was created in legislation and approved by the North Carolina.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
The Education Trust – West Educational Opportunity Audit Report of Findings Oakland Unified School District March 25, 2009 Linda Murray Tami Pearson.
10+ Ways to Analyze Data Presenter: Lupe Lloyd Lupe Lloyd & Associates, Inc.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
The ACE Strategy Week 2: Administer and Evaluate Pre-test Cluster Cycle 2 Goal: By the end of the cycle, 78% of students in grades 3-5 will increase their.
Freshmen On-Track Analysis: Summary of Findings and Implications for Leadership.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
On The Right Road Moving Forward From a Lowest Achieving School to a High Progress Rewards School Fitzgerald High School Ben Hill County.
UNIVERSAL SCREENING AND PROGRESS MONITORING IN READING Secondary Level.
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
MARIETTA HIGH SCHOOL. PURPOSE The purpose of this presentation is to look at aggregated and disaggregated data and gather teacher input in regards to.
Steps to Creating Standards-Based Individualized Education Programs The following highlights the major steps Committees on Special Education (CSEs) can.
September 2, 2009 Blakemore Cluster Meeting. Meeting Objectives and Agenda By the end of cluster, teachers will have developed an understanding of the.
IMPACT OF READ NATURALLY SUMMER AND FALL 2013 ACTION RESEARCH PROJECT AND EDU604 CULMINATION PROJECT DOANE COLLEGE SUE SCHLICHTEMEIER-NUTZMAN, PH.D. By.
Examining Student Work Middle School Math Teachers District SIP Day January 27, 2016.
Planning, Assessment & Research Analysis and Use of Student Data Part I Administrative Academy Los Angeles Unified School District.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Student Growth Measures in Teacher Evaluation: Writing SLOs August 2014 Presented by Aimee Kirsch.
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
What are they? Why use them? How do you write them?
Middle School Training: Ensuring a Strong Foundation of Supports
WASL Science: Grade 10 Specific Title Slide for School.
What are they? Why use them? How do you write them?
Purpose for Curriculum Mapping
OFFICE OF CURRICULUM & INSTRUCTION
Specialist Workshop II Data Driven Decision Making
RDG/416 METHODS OF TEACHING IN EARLY CHILDHOOD: MATHEMATICS The Latest Version // uopcourse.com
RDG 416 RDG 416 rdg 416 rdg416 Entire Course // uopstudy.com
Presentation transcript:

1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007

2 Making Use of Data to Improve Student Performance Identify 1 effective strategy your district uses to make use of data. Move to a small group and share strategies. List ideas from others in small group that you can use in your district. Identify 1 strategy from small group to share with large group. List ideas from large group that you can use in your district.

4 Step DDDM Process

4 Looking at the BIG Picture

5 Multiple Measures Demographics Enrollment, attendance, drop-out rate, ethnicity, gender, grade level Perceptions Perceptions of learning environment, values & beliefs, attitudes, observations Student Learning Standardized tests (NRT/CRT), teacher observations of abilities, authentic assessments School Processes Description of school programs & processes

6 Criterion-Referenced Data What’s required? Proficiency percentages for combined pop. & identifiable subgroups by Test Year (for latest 3 years) Analysis of test by Passage type & type of response for literacy Writing domain & multiple choice for literacy Strand & type of response for math …in order to identify trends and draw conclusions based on results over 3 year period

7 Norm-Referenced Data What’s required? National percentile rank & standard score for combined population & identifiable subgroups by Test Year Analysis of test by Content subskill & skill cluster …in order to identify trends, measure growth, and draw conclusions based on results over 2 year period

8 Disaggregated Data Tools CRT ACSIP Template: # and % of students non- proficient/proficient for combined and subgroup populations ACSIP Strand Performance Report: combined and subgroup performance averages by test, passage type/domain/strand, & type of response Data Analysis Set:

9 Disaggregated Data Tools NRT ITBS ACSIP Report: # & % of students performing above the 50 th percentile on each test and content subskill for combined & subgroup populations Performance Profile: standard score & NPR on each test and content subskill for combined population School Coded Summary: standard score & NPR on each test for subgroup populations Data Analysis Set:

10

11

12

13

14

15 Digging Deeper  CRT Item Analysis  Content Standard  Language of Question  Level of Questioning  Distracters

16 Content Standard  What is it that the student must know or be able to do?  When is this introduced in the curriculum?  How is it paced?  Is it a “ power standard ” ?  What instructional strategies are used to help students master this standard?  Have I given students the “ tools ” (e.g. calculator skills, writing tips, test taking skills, etc.) necessary to respond appropriately?  Can this standard easily be integrated into other curricular areas?

17 Language of Question  How is the question worded on the test?  Are there vocabulary words used that may hinder comprehension?  Do I teach and test using the same language?  Do I have word/learning walls in my content area to support this standard and related vocabulary?

18 Level of Questioning  According to Bloom ’ s, what is the level of questioning used to measure mastery of the standard?  Highlight the verb(s) in the question. Do I use those same verbs in my teaching and testing?  Have I taught “ key ” or “ clue ” words that will help students to understand what is being asked of them?  Is the question “ multi-layered ” ?

19 Distracters  Are there items that “ distract ” the student from identifying what is being asked, or are there items that may “ confuse ” the student as he/she makes an answer choice?  Labels  Additional information  Multi-layered tasks  Conversions  “ Not ”

20

21 Digging Deeper NRT Item Analysis Building Item Analysis Identify items that have a negative value of 10 or more as indicated by the bar falling to the left of the 0 mark Analyze results of all related items

22

23 (Grade Level & Name of Exam) Student Population YearWeakness (see ACSIP rubric) YearWeakness (see ACSIP rubric) YearWeakness (see ACSIP rubric) Combined %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile African American %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile Hispanic %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile Caucasian %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile Economic. Dis. %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile LEP %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile Students with Dis. %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile %age prof./adv./ %age at or above 50 th %ile Trend Analysis: (Summarize 3 year findings from above. Include item analysis for further breakdown.)

24 Peeling the Data: Levels of Looking at Data District K-12 Feeder Patterns School Levels Grade Level Programs & Tracks Classroom-teacher Student

25 Peeling the Data: Questions to Ask Are there any patterns by racial/ethnic groups? by gender? by other identifiers? What groups are doing well? What groups are behind? What groups are on target? Ahead? What access and equity issues are raised? Do the data surprise you, or do they confirm your perceptions? How might some school or classroom practices contribute to successes and failures? For which groups of students? How do we continue doing what’s working and address what’s not working for students?

26 Peeling the Data: Dialogue to Have How is student performance described? (by medians, quartiles, levels of proficiency, etc.) How are different groups performing? Which groups are meeting the targeted goals? What don’t the data tell you? What other data do you need? What groups might we need to talk to? (students, teachers) What are the implications for? Developing or revising policies Revising practices and strategies Reading literature Visiting other schools Revising, eliminating, adding programs Dialogues with experts Professional development goal setting and monitoring progress How do we share and present the data to various audiences?

27 Sample Questions from a School’s Data Team Are there patterns of achievement based on Benchmark scores within subgroups? Are there patterns of placement for special programs by ethnicity, gender, etc.? What trends do we see with students who have entered our school early in their education vs. later? Is there a relationship between number of years at our school and our Benchmark scores?

28 Sample Questions from a School’s Data Team Is there a relationship between attendance/tardiness and achievement? How do students who have been retained do later? How do our elementary students do in middle school? Do findings in our NRT results support findings in our CRT results? Can our findings be directly linked to curriculum? instruction? assessment? What are our next steps?

29 Necessary Variables for Data- Driven Decision- Making KNOW HOW TIME WANT TO

30 Candie Watts Arch Ford Education Service Cooperative