Using the 2007 NECAP Reports February, 2008 New England Common Assessment Program.

Slides:



Advertisements
Similar presentations
Franklin Public Schools MCAS Presentation November 27, 2012 Joyce Edwards Director of Instructional Services.
Advertisements

Maryland School Assessment (MSA) 2012 Science Results Carolyn M. Wood, Ph.D. Assistant Superintendent, Accountability, Assessment, and Data Systems August.
Narragansett Elementary School Report Night Gail Dandurand, Principal Lisa Monahan, Assistant Principal.
Data Analysis Protocol Identify – Strengths – Weaknesses – Trends – Outliers Focus on 3 – Strengths – Weaknesses.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Common Questions What tests are students asked to take? What are students learning? How’s my school doing? Who makes decisions about Wyoming Education?
Valentine Elementary School San Marino Unified School District Standardized Testing and Reporting (STAR) Spring 2009 California Standards Test.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Copyright © 2006 Pearson Education, Inc. or its affiliate(s). All rights reserved.
A professional development seminar presented by: Michele Mailhot and Susan Smith March 21, 2012
Using Data to Identify Student Needs for MME Stan Masters Coordinator of Curriculum, Assessment, and School Improvement Lenawee ISD August 26, 2008.
Information on New Regents Examinations for SCDN Presentation September 19, 2007 Steven Katz, Director Candace Shyer, Bureau Chief Office of Standards,
Students in the Gap: Understanding Who They Are & How to Validly Assess Them.
MELROSE PUBLIC SCHOOLS MELROSE VETERANS MEMORIAL MIDDLE SCHOOL OCTOBER 2013 MCAS Spring 2013 Results.
Data for Student Success Regional Data Initiative Presentation November 20, 2009.
Identifying the gaps in state assessment systems CCSSO Large-Scale Assessment Conference Nashville June 19, 2007 Sue Bechard Office of Inclusive Educational.
Mathematics Indicators and Goals. Math Tier II Indicator Indicator 1.8: All junior high students will meet or exceed standards and be identified as proficient.
MI-Access Reports— What Good are They to Me? Prepared by Linda Headley, Headley Pratt Consulting Fall 2007.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Predicting Patterns: Lenawee County's Use of EXPLORE and PLAN DataDirector 2011 User Conference Dearborn, Michigan.
NECAP DATA ANALYSIS 2012 Presented by Greg Bartlett March, 2013.
DATA PRESENTATION February 5, 2008 ARE WE IMPROVING? WAS 2007 BETTER THAN 2006?
WELCOME TO PARK VIEW MIDDLE SCHOOL NECAP REPORT NIGHT.
U-32 Spring 2013 NECAP Presentation March 27,
MyData - MyData - MyData Presented by: Carol Alexander October 22, 2009.
MyData My School My Class MyData Modified for Robert Frost Middle School by: Tawna Montano.
1 New Hampshire Statewide Assessment The 2010 NECAP Reports February 2011.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
CHOICES FOR NINE THE TRANSITION FROM GRADE 8 TO 9 Guidance Program & Services Department Success for all Learners Zion Heights JHS.
WELCOME EPAS Reports: Maximizing the Impact on Student Achievement in Mathematics.
1 Guide to Using the 2006 NECAP & NH-Alternate Reports: Companion PowerPoint Presentation February 2007 New Hampshire Alternate Assessment New Hampshire.
So Much Data – Where Do I Start? Assessment & Accountability Conference 2008 Session #18.
Longitudinal Data Systems: What Can They Do for Me? Nancy J. Smith, Ph.D. Deputy Director Data Quality Campaign November 30, 2007.
Smarter Balanced Assessment System March 11, 2013.
Copyright © 2006 Pearson Education, Inc. or its affiliate(s). All rights reserved.
Michigan Educational Assessment Program MEAP. Fall Purpose The Michigan Educational Assessment Program (MEAP) is Michigan’s general assessment.
11 New Hampshire Statewide Assessment Using the 2008 NECAP Science and NH Alternate Reports December 8 & 11, 2008.
Online Reporting System. Understand the role and purpose of the Performance Reports in supporting student success and achievement. Understand changes.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
Guide to Test Interpretation Using DC CAS Score Reports to Guide Decisions and Planning District of Columbia Office of the State Superintendent of Education.
Using the 2009 NECAP Reports February 2010 New England Common Assessment Program.
Understanding Alaska Measures of Progress Results: Reports 1 ASA Fall Meeting 9/25/2015 Alaska Department of Education & Early Development Margaret MacKinnon,
Fall 2007 MEAP Reporting 2007 OEAA Conference Jim Griffiths – Manager, Assessment Administration & Reporting Sue Peterman - Department Analyst, MEAP.
Using the Fall 2011 NECAP Results New England Common Assessment Program.
11 Using the 2010 NECAP Reports February/March, 2011.
Mathematics ThinkLink Benchmark Assessment What is the data telling us?
1 New Hampshire Statewide Assessment Using the 2009 NECAP Reports January, 2010.
1 New Hampshire – Addenda Ppt Slides State Level Results (slides 2-7) 2Enrollment - Grades 3-8 for 2005 and Reading NECAP 4Mathematics
Using the 2011 NECAP Science Results New England Common Assessment Program.
NECAP Results and Accountability A Presentation to Superintendents March 22, 2006.
MCAS 2007 October 24, 2007 A Report to the Sharon School Committee and Dr. Barbara J. Dunham Superintendent of Schools Dr. George S. Anthony Director of.
1 New England Common Assessment Program Guide to Using the 2005 NECAP Reports: Companion PowerPoint Presentation March/April 2006.
1 AMP Results Overview for Educators October 30, 2015.
PED School Grade Reports (with thanks to Valley High School) ACE August 3, 2012 Dr. Russ Romans District Accountability Manager.
NECAP Presentation for School Year March 26,
Using the Grade 9 Applied Mathematics Assessment IIR Looking at the Item Information Report: Student Roster Files.
1 New Hampshire Statewide Assessment Using the 2007 NECAP and NH Alternate Reports February 2008.
Using the NECAP Analysis and Reporting System February 2013.
A Presentation to the USDOE January 13, 2010 Mary Ann Snider Chief of Educator Excellence and Instructional Effectiveness Race to the Top- Assessment Programs.
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
1 Using the 2009 NECAP Reports February 1-5, 2010.
2011 MEAP Results Board of Education Presentation | 07 May 2012 Romeo Community Schools | Office of Curriculum and Instruction.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Franklin Public Schools MCAS and PARCC Results Spring 2015 Joyce Edwards Assistant Superintendent for Teaching and Learning December 8, 2015.
ACCESS for ELLs Score Changes
K-6 Benchmark Assessment Inservices
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Kentucky Alternate Assessment
Vision 20/20: Checks and Balances
EPAS Educational Planning and Assessment System By: Cindy Beals
Presentation transcript:

Using the 2007 NECAP Reports February, 2008 New England Common Assessment Program

2 Mary Ann Snider Director of Assessment & Accountability RI Department of Education Welcome and Introductions

3 NECAP Service Center: Tim Crockett Assistant Vice-President x2106 Harold Stephens NECAP Program Director x2235 Amanda Smith NECAP Program Manager x2259 Elliot Scharff NECAP Program Manager – Science x2126 Josh Evans NECAP Program Manager x2244 Tina Haley NECAP Program Assistant x2427 Jennifer Varney NECAP Program Assistant x2115 Mellicent Friddell NECAP Program Assistant x2355 Welcome and Introductions

4 Guide to Using the 2007 NECAP Reports

5 Purpose of the Workshop Review the different types of NECAP reports Discuss effective ways to analyze and interpret results data Provide schools and districts an opportunity to share how they have analyzed results data

6 Involvement of Local Educators Development of Grade Level Expectations Test item review committees Bias and sensitivity review committees Classroom teacher judgment data Standard setting panelists Technical Advisory Committee

7 FERPA The Family Educational Rights and Privacy Act (FERPA) Access to individual student results is restricted to: othe student othe student’s parents/guardians oauthorized school personnel Superintendents and principals are responsible for maintaining the privacy and security of all student records. Authorized school personnel shall have access to the records of students to whom they are providing services when such access is required in the performance of their official duties. FERPA website:

8 Types of NECAP Reports Student Report Item Analysis Report Results Report Summary Report Student Level Data Files

9 Student Report

10 Item Analysis Report

11 Results Report

12 Summary Report

13 Student Level Data Files Contain: All demographic information for each student that was provided by the state The scaled score, achievement level, and subscores earned by each student in all content areas tested Also contain: Performance on released items Student questionnaire responses Optional reports data

14 Accessing Your Reports Schools and Districts can download multiple reports at once.

15 Using Your Data Three essential questions… How did we do? What do the data tell us about parts of our program? What do the data tell us about parts of our population? We will begin exploring these questions today by… Looking at the different school-level reports (group data) Looking at the Item Analysis Report (primarily individual student data)

16 Essential Question #1 for Interpreting School Status How did we do? …compared to the district …compared to the state …compared to our own history (both total school and grade/cohort group) …compared to what we would have predicted knowing our school’s programs and students Question #1

17 Essential Question #2 for Interpreting School Status What do the data tell us about parts of our program How did we perform across the content areas? How did we perform in the various sub-content areas? What does the Item Analysis Report tell us about sub-content areas? How did our sub-content area and item-level performance compare to the district and state? Question #2

18 Essential Question #3 for Interpreting School Status What do the data tell us about parts of our population? How did the various sub-groups perform relative to: a. the district? b. the state? c. the same sub-groups last year? d. what we would have predicted knowing the population? How do the percentages of students in the various sub-groups compare to the district and state? What does the questionnaire data tell us about the sub-populations? Question #3

19 Before You Go Any Further… What questions will you answer and for what audiences? Based on what you know about your school’s programs and students, and how they have changed, what do you expect to see? (For example, how would a specific year’s 5 th graders perform relative to 5 th graders from previous years?) What processes will you use to look at your reports? Will you look at teaching year or testing year reports? Who should participate in the discussions? How should you group the participants?

20 Looking at the data There are many ways to look at reports… In order to simplify this presentation, we will only show one of the processes you might use.

21 Looking at the School-Level Reports Schools can view reports for Testing Year ( )

22 Looking at the School-Level Reports Or for Teaching Year ( )

23 Looking at the School-Level Reports 1A and 1B: How did we do compared to the district and the state?

24 Looking at the School-Level Reports 2A: How did we perform across the content areas?

25 Looking at the Results Report – Grade Level Summary 56% of the students in this school scored proficient or above on the grade 5 reading test. 52% of the students in this school scored proficient or above on the grade 5 mathematics test. 43% of the students in this school scored proficient or above on the grade 5 writing test. Does this data match what we know about the school’s program? 2A: How did we perform across the content areas?

26 Looking at the Results Report – Grade Level Summary 52% of the students in this school scored proficient or above on the grade 5 mathematics test. 48% of the students in this district scored proficient or above on the grade 5 mathematics test. 64% of the students in the state scored proficient or above on the grade 5 mathematics test. Does this data match what we know about the school’s program? 2A: How did we perform across a content area (compared to the district and the state)?

27 Looking at the Results Report – Content Area Results 1C: How did we do compared to our own history?

28 Looking at the Results Report – Content Area Results 52% of this year’s fifth grade students scored proficient or above on the mathematics test. Does this confirm what we know about this year’s cohort of fifth grade students compared with last year’s cohort? The big difference could be due to a cohort effect (stronger group vs. weaker group). 67% of last year’s fifth grade students scored proficient or above on the mathematics test. 1C: How did we do compared to our own history?

29 Looking at the Results Report – Content Area Results Cumulative Totals… provide information on multiple cohorts of students exposed to the program of instruction at a specific grade. Caution should be used if the program of instruction has changed significantly. This is the better indicator of how we’re doing. 1C: How did we do compared to our own history?

30 Looking at the Results Report – Content Area Results 2B: How did we perform in the various sub-content areas?

31 Looking at the Results Report – Content Area Results Total Possible Points includes both common and matrix items (not field-test). 2B: How did we perform in the various sub-content areas? Total Possible Points also represents the test’s balance of representation.

32 Looking at the Results Report – Content Area Results Please note: The Total Possible Points column is organized differently on the Reading Results Report 106 possible points are represented here – they are sorted by “Type of Text” The same 106 possible points are represented here – they are sorted by “Level of Comprehension”

33 Looking at the School-Level Report 3B: How did the various sub-groups compare to the district and state?

34 Looking at the School-Level Report 3A: How did the various sub-groups perform? Important Note: Disaggregated results are not reported for sub- groups of less than 10

35 Because this is a small school, and so many of the sub-groups are smaller than 10, this part of the report is not as useful. But we can still look at district and state disaggregated results. Looking at the School-Level Report 3A: How did the various sub-groups perform? 13% of the students with an IEP in this district scored proficient or above 20% of the students with an IEP in the state scored proficient or above Does this data match what we know about the district’s program?

36 Looking at the Item Analysis Report This part of the report gives specific information about the released items

37 Looking at the Item Analysis Report

38 Looking at the Item Analysis Report This part of the report represents all of the items used to compute student scores

39 Looking at the Item Analysis Report

40 Looking at the Item Analysis Report This part of the report does not represent all of the items used to compute student scores

41 Looking at the Item Analysis Report This school scored 33 percent lower than the state on item 3 – that’s probably significant and certainly worth a closer look. 2D: How did our item-level performance compare to the district and state?

42 Looking at the Item Analysis Report Over 80% of the students who answered item 3 wrong chose option A. 2D: How did our item-level performance compare to the district and state?

43 Looking at the Item Analysis Report 2D: How did our item-level performance compare to the district and state? What do we know about this item? This information will help us use the Released Items Support Materials

44 Released Items Documents

45 Using the Released Items Support Materials Consider how the school’s curriculum and instructional practices address this GLE. Consider why so many students might have incorrectly selected option A.

46 Looking at the Item Analysis Report 2C: What does the Item Analysis Report tell us about sub-content areas?

47 Looking at the Item Analysis Report We can see that this school performed better than the district or the state on the “Geometry and Measurement” items throughout the test. 2C: What does the Item Analysis Report tell us about sub-content areas?

48 Looking at the Item Analysis Report Items 6, 11, and 13 all focus on the “Geometry and Measurement” content strand. 2C: What does the Item Analysis Report tell us about sub-content areas?

49 Looking at the Item Analysis Report This school scored higher than the district and the state on “Geometry and Measurement” items 6, 11, and 13. 2C: What does the Item Analysis Report tell us about sub-content areas?

50 Using the Released Items Documents Consider why the students were more successful in answering questions related to the “Geometry and Measurement” content strand. What is different about the way “Geometry and Measurement” is taught? Can this information apply to areas of mathematics where students are not doing as well? What curriculum and instructional practices might have contributed to this success?

51 Looking at the Item Analysis Report 2D: How did our item-level performance compare to the district and state? Over one-third of the students received only partial credit for answering item 15

52 Using the Released Items Support Materials

53 Three Essential Questions Handout 1D: How did we do compared to what we would have predicted knowing our school’s students?

54 Looking at the Item Analysis Report 1D: How did we do compared to what we would have predicted knowing our school’s students?

55 Small Group Activity 1.Select at least one of the three essential questions 2.Select your target audience 3.Begin to answer the question by examining your data 4.Note key findings or conclusions 5.Begin to discuss strategies for improvement 6.Be prepared to share your findings with the large group

56 Supporting Materials Guide to Using the 2007 NECAP Reports Companion PowerPoint presentation Three Essential Questions handout Grade Level Expectations Test Specifications documents Released Items documents Preparing Students for NECAP: Tips for Teachers to Share with Students Technical Report

57 Conclusion “Not everything that can be counted counts, and not everything that counts can be counted.” ~ Albert Einstein