1 New Hampshire Statewide Assessment Using the 2007 NECAP and NH Alternate Reports February 2008.

Slides:



Advertisements
Similar presentations
NH DoE UPDATE: AYP and NECAP NH School Administrators Association Tuesday, September 20, 2005 Tim Kurtz, NH DoE
Advertisements

STAAR Alternate is the state assessment for students with significant cognitive disabilities.
Narragansett Elementary School Report Night Gail Dandurand, Principal Lisa Monahan, Assistant Principal.
Grade 3 Portfolio Implementation Guide Training for School Test Coordinators and Instructional Coaches NHCS Implementation Support Discussion Group Edmodo.
ON TARGET WITH AMAOS 1, 2, 3 SAN BERNARDINO COUNTY SUPERINTENDENT OF SCHOOLS September 29, 2009 Welcome.
The State of the State TOTOM Conference September 10, 2010 Jim Leigh Office of Assessment and Information Services Oregon Department of Education.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Facts About the Florida Alternate Assessment Created from “Facts About the Florida Alternate Assessment Online at:
O NLINE R EPORTING S YSTEM (R EQUIRED FOR DTC S ).
Read to Achieve Portfolio for Third Grade Students School Year.
Copyright © 2006 Pearson Education, Inc. or its affiliate(s). All rights reserved.
A professional development seminar presented by: Michele Mailhot and Susan Smith March 21, 2012
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Information on New Regents Examinations for SCDN Presentation September 19, 2007 Steven Katz, Director Candace Shyer, Bureau Chief Office of Standards,
Student Learning targets
Wisconsin Extended Grade Band Standards
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
MI-Access Reports— What Good are They to Me? Prepared by Linda Headley, Headley Pratt Consulting Fall 2007.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Performance Reports. Objectives Understand the role and purpose of the Performance Reports in supporting student success and achievement. Understand changes.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Update on Key Transition and Smarter Balanced Assessment Development.
NECAP DATA ANALYSIS 2012 Presented by Greg Bartlett March, 2013.
Introduction & Step 1 Presenter:. Training Overview Introduction Participation requirements FET Tool Orientation Distribution of username & passwords.
ACCESS for ELLs ® 101 Title III Conference October 7, 2010.
WELCOME TO PARK VIEW MIDDLE SCHOOL NECAP REPORT NIGHT.
Read the Standards! Read the Standards! How do you teach the standards? Accessing and Using the MCA-III Math Data on the AIR Website January
MyData - MyData - MyData Presented by: Carol Alexander October 22, 2009.
MyData My School My Class MyData Modified for Robert Frost Middle School by: Tawna Montano.
Direct Math Assessment State Department Training September, 2008 Cindy Johnstone Mathematics Coordinator.
1 New Hampshire Statewide Assessment The 2010 NECAP Reports February 2011.
Using the 2007 NECAP Reports February, 2008 New England Common Assessment Program.
1 Guide to Using the 2006 NECAP & NH-Alternate Reports: Companion PowerPoint Presentation February 2007 New Hampshire Alternate Assessment New Hampshire.
Idaho State Department of Education Accessing Your ISAT by Smarter Balanced Data Using the Online Reporting System (ORS) Angela Hemingway Director, Assessment.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
The 1% Rule: Alternate Assessment Participation November 20, 2007.
So Much Data – Where Do I Start? Assessment & Accountability Conference 2008 Session #18.
Copyright © 2006 Pearson Education, Inc. or its affiliate(s). All rights reserved.
Michigan Educational Assessment Program MEAP. Fall Purpose The Michigan Educational Assessment Program (MEAP) is Michigan’s general assessment.
11 New Hampshire Statewide Assessment Using the 2008 NECAP Science and NH Alternate Reports December 8 & 11, 2008.
Online Reporting System. Understand the role and purpose of the Performance Reports in supporting student success and achievement. Understand changes.
Guide to Test Interpretation Using DC CAS Score Reports to Guide Decisions and Planning District of Columbia Office of the State Superintendent of Education.
Using the 2009 NECAP Reports February 2010 New England Common Assessment Program.
Understanding Alaska Measures of Progress Results: Reports 1 ASA Fall Meeting 9/25/2015 Alaska Department of Education & Early Development Margaret MacKinnon,
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
11 Using the 2010 NECAP Reports February/March, 2011.
1 New Hampshire Statewide Assessment Using the 2009 NECAP Reports January, 2010.
1 New Hampshire – Addenda Ppt Slides State Level Results (slides 2-7) 2Enrollment - Grades 3-8 for 2005 and Reading NECAP 4Mathematics
Using the 2011 NECAP Science Results New England Common Assessment Program.
MCAS 2007 October 24, 2007 A Report to the Sharon School Committee and Dr. Barbara J. Dunham Superintendent of Schools Dr. George S. Anthony Director of.
1 New England Common Assessment Program Guide to Using the 2005 NECAP Reports: Companion PowerPoint Presentation March/April 2006.
1 AMP Results Overview for Educators October 30, 2015.
Alternate Proficiency Assessment Erin Lichtenwalner.
PERFORMANCE REPORTS. Understand the role and purpose of the Performance Reports in supporting student success and achievement. Understand changes to the.
PVAAS School Consultation Guide Fall 2010 Session C: 9-12 High School – All Data Tools PVAAS Statewide Core Team
GEORGIA’S CRITERION-REFERENCED COMPETENCY TESTS (CRCT) Questions and Answers for Parents of Georgia Students February 11, 2009 Presented by: MCES.
Using the NECAP Analysis and Reporting System February 2013.
May 20, Understanding New Hampshire’s 2008 AYP Status and Growth Reports.
Call-In Phone Number: Conference Code: Meeting Password: Pearson Jennifer Riley February 5, 2015 Perspective™ for Minnesota.
1 Using the 2009 NECAP Reports February 1-5, 2010.
Michigan School Data (MI School Data). Agenda  Overview of MI School Data Portal  Navigation 101  Sample Reports  Training and TA  Q & A 2.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
STAAR Alternate is the state assessment for students with significant cognitive disabilities.
K-6 Benchmark Assessment Inservices
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
STAAR Alternate is the state assessment for students with significant cognitive disabilities.
Kentucky Alternate Assessment
Developing School Improvement Plans #101
Standards-based Individualized Education Program (IEP) Module Two: Developing the Present Level of Academic Achievement and Functional Performance (PLOP)
Presentation transcript:

1 New Hampshire Statewide Assessment Using the 2007 NECAP and NH Alternate Reports February 2008

2 Welcome and Introductions Deb Wiswell Administrator, Bureau of Accountability Gaye Fedorchak Interim Director of Assessment

3 Welcome and Introductions Tim Kurtz (Returning June 12, 2008) Director of Curriculum and Assessment Phone: (603) Gaye Fedorchak Interim Director of Assessment & Supervisor of NH Alternate & ACCESS Phone: (603) David Gebhardt NAEP Coordinator Phone: (603) E Mail: Susan Morgan ACCESS for ELLs ® & NH-Alt Program Specialist Phone: (603) Carol Angowski Assessment Program Specialist Phone: (603) Visit us on the Web: NH DOE Assessment Staff

4 Welcome and Introductions Linda Stimson, ELA Supervisor Phone: (603) Christine Downing, Mathematics Coach Phone: (603) Jan McLaughlin, Science Supervisor Phone: (603) Deb Fleurant, Bias & Sensitivity & Title I Phone: (603) Jiffi Rainie, Math/Science Partnership Program Specialist Phone: (603) Gail Taylor, Math/Science Program Asst. Phone: (603) Ken Relihan, Social Studies Supervisor Phone: (603) NH DOE Curriculum Staff

5 Welcome and Introductions Tim Crockett Vice President x2106 Harold Stephens NECAP Program Director x2235 Shannan Douglas NH Program Manager x2139 Amanda Smith NECAP Program Manager x2259 Josh Evans NECAP Program Manager x2244 Amanda Breitmaier NH-Alt Program Manager x2251 Elliot Scharff NECAP Program Manager – Science x2126 Tina Haley NECAP Program Assistant x2427 Jennifer Varney NECAP Program Assistant x2115 Mellicent Friddell NECAP Program Assistant x2355 NECAP Service Center:

6 Guides to Using the 2007 NECAP and NH-Alt Assessment Reports

7 Purpose of the Workshop Review the different types of NECAP and NH Alternate Assessment reports Discuss effective ways to analyze and interpret results data Provide schools and districts an opportunity to share how they have analyzed results data

8 Involvement of Local Educators Development of Grade Level and Grade Span Expectations (NECAP) Development of Alternate Achievement Standards Linked to Grade Level Expectations (NH-Alt) Test Item Review Committees (NECAP) Bias and Sensitivity Review Committees (NECAP) Classroom Teacher Judgment Data (NECAP) Standard Setting Panelists (NECAP and NH-Alt) Technical Advisory Committee (NECAP and NH-Alt) NH-Alt Advisory Task Force (NH-Alt)

9 The Family Educational Rights and Privacy Act (FERPA) Access to individual student results is restricted to:  the student  the student’s parents/guardians  authorized school personnel Superintendents and principals are responsible for maintaining the privacy and security of all student records. Authorized school personnel shall have access to the records of students to whom they are providing services when such access is required in the performance of their official duties. FERPA website:

10 Types of NECAP Reports Student Report (Confidential) with Information for Parents and Report Interpretation Guide Item Analysis Report (Confidential) School level by student Results Report (Public) School and District level Summary Report (Public) School/District/State level Student Level Data Files (Confidential) Excel/csv files by grade on district confidential site

11 Student Report

12 Item Analysis Report

13 Results Report

14 Summary Report

15 NH Alternate Assessment Reports Student Report (Confidential) with Information for Parents and Report Interpretation Guide Student Roster Report (Confidential) School and District levels by student District Student Level Data Files (Confidential) Excel/csv files by grade on district confidential site Disaggregated Results by Content Area (Public) District and State reported separately for each grade State Summary Reports (Public) NH-Alt Students included in NECAP summary reports as “NT-Approved” and in Item Analysis Reports with an ‘A’ in achievement level column

16 Student Reports Were mailed in October to SAU Office and Schools

17

18

19

20 NH-Alt Achievement Levels Names of Achievement Levels are the same as in NECAP Level 4: Proficient with Distinction Level 3: Proficient Level 2: Partially Proficient Level 1: Substantially Below Proficient But they mean something a bit different…

21 NH-Alt Achievement Levels Alternate Achievement Levels describe student performance in terms of the scoring rubric used to score all portfolios. Portfolios show student performance in carefully targeted and ‘grade-linked’ academic skills as it progresses over a full school year. Each targeted skill is individualized for the student and is reduced in depth, breadth or complexity from what grade level peers are learning.

22 NH-Alt: Two Scoring Dimensions 1. The Performance Dimension Most heavily weighted in score and includes: Student progress (from start of year baseline) Connections and Access to the General Curriculum (Student work samples must show linkage to grade-level content at reduced level of depth, breadth, complexity)

23 NH-Alt: Two Scoring Dimensions 2. The Program Dimension Less heavily weighted in scoring, and includes: Generalized Performance (Student uses skill across different settings and situations) Self-Determination (Student attempts to self direct and monitor work) Supports (Does level of assistance provided support growth of independence and match student need?)

24 Sample Portfolio Score Calculation Scoring Sub-Dimension: Base Points (Range 1-4 pts.) Weight Given Weighted Sub-Score Student Progress4 ptsX 416 Access to Curriculum2.5 ptsX 410 Skill Generalization2 ptsX 36 Self-Determination4 ptsX 14 Supports4 ptsX 14 Calculate Content Area Raw Score: = 40

25 Raw Score to Achievement Level Conversion Chart If - Total Raw Score for the content area is: Then - Achievement Level for the content area is: Level 4: Proficient with Distinction Level 3: Proficient Level 2: Partially Proficient Level 1: Substantially Below Proficient

26 Question and Answers Questions about NH-Alternate Reporting?

27 NECAP and NH-Alt Student Level Data Files Contain: All demographic information for each student that was provided by the districts to the state The scaled score (or raw score for NH-Alt), achievement level, and subscores earned by each student in all content areas tested NECAP files also contain: Performance on released items Student questionnaire responses Optional reports data

28 Accessing Your Confidential Reports Schools and Districts can download multiple reports at once. This menu lets you choose between viewing NECAP or NH-Alt Reports

29 Go to… How Do I Find Public Assessment Reports?

30 Click Here

31 Choose Test Year & Reporting level….. Then take a look

32

33

34

35 Using Your Data Three essential questions… How did we do? What do the data tell us about parts of our program? What do the data tell us about parts of our population? We will begin exploring these questions today by… Looking at the different school-level reports (group data) Looking at the Item Analysis Report (primarily individual student data)

36 Essential Question #1 for Interpreting School Performance How did we do? …compared to the district …compared to the state …compared to our own history (both total school and grade/cohort group) …compared to what we would have predicted knowing our school’s programs and students Question #1

37 Essential Question #2 for Interpreting School Performance What do the data tell us about parts of our program How did we perform across the content areas? How did we perform in the various sub-content areas? What does the Item Analysis Report tell us about sub-content areas? How did our sub-content area and item-level performance compare to the district and state? Question #2

38 Essential Question #3 for Interpreting School Performance What do the data tell us about parts of our population? How did the various sub-groups perform relative to: a. the district? b. the state? c. the same sub-groups last year? d. what we would have predicted knowing the population? How do the percentages of students in the various sub-groups compare to the district and state? What does the questionnaire data tell us about the sub-populations? Question #3

39 Before You Go Any Further What questions will you answer and for what audiences? Based on what you know about your school’s programs and students, and how they have changed, what do you expect to see? (For example, how would a specific year’s 5 th graders perform relative to 5 th graders from previous years?) What processes will you use to look at your reports? Will you look at teaching year or testing year reports? Who should participate in the discussions? How should you group the participants?

40 Looking at the Data There are many ways to look at reports… In order to simplify this presentation, we will only show some of the processes you might use.

41 Looking at School-Level Reports Schools can view reports for Testing Year ( )

42 Looking at School-Level Reports Or for Teaching Year ( )

43 Looking at the School-Level Reports 1A and 1B: How did we do compared to the district and the state?

44 Looking at the School-Level Reports 2A: How did we perform across the content areas?

45 Looking at the Results Report – Grade Level Summary 56% of the students in this school scored proficient or above on the grade 5 reading test. 52% of the students in this school scored proficient or above on the grade 5 mathematics test. 43% of the students in this school scored proficient or above on the grade 5 writing test. Does this data match what we know about the school’s program? 2A: How did we perform across the content areas?

46 Looking at the Results Report – Grade Level Summary 52% of the students in this school scored proficient or above on the grade 5 mathematics test. 48% of the students in this district scored proficient or above on the grade 5 mathematics test. 64% of the students in the state scored proficient or above on the grade 5 mathematics test. Does this data match what we know about the school’s program? 2A: How did we perform across a content area (compared to the district and the state)?

47 Looking at the Results Report – Content Area Results 1C: How did we do compared to our own history?

48 Looking at the Results Report – Content Area Results 52% of this year’s fifth grade students scored proficient or above on the mathematics test. Does this confirm what we know about this year’s cohort of fifth grade students compared with last year’s cohort? The difference could be due to a cohort effect. 67% of last year’s fifth grade students scored proficient or above on the mathematics test. 1C: How did we do compared to our own history?

49 Looking at the Results Report – Content Area Results Cumulative Totals… provide information on multiple cohorts of students exposed to the program of instruction at a specific grade. Caution should be used if the program of instruction has changed significantly. This is the better indicator of how we’re doing. 1C: How did we do compared to our own history?

50 Looking at the Results Report – Content Area Results 2B: How did we perform in the various sub-content areas?

51 Looking at the Results Report – Content Area Results Total Possible Points includes both common and matrix items (not field-test). 2B: How did we perform in the various sub-content areas? Total Possible Points also represents the test’s balance of representation.

52 Looking at the Results Report – Content Area Results Please note: The Total Possible Points column is organized differently on the Reading Results Report 106 possible points are represented here – they are sorted by “Type of Text” The same 106 possible points are represented here – they are sorted by “Level of Comprehension”

53 Looking at the School-Level Report 3B: How did the various sub-groups compare to the district and state?

54 Looking at the School-Level Report 3A: How did the various sub-groups perform? Important Note: Disaggregated results are not reported for sub- groups of less than 10

55 Looking at the School-Level Report Because this is a small school, and so many of the sub-groups are smaller than 10, this part of the report is not as useful. But we can still look at district and state disaggregated results. 3A: How did the various sub-groups perform? 20% of the students with an IEP in the state scored proficient or above Does this data match what we know about the district’s program? 13% of the students with an IEP in this district scored proficient or above

56 Looking at the Item Analysis Report This part of the report gives specific information about the released items

57 Looking at the Item Analysis Report

58 Looking at the Item Analysis Report This part of the report represents all of the items used to compute student scores

59 Looking at the Item Analysis Report

60 Looking at the Item Analysis Report This part of the report does not represent all of the items used to compute student scores

61 Looking at the Item Analysis Report This school scored 33 percentage points lower than the state on item 3 – that’s probably significant and certainly worth a closer look. 2D: How did our item-level performance compare to the district and state?

62 Looking at the Item Analysis Report Almost 75% of the students who answered item 3 wrong chose option A. 2D: How did our item-level performance compare to the district and state?

63 Looking at the Item Analysis Report 2D: How did our item-level performance compare to the district and state? What do we know about this item? This information will help us use the Released Items Support Materials

64 Released Items Documents

65 Using the Released Items Support Materials Consider how the school’s curriculum and instructional practices address this GLE. Consider why so many students might have incorrectly selected option A.

66 Looking at the Item Analysis Report 2C: What does the Item Analysis Report tell us about sub-content areas?

67 Looking at the Item Analysis Report We can see that this school performed a little better than the district and about the same as the state on the “Geometry and Measurement” items throughout the test. 2C: What does the Item Analysis Report tell us about sub-content areas?

68 Looking at the Item Analysis Report Items 6, 11, and 13 all focus on the “Geometry and Measurement” content strand. 2C: What does the Item Analysis Report tell us about sub-content areas?

69 Looking at the Item Analysis Report This school did well on these “Geometry and Measurement” items as compared with the district and state. 2C: What does the Item Analysis Report tell us about sub-content areas?

70 Using the Released Items Documents Consider why the students were more successful in answering questions related to the “Geometry and Measurement” content strand. What is different about the way “Geometry and Measurement” is taught? Can this information apply to areas of mathematics where students are not doing as well? What curriculum and instructional practices might have contributed to this success?

71 Looking at the Item Analysis Report 2D: How did our item-level performance compare to the district and state? Over a third of the students received partial credit for answering item 15

72 Using the Released Items Support Materials

73 Looking at the Item Analysis Report 1D: How did we do compared to what we would have predicted knowing our school’s students?

74 Looking at the Item Analysis Report 1D: How did we do compared to what we would have predicted knowing our school’s students?

75 Small Group Activity 1.Select at least one of the three essential questions 2.Select your target audience 3.Begin to answer the question by examining your data 4.Note key findings or conclusions 5.Begin to discuss strategies for improvement 6.Be prepared to share your findings with the large group 7.What will you do next? How will you share your findings?

76 Supporting Materials and Resources Guides to Using the 2007 NECAP & NH-Alt Reports Companion PowerPoint presentation Three Essential Questions handout Grade Level and Grade Span Expectations (within NH Curriculum Frameworks documents) Accommodations, Guidelines, and Procedures: Administrator Training Guide Released Items documents Preparing Students for NECAP: Tips for Teachers to Share with Students Practice Tests for each subject at every grade level Performance Tracker

77 Now that we have collected valuable data, we have partnered with Performance Pathways to help districts and schools access the data – access is via the i4see Workbench. Performance Tracker --- Assessment Builder – Tech Paths (curriculum)

78 So how do I get access… Visit 1.PD centers are providing hands-on training. 2.TIP 16 on the i4see home page will describe how to request a user id. 3.Under Recent Highlights you will find a link to a timeline identifying monthly i4see training sessions.

79 Performance Pathways Provides Access to Assessment Information…

80 So how should we be using Performance Tracker… Yes You Should… Use Performance Tracker to learn more about your student performance in relation to the GLEs & GSEs Look for trends over time rather than one time snap-shots. Look at item level results and specific test items to better understand test terminology and student thinking Define student groups to understand the success of specific programs Compare across student groups and subgroups within your schools and within your district to understand curriculum and instructional strengths, weaknesses & needs Please Be Cautious… Performance Tracker is not meant to recreate AYP results Not all correlations are statistically significant Watch out for percentages -- keep an eye on the number of students represented by the reporting results– don’t jump to conclusions if 75% only represents three students Remember NECAP is only one indicator

81 Conclusion “Not everything that can be counted counts, and not everything that counts can be counted.” ~ Albert Einstein