Presentation is loading. Please wait.

Presentation is loading. Please wait.

Michael Luke 609-984-9637.

Similar presentations


Presentation on theme: "Michael Luke 609-984-9637."— Presentation transcript:

1 Michael Luke michael.luke@doe.state.nj.us 609-984-9637

2 Basic Definitions Test Design 2010 Item Alignment Calculator Usage Item Review (Subjective), Field Testing, Rangefinding (Rubric Development), and Item Review (Objective) Raw Cuts 2008/2009 Results 2008/2009 Questions

3 Basic Definitions MC- multiple choice item, 0/1, 1.5 minutes response time SCR- short constructed response item, 0/1, 2 minutes response time ECR- extended constructed response item, 0/1/2/3, 10 minutes Standards, cluster, strand, cumulative progress indicator (cpi)- see sample item #1

4 Test Design 2010- Grades 6, 7, & 8 Test Part of 6Number of Items, Timing Part 110 SCR, 20 minutes Part 28 MC, 1 ECR 22 minutes Part 38 MC, 1 ECR 22 minutes Part 410 MC, 1 ECR 25 minutes Part 510 MC, 1 ECR 25 minutes Part 66 MC, 1 ECR 19 minutes

5 Test Design Comparison 2009 2010 SCR 8/10 MC 45 ECR 4 10 SCR MC 42 ECR 5

6 Test Design 2010- Grades 6, 7 & 8 Test Length: Grades 6 and 7- day 1 64 minutes, day 2 69 minutes; or Grade 8 133 minutes (+5) Each assessment contains embedded field test items- 49 operational score points (last year 50, 52)

7 Item Alignment 2010 Previously, items representing 12/13 raw score points were aligned to each cluster For 2010, raw score points per cluster are proportional to the number of “bold” CPI’s in the Areas of Focus documents http://www.nj.gov/education/aps/cccs/math/njscp.ht m

8 Item Alignment 2010 Points by ClusterGrade 6Grade 7Grade 8 113 2171317 391211 410118

9 Calculator Usage 2009 v. 2010 2009 2010 Grade 6, Part 1 (SCR items) only was noncalculator Grades 7 and 8, entire assessment was calculator active Grades 6, 7, and 8, Parts 1, 2, and 3 are noncalculator 23 raw points from noncalculator items (47%) 26 raw points from calculator active items (53%) See memos

10 Item/Form Development Item Review (Subjective) Field Testing Rangefinding (Scoring) Item Review (Objective) Statistics Review

11 Form Construction Form is then constructed using item statistics, making sure aligned items meet required Cluster Representation (see earlier slide), and Item Type counts (see earlier slide). Items of various difficulties are used; however, attempt is made to replicate overall form difficulty of previous year.

12 Raw Cuts 2008/2009 Teacher Panel Established Using Bookmark Method Summer 2008) Grade2008-P(200)2009-P2008-AP(250)2009-AP 6 (out of 50)25264140 7 (out of 52)27 4241 8 (out of 52)29284342

13 Results 2008/2009 (Percents of Student Results in Categories) GradeProficient 2008 Proficient 2009 Advanced Proficient 2008 Advanced Proficient 2009 651.745.720.126.1 744.342.519.924.2 842.442.624.829.9

14 Results 2008/2009 Sums of Categories (P+AP Percentages) Grade20082009 671.8 764.266.7 867.272.5

15 Testing Misconceptions On the sixth grade 2008 ASK four of the five easiest points to accumulate are the first point in an ECR item (scaffolding) and the two toughest points to accumulate were from MC items.

16 New Eighth Grade Reference Sheet for 2010 See http://www.nj.gov/education/assessment/ms/5- 8/ref/math/G8MathRef09.pdf

17 Questions? michael.luke@doe.state.nj.us 609-984-9637


Download ppt "Michael Luke 609-984-9637."

Similar presentations


Ads by Google