Presentation is loading. Please wait.

Presentation is loading. Please wait.

Golden Math Nuggets: Digging into Assessment Data to Improve Instruction in Math James McBride, Renaissance Learning R. James Milgram, Stanford University.

Similar presentations


Presentation on theme: "Golden Math Nuggets: Digging into Assessment Data to Improve Instruction in Math James McBride, Renaissance Learning R. James Milgram, Stanford University."— Presentation transcript:

1 Golden Math Nuggets: Digging into Assessment Data to Improve Instruction in Math James McBride, Renaissance Learning R. James Milgram, Stanford University Michael Gallagher, North Carolina DPI Elliot Asp, Cherry Creek Public Schools, CO 1

2 The Common Core Math Standards – How American Students Measure Up: Three Years of Data Presented at CCSSO National Conference on Student Assessment James R. McBride Renaissance Learning June 20 2013 2

3 Outline Background and Overview Methods Aggregate Results Interpretation…Professor Milgram 3

4 Background Since Spring 2011, Renaissance Learning has conducted a program of research to provide advance information about the status of US students relative to the Common Core Math Standards. This paper, and that of Professor Milgram, present an update on the most recent year of the project, and a brief summary of findings and interpretations. 4

5 Background and Overview Beginning in 2008, Renaissance Learning has developed and field tested thousands of new STAR Math items measuring more than 550 standards-based skills, and calibrated them using the Rasch model. Thousands of those items have been aligned to the new Common Core Math Standards. Selected items that align to the Common core have been chosen for use in a research program designed to provide an early appraisal of U.S. students’ proficiency on some of the Common Core Math Standards. What follows is a summary of the design of that research, as well as findings from three annual evaluations of mastery of those items. 5

6 Methods The study consists of more than 100 Common Core-aligned items embedded as “experimental items” in STAR Math, and randomly chosen for administration to the universe of students taking STAR Math on the Renaissance Place RealTime™ platform. Response data from that platform is available to Renaissance Learning for research use. 6

7 7 Test Items 2011, Spring and Fall: -- 52 objectives -- 105 CCSS-aligned STAR Math items -- 2 to 15 items/grade, 2 or more per objective 2012, Spring and Fall: -- 7 standards,14 items added: 119 items in all 2013, Spring: -- item set revamped -- 44 objectives,110 items in all -- 72 new or revised; 38 carried over

8 8 2012 Item Counts by Domain and Grade Level DomainItem Grade Level 12345678910Total Algebra200020224214 Data Analysis & Statistics 02000000002 Geometry & Measmt 200252040015 Numbers & Operations 14816137101262088 Total181016151412141262119

9 9 2013 Item Counts by Domain and Grade Level DomainItem Grade Level 12345678Total Algebra200020004 Data Analysis & Statistics 00 0000000 Geometry & Measmt 000050049 Numbers & Operations 21910271510 497 Total41910272210 8110

10 Students All students taking STAR Math on the RP RealTime™ platform took one or more unscored CCSS-aligned test items on grade level -- random assignment of items to students -- items embedded in random positions Most students took 1 or 2 Common Core-aligned items. Each item was administered to students in its target grade, as well as the next higher grade. 10

11 Spring Data Collection 2011: Data from May and June 2011 -- More than 200,000 students 2012: Data from mid-April – mid-May 2012 -- More than 450,000 students 2013: Data from early May -- More than 200,000 students 11

12 12 Outcome Variables Percent Correct was calculated for each Common Core-aligned item. On-grade percent correct was the variable of primary interest. Other item statistics, including distractor choice percentages, item-score correlations and Rasch difficulty parameters, were calculated but are not reported here. Professor Milgram’s presentation will address some of those data.

13 Selected 2013 Results 13

14 14

15 15

16 16

17 17

18 18

19 19

20 20

21 Interpretation -- Overall percent correct increased somewhat each year from 2011 to 2013 - overall and for the 30 common items. -- Substantial differences grade-to grade, possibly due to: - objectives selected at each grade - differences in CCSS difficulty by grade 21

22 22 -- Steady decline from grade 4 to 8 gives pause. What does it signify? - Difficulties ahead? - Differences between 2010- 2013 curricula and CCSS? - Is it attributable to curriculum? Instruction? Teacher preparation? More than one of these?

23 23

24 24 - Percent correct did increase from Fall 2011 to Spring 2012 -- Largest changes: Grades 1 to 3 -- Smaller changes: Grades 4-6 -- Least change: Grades 7-10 -What does this suggest? -- Closer alignment of current curricula to CCSS in lower grades? -- More effective instruction there? -- Do younger kids just grow faster?

25 Next Steps -- Technical report on 2010-13 study -- Replicate for 2013-14 with expanded scope - Assess Fall-Spring 2013-14 growth - 25

26 Questions For further information: james.mcbride@renlearn.com 26

27 27


Download ppt "Golden Math Nuggets: Digging into Assessment Data to Improve Instruction in Math James McBride, Renaissance Learning R. James Milgram, Stanford University."

Similar presentations


Ads by Google