Presentation is loading. Please wait.

Presentation is loading. Please wait.

Join the conversation: todaysmeet.com/SBAC

Similar presentations


Presentation on theme: "Join the conversation: todaysmeet.com/SBAC"— Presentation transcript:

1 Join the conversation: todaysmeet.com/SBAC
Assessment Update: June 2015 Join the conversation: todaysmeet.com/SBAC

2

3

4 Reconceptualizing assessment without at the same time
reconceptualizing instruction will have little benefit. Pullin, 2008 This quote is a good reminder that the most effective way to support student success is to continue to focus on instruction. We are in the midst of great transition in public education – new standards and new assessments. As we look at the details of this transition today, let’s keep this quote at the forefront of our thinking. Instruction will make the difference.

5 Required State Data for Each of Eight State Priority Areas

6 February 2015 Draft LCAP Evaluation Rubric
OLD ACCOUNTABILITY VS NEW Focused on assessment – old Broader meaning of what successful schools and districts look like – new Focus on college and career readiness – API template Graphic of alberta and recent LCAP template

7

8 Implementation Timeline for
New State Assessments and Accountability

9

10 History / Social Studies
English Language Arts and Mathematics Science and History / Social Studies English Language Development

11 Processes and tools used by the teacher during
instruction • Frequent and of short duration • Provides corrective feedback; modifies • Tasks individualized to students’ needs • Embedded within the learning activity • Reporting outside classroom may/may not be necessary

12 Processes and tools used by the teacher during
instruction • Frequent and of short duration • Provides corrective feedback; modifies • Tasks individualized to students’ needs • Embedded within the learning activity • Reporting outside classroom may/may not be necessary

13 Processes and tools used by the teacher during
instruction • Frequent and of short duration • Provides corrective feedback; modifies • Tasks individualized to students’ needs • Embedded within the learning activity • Reporting outside classroom may/may not be necessary

14 Processes and tools used by the teacher during
instruction • Frequent and of short duration • Provides corrective feedback; modifies • Tasks individualized to students’ needs • Embedded within the learning activity • Reporting outside classroom may/may not be necessary

15

16

17

18 Agenda Principles of Scoring Understanding the Reports
Using the Online Reporting System Overview of the Reporting Timeline Interpreting, Using, and Communicating Results

19 Principles of Scoring

20 Computer Adaptive Testing: Philosophy
“Computer adaptive testing (CAT) holds the potential for more customized assessment with test questions that are tailored to the students’ ability levels, and identification of students’ skills and weaknesses using fewer questions and requiring less testing time.” Shorr, P. W. (2002, Spring). A look at tools for assessment and accountability. Administrator Magazine.

21 How Does a CAT Work? Each student is administered a set of test questions that is appropriately challenging. The student’s performance on the test questions determines if subsequent questions are harder or easier. The test adapts to the student item-by-item and not in stages. Fewer test questions are needed as compared to a fixed form to obtain precise estimates of students’ ability. The test continues until the test content outlined in the blueprint is covered.

22 How Does a CAT Work? Example: A Student of Average Ability
Expanded Very High Ability High Med-High Medium Med-Low Low Expanded Very Low 1 2 3 4 5 6 7 8 9 10 Test Questions Answers (R/W) R R W R W W W W R R

23 Computer Adaptive Testing: Behind the Scenes
Requires a large pool of test questions statistically calibrated on a common scale with ability estimates, e.g., from the Field Test Uses an algorithm to select questions based on a student’s responses, to score responses, and to iteratively estimate the student’s performance Final scale scores are based on item pattern scoring

24 Scoring the CAT As a student progresses through the test, his or her pattern of responses is tracked and revised estimates of the student’s ability are calculated. Successive test questions are selected to increase the precision about the level of achievement given the current estimate of his or her ability. Resulting scores from the CAT portion of the test are based the specific test questions selected as a result of the student’s responses, but NOT the sum of the number answered correctly. The test question pools for a particular grade level are designed to include enhanced pool of test questions that are more or less difficult for that grade but still matching the test blueprint for that grade.

25 Human Scored Items in the CAT
Some items administered on the Smarter Balanced adaptive test component require human scoring of items The adaptive algorithm will select these items based on performance on prior items. Since these items cannot be scored in real time by a human, performance on these items will not impact later item selection.

26 Performance Tasks (PTs)
In all Smarter Balanced tests, a PT and a set of stimuli on a given topic are administered as well as the CAT. PTs are administered at the classroom/group level so they are not targeted to students’ specific ability level. The items associated with the PTs may be scored by machine or by human raters.

27 Final Scoring For each student, the responses from the PT and CAT portions are merged for final scoring. Resulting ability estimates are based on the specific test questions that a student answered, not the total number of items answered correctly. Higher ability estimates are associated with test takers who correctly answer difficult and more discriminating items. Lower ability estimates are associated with test takers who correctly answer easier and less discriminating items. Two students will have the same ability estimate if they have the same set of test questions with the same responses. It is possible for students to have the same ability estimate through different response patterns This type of scoring is called “Item Pattern Scoring.”

28 Final Scoring: Contribution of CAT and PT Sections
Number of Items defined by Test Blueprints ELA/Literacy Mathematics Grade CAT PT 3–5 38–41 5–6 31–34 2–6 6–8 37–42 30–34 11 39–41 33–36

29 Final Scoring: Contribution of CAT and PT Sections (cont.)
Based on the test blueprint, the CAT section is emphasized because there are more CAT items/points than PT items/points. Claims with more items/points are emphasized. Mathematics: Concepts and Procedures  Problem Solving/Modeling and Data Analysis  Communicating Reasoning ELA: Reading  Writing  Speaking/Listening  Research Because scores are based on pattern scoring, groups of items that are more difficult and discriminating will have a larger contribution on final scores. Therefore there is no specific weight associated with either PT or CAT Sections

30 Final Scoring: Mapping
After estimating the student’s overall ability, it is mapped onto the reporting scale through a linear transformation. Mathematics: Scaled Score = * ELA: Scaled Score = * Limited by grade level lowest and highest obtainable scaled score

31 Properties of the Reporting Scale
Scores are on a vertical scale. Expressed on a single continuum for a content area Allows users to describe student growth over time across grade levels Scale score range ELA/Literacy: 2114–2795 Mathematics: 2189–2862 For each grade level and content area, there is a separate scale score range.

32 Smarter Balanced Scale Score Ranges by Grade Level
Subject Min Max 3 ELA 2114 2623 Mathematics 2189 2621 4 2131 2663 2204 2659 5 2201 2701 2219 2700 6 2210 2724 2235 2748 7 2258 2745 2250 2778 8 2288 2769 2265 2802 11 2299 2795 2280 2862 Copyright © 2009 Educational Testing Service.

33 Achievement Levels Achievement level classifications based on overall scores Level 1—Standard Not Met Level 2—Standard Nearly Met Level 3—Standard Met Level 4—Standard Exceeded

34 Achievement Levels by Grade

35 Smarter Balanced Scale Score Ranges for ELA/Literacy
Grade Level 1 Level 2 Level 3 Level 4 3 2114–2366 2367–2431 2432–2489 2490–2623 4 2131–2415 2416–2472 2473–2532 2533–2663 5 2201–2441 2442–2501 2502–2581 2582–2701 6 2210–2456 2457–2530 2531–2617 2618–2724 7 2258–2478 2479–2551 2552–2648 2649–2745 8 2288–2486 2487–2566 2567–2667 2668–2769 11 2299–2492 2493–2582 2583–2681 2682–2795

36 Achievement Levels by Grade

37 Smarter Balanced Scale Score Ranges for Mathematics
Grade Level 1 Level 2 Level 3 Level 4 3 2189–2380 2381–2435 2436–2500 2501–2621 4 2204–2410 2411–2484 2485–2548 2549–2659 5 2219–2454 2455–2527 2528–2578 2579–2700 6 2235–2472 2473–2551 2552–2609 2610–2748 7 2250–2483 2484–2566 2567–2634 2635–2778 8 2265–2503 2504–2585 2586–2652 2653–2802 11 2280–2542 2543–2627 2628–2717 2718–2862

38 Measurement Precision: Error Bands
For each scale score estimated for a student, there is measurement error associated with each score. An error band is a useful tool that describes the measurement error associated with a reported scale score. Error bands are used to construct an interval estimate corresponding to a student’s true ability/proficiency for a particular content area with a certain level of confidence. The error bands used to construct interval estimates were based on one standard error of measurement. If the same test is given to student multiple times, about 68 percent of the time, the student will score within this band.

39 Achievement Levels for Claims
Achievement Levels for claims are very similar to subscores. They provide supplemental information regarding a student’s strengths or weaknesses. No achievement level setting occurred for claims. Only three achievement levels for claims were developed since there are fewer items within each claim. Achievement levels for claims are based on the distance a student’s performance on the claim is from the Level 3 proficiency cut. A student must complete all items within a claim to receive an estimate of his or performance on a claim. Copyright © 2009 Educational Testing Service.

40 Achievement Levels for Claims (2)
A student’s ability, along with the corresponding standard error, are estimated for each claim. The student’s ability estimate for the claim is compared to the Level 3 proficiency cut Differences between and greater than standard errors of the claim would indicate a strength or weakness.

41 Achievement Levels for Claims (3)
At/Near Standard Below Standard

42 Achievement Levels for Claims (4)
Above Standard

43 Understanding the Reports

44 Available Reports Secure Location Preliminary student test results
ORS Preliminary and partial aggregate test results Student Score Reports (ISRs) TOMS Final student data Public Smarter Balanced ELA and mathematics CDE CAASPP Web page CST/CMA/CAPA for Science and STS for RLA

45 Secure Reporting  Report LEA School* Parent Preliminary Student Data
ORS Preliminary Aggregate Data Final Student Score Reports (ISRs) pdf/paper TOMS†/Paper†† TOMS†/Paper Paper Final Student Data File TOMS * Access to ORS will be granted to CAASPP Test Site Coordinators in August. † PDFs of the Student Score Reports will be available in TOMS. †† LEAs must forward or mail the copy of the CAASPP Student Score Report to each student’s parent/guardian within 20 working days of its delivery to the LEA.

46 Preliminary Test Results: Student and Aggregate
Through the Online Reporting System (ORS) Available approximately three to four weeks after student completes both parts—CAT and PT—of a content area Added daily Use Caution: The results are partial and may not be a good representation of the school or district’s final aggregate results. The results are preliminary; the processing of appeals may result in score changes.

47 Student Score Reports (ISR): Overview
One-page report Double-sided: All Smarter Balanced CAPA for Science Single-sided: CST/CMA for Science (Grade 10) STS for RLA Student’s final CAASPP test results Reports progress toward the state’s academic content standards Indicates areas of focus to: Help students’ achievement Improve educational programs LEA distributes to parents/guardians

48 Student Score Reports: Shipments to LEAs
Two copies of each student’s Student Score Report One for the parent/guardian One for the school site 2015 LEA CAASPP Reports Shipment Letter 2015 School CAASPP Reports Shipment Letter Note: Per California Code of Regulations, Title 5, Section 863, LEAs must forward one copy to parent/guardian within 20 business days. Schools may file the copy they receive, or they may give it to the student’s current teacher or counselor. If the LEA receives the reports after the last day of instruction, the LEA must mail the pupil results to the parent or guardian at their last known address. If the report is non-deliverable, the LEA must make the report available to the parent or guardian during the next school year.

49 Test Results Reported on the Student Score Reports
For students who took Smarter Balanced ELA and mathematics, CST, CMA or CAPA for Science For students who took STS RLA Grade Smarter Balanced ELA and mathematics CST, CMA, or CAPA Science 3 4 5 6 7 8 10 11 Grade STS RLA 2 3 4 5 6 7 8 9 10 11

50 Elements of the Student Score Report
Front Page Back Page 5 1 2 3 6 7 4 8

51 Elements of the Student Score Report
Front Page 1

52 Elements of the Student Score Report
Front Page 2

53 Elements of the Student Score Report
Front Page 3

54 Elements of the Student Score Report
Front Page 4

55 Elements of the Student Score Report
Back Page 5

56 Elements of the Student Score Report
Back Page 6

57 Elements of the Student Score Report
Back Page 7

58 Elements of the Student Score Report: Science Grades 5, 8, & 10 only
Back Page 8

59 Elements of the Student Score Report: Early Assessment Program Grade 11 only
Back Page 8

60 Student Score Reports (cont.)
A guide explaining the elements of student score reports will be available electronically on the caaspp.org reporting Web page.

61 Final Student Data File
Downloadable file in CSV format Data layout to be released soon on caaspp.org Includes test results for all students tested in the LEA Available within four weeks after the end of an LEA’s test administration window in TOMS Additional training planned

62 Public Web Reporting Site
Available on the CDE Web site through DataQuest Planned release in mid-August Access two testing programs through one Web site Smarter Balanced ELA and mathematics CST/CMA/CAPA Science and STS RLA Additional training planned

63 Reporting Timeline

64 4 Weeks After Test Administration Window Closes:
Timeline for Preliminary Results, Student Score Reports, and Final Student Data File Appeals Additional preliminary results received Rescores Week 0: Student completes a content area. Weeks 1–3: Student responses are scored and merged; preliminary results are checked. Week 4: LEA accesses ORS to view preliminary results. 4 Weeks After Test Administration Window Closes: LEA accesses and downloads the final student data file from TOMS. Beginning Early July: LEAs receive paper Student Score Reports with final test results; PDFs of Student Score Reports available in TOMS.

65 Timeline for Public Reporting on DataQuest
Early August LEAs preview embargoed public reporting site. Mid-August CDE releases public reporting results through DataQuest based on results through June 30. Mid-September CDE releases updated public reporting results based on results for 100% of LEAs.

66 Communications Toolkit (Cont.)
Sample parent and guardian letter to accompany the Individual Student Report Reading Your Student Report, in multiple languages, to help parents and guardians read and interpret the Individual Student Report Documents that include released questions that exemplify items in the Smarter Balanced assessments to help parents/guardians understand the achievement levels Short video to help parents/guardians understand the Individual Student Report

67 Questions?


Download ppt "Join the conversation: todaysmeet.com/SBAC"

Similar presentations


Ads by Google