Download presentation
Presentation is loading. Please wait.
Published byChloe Fletcher Modified over 9 years ago
1
2015 Post-Test Workshop: Reporting Summative Assessment Results
California Assessment of Student Performance and Progress (CAASPP) 2015 Post-Test Workshop: Reporting Summative Assessment Results
2
Agenda Purpose Principles of Scoring Understanding the Reports
Using the Online Reporting System Overview of the Reporting Timeline Interpreting, Using, and Communicating Results
3
Purpose
4
Purpose By the end of this workshop, viewers will know how to:
Understand the scoring process Understand the components of each report Use the Online Reporting System (ORS) to view partial and preliminary results Become familiar with the reporting timeline Interpret, use, and communicate results
5
Focus of this Workshop Overview of the reporting system for all CAASPP operational summative assessments: Smarter Balanced English language arts/literacy (ELA) and mathematics California Standards Tests (CSTs), California Modified Assessment (CMA), California Alternate Performance Assessment (CAPA) for Science Standards-based Tests in Spanish (STS) for Reading/Language Arts (RLA) Focus on Smarter Balanced ELA and mathematics
6
Principles of Scoring
7
Goals of the Section Provide an overview of computer adaptive testing (CAT) and scoring Describe the relative contribution of the performance tasks (PTs) and the CAT to the overall scores Describe the Score scale Achievement levels Error Bands Claims
8
Computer Adaptive Testing: Philosophy
“Computer adaptive testing (CAT) holds the potential for more customized assessment with test questions that are tailored to the students’ ability levels, and identification of students’ skills and weaknesses using fewer questions and requiring less testing time.” Shorr, P. W. (2002, Spring). A look at tools for assessment and accountability. Administrator Magazine.
9
How Does a CAT Work? Each student is administered a set of test questions that is appropriately challenging. The student’s performance on the test questions determines if subsequent questions are harder or easier. The test adapts to the student item-by-item and not in stages. Fewer test questions are needed as compared to a fixed form to obtain precise estimates of students’ ability. The test continues until the test content outlined in the blueprint is covered.
10
How Does a CAT Work? Example: A Student of Average Ability
Expanded Very High Ability High Med-High Medium Med-Low Low Expanded Very Low 1 2 3 4 5 6 7 8 9 10 Test Questions Answers (R/W) R R W R W W W W R R
11
Computer Adaptive Testing: Behind the Scenes
Requires a large pool of test questions statistically calibrated on a common scale with ability estimates, e.g., from the Field Test Uses an algorithm to select questions based on a student’s responses, to score responses, and to iteratively estimate the student’s performance Final scale scores are based on item pattern scoring
12
Computer Adaptive Testing: Practical Considerations
Each student’s test is constrained to ensure coverage of the full range of appropriate grade level content, e.g., ELA test cannot consist of only Reading Informational items. The exposure of test questions to many students is constrained to maintain test security. Sets of test questions based on a common passage or stimulus constrains the ability to adapt within the set. The responses must be machine-scored to select the next question. Human-scored performance task responses are combined later with the CAT results.
13
Scoring the CAT As a student progresses through the test, his or her pattern of responses is tracked and revised estimates of the student’s ability are calculated. Successive test questions are selected to increase the precision about the level of achievement given the current estimate of his or her ability. Resulting scores from the CAT portion of the test are based the specific test questions selected as a result of the student’s responses, but NOT the sum of the number answered correctly. The test question pools for a particular grade level are designed to include enhanced pool of test questions that are more or less difficult for that grade but still matching the test blueprint for that grade.
14
Human Scored Items in the CAT
Some items administered on the Smarter Balanced adaptive test component require human scoring of items The adaptive algorithm will select these items based on performance on prior items. Since these items cannot be scored in real time by a human, performance on these items will not impact later item selection.
15
Performance Tasks (PTs)
In all Smarter Balanced tests, a PT and a set of stimuli on a given topic are administered as well as the CAT. PTs are administered at the classroom/group level so they are not targeted to students’ specific ability level. The items associated with the PTs may be scored by machine or by human raters.
16
Final Scoring For each student, the responses from the PT and CAT portions are merged for final scoring. Resulting ability estimates are based on the specific test questions that a student answered, not the total number of items answered correctly. Higher ability estimates are associated with test takers who correctly answer difficult and more discriminating items. Lower ability estimates are associated with test takers who correctly answer easier and less discriminating items. Two students will have the same ability estimate if they have the same set of test questions with the same responses. It is possible for students to have the same ability estimate through different response patterns This type of scoring is called “Item Pattern Scoring.”
17
Final Scoring: Contribution of CAT and PT Sections
Number of Items defined by Test Blueprints ELA/Literacy Mathematics Grade CAT PT 3–5 38–41 5–6 31–34 2–6 6–8 37–42 30–34 11 39–41 33–36
18
Final Scoring: Contribution of CAT and PT Sections (cont.)
Based on the test blueprint, the CAT section is emphasized because there are more CAT items/points than PT items/points. Claims with more items/points are emphasized. Mathematics: Concepts and Procedures Problem Solving/Modeling and Data Analysis Communicating Reasoning ELA: Reading Writing Speaking/Listening Research Because scores are based on pattern scoring, groups of items that are more difficult and discriminating will have a larger contribution on final scores. Therefore there is no specific weight associated with either PT or CAT Sections
19
Final Scoring: Mapping
After estimating the student’s overall ability, it is mapped onto the reporting scale through a linear transformation. Mathematics: Scaled Score = * ELA: Scaled Score = * Limited by grade level lowest and highest obtainable scaled score
20
Properties of the Reporting Scale
Scores are on a vertical scale. Expressed on a single continuum for a content area Allows users to describe student growth over time across grade levels Scale score range ELA/Literacy: 2114–2795 Mathematics: 2189–2862 For each grade level and content area, there is a separate scale score range.
21
Smarter Balanced Scale Score Ranges by Grade Level
Subject Min Max 3 ELA 2114 2623 Mathematics 2189 2621 4 2131 2663 2204 2659 5 2201 2701 2219 2700 6 2210 2724 2235 2748 7 2258 2745 2250 2778 8 2288 2769 2265 2802 11 2299 2795 2280 2862 Copyright © 2009 Educational Testing Service.
22
Achievement Levels Achievement level classifications based on overall scores Level 1—Standard Not Met Level 2—Standard Nearly Met Level 3—Standard Met Level 4—Standard Exceeded
23
Achievement Levels by Grade
24
Smarter Balanced Scale Score Ranges for ELA/Literacy
Grade Level 1 Level 2 Level 3 Level 4 3 2114–2366 2367–2431 2432–2489 2490–2623 4 2131–2415 2416–2472 2473–2532 2533–2663 5 2201–2441 2442–2501 2502–2581 2582–2701 6 2210–2456 2457–2530 2531–2617 2618–2724 7 2258–2478 2479–2551 2552–2648 2649–2745 8 2288–2486 2487–2566 2567–2667 2668–2769 11 2299–2492 2493–2582 2583–2681 2682–2795
25
Achievement Levels by Grade
26
Smarter Balanced Scale Score Ranges for Mathematics
Grade Level 1 Level 2 Level 3 Level 4 3 2189–2380 2381–2435 2436–2500 2501–2621 4 2204–2410 2411–2484 2485–2548 2549–2659 5 2219–2454 2455–2527 2528–2578 2579–2700 6 2235–2472 2473–2551 2552–2609 2610–2748 7 2250–2483 2484–2566 2567–2634 2635–2778 8 2265–2503 2504–2585 2586–2652 2653–2802 11 2280–2542 2543–2627 2628–2717 2718–2862
27
Measurement Precision: Error Bands
For each scale score estimated for a student, there is measurement error associated with each score. An error band is a useful tool that describes the measurement error associated with a reported scale score. Error bands are used to construct an interval estimate corresponding to a student’s true ability/proficiency for a particular content area with a certain level of confidence. The error bands used to construct interval estimates were based on one standard error of measurement. If the same test is given to student multiple times, about 68 percent of the time, the student will score within this band.
28
Achievement Levels for Claims
Achievement Levels for claims are very similar to subscores. They provide supplemental information regarding a student’s strengths or weaknesses. No achievement level setting occurred for claims. Only three achievement levels for claims were developed since there are fewer items within each claim. Achievement levels for claims are based on the distance a student’s performance on the claim is from the Level 3 proficiency cut. A student must complete all items within a claim to receive an estimate of his or performance on a claim. Copyright © 2009 Educational Testing Service.
29
Achievement Levels for Claims (2)
A student’s ability, along with the corresponding standard error, are estimated for each claim. The student’s ability estimate for the claim is compared to the Level 3 proficiency cut Differences between and greater than standard errors of the claim would indicate a strength or weakness.
30
Achievement Levels for Claims (3)
At/Near Standard Below Standard
31
Achievement Levels for Claims (4)
Above Standard
32
Understanding the Reports
33
Available Reports Secure Location Preliminary student test results
ORS Preliminary and partial aggregate test results Student Score Reports (ISRs) TOMS Final student data Public Smarter Balanced ELA and mathematics CDE CAASPP Web page CST/CMA/CAPA for Science and STS for RLA
34
Secure Reporting Report LEA School* Parent Preliminary Student Data
ORS Preliminary Aggregate Data Final Student Score Reports (ISRs) pdf/paper TOMS†/Paper†† TOMS†/Paper Paper Final Student Data File TOMS * Access to ORS will be granted to CAASPP Test Site Coordinators in August. † PDFs of the Student Score Reports will be available in TOMS. †† LEAs must forward or mail the copy of the CAASPP Student Score Report to each student’s parent/guardian within 20 working days of its delivery to the LEA.
35
Preliminary Test Results: Student and Aggregate
Through the Online Reporting System (ORS) Available approximately three to four weeks after student completes both parts—CAT and PT—of a content area Added daily Use Caution: The results are partial and may not be a good representation of the school or district’s final aggregate results. The results are preliminary; the processing of appeals may result in score changes.
36
Student Score Reports (ISR): Overview
One-page report Double-sided: All Smarter Balanced CAPA for Science Single-sided: CST/CMA for Science (Grade 10) STS for RLA Student’s final CAASPP test results Reports progress toward the state’s academic content standards Indicates areas of focus to: Help students’ achievement Improve educational programs LEA distributes to parents/guardians
37
Student Score Reports: Shipments to LEAs
Two copies of each student’s Student Score Report One for the parent/guardian One for the school site 2015 LEA CAASPP Reports Shipment Letter 2015 School CAASPP Reports Shipment Letter Note: Per California Code of Regulations, Title 5, Section 863, LEAs must forward one copy to parent/guardian within 20 business days. Schools may file the copy they receive, or they may give it to the student’s current teacher or counselor. If the LEA receives the reports after the last day of instruction, the LEA must mail the pupil results to the parent or guardian at their last known address. If the report is non-deliverable, the LEA must make the report available to the parent or guardian during the next school year.
38
Test Results Reported on the Student Score Reports
For students who took Smarter Balanced ELA and mathematics, CST, CMA or CAPA for Science For students who took STS RLA Grade Smarter Balanced ELA and mathematics CST, CMA, or CAPA Science 3 4 5 6 7 8 10 11 Grade STS RLA 2 3 4 5 6 7 8 9 10 11
39
Elements of the Student Score Report
Front Page Back Page 5 1 2 3 6 7 4 8
40
Elements of the Student Score Report
Front Page 1
41
Elements of the Student Score Report
Front Page 2
42
Elements of the Student Score Report
Front Page 3
43
Elements of the Student Score Report
Front Page 4
44
Elements of the Student Score Report
Back Page 5
45
Elements of the Student Score Report
Back Page 6
46
Elements of the Student Score Report
Back Page 7
47
Elements of the Student Score Report: Science Grades 5, 8, & 10 only
Back Page 8
48
Elements of the Student Score Report: Early Assessment Program Grade 11 only
Back Page 8
49
Student Score Reports (cont.)
A guide explaining the elements of student score reports will be available electronically on the caaspp.org reporting Web page.
50
Final Student Data File
Downloadable file in CSV format Data layout to be released soon on caaspp.org Includes test results for all students tested in the LEA Available within four weeks after the end of an LEA’s test administration window in TOMS Additional training planned
51
Public Web Reporting Site
Available on the CDE Web site through DataQuest Planned release in mid-August Access two testing programs through one Web site Smarter Balanced ELA and mathematics CST/CMA/CAPA Science and STS RLA Additional training planned
52
Using the Online Reporting System
53
Important Reminder The data available in the CAASPP ORS represents partial and preliminary results that are not appropriate for public release. As a real-time system, results will change as additional data is received and relevant appeals and rescores are processed. These changes may result in final scores being higher or lower than the preliminary results posted to this system. The California Department of Education (CDE) recommends that data from the ORS only be released publically following the state-level release of assessment data that occurs in August.
54
ORS Summary The Online Reporting System (ORS) is a Web-based system that displays score reports and completion data for each student who has taken the following California assessments: Smarter Balanced tests in ELA or mathematics CSTs for Science CMA for Science CAPA for Science STS for RLA CAASPP Test Site Coordinators will have access to ORS in August. They will be able to view test results for students enrolled in their entity.
55
Viewing ORS Reports From the Select drop-down list, select the LEA or school whose reports you want to view. A list will appear only if you are associated with more than one school or LEA. Select [Score Reports]. Note: All score report data, except for individual students’ score reports, can be disaggregated into subgroups for detailed analysis. For example, an LEA CAASPP Coordinator can view a Grade 5 Mathematics report for an LEA. A CAASPP Test Site Coordinator would be able to view Grade 5 Mathematics for his or her school site only.
56
Available Features in the ORS
Home Page Dashboard Subject Detail Claim-Level Detail Student Listing Student Detail Managed Rosters
57
Home Page Dashboard Report
Overall summary of score data and testing progress for your LEA and/or school Starting point for data analysis Definition of students whose aggregated scores you want to view Navigation to more detailed score reports Note: The score data that can be viewed are dependent on the user role.
58
Home Page Dashboard Report (cont.)
59
Subject Detail Report The aggregation tables on the Home Page Dashboard display score data for students by grade and subject and provide access to Subject Detail Reports.
60
Subject Detail Report (cont.)
61
Subject Detail Report (cont.)
Each Subject Detail Report consists of the following components: Report descriptor Test name (subject and grade) Administration year Entity (e.g., LEA, school) The title of the score report table Name, number of students, average scale scores, percent in each achievement level All data are based on the total number of students who have taken and completed the test and scored
62
Claim-Level Detail Report
2 1
63
Claim-Level Detail Report (cont.)
64
Claim-Level Detail Report (cont.)
The Claim-Level Detail Report consists of the following components: Report name [Entity] Performance for Each Claim. What are my [entity’s] strengths and weaknesses in [Subject or Course]? Test name (subject and grade) Administration year Entity (e.g., LEA, school, or roster) The title of the score report table Name, number of students, average scale score, claims, and percentage in each claims achievement level
65
Student Listing Report
2 1
66
Student Listing Report (cont.)
67
Student Listing Report (cont.)
Students’ SSIDs are displayed. Different procedure for viewing scale score data by demographic subgroup. [Print] on the Student Listing Report prints the current page and also generates a PDF file of individual preliminary student results for all the students in the roster.
68
Student Detail Report 2 1
69
Student Detail Report (cont.)
70
Student Detail Report (cont.)
Displays the breakdown of the student’s preliminary scale score, achievement level for the selected subject, and performance and claim description for each claim Includes average scale scores for the LEA for comparison purposes Note: State-level scale score averages will not be available until formally released by the CDE.
71
Manage Roster Reports Rosters are customized groupings of students within a school Example: School-level users can create a report that lists all students within a specific grade or a particular classroom. This feature is available now.
72
Manage Roster Reports: Adding Rosters
73
Manage Roster Reports: Adding Rosters (cont.)
74
Manage Roster Reports: Adding Rosters (cont.)
75
Manage Roster Reports: Adding Rosters (cont.)
76
Manage Roster Reports: View Rosters
77
Manage Roster Reports: Print, Modify, and Delete
78
Live Demonstration: Manage Rosters
79
How Are Data Loaded into ORS?
Data will be loaded into ORS on a nightly basis. The ORS will continually update with preliminary test results until testing is completed. Factors such as processed appeals, are not accounted for in the ORS feed.
80
4 Weeks After Test Administration Window Closes:
Why Preliminary? Appeals Additional preliminary results received Rescores Week 0: Student completes a content area. Weeks 1–3: Student responses are scored and merged; preliminary results are checked. Week 4: LEA accesses ORS to view preliminary results. 4 Weeks After Test Administration Window Closes: LEA accesses and downloads the final student data file from TOMS.
81
Resources for the ORS ORS module:
Archive of the ORS Webcast with demo: Form to submit feedback on the ORS:
82
Reporting Timeline
83
4 Weeks After Test Administration Window Closes:
Timeline for Preliminary Results, Student Score Reports, and Final Student Data File Appeals Additional preliminary results received Rescores Week 0: Student completes a content area. Weeks 1–3: Student responses are scored and merged; preliminary results are checked. Week 4: LEA accesses ORS to view preliminary results. 4 Weeks After Test Administration Window Closes: LEA accesses and downloads the final student data file from TOMS. Beginning Early July: LEAs receive paper Student Score Reports with final test results; PDFs of Student Score Reports available in TOMS.
84
Timeline for Public Reporting on DataQuest
Early August LEAs preview embargoed public reporting site. Mid-August CDE releases public reporting results through DataQuest based on results through June 30. Mid-September CDE releases updated public reporting results based on results for 100% of LEAs.
85
Interpreting, Using and Communicating Results
2015 CAASPP Post-Test Workshop Gina Koency Senior Assessment Fellow
86
Topics Appropriate use of scores Report use scenarios
Communication plan and tools
87
Appropriate Use of Scores
Follow student progress* Scale Score Achievement Level Level 1. Standard not met Level 2. Standard nearly met Level 3. Standard met Level 4. Standard exceeded *Scores are a baseline in 2015
88
Appropriate Use of Scores (Cont.)
Identify students who may need additional help* Identify strengths, weaknesses, and gaps in curriculum and instruction* Claim level scores: Below standard At or near standard Above standard *Use in conjunction with other evidence of student learning.
89
Appropriate Use of Scores (Cont.)
Identify areas for professional development Identify areas for resource allocation Communicate student achievement to students, parents/guardians, and community
90
Appropriate Use of Scores (Cont.)
Keep in mind that ORS reports are preliminary, and not for public release. Consider the number/percentage of students tested. Consider the assessment literacy of the intended audience. Understand and be prepared to address the principles of scoring.* *See Section 2 of this workshop
91
ORS Scenarios Scenario 1: LEA administrator report of preliminary summary results to site administrators How did our LEA perform overall? Scenario 2: Site administrator report of preliminary summary results to staff How did our school perform overall?
92
ORS Scenarios (Cont.) Other scenarios might include:
Grade level lead report of preliminary summary results to grade level team Department head or coach report of preliminary summary results to department or subject area teachers
93
Scenario 1: How did our LEA perform overall?
94
Scenario 1: How did our LEA perform overall?
95
Scenario 2: How did our school perform overall?
96
Scenario 2: How did our school perform overall?
97
Scenario 2: How did our school perform overall?
98
Other Report Options Use the “Breakdown by” filter to disaggregate by demographic sub-group: race/ethnicity or gender. Use the “Manage Rosters” feature to create custom groups to meet locally defined needs.
99
Evidence-Based Inquiry Process
Identify a question Collect multiple sources of evidence Analyze the evidence Interpret the findings Develop a plan Remember: This is a baseline year for valid and reliable data about student achievement of California’s college and career readiness standards, as measured by the new tests.
100
Digital Library Connection
Let’s assume that students in grade three were below standard on the Reading claim. Use the Digital Library to identify resources such as: Using Fluency Stations as Formative Assessment (RF 3.4 and RF 4.4) fluency-stations-formative-assessment
101
Communications Plan Consider how results from the summative assessments will be communicated to: School boards and LEA administrators Site administrators and staff Parents and guardians Community members Media
102
Communications Plan (Cont.)
Identify the needs of each audience. What are their concerns? What do they need to know? When do they need to hear from you? Decide on the message content. Identify resources. Identify persons responsible. Map out the timeline, with deliverables and follow-up.
103
Communications Plan (Cont.)
Focus on key talking points: The CCSS and Smarter Balanced represent a comprehensive plan for student success in college and careers. This new testing system is designed to help teachers. Patience and persistence: adjustments will always be needed to ensure high quality teaching and learning. This is the first year of a new baseline for student achievement.
104
New Test, New Baseline This year will establish a baseline for the progress we expect students to make over time toward college and career readiness. Many if not most students will need to make significant progress to reach the at or above standard level. These results will provide us an opportunity to focus on the needs of students and teachers.
105
Smarter Balanced Communication Tools
Communication Tips (PPT) Connecting Learning to Life (DOC) New Future, New Test with Talking Points (PPT) Principal's Newsletter (DOC) April to June 2015 Communication Timeline (PDF) Role Play Cards with Instructions (DOC)
106
Communications Toolkit
Short documents, in English and Spanish: key topics such as “Creating a Computer Adaptive Test” or “Accessibility and Accommodations: Addressing the Needs of all Students” Links to key sites such as the California PTA Brief videos, in English and Spanish: key topics such as What are the Smarter Balanced Assessments? and Ready. Set. Succeed: Preparing Our Kids for College and Career
107
Communications Toolkit (Cont.)
Sample parent and guardian letter to accompany the Individual Student Report Reading Your Student Report, in multiple languages, to help parents and guardians read and interpret the Individual Student Report Documents that include released questions that exemplify items in the Smarter Balanced assessments to help parents/guardians understand the achievement levels Short video to help parents/guardians understand the Individual Student Report
108
Questions?
109
Updates and Announcements
110
Updates and Announcements
The Online Reporting System User Guide is forthcoming. In-person 2015 CAASPP Post-Test Workshops Sacramento: May 22 Fresno: May 26 Los Angeles: May 27 Santa Clara: May 27 San Diego: May 29 Space to attend is still available! Register online at: Upcoming Webcasts—dates to be determined Final Student Data File Public Web Reporting Site
111
Resources and Support
112
Help Desk Support The California Technical Assistance Center (CalTAC) is here to support all LEA CAASPP Coordinators! Monday–Friday from 7 a.m.–5 p.m. PT Phone: Web site:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.