Presentation is loading. Please wait.

Presentation is loading. Please wait.

2016 Post-Test Workshop: Smarter Balanced Results

Similar presentations


Presentation on theme: "2016 Post-Test Workshop: Smarter Balanced Results"— Presentation transcript:

1 2016 Post-Test Workshop: Smarter Balanced Results
Assessment Services June 29, July 11, & September 6

2 Agenda Principles of Scoring Understanding the Reports
Moving Beyond the Scores Assessment Services

3 Principles of Scoring Computer Adaptive Testing Review
Contributions in Overall Score Scores Students will Receive Principles of Scoring

4 A Computer Adaptive Test (CAT) is Based on:
Large item bank - covering all areas assessed and varying levels of difficulty (statistically calibrated on a common scale with ability estimates) Recommended blueprint - focuses the selection of questions from the test bank to appropriate content so the structure of the test is similar for every student See handout Assessment Services

5 A Computer Adaptive Test (CAT) is Based on (Continued):
Programming language (or algorithm) - a step-by-step approach that tells the CAT what to do next based on students’ answers Algorithm rules – set of rules to ensure each student’s test contains the proper types of questions and content covered Balance – enough of the concept Question type – selected response, constructed response Reading length – short, medium, long Difficulty – appropriate for grade level Assessment Services

6 How Does a CAT Work? Example: A Student of Average Ability
Expanded Very High Ability High Med-High Medium PATTERN Med-Low Low Expanded Very Low 1 2 3 4 5 6 7 8 9 10 Test Questions Answers (R/W) R R W R W W W W R R

7 Scoring the CAT Final scale scores are based on item PATTERN SCORING:
As a student progresses through the test, his or her pattern of responses is tracked and revised estimates of the student’s ability are calculated Successive test questions are selected to increase the precision of the student’s level-of-achievement estimate Scores from the CAT portion of the test are based on the difficulty of the items that were right or wrong, NOT on the total number of correct answers The test question bank for a particular grade level are designed to include an enhanced pool of test questions that are more or less difficult for that grade, but still match the grade’s test blueprint Assessment Services

8 Key Principle of Scoring
Assessment Services

9 Final Scoring: Contribution of CAT and PT Sections
For each student, the responses from the PT and CAT portions are merged for final scoring Number of Items Defined by Test Blueprints ELA/Literacy Mathematics Grade CAT PT 3–5 38–41 4–6 31–34 2–6 6–8 37–42 30–34 11 39–41 33–36 See handout Assessment Services

10 Final Scoring: Contribution of CAT and PT Sections (cont.)
Based on the test blueprint, the CAT section is emphasized because there are more CAT items/points than PT items/points Claims with more items/points are emphasized Because scores are based on pattern scoring, groups of items that are more difficult and discriminating will have a larger contribution on final scores There is no specific weight associated with either the PT or CAT Assessment Services

11 Reporting Scale After estimating the student’s overall ability, it is mapped onto the reporting scale Scores are on a vertical scale Expressed on a single continuum for a content area Measures student growth over time across grade levels For each grade level and content area, there is a separate scale score range Assessment Services

12 Smarter Balanced Scale Score Ranges by Grade Level
Subject Min Max 3 ELA 2114 2623 Math 2189 2621 4 2131 2663 2204 2659 5 2201 2701 2219 2700 6 2210 2724 2235 2748 7 2258 2745 2250 2778 8 2288 2769 2265 2802 11 2299 2795 2280 2862 Assessment Services Copyright © 2009 Educational Testing Service.

13 Overall Achievement Levels
Achievement level classifications are based on overall scores for English language arts and mathematics Exceeded the Standard Met the Standard Students will receive one of four score levels. Students scoring in the two top levels are on a path to be college and career ready at high school graduation. Students scoring below the “Met the Standard” level will need further development to get on path. Elementary & Jr. High: We are in a transition year – More students are likely to need further development to meet the new standards . But its ok, as students spend additional time in class rooms employing the new curriculum and teaching, results will improve. High School: We are in a transition periods – Results may show more students are likely to need further development to meet the new standards. But it’s ok – students still have another year to continue to progress. They are now on a focused path to gaining these skills. Public colleges and universities in several states have decided to use the results of the Grade 11 ___________exams to help determine whether admitted students are ready to proceed directly to entry-level, credit-bearing courses and can skip the placement tests that are typically administered.  In all of these cases colleges will continue to use their existing admissions criteria. <insert state specific policy> Nearly Met the Standard Has Not Met the Standard

14 Smarter Balanced Scale Score Ranges for ELA/Literacy
Grade Level 1 Level 2 Level 3 Level 4 3 2114–2366 2367–2431 2432–2489 2490–2623 4 2131–2415 2416–2472 2473–2532 2533–2663 5 2201–2441 2442–2501 2502–2581 2582–2701 6 2210–2456 2457–2530 2531–2617 2618–2724 7 2258–2478 2479–2551 2552–2648 2649–2745 8 2288–2486 2487–2566 2567–2667 2668–2769 11 2299–2492 2493–2582 2583–2681 2682–2795 See handout Assessment Services

15 Smarter Balanced Scale Score Ranges for Mathematics
Grade Level 1 Level 2 Level 3 Level 4 3 2189–2380 2381–2435 2436–2500 2501–2621 4 2204–2410 2411–2484 2485–2548 2549–2659 5 2219–2454 2455–2527 2528–2578 2579–2700 6 2235–2472 2473–2551 2552–2609 2610–2748 7 2250–2483 2484–2566 2567–2634 2635–2778 8 2265–2503 2504–2585 2586–2652 2653–2802 11 2280–2542 2543–2627 2628–2717 2718–2862 See handout Assessment Services

16 Measurement Precision: Error Bands
Tests are imprecise! For each scale score estimated for a student, there is measurement error associated with it Measurement error occurs due to factors unrelated to learning (e.g. mood, health, testing conditions) Error bands are used to construct an interval estimate corresponding to a student’s true ability/proficiency for a particular content area with a certain level of confidence The error bands used to construct interval estimates were based on one standard error of measurement (If the same test is given to a student multiple times, the student will score within this band about 68 percent of the time) Measurement error refers to the degree of imprecision or uncertainty in the assessment Assessment Services

17 Achievement Levels for Claims
Achievement levels for claims are very similar to subscores; they provide supplemental information regarding a student’s strengths or weaknesses Only three achievement levels for claims were developed since there are fewer items within each claim A student must complete most items within a claim to receive an estimate of his or her performance on a claim Below Near Above Assessment Services Copyright © 2009 Educational Testing Service.

18 Achievement Levels for Claims
Below Standard Above Standard Near Standard English Language Arts Reading Writing Speaking & Listening Research/Inquiry Mathematics Concepts & Procedures Problem Solving Communicating Reasoning Modeling & Data Analysis

19 Achievement Levels for Claims (Cont.)
Achievement levels for claims are based on the distance a student’s performance on the claim is from the Level 3 “standard met” criterion A student’s ability, along with the corresponding standard error, are estimated for each claim Differences between the performance estimate and Level 3 greater than standard errors of the claim would indicate a strength or weakness Assessment Services Copyright © 2009 Educational Testing Service.

20 Achievement Levels for Claims (Cont.)
Assessment Services Copyright © 2009 Educational Testing Service.

21 Understanding the reports
Results in ORS Illuminate & CDE Reports Individual Student Reports Understanding the reports

22 Test Results in the Online Reporting System (ORS)
Online Reporting System (ORS) = a Web-based system that displays score reports and completion data for each student who has taken the assessments Results are available three weeks after a student completes both parts—CAT and PT—of a content area Test results are added nightly To access ORS, go to the CAASPP Portal at Assessment Services

23 Online Reporting System (ORS)
After logging in, select [Score Reports] 1 ORS Dashboard 2 Assessment Services

24 ORS Home Page Dashboard: Select Test, Administration, and Enrollment Status
1 Select Administration 2014–15 data are final 2015–16 are partial Enrollment Status View Assessment Services

25 ORS Home Page Dashboard: Select Grade and Subject
2 Test results are available for students who have completed both parts—CAT and PT. Students are in the process of completing tests—results are not yet available. Assessment Services

26 ORS Subject Detail Report
Legend Updated Demographic Subgroups Exploration Menu Assessment Services

27 ORS Claim-Level Report Detail
Assessment Services

28 ORS Assessment Target Reports
NEW! Indicator of strengths and weaknesses relative to the test performances as a whole of the group you are viewing Strengths and weaknesses do not imply proficiency or that a particular content standard has been met Show how a group of students performed on a target compared to their overall performance on the assessment Should serve as a starting point in overall investigations of students’ strengths and weaknesses Target reports based on fewer than 50 students may be less reliable Assessment Services

29 ORS Assessment Target Reports
NEW! ORS Assessment Target Reports Example: Mathematics Target Report Assessment Target Report Quick Start Guide available at Assessment Services

30 ORS Student Listing Report
Assessment Services

31 ORS Student Detail Report
Legend for Claim Achievement Category Average LEA and school scale scores for comparison Note: State-level scale score averages will not be available until formally released by the CDE. Performance on Claims Assessment Services

32 Reports in Illuminate Assessment Services

33 Reports in Illuminate Assessment Services

34 Reports on CDE Website Assessment Services

35 Test Results Reported on the Individual Student Reports (IRS)
Student Score Reports Overview Test Results Reported on the Individual Student Reports (IRS) Grade Smarter Balanced ELA and mathematics CST, CMA, or CAPA Science 3 4 5 6 7 8 10 11 For students who took Smarter Balanced ELA and mathematics, CST, CMA or CAPA for Science; CAA results will be coming later Assessment Services

36 Redesigned Student Score Reports
See handout Assessment Services

37 Test Results from Multiple Years
NEW! Test Results from Multiple Years Assessment Services

38 Low, Medium and High Bands
(Divide the difference by three = 21 pts) (Divide the difference by three = 21 pts) 2367 to 2387 2388 to 2408 2409 to 2431 2432 to 2450 2451 to 2469 2470 to 2489 (Divide the difference by three = 21 pts) (Divide the difference by three = 21 pts) 2416 to 2433 2434 to 2452 2453 to 2472 2473 to 2491 2492 to 2511 2512 to 2532 Assessment Services See handout

39 Low, Medium and High Bands (cont.)
2015 2432 2016 2470 Assessment Services

40 Sample Report: Comparing ELA Scale Scores Year to Year by Grade
Threshold Scale Score Range Standard Exceeded Standard Met ● 2541 Standard Nearly Met Standard Not Met Spring 2015 Spring 2016 Grade Grade Grade Grade Grade 7 Grade Grade 11

41 Moving beyond the scores
Rightful Place and Rightful Purpose Evidence Centered Design Moving beyond the scores

42 Evidence-Centered Design
Knowing how the assessment items were developed, helps in understanding the results and making connections to the classroom ITEMS Evidence Statements Content Claims Targets Overall Claims The Smarter Balanced Hierarchy of Item Development and Reporting of Scores Assessment Services

43 Concepts of Evidence-Centered Design
Define the domain Define claims to be made Define assessment targets Define evidence required Develop items or performance tasks Common Core Standards— Mathematics; English Language Arts/Literacy (ELA) Four ELA & Four Mathematics Claims Content Specifications Knowledge, Skills, and Abilities Evidence to be Elicited from Student Methods for Eliciting Evidence Assessment Services

44 Relationships Between Common Core Standards, California Frameworks, and Assessments
English Language Arts/Literacy Content Specifications (Claims) English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects English Language Arts/English Language Development Framework Content Specifications were developed to ensure that the assessments cover the range of knowledge, skills, and abilities required for each claim Assessment Services

45 ELA/Literacy Content Specifications
(Grade 7 ELA Example) See handout Content Specifications state the claim, specify assessment targets, and link them to the CCSS; targets are the bridge between the content standards and the evidence that supports the claim Assessment Services

46 Relationships Between Common Core Standards, California Frameworks, and Assessments
English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects English Language Arts/Literacy Content Specifications (Claims) English Language Arts/Literacy Item Specifications (Claims and targets per claim) English Language Arts/English Language Development Framework Item Specifications provide guidance on how to translate the Content Specifications into actual assessment items; were developed to ensure that the items measure the assessments’ claims Assessment Services

47 Example of Item Specifications
See handout Evidence Required for Target 1 The student will identify text evidence (explicit details and/or implicit information) to support a GIVEN inference or conclusion based on the text. Item Specifications delineate the types of evidence that should be collected regarding the knowledge, skills, and /or abilities that are articulated in the standards Assessment Services

48 Relationships Between Common Core Standards, California Frameworks, and Assessments
English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects English Language Arts/Literacy Content Specifications (Claims) English Language Arts/English Language Development Framework English Language Arts/Literacy Item Specifications (Claims and targets per claim) English Language Arts/Literacy Assessment Blueprint Test Blueprint describes the content of the assessments, how that content will be assessed (CAT/PT), including number of items and DOK levels Assessment Services

49 Relationships Between Common Core Standards, California Frameworks, and Assessments
English Language Arts/Literacy Content Specifications (Claims) English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects English Language Arts/English Language Development Framework English Language Arts/Literacy Item Specifications (Claims and targets per claim) English Language Arts/Literacy Assessment Blueprint English Language Arts/Literacy Summative Assessment Mathematics Content Specifications (Claims) Mathematics Mathematics Framework for California Public Schools Mathematics Item Specifications (Claims and targets per claim) Mathematics Assessment Blueprint Mathematics Summative Assessment Assessment Services

50 Range Achievement Level Descriptors (ALDs)
Level Level Level Level 4 “How good is good enough?” Describe the cognitive and content rigor within each achievement level for each grade, claim and target Describe the knowledge, skills, and processes expected of students Can guide the development of classroom rubrics Operationalize the expectations of the assessments Assessment Services

51 Building a Logical Argument Common Core State Standards
Assessment Claim Assessment Target Evidence Student Response Claim Assessment Target Evidence Student Response

52 Rightful Place/Purpose: Assessment Frequency and Impact on Instruction
Statewide Summative Classroom Formative Assessment Services

53 ….but they don’t tell the entire story
Statewide summative assessments are like the tip of an iceberg—it pays to pay attention ….but they don’t tell the entire story Assessment Services

54 Rightful Place, Rightful Purpose for Statewide Summative Assessment
Provide a general direction—we must dig deeper to determine cause Focus on groups, programs, and disaggregation Rarely provide definitive answers, but raise many questions, allowing reflection on context and practice Provide an entry point into a collaborative, honest conversation Assessment Services

55 Call to Action – Pay Attention but Move Beyond the Scores
Focus on improving learning Not solely about increasing scores Reflect on what you can control to move beyond the scores Use with other sources of evidence Test scores should be used in conjunction with multiple pieces of evidence to arrive at a more complete understanding of student learning and progress Collaborative process that requires: Honesty Ability to handle ambiguity Patience Assessment Services

56 Final Thoughts Assessment Services

57 Resources English Language Arts/English Language Development Frameworks Content Specifications, Item Specifications and Blueprints Smarter Balanced Scale Score Ranges Sample Student Score Report Claim Descriptions for ELA and Mathematics Target Score Report FAQs Achievement Level Descriptors Teacher Guides to Smarter Balanced Understanding CAASPP Results for Parents Assessment Services

58 Assessment Services Webpage
Staff Portal > Departments > Assessment Services > Smarter Balanced Assessment > Results Assessment Services

59 Contact Information Erin Gordon: Amal Morcos: Denise Ormsbee: Assessment Services main phone: Assessment Services Smarter Balanced Web page: Assessment Services


Download ppt "2016 Post-Test Workshop: Smarter Balanced Results"

Similar presentations


Ads by Google