Presentation is loading. Please wait.

Presentation is loading. Please wait.

2017 CAASPP Post-Test Workshop: Using Assessment Results to Inform Teaching and Learning Assessment Services.

Similar presentations


Presentation on theme: "2017 CAASPP Post-Test Workshop: Using Assessment Results to Inform Teaching and Learning Assessment Services."— Presentation transcript:

1 2017 CAASPP Post-Test Workshop: Using Assessment Results to Inform Teaching and Learning
Assessment Services

2 Learning Goal Participants will be equipped with the information, and tools necessary to: Understand the CAASPP scores Access the various reports Analyze the results Use the results and available resources to improve teaching and learning Assessment Services

3 Moving Beyond the Scores
Agenda Principles of Scoring Accessing the Reports Analyzing the Results Moving Beyond the Scores Assessment Services

4 Principles of Scoring Computer Adaptive Testing Review
Contributions in the Overall Score Scores Students will Receive Principles of Scoring

5 A Computer Adaptive Test (CAT) is Based on:
Large item bank - covering all areas assessed and varying levels of difficulty (statistically calibrated on a common scale with ability estimates) Recommended blueprint - focuses the selection of questions from the test bank to appropriate content so the structure of the test is similar for every student See handout Assessment Services

6 A Computer Adaptive Test (CAT) is Based on (Continued):
Programming language (or algorithm) - a step-by-step approach that tells the CAT what to do next based on students’ answers Algorithm rules – set of rules to ensure each student’s test contains the proper types of questions and content covered Balance – enough of the concept Question type – selected response, constructed response Reading length – short, medium, long Difficulty – appropriate for grade level Assessment Services

7 How Does a CAT Work? Example: A Student of Average Ability
Expanded Very High Ability High Med-High Medium PATTERN Med-Low Low Expanded Very Low 1 2 3 4 5 6 7 8 9 10 Test Questions Assessment Services Answers (R/W) R R W R W W W W R R

8 Scoring the CAT Final scale scores are based on item PATTERN SCORING:
As a student progresses through the test, his or her pattern of responses is tracked and revised estimates of the student’s ability are calculated Successive test questions are selected to increase the precision of the student’s level-of-achievement estimate Scores from the CAT portion of the test are based on the difficulty of the items that were right or wrong, NOT on the total number of correct answers The test question bank for a particular grade level are designed to include an enhanced pool of test questions that are more or less difficult for that grade, but still match the grade’s test blueprint Assessment Services

9 Key Principle of Scoring
Assessment Services

10 Performance Tasks (PTs)
Performance task (PT) = set of complex questions challenging students to apply their knowledge and skills to respond to real-world problems PTs are not adaptive and, therefore, are not targeted to a student’s specific ability level The items associated with the PTs may be scored by a machine or by human raters Assessment Services

11 Final Scoring: Contribution of CAT and PT Sections
For each student, the responses from the PT and CAT portions are merged for final scoring Number of Items Defined by Test Blueprints ELA/Literacy Mathematics Grade CAT PT 3–5 38–41 4–6 31–34 2–6 6–8 37–42 30–34 11 39–41 33–36 See handout Assessment Services

12 Final Scoring: Contribution of CAT and PT Sections (cont.)
Based on the test blueprint, the CAT section is emphasized because there are more CAT items/points than PT items/points Claims with more items/points are emphasized Because scores are based on pattern scoring, groups of items that are more difficult will have a larger contribution on final scores There is no specific weight associated with either the PT or CAT Assessment Services

13 Final Scoring: Contribution of the CAT and PT
Relative contribution of the PT and CAT sections on the total score will vary across students since the overall score Is based on pattern scoring (for the CAT) Depends on performance on the CAT section Depends on performance on the PT section Depends on the average difficulty of the PT section Example: A PT might have a significant negative contribution on a students overall score if they did extremely well on the CAT but performed poorly on a very easy PT section Upshot: All parts of the Smarter Balanced Assessments are important Assessment Services

14 Reporting Scale After estimating the student’s overall ability, it is mapped onto the reporting scale Scores are on a vertical scale Expressed on a single continuum for a content area Measures student growth over time across grade levels For each grade level and content area, there is a separate scale score range Assessment Services

15 Overall Achievement Levels
Achievement level classifications are based on overall scores for English language arts and mathematics 4 Exceeded the Standard 3 Met the Standard 2 Students will receive one of four score levels. Students scoring in the two top levels are on a path to be college and career ready at high school graduation. Students scoring below the “Met the Standard” level will need further development to get on path. Elementary & Jr. High: We are in a transition year – More students are likely to need further development to meet the new standards . But its ok, as students spend additional time in class rooms employing the new curriculum and teaching, results will improve. High School: We are in a transition periods – Results may show more students are likely to need further development to meet the new standards. But it’s ok – students still have another year to continue to progress. They are now on a focused path to gaining these skills. Public colleges and universities in several states have decided to use the results of the Grade 11 ___________exams to help determine whether admitted students are ready to proceed directly to entry-level, credit-bearing courses and can skip the placement tests that are typically administered.  In all of these cases colleges will continue to use their existing admissions criteria. <insert state specific policy> Nearly Met the Standard 1 Has Not Met the Standard

16 Smarter Balanced Scale Score Ranges for ELA/Literacy
Grade Level 1 Level 2 Level 3 Level 4 3 2114–2366 2367–2431 2432–2489 2490–2623 4 2131–2415 2416–2472 2473–2532 2533–2663 5 2201–2441 2442–2501 2502–2581 2582–2701 6 2210–2456 2457–2530 2531–2617 2618–2724 7 2258–2478 2479–2551 2552–2648 2649–2745 8 2288–2486 2487–2566 2567–2667 2668–2769 11 2299–2492 2493–2582 2583–2681 2682–2795 See handout Assessment Services

17 Smarter Balanced Scale Score Ranges for Mathematics
Grade Level 1 Level 2 Level 3 Level 4 3 2189–2380 2381–2435 2436–2500 2501–2621 4 2204–2410 2411–2484 2485–2548 2549–2659 5 2219–2454 2455–2527 2528–2578 2579–2700 6 2235–2472 2473–2551 2552–2609 2610–2748 7 2250–2483 2484–2566 2567–2634 2635–2778 8 2265–2503 2504–2585 2586–2652 2653–2802 11 2280–2542 2543–2627 2628–2717 2718–2862 See handout Assessment Services

18 Measurement Precision: Error Bands
Tests are imprecise! For each scale score estimated for a student, there is measurement error associated with it Measurement error occurs due to factors unrelated to learning (e.g. mood, health, testing conditions) Error bands are used to construct an interval estimate corresponding to a student’s true ability/proficiency for a particular content area with a certain level of confidence The error bands used to construct interval estimates were based on one standard error of measurement (If the same test is given to a student multiple times, the student will score within this band about 68 percent of the time) Measurement error refers to the degree of imprecision or uncertainty in the assessment Assessment Services

19 Achievement Levels for Claims
Achievement levels for claims are very similar to sub-scores; they provide supplemental information regarding a student’s strengths or weaknesses Only three achievement levels for claims were developed since there are fewer items within each claim A student must complete most items within a claim to receive an estimate of his or her performance on a claim Achievement levels for claims are based on the distance a student’s performance on the claim is from the Level 3 “standard met” criterion Assessment Services Copyright © 2009 Educational Testing Service.

20 Achievement Levels for Claims
Below Standard Above Standard Near Standard English Language Arts Reading Writing Speaking & Listening Research/Inquiry Mathematics Concepts & Procedures Problem Solving Communicating Reasoning Modeling & Data Analysis

21 Questions about the scores that students receive?

22 Accessing the various reports
ORS Reports Illuminate Reports CDE Reports Individual Student Reports Accessing the various reports

23 Access & Timeline for 2017 CAASPP Reports
Preliminary CAASPP Results in ORS Illuminate CAASPP Reports CDE CAASPP Reporting Site Individual CAASPP Student Reports ACCESS Available to site coordinators, principals, and instructional cabinet Available to SDUSD staff who have access to Illuminate (e.g. teachers, principals) Available to the public via the CDE website Mailed home to parents by Assessment Services; school copies sent to sites via truck AVAILABLE Approximately three weeks after a student tests July (preliminary) Late August (final) Late August or Early September September Assessment Services

24 Test Results in the Online Reporting System (ORS)
Online Reporting System (ORS) = a Web-based system that displays score reports and completion data for each student who has taken the assessments Results are available approximately three weeks after a student completes both parts—CAT and PT—of a content area Test results are added nightly See handout Assessment Services

25 Online Reporting System (ORS)
After logging into ORS via TOMS, select [Score Reports] 1 ORS Dashboard 2 Assessment Services

26 ORS Home Page Dashboard: Select Test, Administration, and Enrollment Status
1 Select Test 1 Enrollment Status View Select Administration 2014–15 data are final 2015–16 data are final 2016–17 are partial Assessment Services

27 ORS Dashboard: Select Grade and Subject
2 Test results are available for students who have completed both parts—CAT and PT. Students are in the process of completing tests—results are not yet available. Assessment Services

28 ORS Subject Detail Report
Legend Demographic Subgroups Exploration Menu Assessment Services

29 ORS Exploration Menu Assessment Services

30 ORS Claim-Level Report Detail
Assessment Services

31 ORS Student Listing Report
Sort Assessment Services

32 Reports in Illuminate Assessment Services

33 Reports in Illuminate SBA Performance Summary = this report summarizes SBA ELA or Math performance levels and claim performance for past or current students SBA Multi-Year Report = this report summarizes SBA ELA or Math assessments overall and claim performance levels over two years SBA Site or Grade Level Comparison = this report summarizes SBA ELA or Math assessments overall and claim performance levels either by grade level or school site SBA Student Roster = this report identifies students that are at each level overall and by claim performance SBA Student Summary = this report identifies ELA or Math performance levels and claim performance for a single student Assessment Services

34 Tips for Accessing Reports in Illuminate
Check the enrollment date in the report filter or control panel (upper right) to ensure you are pulling the correct set of students (Do you want to see results for students who tested with you or for current students?) Always clear the [Search] box Do not use too many filters Navigation for Pre-Built Reports: Reports > List Reports > Select “Prebuilts” in filters > Type “SBA” in Search box (type “Preliminary” for preliminary reports) Navigation for School’s Data File: Assessments > List Assessments > Select “State & National Publisher Assessment” in filters > Type “SBA” in Search box (type “Preliminary” for preliminary reports) Assessment Services

35 Reports on CDE Website Compare test results across counties, districts, schools, or the state on a single screen View results over time in addition to a single year Assessment Services See handout

36 Reports on CDE Website The reports on the CDE website display the mean scale score by grade level and overall for each content area The mean scale score is an important data point to consider when reviewing the results Assessment Services

37 Comparing Median Scale Scores from Year to Year
2800 Threshold Scale Score Range Standard Exceeded Standard Met ● 2541 Standard Nearly Met Standard Not Met Spring 2015 Spring 2016 Spring 2017 2300 Grade Grade Grade Grade Grade 7 Grade Grade 11 Assessment Services

38 Test Results Reported on the Individual Student Reports (IRS)
Student Score Reports Test Results Reported on the Individual Student Reports (IRS) Student Score Report Versions Assessment Grades Smarter Balanced for ELA and mathematics 3, 4, 6, 7, and 11 Smarter Balanced for ELA, mathematics, and science (California Science Test [CAST] pilot) 5, 8, and 11 CAAs for ELA and mathematics CAAs for ELA, mathematics, and science (CAA for Science pilot) CAST or CAA for science (CAST pilot and CAA for Science pilot) 10 and 12 Assessment Services

39 Student Score Reports See handout Assessment Services

40 Test Results from Multiple Years
Assessment Services

41 Low, Medium and High Bands
(Divide the difference by three = 21 pts) (Divide the difference by three = 21 pts) 2367 to 2387 2388 to 2408 2409 to 2431 2432 to 2450 2451 to 2469 2470 to 2489 (Divide the difference by three = 21 pts) (Divide the difference by three = 21 pts) 2416 to 2433 2434 to 2452 2453 to 2472 2473 to 2491 2492 to 2511 2512 to 2532 See handout Assessment Services

42 Low, Medium and High Bands (cont.)
2015 2432 2016 2470 Assessment Services

43 Questions about the various reports available?

44 Research Recall Reflect Respond Analyzing the results

45 Framing the Conversation
Student Performance - Data Review and Discovery What is the current state of performance? Informing Teaching and Learning Optimizing Results – Possible Actions Moving Forward What should/will change? Learning Conditions -Recalling Program, Practice, Policies, etc. What were the conditions of learning? Practices, Programs, and Policies Impacting Performance – Making Connections What may have contributed to the observed performance? Assessment Services

46 From Frame to Process: The Four R’s
Data Review and Discovery Possible Actions Moving Forward Program, Practice, Policies Possible Connections RESOURCES See handout Assessment Services

47 Research Engage with the data to find facts Describe what you observe
Look for data trends Focus on facts, not conjecture RESEARCH Using the data sources available, report the facts. Look for trends or areas of concern. Look for areas of success. This is an objective data discovery. Assessment Services

48 Research Guiding Questions: What Is the Current State of Performance?
Are there any trends or patterns that emerge by grade level, by student group, by content area, and/or by claim area? What scores look noticeably different from last year’s scores? Where do the scaled scores fall within the levels (i.e, high, medium, or low band)? See handout Assessment Services

49 Recall Consider school/classroom programs, practices, and policies.
Think about last year: What happened? What curriculum was in place? What professional learning occurred? RECALL Focusing on the prior school year(s), discuss the realities of the classroom, school, and LEA program, practice, and policies. What happened? Focus on facts only—no conjecture. Assessment Services

50 Recall Guiding Questions: What Were the Conditions for Learning?
What curriculum/instructional materials did we initiate/continue last year? What opportunities did we provide our students and staff to interact with the assessment tools (e.g., practice test, SBAC interims) prior to the summative assessment administration? What evidence did we collect during the year that is consistent with the evidence required on the assessment? Assessment Services

51 Reflect Determine possible reasons for the performance
Consider existing programs, practices and policies that might help to explain observed patterns, trends, or gaps in achievement Keep the conversation honest REFLECT Connect performance with prior year practices, programs, and policies. Consider possible reasons. Be honest about what occurred last year and how that may have impacted the performance that you observe. Assessment Services

52 Reflect Guiding Questions: What May Have Contributed to the Observed Performance?
What would you consider the key factors contributing to the apparent successes/needs as indicated by the results? How do these results affirm areas where instruction was focused and increased learning was expected? Assessment Services

53 Respond Use guiding questions Think about what you can control
What other data are available to you? How will you know progress is being made? RESPOND What might be some possible ways to move forward? These responses should tie directly to what emerged from the reflection process. Assessment Services

54 Respond Guiding Questions: What Should/Will Change?
How might we re-structure our professional learning communities and collegial work to support the needs illustrated by the data? What evidence do we need during classroom instruction to know that our students are making progress? Assessment Services

55 Reminders: Analyzing Results
Examining assessment data: Helps promote the effective and appropriate use of data when done correctly Helps to build a common understanding of expectations Assessment Services

56 Questions about Analyzing the Results?

57 Moving beyond the results
Rightful Place and Rightful Purpose Evidence Centered Design Moving beyond the results

58 Evidence-Centered Design
Knowing how the assessment items were developed, helps in understanding the results and making connections to the classroom ITEMS Evidence Statements Content Claims Targets Overall Claims The Smarter Balanced Hierarchy of Item Development and Reporting of Scores Assessment Services

59 Relationships Between CCSS, CA Frameworks, and the Assessment
English Language Arts/Literacy Content Specifications (Claims) English Language Arts/English Language Development Framework English Language Arts/Literacy Item Specifications (Claims and targets per claim) English Language Arts/Literacy Assessment Blueprint English Language Arts/Literacy Summative Assessment English Language Arts Mathematics Content Specifications (Claims) Mathematics Framework for California Public Schools Mathematics Item Specifications (Claims and targets per claim) Mathematics Assessment Blueprint Mathematics Summative Assessment Mathematics Assessment Services

60 Example of Item Specifications
See handout Evidence Required for Target 1 The student will identify text evidence (explicit details and/or implicit information) to support a GIVEN inference or conclusion based on the text. Item Specifications delineate the types of evidence that should be collected regarding the knowledge, skills, and /or abilities that are articulated in the standards Assessment Services

61 Test Blueprint Test Blueprint describes the content of the assessments, how that content will be assessed (CAT/PT), including number of items and DOK levels Key Data Systems (KDS) modified the SBAC test blueprint and aligned it to the standards Assessment Services See handout

62 Rightful Place/Purpose: Assessment Frequency and Impact on Instruction
Statewide Summative Classroom Formative Assessment Services

63 ….but they don’t tell the entire story
Statewide summative assessments are like the tip of an iceberg—it pays to pay attention ….but they don’t tell the entire story Assessment Services

64 Rightful Place, Rightful Purpose for Statewide Summative Assessment
Provide a general direction—we must dig deeper to determine cause Focus on groups, programs, and disaggregation Rarely provide definitive answers, but raise many questions, allowing reflection on context and practice Provide an entry point into a collaborative, honest conversation Assessment Services

65 Call to Action – Pay Attention but Move Beyond the Scores
Focus on improving learning Not solely about increasing scores Reflect on what you can control to move beyond the scores Use with other sources of evidence Test scores should be used in conjunction with multiple pieces of evidence to arrive at a more complete understanding of student learning and progress Collaborative process that requires: Honesty Ability to handle ambiguity Patience Assessment Services

66 Final Thoughts Assessment Services

67 Resources Content Specifications, Item Specifications and Blueprints
Smarter Balanced Scale Score Ranges Sample Student Score Report Claim Descriptions for ELA and Mathematics Target Score Report FAQs Achievement Level Descriptors Teacher Guides to Smarter Balanced Understanding CAASPP Results for Parents Assessment Services

68 Assessment Services Webpage
Staff Portal > Departments > Assessment Services > CAASPP> Results Assessment Services

69 Contact Information Erin Gordon: Amal Morcos: Assessment Services main phone: Assessment Services CAASPP Web page: Assessment Services


Download ppt "2017 CAASPP Post-Test Workshop: Using Assessment Results to Inform Teaching and Learning Assessment Services."

Similar presentations


Ads by Google