Download presentation
Presentation is loading. Please wait.
1
CII Smarter Balanced Assessments
Friday, November 14, 2014
2
A Balanced Assessment System
Rocklin USD September 21, 2014 A Balanced Assessment System Summative assessments Benchmarked to college and career readiness Teachers and schools have information and tools they need to improve teaching and learning Common Core State Standards specify K-12 expectations for college and career readiness All students leave high school college and career ready You have probably seen this diagram before. It represents each of the components of the Smarter Balanced assessment system. We are showing it here to make it clear that the box in the lower left that says “Educator resources for formative assessment practices to improve instruction” represents the Digital Library. The Digital Library’s formative assessment resources and strategies, such as effectively using exit tickets, or helping students write and use “I can” statements, are designed to be used in conjunction with the Interim Assessments and the Summative Assessments to help districts build sa balanced assessment system. Educator resources for formative assessment practices to improve instruction Interim assessments Flexible, open, used for actionable feedback Graphic from California Department of Education Senior Assessment Fellow, Mary Tribbey
3
A Balanced System of Assessment
Large Scale (Assessment of) Summative in nature Norm referenced Aptitude Achievement Essential Question: What have students already learned? Mid-Scale (Assessment for) Formative processes with summative information Criterion-referenced Often teacher or district-made How can we help students learn more? Small-Scale Questioning Digital Learning tools
4
Formative vs. Summative Assessment
Formative: FOR learning. It is taken at varying intervals throughout a course to provide information and feedback that will help improve the quality of student learning and the quality of the course itself. Formative assessment provides information on what an individual student needs to practice, to have re-taught, and what to learn next. Summative: OF learning. It is generally taken by students at the end of a unit, semester, or year to demonstrate the “sum” of what they have or have not learned.
5
Assessment Cycles by Purpose
Formative assessment practice Interim/Benchmark assessment Summative assessment (Chapter 8 of CDE ELA/ELD Curriculum Framework, 2014, adapted from Herman & Heritage, 2007)
6
The word ‘assess’ comes from the Latin verb ‘assidere’ meaning ‘to sit with’.
In assessment one is supposed to sit with the learner. This implies it is something we do ‘with’ and ‘for’ students and not ‘to’ students. (Green, 1999)
7
Formative Assessment is a process not a test.
8
Assessment For Learning
Identify Guiding Questions Answer the Questions: What do students need to know, understand and be able to do? Identify Specific Proficiencies Identify Quick, Informal Assessments Use Assessments Effectively Modify Teaching and Learning Revisit, Reflect, Revise
9
5 Key Strategies of Formative Assessment
Clarifying, sharing, and understanding learning intentions and criteria for success Engineering effective classroom discussions, activities and learning tasks that elicit evidence of learning Providing feedback that moves learning forward Activating learners as instructional resources for one another Activating learners as the owners of their own learning -Dylan Wiliam Embedded Formative Assessment
10
Students need to know where they are going in their learning.
1. Clarifying, sharing, and understanding learning intentions and criteria for success Students need to know where they are going in their learning. Students need to know what counts as quality work. Learning intentions: Task-specific versus generic scoring rubrics Product-focused versus process-focused criteria Official versus student-friendly language
11
2. Eliciting Evidence of Learners’ Achievement
Student engagement Hot-seat questioning Wait time All-Student response systems Alternatives to questions Exit passes Evaluative and interpretive listening Discussion questions and diagnostic questions Question shells
12
3. Feedback That Moves Learning Forward
Feedback should cause thinking. It should be focused. It should relate to the learning goals that have been shared with students. It should be more work for the recipient than the donor. It should increase the extent to which students are owners of their own learning.
13
4. Activating Students as Instructional Resources for One Another
Cooperative learning End of topic questions Error classification What did we learn today? Preflight checklist Group based test prep Teach someone else (peer buddies)
14
5. Activating Students as Owners of Their Own Learning
Self-assessment Self-regulated learning Metacognition Motivation Integrating motivational and cognitive perspectives Learning logs and portfolios
15
Smarter Balanced Item Types
Image of a student or students at the computer, taking an online test Selected Response Constructed Response Technology Enhanced Performance Tasks
16
Pilot Formative Assessment Project
Center for Learning Analytics (Spring 2014) 10 participating classrooms from 5 participating districts 6 classrooms – K-3 literacy focus 4 classrooms – 8th/9th grade Algebra I focus Project consisted of 3 phases: observation, prototype & implementation
17
Improving on Complexity
More evidence-based instruction More rigorous questions within the instruction Students explain reasoning Student Collaboration- Working in groups More classroom discussion with emphasis on academic conversations
18
Performance Tasks Defined: to measure students’ ability to apply their knowledge and skills in response to complex questions or problems.
19
Performance Tasks Measure a student’s ability to integrate knowledge and skills across multiple standards Will be used to better measure capacities such as depth of understanding, research skills and complex analysis which cannot be adequately assessed with selected or constructed-response items Reflect a real-world task and/or scenario-based problem
20
What is NOT New about Performance Assessments
Teachers have always had students demonstrate or create: Science labs Speeches/dramatic interpretation Pieces of art Balance beam routines Music recitals
21
Performance Task Structure
Stimulus Information Processing Product
22
Grade 8 Classroom Activity
23
Quickwrite: Implications of The Classroom Activity for Students
What do students need to be able to do to be successful during the activity? How can teachers help prepare students and expand on these skills and strategies during classroom instruction?
24
Discussion & Reflection
25
Performance Task: Anatomy Lesson 1 Grade 8 ELA Performance Task
Read the Student Directions and scan the sources Pages 2–13
26
Quickwrite What do students need to be able to do?
How can teachers help them learn how to do this?
27
Cesar Chavez Sample Passage
28
Use Multiple Text Sources
Use several examples of resources that can be used to collect research Biography.com has short video clips Include read alouds of historical figures
29
Mapping Out Facts and Ideas
30
Performance Task: Anatomy Lesson 2
Image of a student writing on paper or the computer Look through the Part 1 items and the Part 2 Student Directions. Pages 14–21
31
Quickwrite What do students need to be able to do?
How can teachers help them learn how to do this?
32
Selecting Information from theText
What does the evidence suggest about the different theories regarding what happened to Amelia Earhart? Evidence supporting first theory Amelia died when plane crashed into ocean near Howland Island Page Evidence calling first theory into question Why is this theory unlikely? Example… Sent radio message to waiting crew about running out of gas Quote: “only half hour’s gas left” 150 no sign of plane wreckage in area “a plane that big would not have sunk right away” 151
33
Discussion & Reflection
34
Expectations of the Tasks
Look at the scoring rubrics for this task (pp ). What do the rubrics suggest about important knowledge and skills students must demonstrate?
35
What is DoK? Webb’s Depth of Knowledge: Scales of cognitive demand
Bloom’s Revised Taxonomy: Levels of intellectual ability
36
Examples of DOK 3 & 4 Activities
Level 3 Support ideas with details and examples Use voice appropriate to the purpose and audience Level 4 Analyze multiple sources of evidence Synthesize information across multiple sources or texts
37
Assessment Cycles by Purpose
Formative assessment practice Interim/Benchmark assessment Summative assessment (Chapter 8 of CDE ELA/ELD Curriculum Framework, 2014, adapted from Herman & Heritage, 2007)
38
SBAC Digital Library California Department of Education States that the SBAC Digital Library meets the following criteria: Aligns with Common Core State Standards Demonstrates high-quality instruction Addresses learner differences Is engaging/user-friendly Incorporates formative assessment practices
39
State Network of Educators
Composition 60─150 members per state Comprised of K─12 educators and higher – education faculty Each network has diverse expertise in: CCSS mathematics and English-language arts; science, social science, general education, gifted and talented, English learners, and students with disabilities. Expectations Participate in 5 trainings. Help populate the Digital Library in advance of the June 2014 preview. Submit and review resources using the Quality Criteria. Use resources and collaboration tools for members’ own professional learning and instruction. Provide feedback on the resources in the library, review and posting process, Quality Criteria, and usability of software. Once the gatekeeping and quality criteria were established, the next step was to elicit educators to submit resources and to also review the resources using the criteria established. These educators are called State Network of Educators (SNE) In California: 110 educators (we wanted a balance of subjects, with an emphasis on math and English; but also, history, reading, science, administers, instructional coaches ) We have all grade levels including higher ed– We have a balance of educators from southern, central, and northern CA The yparticipate in 5 trainings, which include how to submit resources and review resources, use the software in the D.L., and all the functionality of the Digital Library.
40
What the Digital Library Is Not…
Rocklin USD September 21, 2014 What the Digital Library Is Not… a bank of assessment items a learning management system a library for the general public a site to freely post resources First, let’s make clear what the Digital Library is not: It is not a bank of items; you cannot use the Digital Library to create a test, although you might find a test in the library that will meet your needs. It is not a learning management system. This is a library of resources that help teachers learn to use or improve the use of assessment as a tool to improve teaching and learning. It is not a library available to the general public. The Digital Library has been licensed by the state of California to support its K-12 teachers in the use of formative assessment and requires that each user have his or her own login and password. It is not a site to freely post resources. Before they are posted, each of the resources must meet strict criteria, and be vetted by the Statewide Network of Educators in order to be accepted. Graphic from California Department of Education Senior Assessment Fellow, Mary Tribbey
41
Guided Tour of SBAC Digital Library
Let’s Tour Together Looking for instructional resources for: ELA informational text & ELA Speaking/Writing Grades 4-6 Now, Let’s Look at Range of Resources: “Exit Tickets” as formative assessment tool Questions to consider: How will this resource align with other parts of our grade-level or school curriculum? What is our internal process to consider resources from SBAC Digital Library or other sites? In what ways does this resource represent high quality instruction, content and formative assessment practice?
42
Use Your SBAC Guided Tour Sheet to Explore
Strengths of SBAC Digital Library Limitations of SBAC Digital Library
43
SBAC Digital Library – After the Tour
Questions to consider when rolling out the SBAC Digital Library: How will this resource align with other parts of our grade-level or school curriculum? What is our internal process to vet resources from SBAC Digital Library or other sites? In what ways does this resource represent high quality instruction, content and formative assessment practice?
44
Interim Assessments Welcome and thank you for your interest
This presentation will provide information to help you understand the purpose of the DL and how to navigate the system once you are in the Digital Library. This presentation will give you background of the Digital Library so you can support its use by educators in your school district.
45
Smarter Balanced Interim Assessments
Outcomes Purpose Differences between: Interim Comprehensive Assessments Interim Assessment Blocks Uses Role in the local assessment system The intended outcome of this presentation is to give you an understanding of the purpose of the Interim Assessments, the differences between the two types of tests, the ways in which the tests might be used, and to ask you to consider what role these tests will play in your local assessment system.
46
Interim Assessments The Smarter Balanced Interim Assessments comprise interim comprehensive assessments (ICAs) and interim assessment blocks (IABs) ICAs and IABs are alike in the following ways: The quality criteria used for the ICA and IAB items are the same as those used for the summative assessment. ICAs and IABs use the same universal tools, designated supports, and accommodations. Available to all California teachers ICA and IAB use is optional. There are two types of Interim Assessments: The Interim Comprehensive Assessments, or ICAs, which are a mirror of the summative assessment, and the Interim Assessment Blocks, or IABs, which test smaller sets of skills. The two tests have several things in common. The items for both were created using the same item quality criteria as is used for the summative assessment. They both offer students the same universal tools, designated supports, and accommodations. They are provided to all K-12 teachers through Smarter Balanced membership fees. They are both optional. Districts may use these tests, but they are not required to do so.
47
Interim Comprehensive Assessments (ICAs)
Mirror the summative assessment: Use the same blueprints as the summative. Assess the same range of standards. Have the same item types and formats. Include performance tasks. Require the same amount of administration time. Provide information regarding student readiness for the end-of-year summative assessment Now let’s look at each of the tests in a little more detail. The Interim Comprehensive Assessment, the ICA, is designed to mirror the summative assessment. It uses the same blueprint, has the same item types and formats, includes similar performance tasks, and requires the same amount of time to administer to students. The reports that come from the ICA provide information designed to demonstrate individual student readiness for the summative assessment.
48
Example Use of ICAs Examples of the use of ICAs include:
Beginning of the year a student from another state is given the previous year’s ICA. Mid-year a teacher gives an ICA to gauge how students might perform on the summative assessment. This slide shows a couple of ways in which the ICA might be used; both of these examples take advantage of the test’s purpose of providing information similar to the summative test. Do not say: Taken from the Smarter Balanced Interim Assessments Structure and Understandings
49
Interim Assessment Blocks (IABs)
Assess fewer sets of skills, and: Use the same targets, by grade level, as the summative blueprints. Consist of short, focused sets of items. Provide information about a student’s strengths and needs in relation to the assessment targets. Offer varied blocks by grade level and subject area. The Interim Assessment Blocks, the IABs, are designed to allow teachers to test a subset of the skills included in the summative test. It uses the same assessment targets as the summative blueprints, and teachers can select which sets, or blocks, of skills that they want to test. These blocks vary by grade level and subject area, as we will see in the next few slides.
50
Blocks for ELA, Grades 3 to High School
The blocks for ELA are divided into three groups in this visual. The grouping does not mean that Grades 3-5 for example, will all receive the same test questions. Rather, it means that Grade 3 students will have test questions at the Grade 3 level for each of the blocks shown, and Grade 4 students will see test questions at the Grade 4 level for each of the blocks shown, and so on for Grade 5. The caveat to what I just said is this: When the IABs are rolled out for the first time, the availability of items will be limited. We may only see one performance task, and we may not have all of these blocks available. As additional items are field tested and then added, all of these blocks will eventually become available. Note: if the group cannot see these, quickly read them.
51
Blocks for Mathematics, Grades 3 to 5
Note that the blocks for mathematics are different at each of the grade levels, so this visual shows each grade separately. Note: if the group cannot see these, quickly read them.
52
Blocks for Mathematics, Grades 6 to 8
Note: if the group cannot see these, quickly read them.
53
Blocks for Mathematics for High School
Note: if the group cannot see these, quickly read them.
54
Example Use of IABs Examples of use of the IABs include:
A teacher uses a block focused on argumentative writing to determine the degree of a student’s understanding before or after instruction. A team of teachers uses a block to become informed about how a group of students are performing in geometry. This slide shows a couple of ways in which the IABs might be used. In the first case, the teacher is using the IAB for a specific student. In the next, a team of teachers is using the IABs to learn about and reflect on the performance of a group of students. Do not say: From the Smarter Balanced Interim Assessments Structure and Understandings
55
Rollout of ICAs and IABs
Initial item pool will be limited in depth. Initial ICAs and IABs will be in a fixed format As the item pool grows, ICAs and IABs will become available as computer adaptive tests (CATs). When the ICAs an IABs are first released, they will be provided in a fixed format, due to the limited item pool, and all students will see the same questions. As the item pool grows, the tests will be provided to students in the Computer Adaptive Test, or CAT, format.
56
Administration ICAs and IABs only administered online.
The same teacher registration process as the summative assessment will be used. ICA and IAB administration will use the same test delivery interface as the summative assessment. Testing intervals are determined locally. There are no restrictions on the number of times ICAs orIABs may be administered. The items are not secure. Both of the tests are administered on line. Teachers will register one time and use the same login for the summative assessment and the Interim Assessments. Test administrators and students will experience the same interface as they will during the summative assessment. Unlike the summative assessment, although access to the test is limited to California teachers and students, the items themselves are not secure. When the tests are administered is up to the LEA, but during this initial rollout phase, it’s important to remember that the item pool from which the IABs and ICAs draw is limited. That means if a district uses both the IABs and the ICAs, it is likely that students will see the same items in both tests and their results might be influenced by having seen the same item multiple times. Districts need to think this through when making agreements with teachers about the purpose and use of the tests locally.
57
Scoring and Results of the Smarter Balanced Interim Assessments
Most items are scored by the Smarter Balanced test delivery engine. Scoring of constructed-response items and performance tasks is a local responsibility. Score reports are generated once the constructed-response item scores and performance task scores are input into the system. The reports that come from the ICAs and IABs will be generated after the constructed response and performance tasks scores are entered into the system. Instructions on how to enter those scores will be provided to LEAs. Both the ICAs and the IABs will provide individual student results that teachers can use for the purpose of adjusting instruction. The ICAs will provide this information at the claim level; the IABs will provide the scores at the block level.
58
Availability Fixed form of both ICAs and IABs are estimated to be released in January of 2015. At the present time, we expect that both the initial versions of the ICAs and the IABs will be released sometime around mid-January of 2015.
59
Unprecedented Opportunity
60
Most recent update on November 4, 2014
Emedded- delivered within the computer program Non-embedded- separate from it
61
Embedded vs. Non-Embedded
Embedded: Tools provided digitally through the test delivery system Non- Embedded: Provided at the local level outside of the testing system. Based on individual student’s needs Decision should reflect the student’s prior use of, and experience with tool If part of an IEP/504 decision, there needs to be evidence noted that indicates a need. Tool needs to be utilized according to the Descriptions/Recommendations for Use found within the Guideline document
62
Universal Tools Universal tools are available to all students based on student preference and selection : School input as well
63
Designated Supports For use by ANY student for whom the need has been indicated by a teacher or school team Need to be identified prior to assessment administration Also determined with parent/guardian and student input as appropriate. Recommendation for consistency in system used to determine need Must be activated prior to testing by entering information on TOMS (Test Operations Management System)
64
Accommodations Changes in procedures or materials that increase equitable access for students with IEP’s or 504 plans. Need to be identified prior to assessment administration Also needs to be entered into TOMS There must be documentation within the IEP or the 504 plan with evidence that shows a need.
65
Universal Tools Descriptions
What are the implications of this? This is very different from what students have had access to in the past. How might your teachers need to support around this? Universal Tools Descriptions
66
Both Designated Supports and Accommodations give teachers access to not only a Description but also Recommendations for Use
67
Understanding the Descriptions:
Example from Non-Embedded Accommodations
70
Why do we need guidelines for individualized supports on the Smarter Balanced assessments?
Some features increase engagement and motivation in students Are they more distracting than useful for a student? Too many features can be confusing to students Students need to be familiar with usage of the accommodations-something they do within their everyday classroom and how does that apply to the computerized testing system?
72
Classroom Application
Divide into Elementary, Middle and High School Divide into groups of 1-5 Within each group: -Take 1 set of the crosswalk -Each person takes 1-2 pages -Independently look at the page(s)- How can this assessment accommodation transfer over into my classroom practice?- 5 minutes -10 minutes- Group Share out/Discussion. How might this crosswalk be beneficial to your planning as a teacher? What are some of your concerns? What are some implications for your specific grade level?
73
Deborah S. Delisle, U.S. Department of Education
“A field test is not designed to be a valid and reliable measure of student achievement; rather, it is designed to help the test developers evaluate whether the test, individual items, and the technology platform work as intended before the first operational administration.” Deborah S. Delisle, U.S. Department of Education I know that this quote may be repetitive for many of you; however, I believe that it nicely frames the true purpose of the 2014 field test. --READ QUOTE-- Thus the field test was really just a test of the test. At the state and consortium level of the system: System designers were testing the technology platform And test contractors were testing the validity and reliability of the questions However, at the local level we were also being provided an opportunity to test the system: Schools and districts had the opportunity to test their internal network capacity and student to device rations Teachers were able to test system of logging student onto the test and ensuring that students were provided the appropriate supports and accommodations AND students tested their ability to take a test in a new way using a keyboard and computer rather than paper and pencil. We wanted to leverage this unique opportunity to test the test in a data and accountability free zone and survey our students, teachers and administrators on their experience in order to deepen our knowledge about what steps need to be taken to prepare our districts, schools, teachers, and students for the operational assessment in 2015.
74
Field Test Overview Timeline: March 25th through June 6th
Schools assigned a five- or six- week window Participants: Grades 3-8: All students Grades 9 & 10: Only students selected for the scientific sample Grade 11: Students selected for the scientific sample; all others encouraged Components: Performance Task Non-performance Task Format: Computer-based; not computer adaptive Before delving into the survey results themselves, I wanted to provide you an overview of the field test as it was constructed in the state of California. The field test was conducted across the state between March 25th and June 6th Schools were assigned a five or six week window during which they were required to test all of their eligible students in the designated content areas All eligible students in grades three through eight were required participate In grades nine and ten, only students selected for the scientific sample took the field test for the purpose of vertically aligning the tests between eighth and eleventh grade Finally grade eleven students selected for the scientific sample were required to take the test, but all other eleventh grade students were encouraged to take the test The field test itself was comprised of two primary components A performance task which included a classroom activity followed by a set of complex questions that were centered around a common theme or problem AND a non-performance tasks, which are your traditional questions that utilize multiple response options such as multiple choice, selected response, drag and drop, and matching tables FINALLY it’s important to note that the field test was computer-based, not computer adaptive - Although when the SBAC assessments go operational in 2015 they will be computer adaptive
75
2014 SBAC Field Test Surveys
Number of Responses Number of Districts Responding Student 6400 16 Teacher/Proctor 336 17 Administrator 56 13 As previously mentioned, three SBAC surveys were created for use by students, teachers and administrators. The surveys were disseminated out to all of our districts through the District Assessment Leader Network’s monthly meetings. Districts were free to determine which surveys to administer and to how many individuals they would be administered. In the end, at the county level, we collected: 6,400 valid student responses from 16 of our local districts 336 valid teacher responses representing 17 of the county’s districts AND 56 valid administrator responses representing 13 of our districts
76
Student Survey Results
Methodology Used Weighted student results based on % of 2013 STAR testing population Approximately 28% of students encountered problems with the technology or testing system Student Preparation 50% Completed a Practice Test Together as a Class 53% Individually 54% Teacher Reviewed Technology Needed for Testing (i.e. computer, mouse) Before discussing the student survey results themselves, it’s important to note that… Because of the optional nature of the surveys, when conducting the analysis we had to weight the student survey results based on the district’s percentage of 2013 STAR testing population. This ensure that district results were adequately weighted regardless if they determine to sample all of their students who just a small sample such as a single classroom at each grade level. Moving on to the results: Based on the 6400 student survey results, in terms of student preparation prior to taking the SBAC field test: Approximately 50% completed a practice test together as a class While 53% completed a practice test individually Additionally, 54% were prepared for the test by having their teacher review the technology (such as reviewing how to use the computer keyboard or mouse) Also, approximately 28% of the students surveyed indicated that they encountered a problem with the technology or testing system during their field test experience
77
Student Survey Results, Continued
Testing Conditions 33% Computer Lab 56% Classroom with Laptops 4% Classroom with Tablets 7% Other Content Approximately 56% of the students polled felt comfortable or very comfortable with their understanding of the questions asked A few more highlights from the student survey are contained here. In regards to the testing conditions of the field test: 33% took the test in a computer lab AND, a combined 60% took the test in their classroom using either a laptop or tablet Lastly, when asked about the content of the test - 56% of the students indicated that they were either comfortable or very comfortable in terms of their understanding of the questions asked
78
Teacher/Proctor Survey Results
* Highest ranked item of the PD options Teacher Preparation 93% Overview of the SBAC Field Test 66% Instruction on the Selection and Administration of Universal Tools, Designated Supports and Accommodations 78% Demonstration of the Test Administrator Interface 70% Physical Copies of Required Manuals Impact on Instruction 53% indicated that the use of computer devices for testing purposes impacted their ability to use technology for instructional purposes Moving on to the teacher survey results: When surveyed about their teacher preparation: 93% of the teachers indicated that they were provided with an overview of the SBAC field test 66% indicated that they received instructions on the selection and administration of universal tools, designated supports and accommodations 78% were provided with a demonstration of the test administrator interface AND 70% were provided physical copies of the required manuals Additionally, 53% indicated that the use of computer devices for testing purposes impacted their ability to use technology for instructional purposes (this finding may be important as districts determine how to allocate their resources during testing and as teachers plan their lessons during this timeframe) 85% of teachers ranked “training on preparing students for the content” as important or very important (This was the highest ranked of the professional development options and indicated a high need amongst our teachers on preparing our students for the type of questions posed in the SBAC assessments) Professional Development 85% of teachers ranked “training on preparing students for the content” as important or very important*
79
Administrator Survey Results
Technology & Systems 66% Experienced significant glitches in the system (e.g. device failure, missing log-in information) 89% Indicated that the network and bandwidth were reliable during testing Impact on Instruction 67% indicated that the use of computer devices for testing purposes impacted the school’s ability to use technology for instructional purposes Lastly, the administrator survey results indicated that: In terms of technology and systems 66% experienced significant glitches in the system (such as a device failure or missing log-in information) However, 89% indicated that their network and bandwidth were reliable during testing Lastly, in regards to impact on instruction - 67% indicated that the use of computer devices for testing purposes did impact the school’s ability to use technology for instructional purposes (again, pointing to the fact that districts must determine how to allocate devices for testing in conjugation with the devices use for regular instructional practice)
80
Assessment & Accountability Updates - CAASPP
CAASPP System Components SBAC: ELA & Math, Grades 3-8 & 11 EAP option for grade 11 students; will not require additional testing CST/CMA/CAPA: Science, Grades 5, 8 & 10 Alternate Assessment (Field Test): ELA & Math, Grades 3-8 & 11 STS: Reading/Language Arts Optional for LEAs to purchase Diagnostics: ELA & Math, Grade 2 (by November 2014) ACCOUNTABILITY: Will significantly change Will not be comparable to the old system - Index values/baselines may differ significantly Alternate Assessment: CAPA sunset per legislation in July Will field test in NCSC would not permit all test takers to participate in the field test; decided not to move forward with them CDE is looking for an alternate assessment to field test at this time; “work in progress”
81
Updates Continued – Technology & Data Readiness
Requirements are unchanged from the Field Test (e.g. devices, accessories, bandwidth, etc.) Secure browser updated version scheduled to be available late October 2014 TIDES has changed to TOMS Student demographic information will be sourced in CALPADS Student accommodation information will be sourced in TOMS Student accommodation information cannot be uploaded until student demographic information has loaded into TOMS through CALPADS
82
SBAC Best Practices Collaboration Communication with Stakeholders
Training Student Registration Technology Preparation and Troubleshooting Scheduling
83
SBAC Best Practices (1) Collaboration:
CAASPP and CALPADS and Technology Staff LEA and sites (2) Communication with Stakeholders: Students Teachers Parents Community
84
SBAC Best Practices (3) Training: (4) Student Registration:
LEA and site coordinators Technology staff Test administrators Teachers Students (4) Student Registration: Continuous update of CALPADS, especially SSIDs Student Name Grade Level
85
SBAC Best Practices (5) Technology Preparation and Troubleshooting:
Selecting and preparing devices Testing bandwidth capacity Providing technology support staff to sites during testing Providing troubleshooting guides to site coordinators and test administrators (6) Scheduling: Flexible approach Thoughtful staffing Mindful of student need (end times, absences, use of accommodations) Well-planned use of facilities Aligned with the classroom activity administration requirements
86
SBAC Best Practices – Think/Pair/Share
Collaboration Communication with Stakeholders Training Student Registration Technology Preparation and Troubleshooting Scheduling Which practices did you conduct at your school/district last year? Which practices would you like to adopt at your school/district in the coming year?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.