Presentation is loading. Please wait.

Presentation is loading. Please wait.

SMARTER Balanced Assessment

Similar presentations


Presentation on theme: "SMARTER Balanced Assessment"— Presentation transcript:

1 SMARTER Balanced Assessment
Overview and Sample Items Huron ISD, September, 2013

2 SMARTER Balanced Assessment (SBAC) Beginning 2014-2015

3 Pascal (Pat) D. Forgione, Jr., Ph.D.
Center for K-12 Assessment and Performance Management at ETS A Step Change in K–12 Testing, August 2013

4 SBAC Summative Assessment
Performance Tasks One ELA and one math task per year Integrate knowledge and skills Computer-delivered Scored within two weeks Computer-Adaptive Given during the final weeks of the school year Multiple item types scored by computer Re-take option locally determined Pascal (Pat) D. Forgione, Jr., Ph.D. Center for K-12 Assessment and Performance Management at ETS

5 Pascal (Pat) D. Forgione, Jr., Ph.D.
Center for K-12 Assessment and Performance Management at ETS

6 Pascal (Pat) D. Forgione, Jr., Ph.D.
Center for K-12 Assessment and Performance Management at ETS

7 Pascal (Pat) D. Forgione, Jr., Ph.D.
Center for K-12 Assessment and Performance Management at ETS

8 Pascal (Pat) D. Forgione, Jr., Ph.D.
Center for K-12 Assessment and Performance Management at ETS

9 Pascal (Pat) D. Forgione, Jr., Ph.D.
Center for K-12 Assessment and Performance Management at ETS

10 Expected Test Construction
Number of Items Administration Mode Scoring Method 19-30 Selected response 3 Extended constructed response 7-18 Technology enhanced One Performance for math, one for ELA Computer adaptive: Selected Response, Extended Constructed Response, Technology Enhanced Computer delivered: teacher-administered performance event Computer adaptive: automated computer scoring Performance event: combination of Artificial Intelligence and teacher Either hide this slide OR hide the next 4 – this summarizes all the other test subjects/levels State Higher Education Executive Officers Susan Gendron

11 Michigan’s online initiatives
Pilot in 2006 Pilot in 2011 (English Language Proficiency) Pilot in 2012 ( Dynamic Learning Maps Alternate Assessment Consortium for 1% of population) Pilots leading up to operational adoption of SMARTER/Balanced Assessment Consortium products in 2014/15 “All challenges will be resolved by ” ~MDE

12 Smarter Balanced Item and Performance Task Development
Welcome to the Smarter Balanced Assessment Consortium’s Item and Task Development Training Module. This module will introduce educators to the Smarter Balanced Assessment Consortium’s training series and procedures for developing items and performance tasks for its next generation assessment system. Huron Intermediate School District

13 Smarter Balanced and Evidence-Centered Design
Items and Performance Tasks Smarter Balanced Item and Task Specifications Smarter Balanced Content Specifications Smarter Balanced is applying Evidence-Centered Design in many ways. As is described in greater detail in the module that focuses on Evidence-Centered Design, {+} Smarter Balanced has employed Evidence-Centered Design to analyze the Common Core State Standards to identify the important skills, knowledge and abilities that students must develop to be college and career ready. This analysis informed the development of the Smarter Balanced Content Specifications which define the claims to be made about students and their learning and define the specific knowledge, skills, and abilities that will be measured by the assessment system. Information contained in the Content Specifications was then used to develop the Item and Task Specifications which provide information about the items and tasks that will be developed to collect evidence about student learning. Already, the Item Specifications have been used to develop a large sample of items and tasks. Item and task writers and reviewers will rely heavily on these documents to guide their work going forward and it is important to become familiar with these documents. Common Core State Standards

14 SBAC Assessment Item Types
Selected Response Constructed Response Technology-Enabled Technology-Enhanced Performance Tasks To collect evidence about the wide range of assessment targets, Smarter Balanced will use a variety of item and task types. These item and task types fall into six broad categories and include Selected Response items, Constructed Response items, Extended Response items, and Performance Tasks. In addition, there are two categories of technology-rich items and tasks known as Technology-Enabled And Technology-Enhanced. Each of these item and task types will be explored in greater detail in the item and task type modules. But let’s take a brief look at each category now.

15 Key Concepts Evidence Universal Design Accessibility Sensitivity Bias
As mentioned already, there are several terms and concepts that are key to understanding the Smarter Balanced assessment system. Among these terms are Evidence, Universal Design, Accessibility, Sensitivity, and Bias.

16 Key Concepts Evidence Information that students provide through their responses about their knowledge, skills, and abilities Universal Design Designing items and tasks so that they function as intended for as many students as possible Accessibility Additional information or presenting items and tasks in a different way in order to meet the specific needs of some students Sensitivity Content contained in an item that may be distracting or upsetting for some students For now, it is important to know that Evidence focuses on the information that students provide about their knowledge, skills, and abilities through the items and tasks educators will help develop and review. Collecting evidence to support claims about student learning is the primary goal of the Smarter Balanced next generation assessment system. Bias Use of names, topics, or contexts that may be unfamiliar to a sub-group of students

17 Key Concepts Evidence Information that students provide through their responses about their knowledge, skills, and abilities Universal Design Designing items and tasks so that they function as intended for as many students as possible Accessibility Additional information or presenting items and tasks in a different way in order to meet the specific needs of some students Sensitivity Content contained in an item that may be distracting or upsetting for some students Universal Design is a concept that focuses on designing items and tasks so that they function as intended for as many students as possible. Universal Design is a key concept that will guide your development of all items and tasks. Bias Use of names, topics, or contexts that may be unfamiliar to a sub-group of students

18 Key Concepts Evidence Information that students provide through their responses about their knowledge, skills, and abilities Universal Design Designing items and tasks so that they function as intended for as many students as possible Accessibility Additional information or presenting items and tasks in a different way in order to meet the specific needs of some students Sensitivity Content contained in an item that may be distracting or upsetting for some students Accessibility focuses on including additional information or presenting items and tasks in a different way in order to meet the specific needs of some students. Among these needs are accessing content in braille, audio or signed form. While item writers will not be responsible for providing this additional information, it is important to think carefully about designing items that do not contain features that make it difficult to add this supplemental accessibility information. Bias Use of names, topics, or contexts that may be unfamiliar to a sub-group of students

19 Key Concepts Evidence Information that students provide through their responses about their knowledge, skills, and abilities Universal Design Designing items and tasks so that they function as intended for as many students as possible Accessibility Additional information or presenting items and tasks in a different way in order to meet the specific needs of some students Sensitivity Content contained in an item that may be distracting or upsetting for some students Sensitivity focuses on content contained in an item that may be distracting or upsetting for some students. As an example, references to religious practices or political beliefs can shift some students’ focus from the problem at hand and instead place unintended attention toward the religious or political topic. Considering sensitivity is one component of Universal Design. Bias Use of names, topics, or contexts that may be unfamiliar to a sub-group of students

20 Key Concepts Evidence Information that students provide through their responses about their knowledge, skills, and abilities Universal Design Designing items and tasks so that they function as intended for as many students as possible Accessibility Additional information or presenting items and tasks in a different way in order to meet the specific needs of some students Sensitivity Content contained in an item that may be distracting or upsetting for some students Bias focuses on the use of names, topics, or contexts that may be unfamiliar to a sub-group of students and which may unintentionally increase the difficulty of an item or task. As an example, asking students to write about what they did during a snow day might create an unintended challenge for students who have never experienced a snow day and may alter what is being measured from writing informational text to writing fiction. Bias is another important component of Universal Design. Bias Use of names, topics, or contexts that may be unfamiliar to a sub-group of students

21 Each item should be written to assess a primary claim
General Guidelines for Developing Mathematics and ELA Selected Response and Constructed Response Items Aligned to the CCSS, some items may include concepts detailed in the Standards for lower grades Each item should be written to assess a primary claim Secondary content claims are also possible Each item should be written to: assess a primary claim from the Content Specifications. Claim 1 items should be written to assess a given content domain or conceptual category. Claim 2, 3, and 4 items should be written to a primary claim and in some cases, a secondary claim. Secondary claims should be listed in order of prominence when completing the item template. There are no selected response items for Claim 4.

22 Clearly stated to ensure that students understand the task
General Guidelines for Developing ELA and Math Selected Response and Constructed Response Items Clearly stated to ensure that students understand the task Clearly elicit the desired evidence of a student’s knowledge, skills and ability Appropriate grade-level difficulty, cognitive complexity, and reading level Depth of Knowledge considered Additional guidelines for writing selected and constructed response items include: Items should have a central focus and be clearly stated to ensure that students understand the task. Items should be written to clearly elicit the desired evidence of a student’s knowledge, skills, and abilities. Items should be appropriate for students in terms of grade-level difficulty, cognitive complexity, and reading level. Depth of Knowledge level should be considered.

23 Some constructed responses require students to support their reasoning
Essential Requirements of ELA and Mathematics Selected Response and Constructed Response Items Some constructed responses require students to support their reasoning Plausible distractors represent common mistakes ELA - appropriate content and contexts Mathematics – must be accurate and more complex items may include scaffolding For Mathematics Grades 3–5, items do not require a calculator When developing selected and constructed response items, there are several requirements that are important to keep in mind. These requirements include Students will often be asked to support their reasoning on constructed response questions. For some ELA items, this means that students must support their response with information from the text or texts. Selected-response items must contain plausible but incorrect distractors. This should not be any incorrect answer but carefully constructed incorrect answers that represent common mistakes in order to elicit information about student misconceptions. ELA items should include appropriate content, contexts, and presentation. The stimulus text for Claim 1 should represent a range of difficulty. Stimulus texts for Claims 2 and 4 should be below the assessed grade level.

24 SBAC Assessment Item Types
Selected Response Extended Constructed Response Technology-Enabled Technology-Enhanced Performance Tasks To collect evidence about the wide range of assessment targets, Smarter Balanced will use a variety of item and task types. These item and task types fall into six broad categories and include Selected Response items, Constructed Response items, Extended Response items, and Performance Tasks. In addition, there are two categories of technology-rich items and tasks known as Technology-Enabled and Technology-Enhanced. Each of these item and task types will be explored in greater detail in the item and task type modules. Let’s start with Selected Response.

25 Selected Response Single Response – Multiple Choice
Many experts will tell you that television is bad for you. Yet this is an exaggeration. Many television programs today are specifically geared towards improving physical fitness, making people smarter, or teaching them important things about the world. The days of limited programming with little interaction are gone. Public television and other stations have shows about science, history, and technical topics. Which sentence should be added to the paragraph to state the author’s main claim? A. Watching television makes a person healthy. B. Watching television can be a sign of intelligence. C. Television can be a positive influence on people. D. Television has more varied programs than ever before. Selected Response items prompt students to select one or more responses for a set of options. As an example, this item asks students to select the single best response. This type of selected response item is referred to as a multiple-choice item.

26 Selected Response Multiple Correct Options
Which of the following statements is a property of a rectangle? Select all that apply. ☐ Contains three sides ☐ Contains four sides ☐ Contains eight sides ☐ Contains two sets of parallel lines ☐ Contains at least one interior angle that is acute ☐ Contains at least one interior angle that is obtuse ☐ All interior angles are right angles ☐ All sides have the same length ☐ All sides are of different length Other selected response items may ask students to select more than one option. As an example, this item asks students to identify all of the properties of a rectangle.

27 Benefits and Limitations of Selected Response Items
Answered quickly Assess a broad range of content in one test Inexpensive and objectively scored Results collected quickly Limitations Limited ability to reveal a student’s reasoning process Difficult to assess higher- order thinking skills Selected response items have many benefits. {+} Selected response items are designed to be answered within 1 or 2 minutes and allow the opportunity to assess a broad range of content in one test. Selected response items are inexpensive to score, are scored objectively, and student results are collected quickly. Despite these benefits, there are two limitations. With selected responses, it is difficult to understand a student’s reasoning process and to assess higher-order thinking skills.

28 Formats and Components of Mathematics Selected Response Items
Traditional Selected Response Item Key and Distractor Analysis Which number is both a factor of 100 and a multiple of 5? A. 4 B. 40 C. 50 D. 500 Did not consider criteria of “multiple of 5” Did not consider criteria of “factor of 100” Correct Multiplied 100 and 5 Which number is both a factor of 100 and a multiple of 5? STEM Statement of the question RATIONALE A. 4 B. 40 C. 50 D. 500 OPTIONS: Possible answers the students must select from DISTRACTOR KEY The Smarter Balanced Assessment will be using both traditional and non-traditional selected response items. First let’s take a look at the format and components of a traditional selected response item. {+} This item is a traditional multiple-choice item with a stem and four options. {+} The stem is the statement of the question to which the student responds. The options are possible answers from which students must select. Options should be arranged according to a logical order such as numerically or alphabetically. There are four different ways to respond to this item and only one correct answer. Distractors are the incorrect options. A key and distractor analysis accompanies each selected response item. The key identifies the correct response. In addition, a rationale for each incorrect response is provided. Incorrect responses should be based on the likely errors students will make and common misunderstandings. For example in the case of a two-step problem, a student may solve only the first step, making the solution to the first step an excellent distractor. The distractor analysis explains the rationale a student might use to select each option. It is important the distractors and the key are balanced. No one option should be obviously different from the others.

29 Non-Traditional Math Selected Response Item
STIMULUS 17 × 12 A multiplication problem is shown below. 17 × 12 Which model(s) below could represent the solution to this problem? Select all that apply. Which model(s) below could represent the solution to this problem? Select all that apply. STEM A. B. C. (1×1)+(1×7)+(1×2)+(2×7) This is a more complex selected-response item. This item has a stem, stimulus, and six options. {+} Like the previous item, the stem is the statement of the question to which the student responds and the options are possible answers the students must select from. The stimulus is the text, source, and/or graphic about which the item is written. The stimulus provides the context of the item or task to which the student must respond. Here the student is provided with options in which more than one option is correct. Unlike the prior item that had only one correct response, this item contains more than one option that is correct. D. E. F. (17×2)+(17×1)

30 Non-Traditional Math Selected Response Item
Key and Distractor Analysis: Does not understand how to model multiplication of two two-digit numbers using area models. Correct Did not account for the values of the digits in the tens places. Did not understand that the 1 represents 10 in the multiplication problem Showed multiplication of 17 and (1 + 2) instead of 17 and 12 Responses to this item will receive 0–2 points, based on the following: 2 points: B, D 1 point: Either B or D 0 points: Any other combination of selections. The scoring rubric describes how points are awarded for an item or task. The number of points a student can earn on a selected response item will vary. {+} This item is worth two points. The student earns two points for selecting options B and D, 1 point for selecting only option B or only option D, and no points for any other combination of selections.

31 Non-Traditional Math Selected Response Item
For numbers 1a – 1d, state whether or not each figure has ⅖ of its whole shaded. STEM MULTIPLE PARTS OPTIONS 1a. 1b. Let’s take a look at another way to format a non-traditional selected response item. This item has a stem, multiple parts and two options for each part. {+} The stem directs students to decide whether or not the model in each part answers the question. There are 16 different ways to respond to this item making guessing the correct answer much less likely than for a traditional selected-response item. 1c. 1d.

32 Non-Traditional Math Selected Response Item
Scoring Rubric: Responses to this item will receive 0–2 points, based upon the following: 2 points: YNYN The student has a solid understanding of ⅖ as well as the equivalent form of ⅖. 1 point: YNNN, YYNN, YYYN The student has only a basic understanding of ⅖. Either the student doesn‘t recognize an equivalent fraction for ⅖ or doesn‘t understand that all 5 parts must be equal-sized in figure 1b. 0 points: YYYY, YNNY, NNNN, NNYY, NYYN, NYNN, NYYY, NYNN, NNNN, NYNY, NNYN, NNNY. The student demonstrates inconsistent understanding of ⅖ or answers “Y” to figure 1d, clearly showing a misunderstanding of what ⅖ means. Figure 1d is considered a “disqualifier “and an answer of “Y” to this part of the item would cancel out any other correct responses as “guesses” on the part of the student. The scoring rubric for this item indicates it is worth two points and points will be awarded based on the level of understanding a student demonstrates. The scoring rubric describes how points are awarded for an item or task.

33 Non-Traditional Math Selected Response Item
Scoring Rule: Students who properly match the four shapes to their name will receive two points. Students who make two or three correct matches will receive partial credit of one point All other connections will receive a score of 0. Match each shape below to its name. Another format for a non-traditional selected response item is to require students to match descriptions of a term or activity to a corresponding option. This is a technology-enhanced item and has a scoring rule instead of a scoring rubric. {+} The scoring rule determines how points are awarded. This item is worth two points and points will be awarded based on the level of understanding a student demonstrates. The scoring rule describes how points are awarded for an item or task.

34 Examples of Poorly Written Items
The table below shows the weights of three vehicles. Which list shows the vehicles in order from lightest to heaviest? ☐ car, motorcycle, truck ☐ motorcycle, car, truck ☐ truck, car, motorcycle ☐ truck, motorcycle, car The table below shows the number of apples three students picked. Which list shows the number of apples picked in order from greatest to the least? ☐ 95, 107, 121 ☐ 95, 121, 107 ☐ , 107, 95 ☐ , 95, 107 Vehicle Weight (in pounds) Car 4,050 Motorcycle 497 Truck 12,159 Student Number of Apples Bobby 107 Carlos 95 Jenna 121 We have already looked as some exemplar items. Now let’s review some items that fall short of Smarter Balanced item development standards. {+} Let’s pause for thirty seconds so that you can look at this item and think about its flaws. [pause for 30 seconds] OK. Let’s examine some flaws in this item. First, the context of this item can affect the student’s performance. Many students could use prior knowledge unrelated to the concept or skill being measured to answer this item. The item is also problematic because it asks the students to order the vehicles and not the numbers that represent the weight of the vehicles. Also, the key is the only option that begins with the lightest vehicle. Many students will not need to order any number to determine the correct answer. A similar question could be asked using the weight of three different elephants or the length of three different rivers. Let’s look at a similar item that measures the same content. This item measures the same content, but students cannot draw upon prior knowledge to help them answer the question. Students also need to order the numbers and consider the options more closely.

35 Examples of Poorly Written Items
Look at the rectangle below. What is the area, in square feet, of the rectangle? ☐ 3 ☐ 15 ☐ 18 ☐ 63 3 feet 6 feet This is another example of an item that has some flaws. Let’s pause for fifteen seconds so that you can look at this item and think about its flaws. {+} [15 second pause] This item asks students to find the area of the rectangle, which is 18 square feet. The perimeter of this rectangle is 18 feet. Many elementary students confuse area and perimeter. Many students would select the correct answer for the wrong reason. In addition to this shortcoming, the distractors in this item are also unbalanced. Three is the only single digit option and 63 is much greater than the other options. This item would be strengthened by changing the dimensions so that the area and perimeter are not the same number and if the distractors were more similar to each other, ideally with one representing the perimeter, another representing an anticipated arithmetic error, and the third a number that is in close proximity to the other options.

36 SBAC Assessment Item Types
Selected Response Constructed Response Technology-Enabled Technology-Enhanced Performance Tasks

37 Students in Third-Grade
Constructed Response The table below shows the number of students in each third-grade class at Lincoln School. There are 105 fourth-grade students at Lincoln School. How many more fourth-grade students than third-grade students are at Lincoln School? Show or explain how you found your answer. Students in Third-Grade Class Number of Students Mrs. Roy 24 Mr. Grant 21 Mr. Harrison 22 Ms. Mack 25 Constructed response items prompt students to produce a text or numerical response in order to collect evidence about their knowledge or understanding of a given assessment target. As an example, this item asks students to produce a response that provides evidence about their ability to add and subtract.

38 Constructed Response Extended Response
Ms. McCrary wants to make a rabbit pen in a section of her lawn. Her plan for the rabbit pen includes the following: It will be in the shape of a rectangle. It will take 24 feet of fence material to make. Each side will be longer than 1 foot. The length and width will measure whole feet. Pen 1: Length: (feet, square feet) Width: (feet, square feet) Area: (feet, square feet) Pen 2: Length: (feet, square feet) Width: (feet, square feet) Area: (feet, square feet) Pen 3: Length: (feet, square feet) Width: (feet, square feet) Area: (feet, square feet) Part A Draw 3 different rectangles that can each represent Ms. McCrary’s rabbit pen. Be sure to use all 24 feet of fence material for each pen. Use the grid below. Click the places where you want the corners of your rectangle to be. Draw one rectangle at a time. If you make a mistake, click on your rectangle to delete it. Continue as many times as necessary. Part B Ms. McCrary wants her rabbit to have more than 60 square feet of ground area inside the pen. She finds that if she uses the side of her house as one of the sides of the rabbit pen, she can make the rabbit pen larger. Draw another rectangular rabbit pen. Use all 24 feet of fencing for 3 sides of the pen. Use one side of the house for the other side of the pen. Make sure the ground area inside the pen is greater than 60 square feet. Use the grid below. Click the places where you want the corners of your rectangle to be. If you make a mistake, click on your rectangle to delete it. In some cases, the evidence required to support a claim about a given assessment target necessitates a more extended response. As an example, this item prompts students to provide evidence about their understanding of perimeter and area by producing an extended response. Use your keyboard to type the length and width of each rabbit pen you draw. Then type the area of each rabbit pen. Be sure to select the correct unit for each answer. [Students will input length, width, and area for each rabbit pen. Students will choose unit from drop down menu.] Use your keyboard to type the length and width of each rabbit pen you draw. Then type the area of each rabbit pen. Be sure to select the correct unit for each answer. Length: (feet, square feet) Width: (feet, square feet) Area: (feet, square feet)

39 Purpose of Constructed Response Items
Address assessment targets and claims that are of greater complexity Require more analytical thinking and reasoning Constructed response items are brief open-response items that focus on a particular skill or concept and require students to produce a short written response. Constructed response items address assessment targets and claims that are of greater complexity, requiring more analytical thinking and reasoning than a selected response can elicit.

40 Administration of Constructed Response Items
Administered during the computer-adaptive component Scored using artificial intelligence Most constructed response items take between 1 and 5 minutes to complete Some more complex items may take up to 10 minutes to complete Unlike Performance Tasks, which are the topic of another module, {+} constructed response items are designed to be administered during the computer-adaptive component of the assessment. {+} In order to score constructed response items quickly, automated scoring using artificial intelligence will be employed. {+} Most constructed response items should take between 1 and 5 minutes to complete. {+} Some more complex items may take up to 10 minutes to complete.

41 Qualities of a Rubric Focus on the essence of the primary claim and sometimes secondary claim Address the requirements of the specific assessment targets Distinguish between different levels of understanding and/or performance Contain relevant information, details, and numbers that support different levels of competency related to the item or task The language in the rubric should: {+} Focus on the essence of the primary claim and sometimes secondary claim. {+} Address the requirements of the specific assessment targets. {+} Distinguish between different levels of understanding and/or performance. {+} Contain relevant information, details, and numbers that support different levels of competency related to the item or task.

42 Components of a ELA Constructed Response Item
The Shepherd’s Boy and the Wolf A Shepherd's Boy was tending his flock near a village, and thought it would be great fun to trick the villagers by pretending that a Wolf was attacking the sheep: so he shouted out, "Wolf! Wolf!" and when the people came running up he laughed at them because they believed him. He did this more than once, and every time the villagers found they had been tricked, for there was no Wolf at all. At last a Wolf really did come, and the Boy cried, "Wolf! Wolf!" as loud as he could: but the people were so used to hearing him call that they took no notice of his cries for help. And so no one came to help the boy, and the Wolf attacked the sheep. In a few sentences, explain what lesson the reader can learn from the shepherd’s boy. Use details from the story to support your response. STIMULUS The Shepherd’s Boy and the Wolf A Shepherd's Boy was tending his flock near a village, and thought it would be great fun to trick the villagers by pretending that a Wolf was attacking the sheep: so he shouted out, "Wolf! Wolf!" and when the people came running up he laughed at them because they believed him. He did this more than once, and every time the villagers found they had been tricked, for there was no Wolf at all. At last a Wolf really did come, and the Boy cried, "Wolf! Wolf!" as loud as he could: but the people were so used to hearing him call that they took no notice of his cries for help. And so no one came to help the boy, and the Wolf attacked the sheep. STEM In a few sentences, explain what lesson the reader can learn from the shepherd’s boy. Use details from the story to support your response. All constructed response items are worth 2 to 4 points. {+} Let’s take a look at an example of a 2 point constructed response item. Like selected response items, Constructed response items have a stimulus and a stem.

43 Components of ELA Constructed Response Item
2-point Scoring Rubric 2 The response: gives evidence of the ability to explain inferences about theme includes specific inferences that make reference to the text supports the inferences with relevant details from the text 1 gives limited evidence of the ability to explain inferences about theme includes inferences but they are not explicit or make only vague references to the text supports the inference with at least one detail but the relevance of that detail to the text must be inferred A response gets no credit if it provides no evidence of the ability to explain inferences about theme and includes no relevant information from the text. Scoring Notes Response may include but is not limited to: The shepherd’s boy learned that he shouldn’t call wolf unless there is really a wolf. The shepherd’s boy learned that he should only ask for help if he needs it or else he wouldn’t get help when he really needs it. “The people were so used to hearing him call that they took no notice of his cries.” The shepherd’s boy learned not to have fun by tricking people because the people learn not to trust you. Score Point 2 Sample: The lesson learned from this story is do not cry for help when nothing is wrong. The shepherd’s boy pretends that a big wolf is attacking his sheep and yells, “Wolf! Wolf!” The people in the village run out to help him because they believe he needs help. After he tricks the villagers more than once, they realize he is just pretending. Score Point 1 Sample: The lesson learned from this story is do not cry for help when nothing is wrong. The shepherd’s boy cries wolf when there is no wolf and the people come to help him. Score Point 0 Sample: Readers learn a good lesson about how to cry wolf. All constructed response items also must include a scoring rubric, scoring notes, and sample responses. The scoring rubric for each item distinguishes between characteristics of responses that provide evidence that the student has partially or fully developed the skill or knowledge defined by the assessment target. The scoring notes detail the information that should be included in a correct response. The sample responses provide concrete examples of what a response for each point value might look like. The top score sample should showcase a complete and thorough response. The language contained in samples should model what is expected from a student at the grade level being assessed.

44 Components of a Mathematics Constructed Response Item
A teacher asked her students to use estimation to decide if the sum of the problem below is closer to 4,000 or 5,000. , , = One student replied that she thinks the sum is closer to 4,000. She used the estimation shown below to support her reasoning. Is the student’s reasoning correct? In the space below, use numbers and words to explain why or why not. If the student’s reasoning is not correct, explain how she should have estimated. STEM STIMULI All constructed response items are worth from 2 to 4 points. Let’s take a look at an example of a 2 point constructed response item. Like selected response items, constructed response items have a stem and stimulus.

45 Components of a Mathematics Constructed Response Item
Sample Top-Score Response: The student’s reasoning is incorrect. She was rounding to the thousands place. She had 2 numbers that were less than 500, and she decided to round these numbers to 0. This is like saying these numbers were not in the problem at all. She needs to account for these two numbers. Together, they have a sum that is very close to 1,000. I think adding 1, , ,000 is a better strategy. This means the sum is closer to 5,000 than to 4,000. TOP-SCORE Scoring Rubric: Responses to this item will receive 0–2 points, based on the following: 2 points: Student has thorough understanding of how to estimate and how improper estimation can lead to flawed reasoning. Student states that the student in the scenario used reasoning that is incorrect and provides reasoning that shows a better estimation strategy. 1 point: Student has partial understanding of how to estimate and how improper estimation can lead to flawed reasoning. Student states that the student in the scenario used reasoning that is incorrect, but alternate estimation strategy is also flawed. 0 points: Student has little or no understanding of how to estimate and how improper estimation can lead to flawed reasoning. Student states that the student in the scenario used reasoning that is correct. All constructed response items also include a scoring rubric and a sample top score. {+} The scoring rubric for each task should reflect the values set out for the claim being assessed, giving substantial weight to the choice of appropriate methods for solving the problem presented by the task, to reliable application of skills to develop a solution, and to explanations of what has been found. {+} The sample top score is an example of a complete and thorough top-score response. The language should model what is expected from a student at the grade level being assessed. SCORING RUBRIC

46 Examples of Poorly Written Items
Mercedes received 32 pieces of candy on Halloween. She ate ¼ of the candy. How many pieces of candy did Mercedes have left? Show or explain how you found your answer. This item also has a few problems. Let’s pause for fifteen seconds so you can think about the flaws in this item. {+} [15 second pause] First let’s look at the name used. Mercedes is a female name, but it is also a luxury car manufacturer. This may create bias issues and confusion for students. Next, the context is centered on a holiday. This again creates bias issues. Avoid writing about any holidays.

47 SBAC Assessment Item Types
Selected Response Constructed Response Technology-Enabled Technology-Enhanced Performance Tasks To collect evidence about the wide range of assessment targets, Smarter Balanced will use a variety of item and task types. These item and task types fall into six broad categories and include {+} Selected Response items, Constructed Response items, Extended Response items, and Performance Tasks. In addition, there are two categories of technology-rich items and tasks known as Technology-Enabled And Technology-Enhanced. Each of these item and task types will be explored in greater detail in the item and task type modules. But let’s take a brief look at each category now.

48 Selected or Constructed Responses that include Multimedia
Technology-Enabled Selected or Constructed Responses that include Multimedia Brianna is running for class president. She needs to give a speech to the 4th grade class. Listen to the draft of her speech and then answer the questions that follow. (Test-takers listen to an audio version of the following speech.) “Hi, My name is Brianna. I am running for class president, and I hope you will vote for me. You know many of my friends said they would. I am involved in many activities, including track and theater. If I am elected, I will hold several fundraisers so that all students in the 4th grade can go on a trip at the end of the year. Also, we can donate a portion of the money to a charity of our choice. If you want a class president who will work hard for you and listen to your needs, please vote for me next week!” This speech needs to be revised before the student presents it. Which sentence should be omitted to improve the speech. A. I am running for class president, and I hope you will vote for me. B. You know many of my friends said they would. C. If I am elected, I will hold several fundraisers so that all students in the 4th grade can go on a trip at the end of the year. D. If you want a class president who will work hard for you and listen to your needs, please vote for me next week!” The Smarter Balanced assessment system is designed to be administered on computers. For this reason, the assessment system aims to capitalize on the power of computers by employing technology rich items. Technology rich items fall into two broad categories, Technology-Enabled and Technology-Enhanced. {+} Technology-Enabled items make use of multimedia and interactive elements to stimulate the assessment target measured by an item. Technology-Enabled items either collect responses from students by requiring them to select one or more responses or by producing text or numerals. {+} As an example, this item plays a speech for students {+} and asks them to select an option in response to the prompt. Similarly, other items ask students to experiment with interactive tools, like a random sample generator, and then prompt them to produce text-based responses. In these examples, the technology enables the use of a media rich stimulus, but does not produce a new way of providing a response.

49 Example of Technology-Enabled Item
Gregory is installing tile on a rectangular floor. • He is using congruent square tiles that each have a side length of ½ foot • The area of the floor is 22 square feet. • The width of the floor is 4 feet. Use the grid and the tile below to model the floor. What is the length, in feet, of the floor? As an example, this item allows students to explore an interactive tool that enables students to manipulate tiles before entering an answer to the question.

50 Technology-Enabled Items
Digital Media Video Animation Sound Interactive Tools Response Types Selected Response Constructed Response Example: Listen to President Kennedy’s 1961 inaugural address and then write an essay analyzing metaphors used regarding foreign policy. Technology-enabled items use {+} digital media as the stimulus, but do not require specialized interactions to produce response. Possible stimuli for English Language Arts technology-enabled items could include short video clips, audio recordings of lectures or speeches, or dramatic readings of prose or poetry. {+} Despite the use of these media types, a technology enabled item requires a student to provide either a selected response or a constructed response that consists of text. An English language arts technology-enabled item might require students to listen to President Kennedy’s 1961 inaugural address and answer a constructed response item analyzing the metaphors he uses regarding foreign policy. Another possible item might have students view a brief video about the proper way to brush and floss your teeth and then write a summary explaining the steps. For ELA assessments, most technology enabled items will be part of performance tasks that use non-text stimuli and Claim 3 items that involve listening to and/or viewing a stimulus. Example: View video and write a summary explaining steps in a process. Example: Tangrams, Calculator

51 SBAC Assessment Item Types
Selected Response Constructed Response Technology-Enabled Technology-Enhanced Performance Tasks To collect evidence about the wide range of assessment targets, Smarter Balanced will use a variety of item and task types. These item and task types fall into six broad categories and include {+} Selected Response items, Constructed Response items, Extended Response items, and Performance Tasks. In addition, there are two categories of technology-rich items and tasks known as Technology-Enabled And Technology-Enhanced. Each of these item and task types will be explored in greater detail in the item and task type modules. But let’s take a brief look at each category now.

52 Collects Evidence through a Non-Traditional Response
Technology-Enhanced Collects Evidence through a Non-Traditional Response Below is a poem, a sonnet, in which the speaker discusses her feelings about a relationship. Read the poem and answer the question that follows. Remember by Christina Rossetti Remember me when I am gone away,          Gone far away into the silent land;          When you can no more hold me by the hand, Nor I half turn to go yet turning stay. Remember me when no more day by day          You tell me of our future that you plann'd:          Only remember me; you understand It will be late to counsel then or pray. Yet if you should forget me for a while          And afterwards remember, do not grieve:          For if the darkness and corruption leave          A vestige* of the thoughts that once I had, Better by far you should forget and smile          Than that you should remember and be sad.   In the sonnet “Remember,” which two lines reveals a change in the speaker’s message to her subject? In contrast, {+} a Technology-Enhanced item capitalizes on technology to collect evidence through a non-traditional response type. As an example, {+} this item presents a sonnet and prompts students to highlight evidence in the poem that reveal a change in the speaker’s message.

53 Collects Evidence through a Non-Traditional Response
Technology-Enhanced Collects Evidence through a Non-Traditional Response The value of y is proportional the the value of x. The constant of proportionality for this relationship is 1. On the grid below, graph this proportional relationship. Similarly, this item asks students to produce a line to collect evidence about their understanding of proportional geometric relationships. In both cases, the response provided by the student is something different than selecting from a limited set of options or producing text or numbers. Smarter Balanced is committed to the use of technology to improve the quality of assessment. However, a Technology-Enabled or -Enhanced item will only be developed when it is the only way to access students’ understanding. More details about Technology-Enhanced items are provided in a separate module.

54 Technology-Enhanced Items
Specialized interaction May have digital media for stimulus Same requirements as selected and constructed response items Students manipulate information Defined responses Technology-enhanced items are computer delivered items that require {+} specialized interactions students must perform to produce a response. Responses produced by a technology-enhanced item require students to do something other than write text or select from among a set of options. These items may also include digital media as the stimulus. Technology-enhanced items should conform to the same essential requirements that have already been discussed for writing quality selected response and constructed response items. The only difference is that they allow students to manipulate information in ways that are not possible with traditional selected response and constructed response items. Like selected-response items, technology-enhanced items have defined responses that can be scored in an automated manner.

55 Technology-Enhanced ELA Example Item
Below is a poem, a sonnet, in which the speaker discusses her feelings about a relationship. Read the poem and answer the question that follows. Remember When you can no more hold me by the hand, Nor I half turn to go yet turning stay. Remember me when no more day by day You tell me of our future that you plann’d: Only remember me; you understand It will be late to counsel then or pray. Yet if you should forget me for a while And afterwards remember, do not grieve. For if the darkness and corruption leave A vestige* of the thoughts that once I had Better by far you should forget and smile Than that you should remember and be sad. This is an example of an ELA technology-enhanced item for grade 8. For this item, students begin by reading a poem. Next, students find and {+} highlight lines in the text that reveal a change in the speaker’s message. Depending on how the item writer designed the item, students are able to select one or more blocks of text in the poem. *vestige: a mark, trace, or visible evidence of something that is no longer present or evident. In the sonnet “Remember,” which two lines reveal a change in the speaker’s message to her subject?

56 Technology-Enhanced Mathematics Items
Draw a line of symmetry through the figure below. The graph on the right shows a triangle. Draw the triangle after it is reflected over the y-axis. Classify each shape below based whether it contains at least one pair of parallel sides. Reorder the fractions below so that they are ordered from smallest to largest. 3/5 3/4 2/6 1/2 2/3 As a few examples, a technology enhanced item may require the student to {+} produce a line or a set of lines, {+} to draw a shape like an isosceles triangle or a rectangle with a specific area or perimeter, {+} to rearrange the order of numbers or expressions, {+} or to categorize geometric shapes, numbers, or expressions by dragging and dropping them.

57 Example of technology-enhanced item
MDE Rollout Files

58 Example of technology-enhanced item
MDE Rollout Files

59 Comparing Technology-Enabled and Technology-Enhanced Items
Gregory is installing tile on a rectangular floor. • He is using congruent square tiles that each have a side length of ½ foot. • The area of the floor is 22 square feet. • The width of the floor is 4 feet. Use the grid and the tile below to model the floor. What is the length, in feet, of the floor? Draw a line of symmetry through the figure below. Technology enabled- and technology-enhanced items may both contain media elements that cannot be included with a paper-based test. {+} As an example, the technology-enabled item seen earlier asks students to use an interactive tool to explore a concept and to then select a response. It is technology-enabled because it uses an interactive tool as part of its stimulus, but it requires the student to produce a traditional response type. {+} This technology-enhanced item also asks the student to use an interactive tool. However, this item also asks students to use the tool to produce their response, namely a line. Both items capitalize on technology by using an interactive tool. {+} The technology-enabled items require students to produce a traditional text-based response. {+} In contrast, the technology-enhanced item requires students to create a line. The scoring rule that accompanies the item then compares the line created by the student with the correct response. While the difference between the two item types may seem small, the new response type that distinguishes a technology-enhanced item has important implications for item writing. 5.5 feet

60 Technology-Enhanced Item Types
Common English Language Arts Technology- Enhanced item types Dropdowns Classification Reorder text Select and order Select text Below is a poem, a sonnet, in which the speaker discusses her feelings about a relationship. Read the poem and answer the question that follows. Remember When you can no more hold me by the hand, Nor I half turn to go yet turning stay. Remember me when no more day by day You tell me of our future that you plann’d: Only remember me; you understand It will be late to counsel then or pray. Yet if you should forget me for a while And afterwards remember, do not grieve. For if the darkness and corruption leave A vestige* of the thoughts that once I had Better by far you should forget and smile Than that you should remember and be sad. *vestige: a mark, trace, or visible evidence of something that is no longer present or evident. In the sonnet “Remember,” which two lines reveals a change in the speaker’s message to her subject? The notes for a summary need to be arranged correctly in the order in which the events occurred in the passage. Click on each sentence and move it to arrange the sentence into correct chronological order. Summary of Events: Maria laughs with the old women. The guest and family eat dinner. Maria’s mother asks the guests for a story. Maria’s guests arrive. Maria becomes sad. The guests take turn telling stories. Classify each word below based on whether it is a verb or a noun. Verbs Nouns Doll Run Dog Swim Eat There are many different types of technology-enhanced item types that can be used to write an item. {+} Most ELA items will use one of the following types; dropdowns, classification, reorder text, select and order, and select text. As just three examples, the TEI poem item discussed earlier is an example of a select text item. Students selected a block of text from a poem as their answer. The item that asked students to order events in a story is an example of a reorder text item. And the item asking students to classify words as verbs or nouns is an example of a classification item. Additional examples are available in the Technology-Enhanced Item specifications.

61 SBAC Assessment Item Types
Selected Response Constructed Response Technology-Enabled Technology-Enhanced Performance Tasks To collect evidence about the wide range of assessment targets, Smarter Balanced will use a variety of item and task types. These item and task types fall into six broad categories and include {+} Selected Response items, Constructed Response items, Extended Response items, and Performance Tasks. In addition, there are two categories of technology-rich items and tasks known as Technology-Enabled And Technology-Enhanced. Each of these item and task types will be explored in greater detail in the item and task type modules. But let’s take a brief look at each category now.

62 ELA Design of Performance Tasks
Use 1-2 Stimuli for Grade 3. Use up to 5 stimuli for high school. Emphasis on stimuli related to science, history, and social studies. Components of a Performance Task Stimulus Readings Video clips Audio clips Graphs, charts, other visuals Research topic/issue/ problem etc. Information Processing Research questions Comprehension questions Simulated Internet search etc. Product/Performance Essay, report, story, script Speech with/without graphics, other media Responses to embedded constructed response questions. etc. For each component of a performance, a variety of elements may be included. {+} As an example, stimulus can be presented in a variety of formats including reading passages, video or audio clips, images, and topics that require research or investigation. Information processing may occur by having students research specific questions or by asking them to think about specific aspects of stimuli to which they were exposed. Products and performances can also take many forms including essays, stories, reports, and speeches. A wide variety of performance tasks may be developed by combining various stimuli, information processing tasks, and products. The number of stimuli to be used for a performance task differs across grade levels, from one or two at grade 3 to as many as five at the high school level. While stimulus materials should include a wide range of informational pieces, heavy emphasis should be placed on material involving science, history, or social studies content or themes that are consistent with the Common Core State Standards.

63 Parts of Performance Task
Part 1: Student reads research sources and responds to prompts (Claim 1 or 4) Part 2: Student plans, writes, and revises his or her full essay (Claim 2) or plans and delivers a speech (Claim 3) All ELA performance tasks are composed of two parts. {+} The first part of performance tasks presents stimuli that serve as research sources that the student uses to answer an initial set of questions. These stimuli can be informational text, literary text, and audio or video presentations. The questions developed for part one may provide evidence about Claim 1, but most often will focus on Claim 4. The questions students are presented in the first part of a performance task should relate to the questions asked in the second part of the task. The second part of a performance task requires the student to plan, write, and revise an extended response or to present a speech.

64 Performance Tasks for Mathematics
Incorporate a variety of math skills May draw on knowledge from previous grades Multi-step Based on real-world contexts Often include pre-work or group work to establish understanding of the context Require explanation and supporting of reasoning Taken from on 11/12/2012: This module focuses on writing extended response items and performance tasks. The Smarter Balanced Assessment Consortium will use a variety of types of assessment items and tasks to assess student mathematical proficiency. ***Presentation and video will be available shortly.

65 Test Administration Maximum Time Requirements for Performance Tasks
Grade 3–8: 105 minutes total Part 1: 35 min. Part 2: 70 min. High School: 120 minutes total Part 1: min. Part 2: min. All Performance Tasks will be administered in a controlled classroom setting with a time limit of approximately 105 minutes for grades 3 through 8 and 120 minutes for high school. Each task will have two parts.

66 Experience a Performance Task
Put on your “student hat:” Work on the performance task. Complete as much as you can (there may not be enough time to complete the entire task). Put on your “teacher hat:” Review the rubric or scoring guide Discuss: What skills do students need to complete the task? How should this impact your instruction?

67 Supporting Documents and Websites
CCSS – ELA Appendices A, B, C Mathematics Appendix A (High School Course Pathways) Implementation Plans for math and ELA ( hisdcommoncoreresources.wikispaces.com Implementation Plans for math and ELA ( - useful as a starting point, but may inadvertently lead to checklist thinking, so use it carefully; can modify the tool to make it work for you: change the columns, re-order standards, group standards, etc. Show wikispace

68 Work Time Identify a lesson or unit you will teach 2-3 weeks from now.
Select, create, or modify a performance task appropriate to the unit. Identify skills or strategies students need be successful on the performance task and develop a plan to teach those strategies. Examine instruction and assessment for the unit – does it require appropriate depth of knowledge and higher-order thinking?


Download ppt "SMARTER Balanced Assessment"

Similar presentations


Ads by Google