Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building a Learning Progression Rubric

Similar presentations


Presentation on theme: "Building a Learning Progression Rubric"— Presentation transcript:

1 Building a Learning Progression Rubric
SLO team, ARE July 2016

2 Objectives Review the SLO process and timeline for 2016-17
Understand the purpose of the learning progression rubric in the SLO process Understand the recommended rubric structure Understand the generic learning progression frameworks that should be utilized to build/modify an SLO rubric By the end of this presentation, teachers will have reviewed the SLO Process and Timeline for this first learning year; understand the purpose of the learning progression rubric in the SLO Process; understand the recommended rubric structure; and understand the generic learning progression frameworks that should be utilized to either build or modify an SLO rubric. This presentation is meant to be an addition to other resources found on the SLO Webpage (for example, the SLO Handbook and SLO FAQ) as well as to the support within the building (such as the school leader and/or [SI] TEC).

3 What are student learning objectives?
A collection of instructional best practices addressing four questions: What is important for my students to learn? (i.e. Where are my students going?) Deep understanding and prioritization of the standards, knowledge and skills to be mastered in the course BCC area presenting could include connection to resources they have developed here Where are my students starting? Determining initial baseline preparedness for each student in an SLO Where are my students now? Ongoing progress monitoring, formative assessment,  instructional shifts Where did my students finish? Determine End of Course Expectation Levels and mastery of the SLO learning objective using a learning progression Purpose Emphasize how SLOs are simply a collection of best instructional practices. Talking Points

4 Impacted Instruction Ongoing DDI Standards
Learning Objective Baseline preparedness Impacted Instruction Focused Differentiated Where are my students going? Where are my students starting? Instructional planning What do I teach next? Ongoing DDI *Facilitator note: This slide has animation to introduce each of the four questions on it’s own, to slow down the introduction of the overall graphic and speak to each instructional best practice. In our SLO/DDI Process, please notice that the Baseline Preparedness component of SLOs comes after we’ve determined our Student Learning Objective. It helps teachers to identify where students are starting in order to begin planning for instruction. SLOs and DDI aren’t one set of steps to be followed like a recipe, rather a collection of practices that work together to inform and impact instruction. All of them are cyclical in nature and inform instruction at different times and in different ways (year long planning, lesson planning, unit planning, etc) Where are my students now? Determine growth toward learning objective Assignments, assessments, tasks, observations End of Course Expectation Level Where did my students finish?

5 SLO Timeline August September October November December January
Teachers/SSPs School Leaders August September October November December January February March April May Review SLO components with grade or content peers Create or select 2 SLO Objectives Deadline LTG Phase: 9/30* Determine Performance Criteria and LP Rubric Collect and analyze Baseline Data Determine students’ Preparedness Levels Deadline LTG Phase: 10/30* Review SLOs Request revisions if necessary All year-long and S1 SLOs approved by 10/30 Conduct mid-year LEAP conversations Communicate mid-year SLO data entry expectations (not required by district) Communicate EOY deadlines Work with school leader on mid-year SLO data entry expectation Prepare to discuss students’ progress on SLO at mid-year conversation Deadline End of Course Phase** Reflect on Student Growth Submit End of Course Expectation Levels by school-determined date Deadline End of Course Phase** Work with SLT on calibration of expectations Review SLOs Request revisions if necessary Over the course of the year, with support and guidance from school leaders, teachers and SSPs conduct ongoing formative assessment, DDI, data teams and instructional shifts *BCC teachers and others with limited student contact time may submit by 10/30 in order to allow appropriate time to collect and analyze Baseline Data. School leaders have until the Friday before Thanksgiving break to approve. **End of Course SLO information must be entered by the LEAP EOY Conversation (State law requires these be held by two weeks before the last day of school.) School leaders should determine a specific SLO deadline; it is recommended deadlines be as close to the LEAP EOY Conversation as possible.

6 For each student, what growth does s/he make throughout the year?
SLOs in LEAP Baseline Preparedness Level For each student, what growth does s/he make throughout the year? End Of Course Expectation Level Each student should leave the course as, or more, prepared for the next course than s/he entered.

7 SLO Scoring matrix Did Not Yet Meet Expectations
Partially Met Expectations Approached Expectations Met Expectations Exceeded Expectations Significantly Underprepared Teacher & Evaluator Decision: 0, 1, or 2 3 Additional Evidence Needed 0 or 1 2 Somewhat Prepared 1 Prepared NA* Ahead 2 or 3 In order to determine the point values for each of the cells on this matrix, we began with the movement of a prepared student. A prepared student should leave the course/year at met expectations. This amount of movement would signal approximately one year’s worth of growth. Then, using this platonic ideal, point values for other cells were given. So, a somewhat prepared student should be leaving that course/year at least at Approached Expectations. An underprepared student should at least be leaving that course at Partially Met Expectations. Anything above this green diagonal would be considered higher growth, and therefore worth an additional point. Anything below this line would be considered less than expected growth, and therefore worth less points. Please note the three gray decision boxes on this matrix. These boxes indicate opportunities for a teacher and evaluator team to make a collaborative decision about the point value appropriate for that student’s growth. So, for example, a student entering the course significantly underprepared (more than 2 years behind grade level) could still make more than a year’s worth of growth, or even up to 2 year’s worth of growth and not hit partially met expectations. In a case like this, it would be up to the teacher and school leader team to determine what point value the growth of that student is worth (0, 1, or 2), based on strong evidence. A student who entered ahead, may be expected to leave the course at distinguished, but we also allow for the fact that distinguished may not capture the true amount of growth of that student. If the evidence supports, a teacher and school leader team may decide to award 3 points for a student in that situation. There is additional guidance on the SLO website for making these decisions (in the document: Overview of SLOs and Student Growth) with guiding questions. *Right now, in order to be preparing for making these decisions when the time comes, you, as the evaluator, should be speaking with your teachers regarding the students they’re assigning to the significantly underprepared, underprepared and ahead preparedness levels. Remember that for the underprepared and significantly underprepared categories teachers will need to enter additional information in the SLO Application. Also, if teachers wish to place students in the extreme-growth boxes in the upper right: they will first need to request approval to do so through the SLO Application. You return the SLO to the teacher and either grant or do not grant that request. They can then continue with the SLO, and submit it. If students have been placed in the extreme growth categories, they are asked to provide additional information for each student. You should particularly thoroughly review any student placed in these combinations. Significantly Underprepared to Met Expectations represents more than three years worth of growth. Research shows that students that start out behind rarely make more than 1.5 years of growth; they may catch up, but not at such an extreme rate. Teacher & Evaluator Decision Cells: Growth can look distinctly different for individual students falling in these cells. For example, a Significantly Underprepared student can demonstrate substantial growth, but still not meet the criteria for Partially Met Expectations of the current year standards. In these cells, teachers and evaluators determine the student’s growth based on the individual student’s body of evidence. Additional Evidence Needed Cells: Teachers will need to request of their evaluator, through the SLO Application, the ability to place students in these extreme-growth combinations. Once access to the extreme-growth combinations is provided, teachers will need to supply additional individualized evidence for students achieving these levels of growth.

8 SLO SCORING MATRIX Additional Evidence Needed Cells:
Did Not Yet Meet Expectations Partially Met Expectations Approached Expectations Met Expectations Exceeded Expectations Significantly Underprepared Teacher & Evaluator Decision: 0, 1, or 2 3 Additional Evidence Needed 0 or 1 2 Somewhat Prepared 1 Prepared NA* Ahead 2 or 3 In order to determine the point values for each of the cells on this matrix, we began with the movement of a prepared student. A prepared student should leave the course/year at met expectations. This amount of movement would signal approximately one year’s worth of growth. Then, using this platonic ideal, point values for other cells were given. So, a somewhat prepared student should be leaving that course/year at least at Approached Expectations. An underprepared student should at least be leaving that course at Partially Met Expectations. Anything above this green diagonal would be considered higher growth, and therefore worth an additional point. Anything below this line would be considered less than expected growth, and therefore worth less points. Please note the three gray decision boxes on this matrix. These boxes indicate opportunities for a teacher and evaluator team to make a collaborative decision about the point value appropriate for that student’s growth. So, for example, a student entering the course significantly underprepared (more than 2 years behind grade level) could still make more than a year’s worth of growth, or even up to 2 year’s worth of growth and not hit partially met expectations. In a case like this, it would be up to the teacher and school leader team to determine what point value the growth of that student is worth (0, 1, or 2), based on strong evidence. A student who entered ahead, may be expected to leave the course at distinguished, but we also allow for the fact that distinguished may not capture the true amount of growth of that student. If the evidence supports, a teacher and school leader team may decide to award 3 points for a student in that situation. There is additional guidance on the SLO website for making these decisions (in the document: Overview of SLOs and Student Growth) with guiding questions. *Right now, in order to be preparing for making these decisions when the time comes, you, as the evaluator, should be speaking with your teachers regarding the students they’re assigning to the significantly underprepared, underprepared and ahead preparedness levels. Remember that for the underprepared and significantly underprepared categories teachers will need to enter additional information in the SLO Application. Also, if teachers wish to place students in the extreme-growth boxes in the upper right: they will first need to request approval to do so through the SLO Application. You may grant or not grant that request and return the SLO to the teacher. They can then continue with the SLO, and submit it. If students have been placed in the extreme growth categories, they are asked to provide additional information for each student. You should particularly thoroughly review any student placed in these combinations. Significantly Underprepared to Met Expectations represents more than three years worth of growth. Research shows that students that start out behind rarely make more than 1.5 years of growth; they may catch up, but not at such an extreme rate. STOP: Take a minute to read and reflect on this matrix. How might you explain it to someone else (or put it into your own words)? There is a notecatcher available on your table for you to use. Teacher & Evaluator Decision Cells: Growth can look different for individual students falling in these cells. For example, a Significantly Underprepared student can demonstrate substantial growth, but still not meet the criteria for Partially Met Expectations of the current year standards. In these cells, teachers and evaluators determine the student’s growth based on the individual student’s body of evidence. Additional Evidence Needed Cells: Teachers will need to request of their evaluator, through the SLO Application, the ability to place students in these extreme-growth combinations. Once access to the extreme-growth combinations is provided, teachers will need to supply additional individualized evidence for students that achieve these levels of growth.

9 Review: the First Steps
Select what is most important for students to know by the end of the year. (Objective Statement) Determine what it “looks like” when students reach the Objective. (Performance Criteria) Specify the proficiency levels through which students progress on the path towards mastery of the Objective. (Rubric) During the contract year, all teachers must write two Objectives and complete the five SLO process components for each Objective in collaboration with their evaluator and their data team. Each Objective should address ALL students in a particular class or course section. The Objective Statement is paired with corresponding Performance Criteria and a learning progression rubric in order to help teams come to a consensus on a common definition for student mastery, common expectations for student performance, and a common method for measuring where students are at during various points along the instructional period. The focus of this session is the learning progression rubric which addresses each Performance Criterion and its series of proficiency levels.

10 Review: The first Steps What’s important for my students to learn?
i.e. Where are my students going? Which standards should be the focus of an SLO? To help answer this question, consider: Standards included in the SLO should be: The most important standards in the course; and Build upon and require mastery of other grade level standards in order to be proficient, i.e. be the culmination of an entire year of rigorous learning Pre-written district model SLOs are already written based on priority/high-impact standards. We strongly recommend first-year teachers choose at least one model SLO Work collaboratively with your team at your school. We strongly recommend tthose teaching the same content use the SLOs Teachers and SSPs must complete two SLOs each school year. Data must be entered into and approved by your evaluator in the online SLO Application. Purpose Talking Points It should not be possible for a student to master the content, skills, and standards of an SLO in one unit of a multi-unit course. The learning progression should be the culmination of a year’s worth of learning. During the contract year, all teachers must write two Objectives and complete all steps in the SLO process. Each Objective should address ALL students in a particular class or course section. The Objective Statement is paired with corresponding Performance Criteria and a learning progression rubric in order to help teams come to a consensus on a common definition for student mastery, common expectations for student performance, and a common method for measuring where students are at during various points along the instructional period. The focus of this session is the learning progression rubric which addresses each Performance Criterion and its series of proficiency levels.

11 The Learning Progression Rubric
“rubric” – same word, different meanings Rubric as a Scoring and Learning Tool Rubric as a Learning Progression* Specific to the task with which they are used (Brookhart, 2013) Contain answers to a problem, explain the reasoning students are supposed to use, or list facts and concepts students are supposed to mention (Brookhart, 2013) Describe a student’s level of Expectation or proficiency along the path of developing deeper understanding of the content and the gradual building of fluency in skills and academic language Use criteria and descriptions of student performance that generalize across time and can be used with different tasks (Brookhart, 2013) Describe successively more sophisticated ways of thinking about an idea (Wilson & Bertenthal, 2005) It is important to note that the use of the term “rubric” for the DPS SLO process does not refer to a rubric in the traditional sense, i.e., a scoring and learning tool for a specific task. Rather, “rubric” here means a learning progression specific to a teacher’s SLO. These rubrics are used to describe the typical growth process through which students move as they develop mastery of a standards-based Objective, rather than to score a particular assessment item or task. As such, the learning progression rubric helps teacher teams come to a consensus on a common definition for student mastery, common expectations for student performance, and a common method for tracking student growth during the instructional period. It also helps teachers identify and differentiate instructional strategies and can be used to provide students and parents clear and concise feedback. One recommended resource to help you distinguish between these two types of rubrics is: Susan Brookhart’s How to Create and Use Rubrics for Formative Assessment and Grading, ASCD, We recommend focusing on Chapter 3: “Writing or Selecting Effective Rubrics”. Additionally, work cited above includes: Wilson, Mark & Bertentahl, Meryl (Eds.). (2005). Systems for State Science Assessment. Board on Testing and Assessment, Center for Education, National Research Council of the National Academies. Washington, D.C.: National Academies Press. *This is the type of rubric used for the DPS SLO process.

12 Columns (Proficiency Levels):
Rubric Structure Columns (Proficiency Levels): Each column identifies milestones students typically reach as they develop mastery of the skills and/or concepts embodied in each Performance Criterion. The proficiency levels describe the developmental levels through which students progress on the path toward mastery. The following four proficiency levels should be used: Partially Met Expectations Approached Expectations Met Expectations Represents the level of the standards – that is, proficiency in the grade-level standards of the Objective Statement For SLOs, this column should use the exact wording of the Performance Criteria with each criterion appearing in a single box Exceeded Expectations In order to increase consistency across DPS, we ask that all rubrics for the DPS SLO Process use the same type of table, including four columns and as many rows as the number of Performance Criteria. This allows teacher teams to describe the developmental path for students in regard to each Performance Criterion. *Please note: The rubric should use the four above-mentioned proficiency levels, unless four levels do not sufficiently describe the full array of learning expected during the course. Specifically, there are two situations where a four column rubric may be insufficient: 1) in multi-age/multi-grade classrooms or pull outs (e.g., Montessori, SPED, ESL/ELD, and interventionists); and 2) in classrooms where more than half of the students are Underprepared or Ahead as identified at baseline.

13 Rubric Structure, cont. Rows:
The number of rows directly corresponds to the number of Performance Criteria. Example: Partially Met Expectations Approached Expectations Met Expectations Exceeded Expectations Performance Criterion 1 Performance Criterion 2 Performance Criterion The process of building a learning progression rubric should begin by simply copying the Performance Criteria and inserting them into the third column, then building out the other levels. We highly encourage you to collaborate with subject/grade-level peers during this process.

14 Suggested Resources for Rubric Building
Including, but not limited to: Performance Level Descriptors from PARCC Standards-based rubrics WIDA Can Do Descriptors TS Gold (ECE and Kindergarten) Curricular resources Student work We highly recommend that you draw ideas and language from high-quality resources that are available to you. In almost all cases, there is no need to create this learning progression rubric completely from scratch. Additionally, we highly recommend using student work to give you a sense of the typical pattern of growth students make on their way towards mastery.

15 General Guidelines Phrase descriptors in the positive on the basis of what students are able to do. Provide qualitative descriptions of what students at each level are able to do rather than to distinguish proficiency levels on the basis of quantities or percentages. Avoid vague evaluative terms (e.g., good, fair, poor, etc.) in which there is no consistent understanding of meaning. Let the “Exceeded Expectations” column be open for further interpretation as students may surprise us by demonstrating new ways to apply their knowledge. Please remember that there is no single correct way to write a learning progression rubric. The definition of each level of Expectation needs to be appropriate for the particular subject, grade level of the students, and the specific criterion. However, the following guidelines should be used as much as possible by all teachers: 1. Phrase descriptors in the positive on the basis of what students are able to do, as opposed to what they are not able to do. In this manner, an asset-based perspective is embraced. Even when students have not yet reached proficiency, students reach milestones that signify noteworthy progress. 2. Provide qualitative descriptions of what students at each level are able to do rather than to distinguish proficiency levels on the basis of quantities (e.g., number of details a student uses in a paragraph) or percentages (e.g., scores on certain assessments). 3. Avoid vague evaluative terms (e.g., good, fair, poor, etc.) in which there is no consistent understanding of meaning. 4. Let the “Exceeded Expectations” column be open for further interpretation as students may surprise us by demonstrating new ways to apply their knowledge. Please note that this column usually does not define the next grade level standard(s), but rather deeper understanding of the current grade level standard(s).

16 Guiding Questions What might we typically see from a student as they first begin to learn the content, skills, and/or academic language in the criterion? What do you teach first? And how? What milestones do students’ reach on their way towards proficiency and eventually mastery? How do you tailor your instruction during the course to deepen understanding of the content and the gradual building of fluency in skills and academic language? When working together in teams, teachers may find it valuable to use the above questions to help their team construct the learning progression rubric for their SLO. Thinking about these guiding questions in conjunction with the aforementioned resources and student work should allow teachers to create a typical path of student progression, noting that all students may not progress in exactly the same way toward mastery of each Performance Criterion. Think: What might we typically see from a student as they first begin to learn the content, skills, and/or academic language in the criterion? What do you teach first? And how? The answers to these questions should give you insight into how a learning progression rubric may evolve across the row for a certain performance criterion. It may also help you identify supports and scaffolds for students and possible strategies for reengagement of students. Think: What milestones do students’ reach on their way towards proficiency and eventually mastery? How do you tailor your instruction during the course to deepen understanding of the content and the gradual building of fluency in skills and academic language? The milestones you’re looking for as students progress towards proficiency may be apparent in tasks you ask students to do over the duration of the course.

17 Generic Learning Progression Frameworks
There are four generic options for delineating a typical continuum, or learning progression, from Partially Met Expectations to Exceeded Expectations. options should be combined and modified as needed, and the descriptors of each Expectation need to be written with subject- and grade-specific language and content The following frameworks are: Development of conceptual understanding Development of procedural skills Type or amount of support and scaffolding that a student needs in order to be successful The extent to which the student can monitor his/her own learning (metacognition) Next, we will look at four generic options, outlined in the following slides. These frameworks should be a good foundation for teacher teams to use to create a rubric that fits their own Performance Criteria. Frameworks can be combined and modified as needed based on the content and grade level of the SLO to create a learning progression that details how students would move toward mastery of the specific SLO. The following frameworks are: development of conceptual understanding; development of procedural skills; type or amount of support and scaffolding that a student needs in order to be successful; the extent to which the student can monitor his/her own learning (metacognition).

18 Framework 1: Conceptual Understanding
Partially Met Expectations Approached Expectations Met Expectations (Performance Criteria) Exceeded Expectations Students demonstrate knowledge of a concept through either recognition or recall. Students demonstrate knowledge of several related concepts through recognition or recall, yet make limited connections among them. Students apply knowledge of several related concepts to solve problems; students make strong connections among concepts. Students apply knowledge to other areas (e.g., to other subjects; in new and unfamiliar contexts). Students demonstrate a nuanced understanding of the concepts (e.g., the limits within which they can be applied to other domains). The first framework examines the development of conceptual understanding. Students are initially able to demonstrate knowledge of a concept through either recognition or recall and then move to being able to demonstrate knowledge of several related concepts through recognition or recall. At the Approached Expectations level, students are beginning to make connections among related concepts. At the Met Expectations level (proficiency), students are applying their knowledge of several related concepts in order to solve problems, and they make strong connections among concepts. Finally, at the Exceeded Expectations level, students can apply knowledge to other areas, such as other subjects or new contexts, and they can demonstrate a nuanced understanding of concepts. Please note the alignment between this framework and Bloom’s Taxonomy. Another way to construct a rubric using this type of framework would be to determine the level of the processes described by a Performance Criterion (i.e., grade level proficiency) using Bloom’s Taxonomy. Then, teachers can use a higher order process to define Exceeded Expectations and lower order processes to define the Approached and Partially Met Expectations columns.

19 Framework 2: Procedural Skill
Partially Met Expectations Approached Expectations Met Expectations (Performance Criteria) Exceeded Expectations Students use skills following a step-by-step, or effortful process with minimal fluidity. Students demonstrate errors when following the steps and a pattern of errors may be evident. Students demonstrate fluidity with the skills and apply them to solve problems in familiar contexts. Students demonstrate fluidity with the skills and apply them to solve problems in new contexts. Students’ fluidity allows them to demonstrate higher-order thinking while using the skills. Students apply the skills, often in combination with other skills, in novel ways. Students apply the skills to solve new, complex problems. The second generic framework examines the progression of procedural skill. Note how at the Partially Met Expectations, students may be able to use skills following a step-by-step process, or a process which requires a lot of effort, with little fluidity. Also at this beginning level, students demonstrate errors when following the steps, and the teacher may be able to see a pattern of errors in the students’ work. In the Approached Expectations level, students are demonstrating fluidity with the skills and they’re able to apply the skills in order to solve problems of a familiar type. As students progress to the Met Expectations level, they are demonstrating fluidity with the skills and are able to apply them to solve problems in new contexts. In addition, students can demonstrate higher-order thinking while using the skills. Finally, students that fall in the Exceeded Expectations level of this progression apply the skills, often while combining them with other skills in novel ways. In addition, students are able to apply the skills in order to solve new, more complex problems.

20 Framework 3: Support and Scaffolding
Partially Met Expectations Approached Expectations Met Expectations (Performance Criteria) Exceeded Expectations Students are dependent upon support from a teacher, coach, or more advanced peer and/or students are dependent upon scaffolded work/assessments* to demonstrate success. Students intermittently utilize support/coaching and/or scaffolding (less than that for Partially Met Expectations) to demonstrate basic success. Students demonstrate independence in applying concepts or skills. Support or coaching is utilized for success in solving more complex problems. Students demonstrate independence and reflectiveness in applying concepts and skills. Coaching is used for extending or achieving greater success, yet the coaching is more collaborative between student and teacher. Our third framework examines the progression of support and scaffolding in student learning. In this framework, students move from being dependent upon support from a teacher/coach/peer and/or scaffolded work and assessments in order to demonstrate success to being able to utilize support intermittently and less scaffolding. Students in the Approached Expectations level are able to demonstrate basic success with this more limited support and scaffolding. In order to reach Met Expectations (proficiency), students are demonstrating their ability to successfully apply concepts or skills independently, although they may utilize support or coaching in order to solve more complex problems. Finally, students are able to not only demonstrate independence, but also reflectiveness in successfully applying concepts and skills. When they have access to support/coaching, they are able to extend their skill set and achieve even greater success. Coaching at this level is often more collaborative between student and teacher than it is directive. It is encouraged that teachers examine the types of scaffolds they intend to use as a team. This will ensure consistency across student data, as well as encourage collaboration and sharing of successful practices. In addition, certain scaffolds or methods of support may lend themselves more appropriately to certain content areas. *Examples of scaffolds may include, but are not limited to: word banks, cloze type paragraphs, use of additional manipulatives/tools, “chunking” activities/assignments into smaller sections, simplified instructions, etc.

21 Framework 4: Metacognition
Partially Met Expectations Approached Expectations Met Expectations (Performance Criteria) Exceeded Expectations Students rely on external feedback to monitor and regulate their learning. Students monitor their learning with prompting. Students begin to use explicit learning strategies. Students monitor their learning with minimal or no prompting. Students use learning strategies efficiently and in appropriate contexts. Students independently monitor and extend their own learning. Students self-regulate their learning, generating and using learning strategies efficiently and in appropriate contexts. The last generic framework examines the progression of metacognition. This learning progression framework details how students move from relying on external feedback to monitor and regulate their learning to monitoring their learning with prompting and beginning to utilize explicit learning strategies. In the Met Expectations level, students monitor their own learning with little to no prompting and use learning strategies efficiently and in appropriate contexts. Finally, students in the Exceeded Expectations level push themselves beyond independently monitoring their own learning to also extending it; they self-regulate, and use and generate learning strategies that are efficient and appropriate.

22 2nd grade math example Partially Met Expectations
Approached Expectations Met Expectations (Performance Criteria) Exceeded Expectations Students rely on the use of concrete objects, models, and drawings  to add and subtract within 100. Students begin to show some evidence of using place value strategies to compute sums and differences within 100. Students can add tens and ones separately and use counting on strategies to count on by tens and ones, or a mixture of such strategies in which they add instead of counting the tens and ones. Students can consistently and accurately subtract within 100 when regrouping is not involved. Students independently add and subtract within 100, with fluency and accuracy, using strategies based on place value, properties of operations, and/or the relationship between addition and subtraction. Students independently add and subtract within 100 with fluency and accuracy using strategies based on place value, properties of operations, and  the relationship between addition and subtraction. Students rely on support and scaffolding  to write and solve number sentences for one-step word problems that require addition and subtraction within 100. Students independently, consistently, and accurately write and solve number sentences for one-step word problems that require addition and subtraction within 100. Students independently and with some accuracy, write and solve number sentences for two-step word problems that require addition and subtraction within 100. Students independently and accurately write and solve number sentences for one- and two-step word problems that require addition and subtraction within 100. Students independently write and solve word problems when given a number sentence involving addition and subtraction within 100. Students rely on the use of concrete objects, models, and drawings to add and subtract within 1000. Students begin to show some evidence of using place value strategies to compute sums and differences within 1000. Students can add hundreds, tens and ones separately and use counting on strategies to count on by hundreds, tens and ones, or a mixture of such strategies in which they add instead of counting the hundreds, tens and ones. Students can consistently and accurately subtract within 1000 when regrouping is not involved. Students independently and accurately add and subtract within 1000 using concrete models or drawings and strategies based on place value, properties of operations, and/or the relationship between addition and subtraction. Students independently and accurately add and subtract within 1000 using strategies based on place value, properties of operations, and  the relationship between addition and subtraction. When given the grade-level academic and content language, students correctly identify the addition and subtraction strategies they used and begin to explain the process they followed with those strategies. Responses may be incomplete or illogical. Using academic and content language with scaffolds and supports, students correctly identify and justify their selection of addition and subtraction strategies by explaining the process they followed with those strategies. Responses may be incomplete, but include a logical progression. Students correctly identify and justify their selection of addition and subtraction strategies and solutions by explaining why the strategies work, using place value, properties of operations, and/or the relationship between addition and subtraction. Justifications include correct use of grade-level academic and content language and symbols. Responses show a logical progression. Students critique different strategies using place value, properties of operations, and/or the relationship between addition and subtraction; and justify the efficiency  of those strategies. Student critiques include correct use of grade-level academic and content language and symbols. Using an example, we will illustrate how the generic frameworks may be used to build or modify a learning progression rubric. Here we have a 2nd grade math learning progression rubric that corresponds to the Objective Statement: All students will be able to fluently add and subtract within 100 and will be able to apply addition and subtraction strategies within 100 to solve one- and two-step word problems and justify their solutions, orally and in written form.

23 2nd grade math example, closer look
Partially Met Expectations Approached Expectations Met Expectations (Performance Criteria) Exceeded Expectations Students rely on the use of concrete objects, models, and drawings to add and subtract within 1000. Students begin to show some evidence of using place value strategies to compute sums and differences within 1000. Students can add hundreds, tens and ones separately and use counting on strategies to count on by hundreds, tens and ones, or a mixture of such strategies in which they add instead of counting the hundreds, tens and ones. Students can consistently and accurately subtract within 1000 when regrouping is not involved. Students independently and accurately add and subtract within 1000 using concrete models or drawings and strategies based on place value, properties of operations, and/or the relationship between addition and subtraction. Students independently and accurately add and subtract within 1000 using strategies based on place value, properties of operations, and  the relationship between addition and subtraction. When given the grade-level academic and content language, students correctly identify the addition and subtraction strategies they used and begin to explain the process they followed with those strategies. Responses may be incomplete or illogical. Using academic and content language with scaffolds and supports, students correctly identify and justify their selection of addition and subtraction strategies by explaining the process they followed with those strategies. Responses may be incomplete, but include a logical progression. Students correctly identify and justify their selection of addition and subtraction strategies and solutions by explaining why the strategies work, using place value, properties of operations, and/or the relationship between addition and subtraction. Justifications include correct use of grade-level academic and content language and symbols. Responses show a logical progression. Students critique different strategies using place value, properties of operations, and/or the relationship between addition and subtraction; and justify the efficiency of those strategies. Student critiques include correct use of grade-level academic and content language and symbols. These are the last two rows of the 2nd grade math learning progression rubric shown on the previous slide. By diving deeper into these two rows, we can see more clearly how the generic frameworks were applied to build this learning progression. The red text indicates where supports and scaffolds are used by students to demonstrate their abilities. This idea comes from Framework 3 (Support and Scaffolding). Along these same lines, the blue text indicates when students are able to perform skill(s) independently, thereby referring to Framework 3 as well as Framework 4 (Metacognition). The idea of being able to justify the efficiency of strategies also connects to Framework 4 because students at the Exceeded Expectations level are moving beyond accuracy to also being able to recognize efficient strategies. In purple text, we show a progression of procedural skills as described in Framework 2. Students in the Exceeded Expectations column use all of the enumerated strategies to accurately add and subtract within 1000, while at the lower levels, students use a limited number of strategies. Lastly, highlighted in green text, are examples from Framework 1 (Conceptual Understanding). Students move from simply identifying the learning the strategy they used, to justifying their strategy and explaining why it works, and finally critiquing, and justify the efficiency of, different strategies.

24 Using Your Learning Progression Rubric
What does this student work tell you about where your student is on the learning progression? After examining student work that aligns with your SLO and will be included in your Body of Evidence: check student performance as demonstrated by the student work against the learning progression rubric; record appropriate data for your SLO data tracker; record agreed upon changes in instruction. The successful application of the learning progression rubric requires the use of informed professional judgment and norming with peers. As this is a learning year, it is highly recommended that teacher teams modify their rubrics during the year to best represent the progression of developmental levels of their students, especially as these levels are observed in specific samples of student work over time. While classroom teachers submit their rubrics using the SLO Application as part of Phase 2 by January 31, they can continue to modify their rubrics, in collaboration with their evaluators, using their SLO planning pages. While teachers are building their Body of Evidence for their SLO, they should consistently return to their learning progression rubrics to determine if students are making progress and if student work is represented on their learning progression rubric. Please note that at the end of the year, all teachers will have to make a determination as to where each student places on the learning progression rubric in order to determine student growth. It is recommended teachers use the DPS Data Culture Protocols, which can be found on the Standards Toolkit at: to collaboratively examine student work with their team.

25 slohelp@dpsk12.org Dpsare.com
Available Resources Dpsare.com For SLO questions or concerns, please the ARE SLO team at: Do you have a strong learning progression rubric? Please share it with us at For more detailed information, please check out the resources on the SLO website. These resources include an SLO handbook which includes detailed guidelines for each step in the SLO process, as well as content considerations, SLO application user guides, SLO FAQ, and links to other helpful resources. Resources will be added throughout the course of this learning year, so please check back often for the most up to date information. For additional questions and concerns about the SLO process for teachers, please contact the SLO staff.


Download ppt "Building a Learning Progression Rubric"

Similar presentations


Ads by Google