Download presentation
1
Student Learning Objectives (SLO)
VALLEY GROVE SD 2014 ~ SLO Workshop Student Learning Objectives (SLO) PPT SOURCE: Dr. Cathleen Cubelic
2
Our Objectives Understand what is an SLO
Understand Process: Design, Build, & Review Consider Assessment Quality and Purpose Examine Webb’s DOK in reference to Assessment Collaborate for implementation Build the SLO on the Template Use online tools
3
SLO & Assessment Literacy
Pre-Test
4
Student Learning Objectives MIND DUMP
What do you know? What have you heard? What have you researched? Why are we doing this? Anything else?
5
Student Learning Objectives YOUR SLO…
…written specific to you and a specific class/course/content area for which you teach. Every teacher designs one. Collaborative development is encouraged. Design, Build, Review – Repeat next year/cycle In the interest of all students To improve the program Knowing the good teaching matters most Many Factors and Decisions to make: time frame, course content, learning needs, goal, assessment, measures, indicators… Local Decisions!!!
8
The Rating Tool PDE 82-1 …PVAAS Rostering
9
The SLO in PA is written in relationship to a specific teacher and a specific class/course/content area for which that teacher provides instruction.
10
“The PSSA test doesn’t completely measure my effectiveness.”
≠ The SLO [Elective data] is in response to this statement. SLO CONCEPTS STUDENT ACHIEVEMENT can be measured in ways that reflect authentic learning of content standards. EDUCATOR EFFECTIVENESS can be measured through use of student achievement measures
11
measure of educator effectiveness
SLO Definition A process to document a measure of educator effectiveness based on student achievement of content standards.
12
SLO Process The SLO process contains three (3) action components:
Design (ing): thinking, conceptualizing, organizing, discussing, researching Build (ing): selecting, developing, sharing, completing Review (ing): refining, checking, updating, editing, testing, finalizing Key Points for Trainers Explain that all components are done before the school year (initial conversation with principal) in preparing the SLO; however, the REVIEW component may also continue until the final results are available to determine whether or not the performance expectations have been reached. Clarify the specific timelines for the SLO process will be determined by local education agencies (LEAs) and not by the state; however, a generic timeline for the SLO process should be presented that outlines a before, during, and after school year set of activities. In general: Teacher develops SLO, along with applicable performance measures before school starts Principal reviews and discusses with teacher; adjustments may be required. Teacher reviews SLO progress at some midpoint in the year Principal receives mid-year update from teacher; adjustments may be required. Teacher summarizes performance measure data and evaluates each performance indicator Teacher presents final SLO results Principal assigns final rating in Section 5 DESIGN: This component is the “thinking” step in the process used to conceptualize the learning objective in terms of content, students, and performance measures. BUILD: This component is the “action” step in the process that focuses on completing the SLO Process template and creating and/or selecting performance measures. REVIEW: This component is the “reflection” step used to examine the three “Cs” (i.e., Completeness, Comprehensiveness, and Cohesion) of quality. IMT Orientation Draft 02Sept11-CS
13
Student Learning Objectives Components
Goal Statement – “big idea” of what the SLO is based on Endurance – Learning has worth beyond the assessment Leverage – Content has value across disciplines Readiness – Provides knowledge/skills necessary for success at future levels of instruction Performance Measures – Assessments used to measure student achievement Performance Indicators – Articulated target for student achievement Effectiveness Rating – Translation of number of students meeting performance Indicators How many met target and what does that mean?
14
Student Learning Objectives Assessment Literacy
Input vs. Output When we think about how we are changing education today, we are moving from a system that focuses on inputs to one that focuses on outputs. In an input world, what we care about for integrity of curriculum is making sure that all of our teachers are giving children exactly the same thing. This is a Betty Crocker curriculum. Betty Crocker has some fantastic recipes and we want to make sure that the boxes of cake always produce the same outcome. That’s what education has been. You get a publisher and they say here are the resources, follow the instruction to the letter and that is Input Integrity. Assessment changes all that. Assessment is about output integrity. Did the kid learn what he needed to learn? How does that make it different? When we think about outputs, we have to change all those input factors. Betty Crocker doesn’t help us; the recipe isn’t the guide. The assessment tells us where we need to add a little salt and where we need a little sugar, and where do we need to change what we’re making altogether. Formative assessment and summative assessment give us information about how successful we are, that we need to use in a different way to look at curriculum and instruction integrity, and build upon what we have done previously...adapting and changing in the name of improvement.
15
Student Learning Objectives Assessment Literacy
ASSESSMENT – Foundation for measuring success WEBB’S DOK – New version of Bloom’s Taxonomy Color Laminated Chart PDF Packet
16
What is RIGOR? Rigor in the classroom Rigor is creating an environment in which each student is expected to learn at high levels, each student is supported so that he or she can learn at high levels, and each student demonstrates learning at high levels. -Barbara Blackburn, 2008 Barbara Blackburn – National educational consultant, author of 14 books, newest on Rigor, presenter and keynote speaker
17
Rigor can be accomplished by:
Increasing the complexity of thinking in… Course content – learning progressions and appropriate leveled text for challenge Instruction – activities promote critical thinking, communication building, applying integrated ideas, application of concepts, promoting responsibility Assessment – aligned to instructional targets, engages with academic content, requires extended and elaborated responses.
18
Bloom’s Taxonomy Old (1950s) New (1990s)
HANDOUT: the laminated charts show you a comparison of BLOOM’s TAXONOMY with WEBB ’S DEPTH OF KNOWLEDGE.
19
COMPARISON BLOOM’s KEY POINTS: WEBB’s KEY POINTS: 6 levels
Different sources list different verbs The same verbs appear as examples in more than one cognitive level This overlap indicates that focusing ONLY on verbs to determine what is the level of cognitive demand is not fully adequate. WEBB’s KEY POINTS: The DOK is NOT determined by the verb (Bloom’s) but by the context in which the verb is used and in the depth of thinking that is required. Names 4 different ways students interact with content. Each level is dependent upon how deeply students understand the content
20
DOK is about what follows the verb...
What comes after the verb is more important than the verb itself… “Analyze this sentence to decide if the commas have been used correctly” does not meet the criteria for high cognitive processing. The student who has been taught the rule for using commas is merely using the rule.
21
Same Verb – 3 different DOK levels
DOK 1- Describe three characteristics of metamorphic rocks. (Requires simple recall) DOK 2- Describe the difference between metamorphic and igneous rocks. (Requires cognitive processing to determine the differences in the two rock types) DOK 3- Describe a model that you might use to represent the relationships that exist within the rock cycle. (Requires deep understanding of rock cycle and a determination of how best to represent it)
22
DOK is about intended outcome, …not difficulty
DOK is a reference to the complexity of mental processing that must occur to answer a question, perform a task, or generate a product. • Adding is a mental process. • Knowing the rule for adding is the intended outcome that influences the DOK. • Once someone learns the “rule” of how to add, is DOK 1 and is also easy. • Adding 4,678, ,578,885 is still a DOK 1 but may be more “difficult.”
23
WEBB’S DOK RESOURCES Online Search – tons of resources…
Laminated Charts – Webb’s vs. Bloom’s Handout DOK #1– Levels Described Handout DOK #2 – Subject Area Info Handout DOK #3 – Question Stems Activity: Question Analysis Math – Trip to the Capital ELA – Women Poem
24
SLO Process Components DESIGN
Thinking about what content standards to measure Organizing standards and measures Discussing with colleagues collective goals Researching what is needed for a high quality SLO Key Points for Trainers Designing is planning for the SLO, examining what is needed, and how performance measures are used to collect information about student achievement. Activities during this stage establish the foundation for developing a student learning objective, including such things as: Identifying target content standards Discussing “Big Idea” in the standards Thinking about the goal Collaborating with other teachers Brainstorming the type of performance measures IMT Orientation Draft 02Sept11-CS
25
SLO Process Components BUILD
Selecting the performance measure(s) Developing targets and expectations Completing the template Sharing the draft materials with other colleagues Developing/Documenting performance task(s) Key Points for Trainers Ensure the participants understand that the building is an iterative process between the original design and create the SLOs. Often the original designs must be changed once the details on how the standards will be measures and developed performance indicator targets. Activities during this stage complete the SLO Process Template 10.0 and include such things as: Selecting (or creating) the performance measures that are aligned to the targeted content standards Developing mastery and/or growth metrics associated with the performance measures Establishing performance indicator targets Identifying students included in the SLO data Creating performance expectations IMT Orientation Draft 02Sept11-CS
26
SLO Process Components
REVIEW Checking the drafted SLO (including the performance measures for quality Refining measures and targets Editing text and preparing discussion points/highlights for principal Finalizing materials Updating completed SLOs with performance data Key Points for Trainers Ensure the participants understand that the review phase requires an extensive evaluation of the SLOs quality in terms of the 3C’s. Completeness, Comprehensiveness, and Coherence Quality assurance checklist and rubric Activities during this stage occur before and after the presentation to the principal, and include such things as: Finalizing and submitting the proposed SLO Refining the SLO based upon feedback from the principal Collecting performance data on student achievement Adjusting SLOs at during the school year Updating SLOs with data Evaluating each performance indicator Determining the Elective Rating IMT Orientation Draft 02Sept11-CS
27
Design
28
What is a Goal Statement?
Definition: Narrative articulating the “big idea” upon which the SLO is built under which content standards are directly aligned. Characteristics: ENDURANCE: Encompasses the “enduring understanding” of the standard…beyond the test LEVERAGE: Central to the content area…but has value in other disciplines READINESS: Foundational concepts for later subjects/courses … necessary to the next step Endurance Leverage Readiness Endurance – Does it have value beyond the test? Leverage – Does it have value in across other disciplines? Readiness – Does it provide knowledge and skills that are necessary for success at the next level? IMT Orientation Draft 02Sept11-CS
29
Goal Statement Example
“Students will apply the concepts and the competencies of nutrition, eating habits, and safe food preparation techniques to overall health and wellness throughout the life cycle at individual, family and societal levels.” Does this convey an enduring understanding? Is there a central idea? Is there a foundation for later concepts? IMT Orientation Draft 02Sept11-CS
30
SLO Goal Goal Statement addresses: Standards Rationale Statement:
(Template #1) Goal Statement addresses: WHAT the “big idea” is in the standards Standards HOW the skills and knowledge support future learning Rationale Statement: WHY the “big idea” is a central, enduring concept Take the participants onto the SAS portal to the Curriculum Framework Select a content area to use as the model. As a district team, complete the template with a selected Goal Statement derived from the Big Ideas!
31
More Considerations for Goal Statements
Do you have previous data to help guide your goal? What does your growth and achievement look like? Is there a building/district-wide goal? Are there connections to SPP, PVAAS, Danielson areas of focus? Discussion
32
Activity: Goal Statement (Template #1)
Within your team, choose a discipline in which you’d like to focus. Preferably, choose a discipline that is very familiar to you. Complete “Template #1 Goal Statement” We will post them for the entire group.
33
Build
34
Template Section 1
35
Goal Goal statement should articulate an appropriate “big idea”. Standards should be the appropriate Focus Standards supporting the goal. Rationale statement should be reasons why the Goal statement and the aligned Standards address important concepts for this class/course. Focus on content shifts, PA Core Focus, Important Standards.
36
Template Section 2 Goal statement should articulate an appropriate “big idea” Rationale statement should be reasons why the Goal Statement and the aligned Standards address important learning for this class/course.
37
Performance Indicator
Definition: a description of the expected level of student growth or achievement based on the performance measure ***Articulates Targets for each Performance Measure*** Answers two questions………. Does the indicator define student success? What is the specific measure linked to the indicator? Example:
38
Examples of Performance Indicator Targets
Students will achieve Advanced or Proficient on all four criteria of the Data Analysis Project rubric. Students will score an average of 3 or better on five different constructed response questions regarding linear modeling according to the general description of scoring guidelines.( ne%20Scoring%20Guidelines%20-%20Algebra%20I.pdf) Students will improve a minimum of 10% points from pre- to post-test for material in each semester. Students will show “significant improvement” in the Domain of Measurement on the Classroom Diagnostic Tools Mathematics Grade 7 assessment from the first to the last administration. Read each example and have participants decide whether or not the indicators are good. Use the criteria previously established. Discuss each as a group (red or green bucket) and give the reasons for the decision.
39
Performance Indicator – Focus student group
A description of the expected level of achievement for each student in a subset of the SLO population (1F) based on the scoring tools used for each performance measure (4A). Subset populations can be identified through prior student achievement data or through content-specific pretest data.
40
Examples of Performance Indicator Targets: Focused Student Group
Students who scored below the 30th percentile on their benchmark AIMSweb R-CBM probe will score above the 30th percentile by the end of the school year using the national norms. Students who scored below a 2 on the pre-test will improve a minimum of one level on the post-test. What qualifies this as a focused student group? Content-based pretest or prior achievement data? How will the issues of growth and/or achievement factor into the decision about a focused student group indicator?
41
R A T I N G Goal Statement ~ Focus Standards Performance Indicator(s)
SLO Design Coherency R A T I N G Goal Statement ~ Focus Standards Performance Indicator(s) Performance Measure(s) All Students Targeted Students
42
Activity: Growth and Mastery
What assessments may be used as growth, mastery or both? Mastery Growth Participants complete a Venn Diagram using names of assessments as well as an example Performance Indicator Target for ALL students.
43
What are the characteristics of a quality assessment?
Write (3). Report out the summary from your table.
44
Good assessments have……
A specific and defined purpose A mixture of question types Items/tasks with appropriate DOK levels Items/tasks that are Standards Aligned A quality rubric A standardized scoring method Academic Rigor A reasonable time limit for completion An appropriate readability level Multiple methods of student demonstration Validity and reliability Well-written directions and administration guidelines Cut scores for performance categories
45
Academic Rigor Standards-Aligned Developmentally Appropriate
Focused on Higher-Order Thinking
46
Weighting, Linking, or Otherwise
Standard You may consider each Performance Indicator equal in importance. Linked You may link multiple Performance Indicators, if you like. Do this for “pass before moving on” assessments. 3. Weighted You may weight multiple Performance Indicators, if you like. Do this when you believe one or more PI’s are more complex or more important than others.
47
Standard Scenario Name Student Proportion Met Target PI 1
Building a Bridge Project 68/80 PI 2 Roller Coaster Design 56/80 P1 3 Egg Parachute 40/80 𝑇𝑜𝑡𝑎𝑙= = =54.7%
48
Weighting Scenario Physics Class with (3) PI targets: Name Weight
Student Proportion Met Target Points Acquired PI 1 Building a Bridge Project 50% 68 80 42.5 PI 2 Roller Coaster Design 25% 56 80 17.5 P1 3 Egg Parachute 40 80 12.5 Total Score = 72.5%
49
Template Section 3
50
Goal-Indicator-Measure
(Big Idea) SLO Goal Indicator #1 Assessment #1a Assessment #1b Indicator #2 Assessment #2 Goal-Standards Indicator Performance Measures Indicator #1 might be relating to a growth/progress monitoring PI whereas Indicator #2 may be for a mastery PI
51
Goal-Indicator-Measure
(Big Idea) SLO Goal Indicator #1 Assessment #1 Indicator #2 Assessment #2 Goal-Standards Indicator Performance Measures 1-1 indicator-assessment ratio
52
Performance Measure - Descriptions
State the name of the assessment(s). List the type of measure. Explain the purpose, state what the Performance Measure should measure. Identify the timeline and occurrence(s) Scoring Tools should indicate the solution key, rubric, checklist, etc. that is being used to score the PM. Administration & Scoring Personnel should contain who is giving the test and who is scoring it. Performance Reporting should state how others will know which students met the Performance Indicator(s). What’s the test? [generally] Why am I giving it? How will it be scored? When will it be administered?
53
Template Section 4 There need not be 5….This is arbitrary. Suggestion is between 2 and 5 PI’s
54
Teacher Expectations Definition: identifies each level (Failing, Needs Improvement, Proficient, Distinguished) students are meeting the Performance Indicator Targets. These reflect the continuum established by the teacher prior to the evaluation period. Each level is populated with a percentage range so that there is distribution of performance across levels. Based on the actual performance across all identified Performance Indicators, the evaluator will determine one of the four levels for the SLO.
55
Template Section 5
56
Review
57
Tools for Review SLO Coherency Rubric School Leader’s SLO Checklist
Assessment QA Checklist
58
The Online Tool http://www.pdesas.org/
Use the Homeroom link at bottom right Click the RIA Homeroom site link in the top paragraph Register and log in.
60
SLO & Assessment Literacy
Post-Test
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.