Download presentation
Presentation is loading. Please wait.
1
KY Science Assessment System Overview
KDE Science Team 11/2016 – 1/2017
2
Goals: to develop an understanding of…
the intent and purpose of each component of the assessment system: CEA, TCT, SSA. the TCT component as a process for calibrating 3 dimensional science instruction and student outcomes. And, to reflect on process and structures that are in place (or are needed) to identify ways to support the implementation of the 3 components of the assessment system. An acronym sheet was provided during the session. This resource can be accessed at this link: (under TCT Overview Documents at the bottom of the page).
3
ASSESSMENT vs. ACCOUNTABILITY
We begin with a discussion of assessment and accountability for 2 reasons: Educators tend to agree that various stakeholders confuse the terms for various reasons, one of which is the elevated role of state test scores as a measure of school/program quality. Assessment and accountability clearly do not mean the same thing but the terms can become conflated. By recognizing that variou stakeholders have different varying reasons for understanding interpreting the terms differently, we can improve promote conversation among various stakeholders and to help develop common understanding on language, and hence, a better understanding of issues within education. Today’s meeting (this PPT) is about an ASSESSMENT system, and not about accountability. The new science assessment system is being designed to systematically improve science teaching and learning across all grade levels. The Commissioner has stated that if this system is effective then a similar system will be used in other disciplines (math, el/a, social studies…)
4
What do these words mean to …
A student? A teacher? A parent? An administrator? A community member? A legislator? ASSESSMENT ACCOUNTABILITY
5
What do these words mean to you? How are they defined?
assessment accountability The evaluation of the quality or ability of someone or something responsibility, answerability
6
3 Ideas to consider: Do various stakeholders confuse these terms?
Does it matter? If so, why? What can we do to reduce confusion / develop common understanding?
7
Assessment Systems old thinking / new thinking
8
Curriculum & Instruction
? Assessment Refine Evaluates Attainment We know that instruction begins with standards. The standards are supported by curriculum and instruction. Assessments are developed to evaluate where students are towards attainment of those standards. The assessments, then, help refine curriculum and instruction. What types of assessments are used and how are they used? KAS for Science supports Curriculum & Instruction
9
Old Thinking Assessment System Classroom Assessment
Since the passage of KERA, KY has had a state summative test for science, so that when combined with classroom assessment, an assessment “system” could be said to exist to provide feedback about curriculum and instruction effectiveness for different purposes. Many districts added benchmark or common assessments to their “system” to provide additional information for various reasons (identify struggling students relative to a benchmark, support PLC work, etc). However, in many cases the benchmark/common assessment was used as a predictor for performance on the state assessment, which resulted in classroom assessments becoming a predictor of the benchmark assessment – thus, science assessment at all levels might tend to look like the state summative test. One problem is that this type of assessment is not the best way to get evidence of student performance, particularly if the goal of assessment is to improve student learning. Bottom line, in some cases this type of assessment system had the unintended consequence of limiting assessment effectiveness. Benchmark/Common Assessment predicts State Summative Assessment State Summative Assessment predicts
10
Assessment System – Old Thinking
Think about the results this type of system might produce... Does this system produce what we want for Kentucky education? Why or why not? Classroom Assessment Many say the “old system” produced good test takers, had the effect of stifling students’ creativity, curiosity, problem solving skills, etc. Benchmark/Common Assessment predicts State Summative Assessment predicts
11
Science Assessment System
New Thinking Classroom Embedded Assessment Through Course Tasks State Summative Assessment evidence evidence evidence Each component of the new science assessment system is intended to provide different types of evidence about the effectiveness of science instruction so that instruction and assessment can improve systematically, over time. The evidence from each component will influence and calibrate the other components. The new science standards demand evidence of the 3 dimensions working together; this assessment system will support developing the capacity of educators and the assessments themselves. KAS for Science supports Curriculum & Instruction
12
Assessment Systems Old vs. New
Reflection Time… Assessment Systems Old vs. New Some see 3 components of the new assessment system, which is the same number as the “old system,” and also see TCTs as being the same as “benchmark/common assessments.” While some districts may be using common/benchmark assessments to achieve some of the outcomes of the TCT component of the new system, there are some purposes of the TCTs in the new system are clearly different and it is extremely important that administrators and educators understand how to use this new component, and more importantly, how not to use it (not to evaluate teachers, students, schools, districts). Some look at these models and interpret there’s not much difference. Why might that occur? What do you see as the biggest differences at this point?
13
Focus is using the 3 Dimensions to make sense of phenomena
Classroom Embedded Assessment Through Course Tasks State Summative Assessment evidence evidence evidence The 3 components of this system work together, providing different evidence to improve instruction and assessment systematically, but the components need to have commonality for this system to “work,” or for the evidence to have some common alignment in order to calibrate. We say that each component has a common focus of “using the 3 dimensions to make sense of phenomena.*” It takes experience and time to develop useful understanding of what this phrase means. We will attempt to unpack this in the next few slides. *from an engineering perspective we would phrase this as “using the 3 dimensions to solve problems.” KAS for Science Focus is using the 3 Dimensions to make sense of phenomena supports Curriculum & Instruction
14
What does it mean to “use the 3 dimensions?”
What do we mean by “phenomena?”
15
What does it mean to “use the 3 dimensions?”
16
crosscutting concepts
KAS for Science 3 dimensions practices crosscutting concepts core ideas This graphic identifies the 3 dimensions of our science standards. The rope is a metaphor to show that these dimensions are intertwined, resulting in strong sense-making of scientific phenomena/problem solving.
17
KAS for Science 3-PS2-2 Make observations and/or measurements of an object’s motion to provide evidence that a pattern can be used to describe future motion. An example of a 3rd grade Performance Expectation, or science standard. The color coding identifies the 3 dimensions of the standard. The code in the beginning indicates 3rd grade, physical science, in particular – force & motion (3-PS2-2).
18
“Tools for accessing, processing, and explaining”
Science and Engineering Practices Asking Questions/Solving Problems Developing and Using Models Analyzing and Interpreting Data Planning and Carrying Out Investigations Using Mathematical and Computational Thinking Constructing Explanations / Designing Solutions Engaging in Argument from Evidence Obtaining, Evaluating, and Communicating Information “Tools for accessing, processing, and explaining” The science and engineering practices (SEP) are TOOLS that students use (they are the same tools that scientists and engineers use). 2 main points: These tools are used to meaningfully access, process, synthesize, and explain information. While some tools may seem very specific to the discipline of science, all disciplines (e/la, Math, SS, etc) use tools similar to these and we should be intentional in our instruction so that students understand this. Consider which of these SEP (or tools) are “hands-on.” Many think of science as “hands on” learning, which can have the unintended effect of stakeholders assuming that when kids are engaged in a hands on experience that they are learning science in a meaningful way. Only “carrying out investigations” is required to be “hands on.” “Modeling” might be “hands on” but doesn’t have to be because models are more than physical reproductions. Effective use of these tools results in “minds on” experiences. Helping students build proficiency with these tools is one key to effective science instruction.
19
Cause and Effect Systems “Sense-making Lenses”
Crosscutting Concepts Patterns Cause and Effect Structure and Function Scale Proportion and Quantity Energy and Matter System and System Models Stability and Change It can be useful to think of the CCC in 3 general categories: Patterns Cause and Effect Systems “Sense-making Lenses” The crosscutting concepts (CCC) are ways that people think in order to make sense of things – lenses for sense-making. While there are 7 CCC defined in our standards, these 7 CCC can be grouped into 3 more general categories in order to more simply understand them: patterns, causality and systems thinking. These 3 ways of thinking are distinctly very different, and result in different approaches to problem solving. It can be a good professional learning experience for teachers to reason through placing each of the 7 into the 3 categories because it deepens understanding of the CCC. Clearly, students will use all 7 CCC in their K-12 science experiences and beyond, and the 7 CCC provide more clarity about the 3 general categories. Developing proficiency with the CCC, understanding how they are used to explain phenomena or solve problems, is very challenging but also very critical for developing lasting proficiency in sense-making/problem solving. Finally, while the CCC are called “crosscutting” because they cross all disciplines of science, the also cross all general disciplines (Math, e/la, SS, etc). We should be intentional in our instruction so that students understand this connection to other content areas, and use these ways of thinking in all disciplines. This connection should also serve as a vehicle for interdisciplinary conversations among teachers.
20
Quick example Content: Heat moves from hotter regions to colder regions More commonly: Heat moves from hot to cold What is most effective to “learn” this content? When does that begin? Why does a frozen steak make the kitchen counter cold? Just knowing the “content” (heat moves from hot to cold) is not adequate to explain this, although you might be able to reason through to a correct answer on a multiple choice test with just this content information. It is proficiency with the SEP and CCC that develops deep understanding of why the counter feels cold. Lasting and transferrable understanding.
21
Message: It’s the intentional and systematic use of the practices and crosscutting concepts, developed over time, that give students the capacity to “own” this lasting understanding. Must begin in kindergarten and continue each and every year!
22
How do you know what to expect for students to be able to do with respect to the SEP and CCC at any age / grade level? Too many educators do not know of this valuable resource. Also useful for differentiating instruction because a learning progression is defined for each of the 8 SEP and 7 CCC. NGSS Appendices
23
Analyzing & Interpreting Data
This is an sample of one of two pages of this SEP: analyzing and interpreting data. There are 8 SEP. There is a lot of information for educators to use in both instruction and assessment of the 8 SEP along the K-12 continuum. Appendix F can be found here: Appendix F
24
Analyzing & Interpreting Data
The grade band continuum for the 7 CCC is defined in Appendix G. Appendix G can be found here: Appendix G
25
What does it mean to “use the 3 dimensions?”
What do we mean by “phenomena?” A scientific phenomenon is anything you can observe and explain using scientific principles. An engaging phenomenon tends to make you wonder… But whether or not something is engaging can depend on how it is presented. One way of saying this is that “a phenomenon does not have to be phenomenal.” The SSA pilot and the TCT examples to come will provide more examples of phenomena that could be explored on science class or on assessments.
26
Consider your most important takeaways from what you have heard so far… Be prepared to share.
27
Curriculum & Instruction
Classroom Embedded Assessment Through Course Tasks State Summative Assessment evidence evidence evidence KAS for Science supports Curriculum & Instruction
28
Curriculum & Instruction
Classroom Embedded Assessment Through Course Tasks State Summative Assessment evidence evidence evidence KAS for Science supports Curriculum & Instruction
29
Classroom Embedded Assessment Through Course Tasks
State Summative Assessment Provide a sampling of a school’s science program level of achievement (based on KAS Science) and identify percentage of students meeting expected levels of attainment particularly as they explain phenomena, use models, and solve problems using practices, core ideas, and crosscutting concepts This is the purpose statement for the SSA from the Science Assessment System document found here:
30
Science State Summative Assessment Design Approach
PEs are “Bundled” – gather evidence of sense-making of phenomena or problem solving A Storyline guides student sense-making This “design approach” of bundling PEs and using a storyline to explore the phenomenon was used in the 7th grade pilot given in the spring of This same approach will be used in the 2017 field test at grades 4, 7, 11.
31
7th Grade Pilot (Spring 2016) – one cluster developed for a 2 PE Bundle
07-PS3-5 Construct, use, and present arguments to support the claim that when the kinetic energy of an object changes, energy is transferred to or from the object. 06-ESS2-4 Develop a model to describe the cycling of water through Earth's systems driven by energy from the sun and the force of gravity. Note the 2 different grade levels in the bundle, and the 2 different science disciplines (physical science and earth science). In the piloted assessment, a phenomenon was explored in order to get evidence of all 3 dimensions of these 2 PEs/standards. The storyline within the assessment presents the phenomenon to be explored. The initial storyline is presented on the next 2 pages.
32
7th grade student Hannelore visits a museum that is located along a wide, flat stretch of river. The museum has a waterwheel in back. Waterwheel
33
Inside the museum, Hannelore learns that the waterwheel has powered the lights of the museum since 2004 – no electricity bills for years! She decides to explore how this could be. The phenomenon explored in this cluster is that this museum does not have to pay for electricity. The question is, where does the energy to power the lights come from? Yes, it comes from the water wheel, and the water wheel gets it’s energy from the river, but where does the river get its energy? Etc…
34
2017 Field Test for Science Assessment
Teacher developed phenomenon-based clusters of assessment items Grades 4, 7 and 11 6 selected response, 2 constructed response per cluster (in general) 20 clusters were created at each grade level (4, 7, 11). Each student at grades 4, 7 and 11 will get a test booklet with 2 clusters. The 2 clusters for each booklet will vary, so it is unlikely (although possible) that 2 students in the same classroom will have the same 2 clusters. Each cluster is expected to take about minutes, but a cap of 70 minutes has been set for the field test. Selected response may have more than four answer choices, and students may be asked to select 1, 2, or 3 answer choices for a selected response question. Students will be told how many to select – they will NOT be told to “select all that apply.” Constructed response means any question that must be hand scored. It could be a written explanation, a model, a flowchart, etc. What’s different about the State Summative Assessment (SSA) for science? Phenomenon-based item clusters The field test of the SSA will utilize a phenomenon-based approach. Rather than being a collection of individual and unrelated items, questions will be grouped in clusters in which every question will be related to the others by asking students to use their understanding of science and/or engineering design to consider a phenomenon or solve an engineering design problem. A phenomenon is anything that can be observed and that students can attempt to understand or explain. Each cluster of items will attempt to gather information about two or more standards that have been bundled together by the item writer. The phenomenon chosen by the item writers is one that allows the bundled standards to be assessed in a logical way. The assessment clusters use narrative text to lead students through the progression of questions. This narrative text is referred to as the storyline and its purpose is to set up the context of the phenomenon and to link the items together in a way that makes sense to the student. For example, the phenomenon of stream erosion could be used to create a cluster of items. The two standards bundled together to measure by this cluster could be an ESS (earth science) standard and one from ETS (engineering design) to create a cluster that asks students to both explain the reason a stream becomes muddy and to evaluate possible solutions to prevent this from happening. The storyline for this cluster might involve two friends noticing the creek near their house becoming muddy after a rain storm and attempting to explain why, then brainstorming ways to reduce the amount of sediment that washes into the stream. One immediate difference students will notice when opening the field test booklets is this storyline. Rather than starting with Question 1 on the first page, each cluster begins with varying amounts of text that establish the phenomenon and/or provide needed information for students to answer the questions that follow. For the hypothetical example above the storyline might begin: Two friends, Marcus and Tammy are walking down their street when they notice the stream that flows beside it is much higher than normal, and the water is dark brown and muddy. “Wow, look at the stream,” said Marcus. “I’ve never seen it that muddy before.” “Yeah,” said Tammy. “It was really clear last week. I wonder what changed to make it so muddy.” “Let’s see if we can find some information to help us figure it out.” This text might then be followed with a table of rainfall data, and then the first question requiring students to use the data to provide an answer. Multi-dimensional items The other difference as compared to previous science assessments students are familiar with is the multi-dimensional approach to item development. Item writers were asked to develop two or three dimensional items, even for multiple choice items. This means the items are designed to measure students ability to use the Science and Engineering Practices (SEP) and Crosscutting Concepts (CCC) as well as the Disciplinary Core Ideas (DCI) of the standards. Most science assessments in the past were heavily weighted toward what is traditionally thought of as science “content” and few items attempted to measure the practice of science. On the field test students will be asked to identify patterns, construct arguments, link cause and effect, etc. using the DCI as the context for applying those SEP and CCC. For example: Which of these most likely caused the stream to become muddy? a) Rain washes soil into the stream as compared to Tammy claimed that the rainfall data explains why the stream becomes muddy. What pattern in the rainfall data supports her claim? a) The creek was always muddy 2-3 days after it rained upstream In the second instance there is a focus on the student being able to identify a pattern in the data. Patterns are one of the Crosscutting Concepts, so the second question is an example of how a question can assess both a CCC as well as the DCI. Specifics of test construction In general, each cluster will consist of six (6) multiple choice questions and two (2) open response. This is not an absolute, as there are some clusters with more or less of both question type as determined by the standards that were bundled to create the clusters. Some multiple choice questions require students to select more than one correct answer. Unless the item specifies otherwise, there is a single correct answer. The number of correct response will be identified in every question as follows: For single answer questions the quantifier one will be included in the question: (7) Which one location would be the best place to build a dam? For multiple answer questions the number of correct responses will be specified in a separate line immediately preceding the answer choices: (7) Which locations for a dam meet the design criteria? Select the TWO best answers. Multiple choice questions are not limited to a fixed number of answer choices. Although the large majority contain four (4) there are some that contain more, especially in the cases where students are asked to select more than one answer. Constructed response questions on the field test ask for a greater variety of response types than in previous assessments. Students may be asked to draw and annotate a model, create a flow chart, create and explain a graph or some other type of response beyond a traditional written answer. In some cases, constructed response questions may incorporate multiple elements such as several related short answer questions combined within the same answer space. For example: Part A. Where would the water collect if Gate A were placed in the following positions: (your answer choices could include upper pool, lower pool, river, reservoir, or unchanged) 1. Position X ________________________________________ 2. Position Y ________________________________________ 3. Position Z ________________________________________ Part B. Where would the water collect if Gate B were placed in the following positions: (your answer choices could include upper pool, lower pool, river, reservoir, or unchanged) 4. Position M________________________________________ 5. Position Q ________________________________________ 6. Position R ________________________________________ Part C. Based on these results, what predictions can you make about the effect of Gate C on the system? Item vocabulary At all grade levels, the terminology of the questions reflects the language of the standards. This includes the wording of the SEP and CCC as well as any content-specific language contained in the Kentucky Academic Standards (KAS) for Science. For instance, the first grade standard 1-LS1-1 contains the word “mimicking” and therefore it might be possible for this word to appear in a question. Fourth grade standard 4-PS4-1 contains “amplitude” and “wavelength,” so students might be expected to understand how both terms relate to the properties of a wave. Common terminology used in the SEP and CCC might also be expected to appear in questions, such as asking students to “construct an explanation” or to “generate a solution.” Since the Engineering Design (ETS) standards will be included in the field test, terms such as “criteria for success” and “constraints” may be included in questions. For instance, a student might be asked: Which one of these criteria for success would most likely be met by Joe’s proposed solution?
35
KAS Science Field Test Summative Item Cluster Field Test for Regular Assessment scheduled Elementary/Middle Schools March 13-17, 2017 All students at Grades 4 and 7 High Schools Upon arrival (week of March 6 through week of March 27) Administration at Grade 11 All schools will be involved Combined student booklet (no separate answer document) No student performance levels generated OAA:DAS:js: 11/10/2016
36
State Summative Assessment
Reflection Time… State Summative Assessment In what ways is the new science summative assessment different from our previous assessments?
37
Let’s experience what we’re asking of students
38
Making sense of repeating motion
Work in small groups; 2 “pendulums” of different lengths Part A: working together, describe how one pendulum “swings” relative to the other. Make sure there is agreement within your team. Share your description with another group. Do you agree?
39
Part B: in your same small group,
quantify how each of the 2 pendulum “swings” so that you can compare with another group. Make a numerical description of the swinging characteristic for each pendulum. Share your numerical description for each pendulum with another group. Do you agree? Was your approach to “quantifying” the same so that you could compare? Do you need to test further?
40
Reflection Based on this experience, can you predict how an even longer pendulum will swing? Can you make a general statement about the relationship between pendulum length and “swing speed?” Based on this experience, are there other things about pendulums that you wonder about?
41
Compare this experience with “typical” science lab about Pendulums
What do you notice? What skills and competencies does each experience develop?
42
Instructional Shifts in Science
Reflection Time… Instructional Shifts in Science
43
Curriculum & Instruction
Classroom Embedded Assessment Through Course Tasks State Summative Assessment evidence evidence evidence KAS for Science supports Curriculum & Instruction
44
Classroom Embedded Assessment
Use organizer to collect your thoughts Based on your understanding of the words, what might classroom embedded assessment mean? If you went into a classroom where CEA is happening, what “things” might you see? What “type of interactions” might you see?
45
Classroom Embedded Assessment
Discuss with your neighbor; refine notes Study the description of CEA Purpose from the SAS overview document and compare with your notes Reconcile your current understanding with the SAS description of CEA – critical to the system working The following is the “purpose statement” from the SAS document. The entire document can be found here: Purpose of Classroom Embedded Assessment within the Science Assessment System: An ongoing process to provide opportunities for seeking and interpreting evidence of 3-dimensional science learning for use by learners and their teachers to decide where the learners are in their learning, where they need to go and how best to get there CEA is the on-going formative assessment process that KY educators have been working to understand and improve their capacities because research is clear that the strong formative assessment practices have significant impact of student learning/success. Outstanding formative assessment practices in the context of our science standards pose an additional challenge because of the focus on 3 dimensional sense-making or problem solving. Formative assessment around the SEP and CCC is new for many teachers and students, as is formative assessment around the nexus of any 2 or all 3 dimensions.
46
Classroom Embedded Assessment
Reflection Time… Classroom Embedded Assessment
47
Curriculum & Instruction
Classroom Embedded Assessment Through Course Tasks State Summative Assessment evidence evidence evidence KAS for Science supports Curriculum & Instruction
48
Curriculum & Instruction
Classroom Embedded Assessment Through Course Tasks State Summative Assessment Calibration Instruction Student Outcomes evidence evidence evidence The Through Course Tasks and the collaborative process that teachers should use to implement a TCT serve to calibrate understanding of the intent of the standards and expectations of student performance. So, not only will facilitation of the task tend to be calibrated appropriately due to the collaborative process that the teacher teams engage with, but this will also help teachers calibrate and refine their classroom embedded assessment. The calibration achieved through the TCT component will also potentially impact expectations for the SSA, and thereby serve as a calibration mechanism for the system. Obtaining evidence of a range of student performance for each grade level with common tasks will have a significant impact on the assessment system and science teaching and learning. KAS for Science supports Curriculum & Instruction
49
The Through Course Task Process
Through Course Tasks vs. The Through Course Task Process The intentional design of the TCT (the task itself) and how the task is intended to elicit the desired evidence of student performance is important to the success of this component of the assessment system. Equally important is the collaborative process that teachers teams should use to effectively implement a TCT with their students. 49
50
Characteristics of TCTs – the task
Students learn through engagement with the task – the task is a sense-making experience TCTs are 3 dimensional tasks specifically designed to get evidence of student competency in 2 dimensions – practices (SEP) and crosscutting concepts (CCC) – untethered from PEs/standards Tasks are to be used formatively – goal is for both student and teacher to understand strengths and improvement needs for the SEP and CCC evaluated 50
51
Characteristics of the TCT Process
Implementing TCTs is a collaborative process for teacher teams to calibrate and refine strategies and expectations for student performance 51
52
Planning for Task Facilitation
TCT Process: a collaborative process for calibrating and refining teaching and learning around rich tasks at every grade level. Planning for Task Facilitation Facilitating the Task Post Task Analysis The TCT Process is a 3 step process that was intentionally designed to achieve specific outcomes for each step of the process. Page 1 of the TCT Facilitation Process Planning document identifies these “desired outcomes” for each stage in the process. Pages 2 and 3 identify “actions” and “possible strategies” for achieving the desired outcomes for each stage. This document is intended to be a tool for teacher teams to use to help them with TCT implementation, something to provide guidance for what they are trying to accomplish. Depending on the experience levels of the teacher teams (or individual members of a team), the team may choose particular aspects of each stage to focus on during their initial TCT experience in order to systematically develop their capacity and not feel overwhelmed. The TCT Process document can be found here: 1 2 3 52
53
Intentionally Designed
Each step aligns with one of the Framework for Teaching domains PLC Team planning Data teams Framework for Teaching Domain 1 Planning Framework for Teaching Domain 3 Instruction Framework for Teaching Domain 4 Professional Responsibility
54
Planning for Task Facilitation
Planning for Task Facilitation Facilitating the Task Post Task Analysis Desired Outcomes Teachers are prepared to optimally facilitate to each learner’s needs in order to collect accurate evidence of each student’s measure of proficiency in 3-dimensional sense-making within the context of the TCT Teachers deepen their understanding of 3- dimensional sense-making of scientific phenomena and engineering solutions Teachers collect defensible evidence of each student’s competencies in 3 dimensional sense-making for the TCT in order to support each student’s growth. Teachers increase their competencies with task facilitation – meeting each student in their “zone” with appropriate feedback questions, documentation of supports given, etc. Kentucky teachers define a grade-level continuum of student performance for the task in 3- dimensions Teachers have specific information to refine curriculum and reflect upon the impact of instructional decisions on each student for the dimensions identified in the TCT Teachers expand their understanding of the connection between student performance and facilitation strategies The next several slides attempt to explain the layout of this document in a way that is less overwhelming than the document all at once.
55
Planning for Task Facilitation
Planning for Task Facilitation Facilitating the Task Post Task Analysis Desired Outcomes Teachers are prepared to optimally facilitate to each learner’s needs in order to collect accurate evidence of each student’s measure of proficiency in 3-dimensional sense-making within the context of the TCT Teachers deepen their understanding of 3- dimensional sense-making of scientific phenomena and engineering solutions Teachers collect defensible evidence of each student’s competencies in 3 dimensional sense-making for the TCT in order to support each student’s growth. Teachers increase their competencies with task facilitation – meeting each student in their “zone” with appropriate feedback questions, documentation of supports given, etc. Kentucky teachers define a grade-level continuum of student performance for the task in 3- dimensions Teachers have specific information to refine curriculum and reflect upon the impact of instructional decisions on each student for the dimensions identified in the TCT Teachers expand their understanding of the connection between student performance and facilitation strategies
56
Planning for Task Facilitation
Planning for Task Facilitation Facilitating the Task Post Task Analysis Desired Outcomes Teachers are prepared to optimally facilitate to each learner’s needs in order to collect accurate evidence of each student’s measure of proficiency in 3-dimensional sense-making within the context of the TCT Teachers deepen their understanding of 3- dimensional sense-making of scientific phenomena and engineering solutions Teachers collect defensible evidence of each student’s competencies in 3 dimensional sense-making for the TCT in order to support each student’s growth. Teachers increase their competencies with task facilitation – meeting each student in their “zone” with appropriate feedback questions, documentation of supports given, etc. Kentucky teachers define a grade-level continuum of student performance for the task in 3- dimensions Teachers have specific information to refine curriculum and reflect upon the impact of instructional decisions on each student for the dimensions identified in the TCT Teachers expand their understanding of the connection between student performance and facilitation strategies
57
Planning for TCT Facilitation Desired Outcomes Actions (what)
Possible Strategies (how) Teachers are prepared to optimally facilitate to each learner’s needs in order to collect accurate evidence of each student’s measure of proficiency in 3-dimensional sense-making within the context of the TCT Teachers deepen their understanding of 3-dimensional sense-making of scientific phenomena and engineering solutions Collaboratively, teacher teams: Develop deep understanding for the “science behind the task” and grade appropriate DCI expectations Develop understanding for how the students effectively use the SEP and CCC for sense-making in this task as grade appropriate Develop appropriate feedback questions and/or other strategies to be used when facilitating the task Develop strategies to document supports used during task facilitation Complete the TCT as a learner – compare understanding of task through the lens of success criteria in order to understand expectations Identify the phenomenon within the task and consult resources to assure deep understanding of associated science concepts Collaborate to generate, review and refine feedback questions, other strategies Identify potential “trouble spots” Plan for possible misconceptions Create/obtain collection device to document supports given during facilitation Create/obtain facilitation checklist to keep task facilitation focused
58
Example Task: This is a sample 1st grade task. All task and corresponding materials can be found on the SharePoint site:
59
Example Task:
60
Example Task:
61
Another Example Task: Family considering moving from Lexington to San Francisco. Mom wants to know how the climate will be different. Study temperature data for both cities and make a display for Mom so she can understand how weather will be different – by season. Now that you’ve studied the data, develop a question whose answer will help you understand why these 2 cities are different. Justify question. This is a sample middle school task. All tasks can be found on the SharePoint site:
63
As you evaluate this task, please do the following:
Identify the phenomenon within the task – what makes the task scientifically interesting or something to figure out? Identify the SEPs and CCCs that students will use to complete the task successfully. Identify the evidence of student knowledge and abilities the task will elicit. Is the evidence likely to allow discrimination among students (learning progression)? Consider how students might be supported to be successful but not compromise evidence collected.
64
Planning for Task Facilitation
TCT Process: a collaborative process for calibrating and refining teaching and learning around rich tasks at every grade level. Planning for Task Facilitation Facilitating the Task Post Task Analysis 1 2 3 64
65
Planning for Task Facilitation
Planning Strategies: Which parts are best done independently? Done before collaboration? Which must be done collaboratively for best outcome? Planning for Task Facilitation
66
Success criteria and progression
What is the tasking asking the students to do? What would an appropriate student response look like for the practices being used in the task? What tools are available to guide this work? Planning for Task Facilitation
67
Planning for Task Facilitation
TCT Process: a collaborative process for calibrating and refining teaching and learning around rich tasks at every grade level. Planning for Task Facilitation Facilitating the Task Post Task Analysis 1 2 3 67
68
Why is it important for teachers to…
Consider… Why is it important for teachers to… use questions to support thinking? scaffold to meet learner’s needs? collect desired evidence? Facilitating the Task
69
X Facilitating the Task
Emphasize that TCT is facilitated with students in order to get accurate evidence of each student’s capacities with respect to the assessment intent. You don’t print it and give it. Please see the TCT Facilitation Guidance document for additional clarification. It can be found here: Facilitating the Task
70
Planning for Task Facilitation
TCT Process: a collaborative process for calibrating and refining teaching and learning around rich tasks at every grade level. Planning for Task Facilitation Facilitating the Task Post Task Analysis 1 2 3 70
71
Analyzing Student Work
Teachers collaborate to: Evaluate student work relative to the success criteria and define continuum within the team Refine instructional practices based on their shared experiences with the process Expand their understanding of instructional decisions Post Task Analysis
72
Consider these questions regarding TCT support:
What structures and Processes do you currently have in your district/school that would support effective TCT Process implementation? What challenges do you anticipate for teachers in your district/school in engaging with the TCT process? What changes do you need to make to support teachers in your district/school?
73
TCT Field Test – 1) Teacher teams at each grade level have developed tasks that will be used in a field test through March NOT exemplar tasks teacher developed (not KDE), used with students access through SharePoint tasks can be modified provided the intent for assessment is not compromised (see facilitation guidance document on SharePoint)
74
Must be a Kentucky educator to have access to SharePoint
Must be a Kentucky educator to have access to SharePoint. District address and password are required for access. Note: Important information is shared in the TCT Facilitation Guidance document. You will find answers to the most frequently asked questions here. Recommended that when accessing TCT materials, that you view this site using “Classic SharePoint”.
75
TCT Field Test – 2) All Kentucky teachers using Science KAS are expected to use a task with their students use the 3 step collaborative process K through high school (high school typically by discipline) students experience 3 dimensional task / teachers experience collaborative process Further information and resources are available on the KDE Science Assessment page. More specific will be provided by the Office of Assessment and Accountability.
76
TCT Field Test – 3) Each district will retain one sample of student work at each grade level for submission to KDE any representative sample evidence of student sense-making not to evaluate district / school / teacher define student capacities, learning continuum future professional learning Many have had questions about what is wanted for student work submission. The most important factor to consider is: does the work give evidence of student thinking? Ideally, the work submitted would show typical misconceptions or areas of struggle for students at that grade level.
77
The TCT component of the science assessment system is…
NOT a common task that teachers drop on students’ desks to complete without interaction NOT a mechanism for administrators to use to evaluate teachers NOT just about the TASK, it’s about the PROCESS that effectively uses the task in an optimum formative way NOT for “A” in Accountability (it is for “a”)
78
Reflection Time… Through Course Tasks
79
Quiz Time… Other questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.