Download presentation
Presentation is loading. Please wait.
Published byBeau Burland Modified over 9 years ago
1
Comprehensive Assessment and Data System in GCISD Summative Assessment—1 time per year Data about Learner— 2-4 times per year (purchased assessments) District Common Assessments— End of Unit and/or grading period (District Created) Curriculum Based Assessments—1-4 times per month (Teacher/Team created) Formative Classroom Assessments (AFL Strategies) Daily Benchmarks & Cornerstone Tasks MAP, TELPAS, CogAT, DRA2, progress monitoring STAAR, End of Unit, Sem. Exams, ACT/SAT Assessments for Learning
4
Break
5
Redefining education because YOUR future matters today!
6
Using Heat Maps for learning | for learners How did we do?
7
Performance Standard for EVERY Grade 3-8 STAAR assessment administered in 2011-12 or 2012-13: Level II – Phase 1 20132652%2012
8
p. 5 So many numbers! How can we use data like these to guide decision-making and development? ≈ 75% ≈ 55% ≈ 85% ≈ 62%
9
≈ 62 % ≈ 55 % ≈ 85 %
10
126
11
Why is the Final Recommended Standard so high?
12
English III Algebra II Grade 2 Grade 5
13
Rethinking Scores 85 75 62 55 Well Prepared Sufficiently Prepared (Gr. 3-8 Final) Getting There (Sufficiently Prepared - Phase 1) Sufficiently Prepared (Gr. 3-8 Phase 2 | EOC Final)
14
Preview
16
How does this inform our work? Level II - Phase 1 KIDS and COMMUNITY Level II - Final PLANNING and PROFESSIONAL DEVELOPMENT
18
Start with what we are helping kids learn…the TEKS Readiness Supporting Process
20
KNOW so they can GROW
21
Know the Standards… 1.Name a hard-to-teach readiness standard 2.What concept(s) are being taught? 3.What do students learn in the previous two grades to prepare them? 4.How will students use the learning next year?
23
Readiness Standards ≈ 30% of assessed TEKS ≈ 65% of STAAR IN-DEPTH instruction BROAD and DEEP ideas 23 Concepts More complex to teach Pacing? More complex to teach Pacing?
24
Which Readiness Standards are hardest to teach? Ask teachers! Validating the System
25
General consensus? Would you expect this in all subjects? What does it mean if the lowest average score is a 3.8?
26
What are the 4 hardest to teach?
27
Teacher Perception 5.2A 5.2C 5.3B 5.3C
28
What are the 3 easiest to teach?
29
Teacher Perception 5.3A 5.8A 5.13B 5.5A 5.10C 5.12B 5.2A 5.2C 5.3B 5.3C
30
Does teacher perception match student performance? Validating the System
31
What do you expect to find?
32
Readiness
33
Student Performance Data Which SEs are the hardest to learn? Which SEs are the easiest to learn? Student Performance Data Which SEs are the hardest to learn? Which SEs are the easiest to learn?
34
Does Perception Match Performance? 5.3A 5.8A 5.13B 5.5A 5.10C 5.12B 5.2A 5.2C 5.3B 5.3C 5.2A 5.2C 5.3B 5.3C 5.3A 5.8A 5.13B 5.5A 5.10C 5.12B
35
5.5A 5.13B ✔
36
What if your data looked like this? 5.2A 5.2C 5.5A 5.12B 5.3A 5.8A 5.13B 5.3B 5.3C 5.10C 5.2A 5.3B 5.3C 5.10C 5.2C 5.8A 5.13B 5.3A 5.5A 5.12B
37
5.3B | 5. 3C | 5.10C | 5. 13B ✔ 5.2C | 5. 5A | 5.12B
38
Heat Maps www.dmac-solutions.net www.dmac-solutions.net Login State Assessment>Instructional Reports>Reporting Category SE Performance Year 2012 &/or 2013 Subject-Mathematics Grade 5, 8, Algebra I Test: March 2012 or April 2013-English Generate
39
Heat Map Analysis Reporting Categories--Areas of concern? Reporting Categories--Areas of strength? What specific Readiness Standards do we need to continue working on? What specific supporting standards do we need to continue working on? Which Red readiness standard comes first in our curriculum documents?
40
Unwrapping the Standards A necessary component to the IDP process Write the standard—underline the nouns (concepts) and circle the verbs(skills) Answer the Readiness Criteria Questions Identify Academic Vocabulary Identify cognitive process level of difficulty using Bloom’s Taxonomy Content Builder--What content do students need to know to connect this new learning? What do they come with? How will they use it in the future? Distractor Factor—common errors What is the BIG IDEA with the readiness standard?
41
Lunch
42
Essential Components of a CBA Know your Purpose To find out what students know and are able to do To determine where students are in the learning continuum and how to support them in moving forward To gather evidence needed to make inferences about student learning and teaching
43
Three Types of Item Formats Selected Response Multiple choice True/false Matching Short answer or fill in the blank
44
Constructed-Response Includes short-response and extended response Requires students to organize and use knowledge and skills to answer the question or complete a task More likely to reveal whether or not students have gained integrated understanding with regard to the readiness standards Requires a scoring guide or rubric to evaluate degree of student proficiency.
45
Assessing Essential Understanding We have to understand the BIG IDEA Essential Understanding questions help teachers to determine if students have grasped conceptual knowledge Often begin with “how” or “why”. Open-ended Requires a scoring guide or rubric
46
Selected Response Reasons For Better content domain sampling Higher reliability Greater efficiency Objectivity Mechanical scoring Reasons Against Emphasis on learning of isolated facts Inappropriate for some purposes Lack of student writing
47
How to edit questions in DMAC Use a question stem to turn the question into a true/false, matching, or fill in the blank.
48
Example 5.03B/5.14B Dual Readiness—DOK 2 Q: Mr. Cantu will put 1 flag on each table in the cafeteria for a school event. The cafeteria has 15 rows with 5 tables in each row and 12 rows with 4 tables in each row. Mr. Cantu already has 94 flags. How many more flags does he need to buy? A. 48 B. 19 C. 27 D. 29
49
Number of Items Guidelines Remember the purpose of your assessment Ask, “How many total items do I need in order to be able to make an accurate inference as to what students know and can do.” Limit the total number of items so that student papers can be quickly scored and the results used right away.
50
Assessment Blueprint
51
Resources for CBA Items Textbook questions (that meet criteria for well-written items) Teacher Created Internet educational resources and organizations DMAC
52
Tools for checking item quality Checklist of guidelines for evaluation of assessment item. Common Formative Assessment Scoring Guide Design Team Reflection
53
Data Teams Process Part of the PLC work 5 Steps 1. collect and chart data and results 2. Analyze strengths and obstacles 3. Set S.M.A.R.T goal for student improvement 4. Select effective teaching strategies 5. Determine results indicator
54
Model of CBA Based on one readiness standard and no more than 2 supporting standards. One or more selected response types (4-7 Questions) 1-2 extended-response items 1 or 2 Essential Questions. Scoring guide/rubric is embedded in the assessment for students to refer to Students can (on average) complete in 20 min. or less. * Using a multiple-measure assessment enables educators to make more accurate inferences.
55
Year-Long Plan 3 Standards for fall Process Selected Response Questions Data Teams process in PLC’s 3 standards for spring Constructed Response Items Essential Questions Scoring Guides
56
Reflections Questions? Comments?
57
Part 2
58
Constructed-Response Reasons For Provides more valid inferences about student understanding than those derived from selected-response items Reasons Against Takes longer to score Dependent on student writing proficiency Can be a challenge to score accurately
59
Example 5.03B/5.14B Dual Readiness—DOK 2 Q: Mr. Cantu will put 1 flag on each table in the cafeteria for a school event. The cafeteria has 15 rows with 5 tables in each row and 12 rows with 4 tables in each row. Mr. Cantu already has 94 flags. How many more flags does he need to buy? A. 48 B. 19 C. 27 D. 29
60
Session 5-Scoring Guides for constructed-response items Scoring Guide is synonymous with rubric A set of general and/or specified criteria used to evaluate student performance Describes “proficiency” as met standard Identifies degree or level of proficiency student achieves at the time of scoring
61
Scoring guides help ALL students succeed! Performance criteria is shared before students begin their work. Specific language that is understood by all Referred to frequently during completion of task Used to assess completed task Expedite the evaluation of student work to help provide timely feedback.
62
Scoring Guide Strategies Specificity is critical Reliability comes from consistency in wording format Clearly linked to standards and assessment items/tasks Scoring guide and task requirements should fit “hand-to- glove”.
63
Criteria Quantitative Proficient = 3 supporting details Exemplary = 4 supporting details Qualitative Proficient =identifies main character Exemplary =relates main character to another character in the story noting similarities and differences
64
Avoid Subjective Language Some Few Good Many Most Little Creative
65
Begin with Proficiency Decide criteria for this level Review task requirements and list the criteria Scoring guide criteria should mirror the task requirements
66
Exemplary Level Great for differentiating Invites students who need a challenge to delve deeper into the task Enables students to show “all they know” relative to the task Should begin the first line with: All proficient criteria met PLUS: Consider how each proficient level could be enhanced- quantitatively and qualitatively—so students understand how to go above and beyond the proficient level.
67
Progressing and Beginning Since the goal is proficiency, design the criteria for the remaining two levels in relation to proficiency. This keeps student attention focused on the proficient criteria.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.