Assessment “We assess students not merely to evaluate them, but to improve the entire process of teaching and learning.” - Douglas B. Reeves, Making Standards.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

BLR’s Human Resources Training Presentations
AP Exam- Tips and Tricks
1 Mesa Public Schools Writing an Effective Learning Objective: The instructional road to focus learning Ensuring learning occurs in every lesson.
The Assessment Toolbox Linda Suskie Middle States Commission on Higher Education AB Tech February 2005.
Aligning Depth of Knowledge with the TEKS and the STAAR
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
The Language of Math November 3, Second Check-In  My name is ___ & I am (role).  I am feeling _______ today because ____.  The biggest challenge.
Assessment Literacy Series
Designing the Test and Test Questions Jason Peake.
New assessment program for
Non-Linguistic Representation Web 2.0 Tools Helping students understand and represent knowledge non- linguistically is the most under-used instructional.
Understanding the Smarter BalanceD Math Summative Assessment
Writing High Quality Assessment Items Using a Variety of Formats Scott Strother & Duane Benson 11/14/14.
Constructing Items & Tasks. Choosing the right assessment strategy The 1999 Standards for educational and psychological testing recommends the use of.
Deep Practice What is it? Why do we need it? How do we create it?
Washington Educational Research Association Conference Washington Educational Research Association Conference Beth Dorr Reading Assessment Specialist
GHSGT in Social Studies
Stages of Second Language Acquisition
Chapter 10 ~~~~~ Content Area and Related Arts Assessment.
What Do the Data Say? Revised Bloom’s Taxonomy: A Tool for Rigor and Curriculum Alignment.
6th 6 Weeks U.S. History 8th Grade
Ginny Price CETL TEST DEVELOPMENT. WRITING MULTIPLE CHOICE ITEMS Write only a few items at a time Immediately after preparing class lesson or after class.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
ON TARGET Some “holes” in the bucket Lower level thinking skills without application—activities that require copying information; labeling parts; coloring.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Groton Elementary Agenda: Discuss assessments, modifications, and accommodations Review common accommodations for assessments Study of Test.
 Participants will teach Mathematics II or are responsible for the delivery of Mathematics II instruction  Participants attended Days 1, 2, and 3 of.
Developing Great Social Studies Test Items Nancy Hester, RESC XIII
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Choose Your Own Adventure. Introduction Use this as a guide when working on a Smarter Balanced question.
Enhancing SIOP ACT-ESL Post-seminar Thursday, April 22, 2010, 3:30-6:30 pm Meadowbrook High School 1.
Unpacking Standards and Using Student-Friendly Language Facilitated by: Catherine Garrison Professional Development Specialist.
Designing STAAR Quality Assessments PRESENTERS: ACADEMIC FACILITATORS (DON MOODY, TJ FLORIE)
Smarter Balanced Claims Sue Bluestein Wendy Droke.
Assessment Item Writing Workshop Ken Robbins FDN-5560 Classroom Assessment Click HERE to return to the Documentation HERE.
Formative and Summative Assessment in the Classroom
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
4-Day Agenda and Expectations Day 2 Building Formative Assessments linked to deconstructed content and skills Building Formative Assessments linked to.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Reading Strategies To Improve Comprehension Empowering Gifted Children.
Assessment and Testing
previous next 12/1/2015 There’s only one kind of question on a reading test, right? Book Style Questions Brain Style Questions Definition Types of Questions.
Assessment Power! Pamela Cantrell, Ph.D. Director, Raggio Research Center for STEM Education College of Education University of Nevada, Reno.
Grand Island K-8 SCIENCE Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Willard Public Schools Teacher Professional Development Teaching Strategies for ELL Students.
Test Question Writing Instructor Development ANSF Nurse Training Program.
LEAP TH GRADE. DATES: APRIL 25-29, 2016 Test Administration Schedule:  Day 1 April 25- ELA Session 1: Research Simulation Task (90mins) Mathematics.
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
Test Taking Skills Make sure you prove what you know!
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Materials FCAT Achievement Levels Test Content and Format, pg. 12 (FCAT Skills Assessed) Examples of Reading Activities Across Cognitive Complexity Levels,
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Extended Written Response Assessment
Writing Reading Items Module 2 Activity 4.
Writing Reading Items Module 2 Activity 4.
EDU 385 Session 8 Writing Selection items
Why Test? Comparative Diagnostic To standards (criteria based)
LEAP TH GRADE.
Classification of Tests Chapter # 2
Effective Questioning
STAAR: What do we notice?
Multiple Choice Item (MCI) Quick Reference Guide
PURPOSEFUL PLANNING PROCESS
The “TOWER” Activity.
Norman L Webb.
Multiple Choice Item (MCI) Quick Reference Guide
Presentation transcript:

Assessment “We assess students not merely to evaluate them, but to improve the entire process of teaching and learning.” - Douglas B. Reeves, Making Standards Work

Learning Objective Participants will be able to create more rigorous assessments by analyzing the content and cognitive level of state released test questions.

Why Assessment Training? Teacher and campuses are overwhelmed with the number of assessments they have to give How do we target High Priority TEKS? How is a good unit assessment constructed? Individual teacher assessments should follow the guidelines of quality assessment

Agenda Six Steps to Creating Quality Assessments 1.Data 2.Content 3.Processes 4.Review Examples 5.Item & Test Leveling 6.Item Design

Look at the Data Review the District Learning Profile. Which SEs are the lowest? Which SEs had the most questions?

Analyzing the content, concepts, processes, and skills that will be assessed (Cognitive and Procedural Knowledge)

TEKS and Assessment: Things to Remember The wording of the standard tells us: WHAT CONTENT will be assessed on STAAR and AT WHAT LEVEL the standard will be assessed on STAAR

Cognitive and Content Expectations Content The content items for which students must demonstrate understanding at the appropriate cognitive level in order to adequately meet the standard. Cognitive The level at which students are expected to perform in order to adequately meet the standard. Determined by the verbs used in BOTH the Knowledge and Skills statements and the Student Expectations

What Should Students be able to DO ? Look at the cognitive level of the verb If the cognitive level of the SE is UNDERSTAND, what does that mean students have to be able to do? Understanding: Constructing meaning from different types of functions be they written or graphic messages activities like interpreting, exemplifying, classifying, summarizing, inferring, comparing, and explaining.

ELAR – Grade 8

Bundling and Dual Coding

Brain Research and Dual Coding Application of Knowledge and Building Schema The brain learns new knowledge (content) by attaching that knowledge to existing schema The brain builds schema by applying (do) conceptual and content knowledge in a variety of novel ways You can most effectively test conceptual knowledge through application questions

STAAR and Dual Coding Math = 75% of items Science = 40% of items Social Studies = 30% English = 60%??? Figure 19 As we review the following examples, think about the implications of test and item design

ELAR: Dual Coding with Figure 19

Your Turn: ACTIVITY Look at the STAAR released items in your handout. Identify what content is being assessed. Identify the skill being assessed. Identify how the student has to apply content with the selected skill. Discuss with your group.

STAAR Overview: Generalizations Verbiage in the questions is complex and reflective of the TEKS Supporting Standards REALLY are being tested! More questions at a higher level Inference from stimulus; no direct answers (no clues or cues) Complex distracters - 4 viable answer choices Dual coding is prevalent Greater depth of content knowledge required to answer each question

Creating an Assessment Blueprint

Step 5: Determining Cognitive and Procedural Difficulty Levels Aligning Item and Test Difficulty to STAAR

Determining Difficulty Level Cognitive Difficulty (verbs) EASY Remember Understand MEDIUM Apply Analyze HARD Evaluate Create

Determining Difficulty Level Procedural Difficulty  This is basically: How many mental processing steps does the student have to go through to answer the question?  The greater the number of processing steps the higher the difficulty level

Determining Difficulty Level Procedural Difficulty EASY The item includes only the stem and the answer choices MEDIUM The item includes a graphic, short reading selection, map etc. (Stimulus piece) The student only has to interpret the stimulus or pull information from it to select the correct answer. HARD The item includes a graphic, short reading selection, map etc. (Stimulus piece) The student has to infer, analyze, summarize, etc. and apply that to the stem or answer choices to select the correct answer.

Your Turn: Determining Item Difficulty 1. Look at the following items 2. Identify the cognitive difficulty of the item 3. Identify the procedural difficulty of the item

Determining Item Difficulty

Item Writing Checklist

Demonstration of Learnig After having determined the cognitive and procedural difficulty of a test item, learners will rewrite the question so that it is either up a level or down a level.

Creating an Assessment Blueprint Identify the process/skills SE that you are pairing with the content SE Write these on the Blueprint Use a variety of processes and skills on the test and throughout the year.

Design Your Test So…. How many items per SE? Cognitive Level- Make 70% of the test easy- medium and 30% difficult. Procedural Level- Make 60% easy procedural and 40% difficult. Use same vocabulary throughout.

Next Steps: What are the implications of today’s learning for your campus? What are things you will change on your assessments? What are things you are still pondering? What is something you want to know more about?

Bonus Slides: Examples of Higher Level Stems Which is an example of… Who would most likely have written (asked, said…) followed by a quote Analogies Fill in the missing part of this graphic organizer Which of the following does not belong? What is the best category for the following? Give a series of clues to the name of a person, place, etc. (riddle format) Cloze passage Hypothetical situation – analysis Sequencing by Roman numeral Who would have been helped by (law, invention, organization – hypothetical)

Use a variety of stimuli Textbox of primary source material Graphs and Charts Pictures or illustrations Spoke Diagrams Multiple visuals: comparison maps, charts Political Cartoons Time lines Flow Charts Graphic designs Headlines Speaker Questions

Qualities of a Good Multiple- Choice Item Effectively written stems Focused on one idea Clear and concise Phrased as a direct question Phrased positively Has only one best and correct answer Contains options that are plausible, but incorrect Has four choices that are homogeneous, parallel in structure, and logical

Qualities of a BAD Multiple- Choice Item Tests trivial information Contains unnecessary/confusing information in the stem or options Is tricky or cute Gives cues or clues to the correct answer Does not have plausible answers Poses a question for which many defensible answers are possible Contains a bias toward or against a group of individuals

Develop plausible distractors Plausible distracters include… common misconceptions statements that sound logical but that are not based on the material given in the stimulus and question common or familiar phrases statements that are true but that do not answer the question Be sure there is not more than one plausible answer.

Organization of Answer Choices Alphabetical Ascending or descending order Shortest to longest Same order as presented in stimulus