Designing a Pool of Items. Workshop Flow The construct of MKT –Gain familiarity with the construct of MKT –Examine available MKT instruments in the field.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

The 21st Century Context for
Take a Test: Answer These Questions About Preparations for Kansas Assessments 1.What’s the difference between a “practice test” and a “formative test”?
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Victorian Curriculum and Assessment Authority
Learning Outcomes Participants will be able to analyze assessments
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
1 The Professional Teaching and Learning Cycle in Action.
1 Module 2: Tasks that Prompt Students’ to Represent Your Outcome Statements “ Every assessment is also based on a set of beliefs about the kinds of tasks.
Supporting Teachers to make Overall Teacher Judgments The Consortium for Professional Learning.
Math-Science Subgroup Report Recommendations. APEC Context Members are keenly interested in collaborating to learn from each other how to provide 21 st.
Teaching CTE Honors Courses Buncombe County
Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI International Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI.
Providing High-Quality Professional Development Session Questions How does the PTLC connect the improvement work to the classroom level? How does the.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Making Proficiency in Math Soar to Greater Heights November 2, 2006 Claudia Ahlstrom.
Principles of High Quality Assessment
Refining Your Research Question. In this session, we will… Discuss guidelines for creating a ‘good’ research question Provide time to revisit and revise.
Science PCK Workshop March 24, 2013 Dr. Martina Nieswandt UMass Amherst
Science Inquiry Minds-on Hands-on.
Big Ideas and Problem Solving in Junior Math Instruction
An Integrated Approach to TGfU
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
The 5 E Instructional Model
CURRICULUM ALIGNMENT Debbi Hardy Curriculum Director Olympia School District.
Student Learning Objectives: Setting Goals for Student Growth Countywide Professional Development Day Thursday, April 25, 2013.
The Planning and Assessment Cycle
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Bank of Performance Assessment Tasks in English
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
ECD in the Scenario-Based GED ® Science Test Kevin McCarthy Dennis Fulkerson Science Content Specialists CCSSO NCSA June 29, 2012 Minneapolis This material.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Mathematical Processes. 2 What We are Learning Today Mathematical Processes What are they? How do we teach through these processes? How do students learn.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
CLASS Keys Orientation Douglas County School System August /17/20151.
The Design Phase: Using Evidence-Centered Assessment Design Monty Python argument.
Student Learning Objectives: Setting Goals for Student Growth Countywide Professional Development Day Thursday, April 25, 2013 This presentation contains.
Presented by Debbie Godsen DePalma.  What is the plan for NYS and the CCS?  What are the CCS?  FAQ  What are the benefits?  What are the models of.
Educator Effectiveness Academy STEM Follow-Up Webinar December 2011.
Assessment Literacy Series 1 -Module 6- Quality Assurance & Form Reviews.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
1 Overview Comments on notebooks & mini- problem Teaching equitably Analyzing textbook lessons End-of-class check (Brief discussion) Introduction to multiplication.
ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why.
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
Writing Modified Achievement Level Descriptors Presented at OSEP Conference January 16, 2008 by Marianne Perie Center for Assessment.
Project Based Learning What, Why & How. Objectives for Today Have you experience the beginning of a project (= Making your own project) Analyze your experience,
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
Company LOGO Welcome Welcome High School Mathematics Educators.
Target -Method Match Selecting The Right Assessment.
Educator Effectiveness Academy Day 2, Session 1. Find Someone Who…. The purpose of this activity is to review concepts presented during day 1.
EQAO Assessments and Rangefinding
How People Learn – Brain, Mind, Experience, and School (Bransford, Brown, & Cocking, 1999) Three core principles 1: If their (students) initial understanding.
Assessment and Testing
Resources and Reflections: Using Data in Undergraduate Geosciences Cathy Manduca SERC Carleton College DLESE Annual Meeting 2003.
Determining Student Mastery: Achieving learning potential using assessment Drew Maerz Asheboro City Schools July 8, 2014.
Literacy and Numeracy Benchmarks Prepared by SAPDC Learning Facilitator Team.
 “I have to teach the same information skills each year because students do not learn them.”  “I don’t have time to give tests so I do not assess student.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Inquiry Primer Version 1.0 Part 4: Scientific Inquiry.
Introduction to Supporting Science. What Does Science Involve? Identifying a question to investigate Forming hypotheses Collecting data Interpreting data.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Tool 5: Using Evidence of Learning Specifications to Develop a Performance Task and Rubric Five Tools & Processes for NGSS.
Writing in Math: Digging Deeper into Short Constructed Responses
Standard Setting for NGSS
NGSS Tool and Process 1 . Advancing Tools and Processes for Next Generation Science Planning for Classroom Assessment Tool 5: Using Evidence of Learning.
Assessing Academic Programs at IPFW
Pedagogical Content Knowledge – Elementary Mathematics
Presentation transcript:

Designing a Pool of Items

Workshop Flow The construct of MKT –Gain familiarity with the construct of MKT –Examine available MKT instruments in the field Assessment Design –Gain familiarity with the Evidence-Centered Design approach –Begin to design a framework for your own assessment Assessment Development –Begin to create your own assessment items in line with your framework Assessment Validation –Learn basic tools for how to refine and validate an assessment Plan next steps for using assessments

Domain Modeling (Design Pattern) (Define Test Specs) Domain Analysis Define item Template Define item Specs Develop Pool of items Collect/ Analyze Validity Data Refine items Refine items Assemble Test Document Technical Info Assessment Development Process

From Design to Development Use information from the Design Pattern to generate items (Use the process of generating items to refine the Design Pattern) What the test-taker should know Focal Knowledge, Skills, and Abilities Evidence we can collect to show they know it Potential observations Potential work products Potential rubrics Kinds of situations that can evoke this evidence Characteristic features Variable features

Item Development Steps Based on the foundational design work… 1.Content experts develop draft items 2.Refine draft items – “hygiene” 3.Collect validity data on draft items 4.Use validity data to refine items 5.Assemble and document the assessment

Assessment Item Anatomy Essential parts of a butterfly are… A.Wings, legs, teeth B.Eyes, wings, legs C.Nose, knees, ears D.Fingers, frontal lobes, hair Butterflies are members of the _______ phylum. What are the closest relatives of the butterfly, and why? Stem Choices Distracters Prompt Constructed Response (Closed-Ended) Constructed Response (Open-Ended) Multiple Choice

Conducting Hygiene Review is essential Clarity and precision Accuracy Grammar No ambiguity No unintended cues Distracters believable, appropriate, same length Placement of correct response varied

Not a Trivial Process! SimCalc

Other Resources for Hunters and Gatherers Standardized tests –International PISA ( ) TIMSS ( ) –National NAEP ( ) –State TAKS released items

Other Resources for Hunters and Gatherers Other resources –Established curricular materials Approved textbooks TPD materials Supplemental teacher resources –SRI’s online resources Performance Assessment Links in Math (PALM): Integrated Performance Assessments in Technology: Online Evaluation Resource Library (OERL): GLOBE assessment tools: –Research literature (e.g., theoretical papers that provide example items) –Other research-based assessments –Mental Measurements Yearbook from the Buros Institute of Mental Measurements

Conducting an “Item Camp” Content experts are trained in how to write items –Illustrative examples provided for each –Trainer demonstrates how to use the templates Content experts develop items –Use a form to help structure refinement –Iterative creation, sharing, feedback –Helps to work in groups –Give them a goal and a timeline –Remind them of their charge –Make them aware of their tools –Don’t throw away work Aim for 2-3x more items than you expect to use on the test form

Welcome to (mini) Item Camp!

Goals Develop 1 to 3 draft items for your (or your neighbor’s) assessment Have an experience of participating in an item camp

Starting with Item Templates What are templates? –Templates are industry standard in assessment development –Templates outline the structure of an item, you fill in the content –Guidelines to systematically produce high-quality items What are these templates? –Created based on the SimCalc design pattern to help seed thinking about the SimCalc items –Templates might be quite different for your design pattern –They are based on our observations about the structure of items that fit into types and “worked” in field testing –Heuristic tool, not theoretical framework In this context, recommended but not required

Template Types Based on Domain Analysis of MKT  Unconventional forms or representations  Choosing problems and examples that can illustrate key curricular ideas  Differentiating between colloquial and mathematical uses of language  Linking precise aspects of representations  Understanding implications of models and representations  Evaluating mathematical statements

Structure of the Templates All multiple choice, with the following elements: –Context  Sets up a teaching task  Presents information necessary to answer the question –Question/Prompt –Distracters

Review Templates

Guidelines for distracters Usually 4 or 5 distracters (may be more) May use “Choose one” or “Choose all that apply.” Responses may include “All of the above” and “None of the above.” Whenever possible, distracters will reflect common errors or misunderstandings, naïve pre-conceptions, or other misconceptions. Teachers should not be able to rule out a distracter or identify the answer simply because of superficial or trivial characteristics, syntactic complexity, or concept complexity. Distracters will not be partially correct responses nor will they be designed to “trick” teachers in responding incorrectly. Be judicious!

Other Item Types True/false Matching Open-ended –Calculations –Short responses –Generating representations –Generating explanations, stories, etc. Require a rubric for scoring

Filling Out the Item Form Name Connection to standard or curriculum KSA assessed Item Correct answer and/or rubric Notes

“Item Hygiene” Consider this a brainstorm, so do not be overly concerned about IH However, keep in mind that this is the next part of the pipeline IH includes: –Clarity, precision, correctness –Grammar –Non-ambiguity –No unintended cues –Distracters believable, appropriate, same length –Placement of correct response varied

Draw on Resources Available Your Design Pattern Your curriculum materials MKT resources (e.g., prior assessments) NCTM and Texas standards

Activity #3 Mini-Item Camp Find Activity #3 in Tab 5 Work on your own, with a partner, or with a small group Develop some MKT items (1 to 3) Prepare 1 or 2 items for the Item Hygiene Review (see page 7) On the item form, include: –Name –Connection to standard or curriculum –KSA assessed –Item –Correct answer and/or rubric –Notes

Time for Activity: 9:30-10:15 On the item form, include: –Name –Connection to standard or curriculum –KSA assessed –Item –Correct answer and/or rubric –Notes

Activity #4 Find Activity #4 on Tab 5 Sit in groups of 3 for Hygiene Review Use the Item Hygiene Guide to edit your teammates items Fix your 1 or 2 items and prepare a clean copy for this afternoon’s activity Discussion to follow –Show-and-tell of 2 or 3 items –Your insights, questions, challenges

Activity #4 Conduct Item Hygiene Review Conduct item hygiene on your set of items, using the Item Hygiene Guide. We suggest writing corrections directly on the item. Following, we will discuss –Insights about development of assessment items –Questions and challenges

Some Useful Resources Anderson, J. R. (2000). Cognitive psychology and its implications (3 rd Ed.). New York: W. H. Freeman and Company. Baxter, G. P., & Glaser, R. (1998). The cognitive complexity of science performance assessments. Educational Measurement: Issues and Practice, 17, 3, Pellegrino, J., Chudowsky, N., Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press. Quellmalz, E., Hinojosa, T., & Rosenquist, A. (2001). Design of student assessment tools for the Global Learning and Observations to Benefit the Environment (GLOBE) program. Presentation at the annual GLOBE International Conference, Blaine, WA. Snow, R. E., Federico, P. A., & Montague, W. E. (Eds.). (1980). Aptitude, learning, and instruction: Volume 2. Cognitive process analyses of learning and problem-solving. Hillsdale, NJ: Erlbaum.