DataDirector: Creating Exams Winter/Spring 2011 Stan Masters Lenawee ISD.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Depths of Knowledge and Reading
Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint.
Assessment Literacy: Formative Instructional Practices
Level 1 Recall Recall of a fact, information, or procedure. Level 2 Skill/Concept Use information or conceptual knowledge, two or more steps, etc. Level.
Coastal Plains RESA Assessment Literacy: Formative Instructional Practices March 27, April 23, April 30, May 7 Session One: Modules 1 & 2 Session Two:
Aligning Depth of Knowledge with the TEKS and the STAAR
National Center on Response to Intervention RTI Implementer Webinar Series: What is Screening?
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
DEEPENING ASSESSMENT LITERACY Fall Objective  Identify best practices for local assessment development  Provide a working knowledge of the WPSD.
ACT Close and Critical Reading Using ACT Content Passages Macomb Intermediate School District September 15 th, 2010.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
NCTM’s Focus in High School Mathematics: Reasoning and Sense Making.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Principles of High Quality Assessment
Grade 12 Subject Specific Ministry Training Sessions
INTRODUCTION AND IMPLICATIONS DELAWARE VALLEY SCHOOL DISTRICT PA Common Core Standards 1.
+ 21 st Century Skills and Academic Standards Kimberly Hetrick Berry Creek Middle School Eagle County School District.
DOK and GRASPS, an Introduction for new staff
Understanding by Design designed by Grant Wiggens and Jay McTighe.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
CURRICULUM ALIGNMENT Debbi Hardy Curriculum Director Olympia School District.
Using Data to Identify Student Needs for MME Stan Masters Coordinator of Curriculum, Assessment, and School Improvement Lenawee ISD August 26, 2008.
Evaluating Student Growth Looking at student works samples to evaluate for both CCSS- Math Content and Standards for Mathematical Practice.
Module 3: Unit 1, Session 2 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 2.
Classroom Assessment for the 21 st Century Session 4 Stan Masters Coordinator - Instructional Data Services Lenawee ISD Summer 2010.
State Scoring Guide Professional Development: Assessing the Essential Skill of Reading Level 2 -- Introduction Information provided by Oregon Department.
Grading and Reporting for the 21 st Century Session 1 Stan Masters Coordinator - Instructional Data Services Lenawee ISD Summer 2010.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
LISD Data Camp June and August, 2011 LISD TECH Center.
The Revised CSOs Lou Maynus, WVDE. 21 st Century Learning Mission To grow the seeds of greatness in every child, teaching them to achieve to their fullest.
DOK Depth of Knowledge An Introduction.
Welcome to the Data Warehouse HOME HELP COGNITIVE LEVELS Assessments COGNITIVE LEVELS.
Common Core State Standards (CCSS) September 12, 2012.
Future Ready Schools Formative Assessment : An Essential Ingredient in a Recipe for a Comprehensive Balanced Assessment System.
Predicting Patterns: Lenawee County's Use of EXPLORE and PLAN DataDirector 2011 User Conference Dearborn, Michigan.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Webb’s Depth of Knowledge
DataDirector: Using Customized Reports for School Improvement Fall 2010 Jennifer DeGrie and Stan Masters Lenawee ISD.
USING DATADIRECTOR ASSESSMENTS TO INCREASE USE OF FEEDBACK TO STUDENTS AND TEACHERS Stan Masters Coordinator of Instructional Data Services Lenawee Intermediate.
NEW REALITY STUDENTS MUST HAVE HIGHER-ORDER THINKING SKILLS 1.
Grading and Reporting for the 21 st Century Session 3 Stan Masters Coordinator - Instructional Data Services Lenawee ISD Winter 2010.
Classroom Assessment for the 21 st Century Session 2 Stan Masters Coordinator - Instructional Data Services Lenawee ISD Summer 2010.
Assessing Student Learning
Secondary Social Studies Learning Targets Stan Masters Coordinator - Instructional Data Services Lenawee ISD.
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
Classroom Assessment Literacy and What you need to know to do it well!
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Setting The Stage: Placemat Activity At your tables, get in groups of four and assign one box per person Take a moment and independently brainstorm important.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
ERead and Report. What is... Independent eBook Reading with a Vocabulary and Comprehension Assessment Focuses mainly on Reading Informational Texts Aligns.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
DataDirector: Creating Assessments Winter/Spring 2010 Stan Masters Lenawee ISD.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Summer 2012 DataCamp June/August, Everyone is using standards…
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
PLC Team Leader Meeting
NCEES & Social Studies Through the Lens of Standards 3 and 4.
DataDirector: Using Customized Reports for Local Assessments Fall 2011 Stan Masters Lenawee ISD.
Naming Conventions Lenawee ISD. Naming Convention for Reports Which 4 th grade students are not making adequate growth based upon the Fall 2007 and Fall.
Getting to Know Webb’s. Webb’s Depth of Knowledge Level One (recall) requires simple recall of such information as fact, definition, term, or simple procedure.
New Hope-Solebury School District. Develop a shared understanding of the concept of cognitive rigor Begin the conversation about Webbs’ Depth of Knowledge.
GOING DEEPER INTO STEP 1: UNWRAPPING STANDARDS Welcome!
DEPARTMENT OF EDUCATION AND TRAINING DET Staff Site.
Instructional Leadership Supporting Common Assessments.
1 Cognitive Demand in Problems  Cognitive demand is a measure of what the instructional question (a question posed during class) or test item requires.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
OCS: Putting the Assessment Pieces Together
Presentation transcript:

DataDirector: Creating Exams Winter/Spring 2011 Stan Masters Lenawee ISD

POP Purpose –train DataDirector users to plan, create, and administer local standards-based assessments instruments Objectives –I can develop an exam blueprint –I can choose items to add to my exam –I can create an online key for my students –I can identify exam reports I will use with others Procedures –PowerPoint presentation –Use of LISD DataDirector site –Use of LISD Data Warehouse webpage

School Improvement Process

WE MUST UTILIZE AN INQUIRY APPROACH TO DATA ANALYSIS WE MUST USE MULTIPLE SOURCES OF DATA We need a data warehouse for our 21 st century schools WE MUST FOCUS ON DATA TO INCREASE STUDENT ACHIEVEMENT Talking Points for the Purpose of Implementing a Data Warehouse in Lenawee Schools

Source: Presentation by Dr. Victoria Bernhardt, April 2007

FERPA/HIPAA Pre-Test You are in charge of a staff meeting to study student achievement on school improvement goals. As part of your meeting, you are showing a report to the entire staff that shows student scores on a common local assessment. The report shows the student names. In addition, you have given them a paper copy of the report. It is a violation of FERPA to display the results of the assessment to the entire staff. The exception would be a group of teachers working on a specific student strategies, as they are a specific population that then has a “legitimate educational interest” in the information.

Existing Summative and Formative Classroom Tests Not Aligned with Expectations Classroom Summative and Formative Tests Aligned to Expectations Common Classroom Summative and Formative Assessments Aligned to Expectations Common Formative and Summative Assessments Aligned to Expectations and Delivered Online Through DataDirector Implementing Exams with DataDirector Adapted from St. Clair RESA

assessment for learning –formative (monitors student progress during instruction) –placement (given before instruction to gather information on where to start) –diagnostic (helps find the underlying causes for learning problems) –interim (monitor student proficiency on learning targets) assessment of learning –summative (the final task at the end of a unit, a course, or a semester) Purposes of Assessments Sources: Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, Bravmann, S. L., “P-I Focus: One test doesn’t fit all”, Seattle Post-Intelligencer, May 2, 2004.Seattle Post-Intelligencer Marshall, K. (2006) “Interim Assessments: Keys to Successful Implementation”. NewYork: New Leaders for New Schools.“Interim Assessments: Keys to Successful Implementation”.

What Do We Ask Ourselves Before Building an Exam Instrument? What do we want to measure? What information do we want to gather? How long do we want the assessment to take? How will we determine which standards we want to assess? How will we use the results?

Guiding Principles for Building Exams A minimum of three items for every standard you want to measure. Spread items measuring an individual standard across the assessment. Range of overall assessment difficulty and for each standard measured. Begin assessment with easier items, build to more difficult and conclude with easier items.

Cognitive Difficulty Levels Level 1 - Basic Skills  requires students to recall information such as facts, definitions, terms, or simple one-step procedures Level 2 - Conceptual Understanding  requires students to make some decisions as to how to approach the problem or activity and may imply more than a single step Level 3 - Extended Reasoning  require students to develop a strategy to connect and relate ideas in order to solve the problem while using multiple steps and drawing upon a variety of skills.

Level One: Basic Skills Support ideas by reference to details in text Use dictionary to find meaning Identify figurative language in passage Solve a one step word problem Perform a specified procedure

Level 2: Conceptual Understanding Predict logical outcome Identify and summarize main points Represent a situation mathematically in more than one way. Interpret a visual representation

Level 3: Extended Reasoning Determine effect of author’s purpose on text elements Summarize information from multiple sources Provide a mathematical justification Describe, compare and contrast solution methods

Psychometrician’s Vocabulary ValidityValidity –What kind of concrete evidence can we collect that a test measures what we say it measures? –Do the results of using the test give the consequences we expect? ReliabilityReliability –What is the measure of the amount of error in a test? –What is the percent of the differences (variance) in the scores that is attributable to the trait being assessed? William H. Trochim, Research Methods Knowledge Base, accessed 3/23/06,

Reliability Source: Ernie Bauer and Ed Roeber, “Technical Standards for Locally-Developed Assessments”, April 8, 2010 Longer tests are more reliable. Tests with a narrow range of content are more reliable. Appropriately difficult tests (neither too hard nor too easy) are more reliable. Clearly worded items make more reliable tests. Reliability can be trumped by validity.

Validity Source: Ernie Bauer and Ed Roeber, “Technical Standards for Locally-Developed Assessments”, April 8, 2010 The evidence we collect depends on the inference we wish to make –Content validity - is there evidence to show the test measures what it says it measures? –Predictive validity - is there evidence that shows that the test predicts what it is we want it to predict? –Concurrent validity - does this test measure the same thing as another measure, only easier to use? –Construct validity - does this test measure the psychological construct we are trying to measure?

Use a blueprint to build validity Source: Ernie Bauer and Ed Roeber, “Technical Standards for Locally-Developed Assessments”, April 8, 2010 A grid is used to summarizes the content and format of the test. –The rows are the learning objectives –The columns are the level of cognitive complexity The cells list the types and numbers of items. You can place a margin that can be used to sum the total points.

Reverse Design Source: Ernie Bauer, and Jim Gullen, “Locally Developed Assessment: What Do the Results Mean?”, April 2010 Start with your items Group the items by content (GLCE/HSCE) –This will give you the rows for the blueprint Split the items in the content piles by DOK –This will give you the columns for the blueprint Create a blueprint document to check if this provides acceptable evidence for you.

Naming Conventions Assessment and Exams –School Year (e.g., ) –Name of Course/Grade (e.g.,US History and Geography or 4 th Grade) –Name of Assessment or Exam (e.g., World War II, EXPLORE, or Dolch Words) You may also identify the timing of the assessment –(e.g., Beginning, Middle, End or Fall/Spring or Pre/Post)

Questions? Stan Masters Coordinator of Instructional Data Services Lenawee Intermediate School District 2946 Sutton Road Adrian, Michigan (phone) (fax) Data Warehouse webpage: