District-Determined Measures: Basics of Assessment.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

District Determined Measures
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 5 Reliability.
Reliability for Teachers Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Reliability = Consistency.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
DEEPENING ASSESSMENT LITERACY Fall Objective  Identify best practices for local assessment development  Provide a working knowledge of the WPSD.
A Step by Step Guide for Teachers Information from the WSWHE BOCES Assessment Development Workshop Presented August 23, 2012 by Katie Jones and Christine.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity Daniel J. Wiener, Administrator of Inclusive Assessment.
Principles of High Quality Assessment
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Grade 12 Subject Specific Ministry Training Sessions
Classroom Assessment A Practical Guide for Educators by Craig A
Understanding Validity for Teachers
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
Assessment Literacy Series
Basics of Assessment Webinar Series Part 2. Logistics  Q&A  Type your questions into the Chat box on the lower right-hand corner of the screen  Recording.
Principles of Assessment
Consortia of States Assessment Systems Instructional Leaders Roundtable November 18, 2010.
Classroom Assessment and Grading
Wisconsin Extended Grade Band Standards
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Technical Adequacy Session One Part Three.
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Assessment and Evaluation TCH 347 Social Studies Methods Department of Teacher Education Shippensburg University Han Liu, Ph. D.
Measuring Complex Achievement
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Alternative Assessment
Teaching Today: An Introduction to Education 8th edition
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
MVSA Ron Noble - ESE October 16, 2013 DDMs: Updates and Discussion.
Montgomery County ESC Phase II: SLO Training.  In order to show growth you need to have a consistent framework between the pre- and post-assessments.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
HOME High Quality Assessments in Performance Evaluation and Professional Growth (PEPG) Systems.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
Chapter 6 - Standardized Measurement and Assessment
July 11, 2013 DDM Technical Assistance and Networking Session.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Classroom Assessment A Practical Guide for Educators by Craig A
Classroom Assessment Validity And Bias in Assessment.
Critically Evaluating an Assessment Task
Assessments: Beyond the Claims
Presentation transcript:

District-Determined Measures: Basics of Assessment

Agenda  District Determined Measures  Assessment Overview  Types of Assessments  Alignment  Assessment Components  Assessment Quality 2

 Measures of student learning, growth, and achievement related to the Massachusetts Curriculum Frameworks, Massachusetts Vocational Technical Education Frameworks, or other relevant frameworks, that are comparable across grade or subject level district-wide. 3 District Determined Measures

District-Determined Measures Implementation Rollout:  By September 2013, all districts will identify and report to ESE a district-wide set of student performance measures for:  Early grade literacy (K-3)  Early grade math (K-3)  Middle grade math (5-8)  Writing to text (9-12)  Non-traditionally assessed subjects (K-12) Massachusetts Department of Elementary and Secondary Education 4

District-Determined Measures Implementation Rollout:  By June 2014, all districts will identify and report to ESE a district-wide set of student performance measures all grades and subjects. Massachusetts Department of Elementary and Secondary Education 5

District-Determined Measures  DDMs should measure growth, not just achievement.  Assessments should be administered across all schools in the district where the same grade or subject is taught. (Common Assessments)  Districts must use measures of growth from state assessments where they are available.  Only applicable to fewer than 20% of educators Massachusetts Department of Elementary and Secondary Education 6

 Borrowing intact measures (e.g., a scoring rubric for a performance task)  available to the public at no cost  do not require permission to use  closely align to the local curriculum 7 District Determined Measures

 Building district creates a customized test for a particular grade or course using:  released items  item pools  sample items 8 District Determined Measures

 Buying: commercial assessments closely aligned with local curricula purchased from a vendor 9 District Determined Measures

 Pre- and post-assessments  Approved commercial assessments  Portfolios  Capstone projects 10 District Determined Measures

The Opportunity  Identifying DDMs can be the impetus for broadening and strengthening the district’s assessment practices.  DDMs will provide educators with useful data that will help them improve both student outcomes and their instructional practices.  DDMs will yield data educators can use throughout the 5-step evaluation cycle. 11

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 12

Assessment Overview  Assessment is a general term that refers to the process of gaining information about student learning  Process includes administration procedures, scoring, reporting, etc.  A DDM is an assessment  Instrument refers to a specific type of data collection tool or mechanism used in an assessment process  There are many types of instruments  A test is one example 13

14 Value of Good Assessment Better assessment Better teaching Better learning and greater confidence Better student outcomes Better opportunities in life

 Indirect  Gather information from sources other than actual samples of student work  Direct  Gather information from actual samples of student work 15 Assessment Approaches

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 16

17 Types of Assessments On-DemandPerformance, Project PortfolioHybrid Summative Assessments: EOC, EOY, Interim, Capstone

On-Demand Assessments  An assessment that takes place at a predetermined time and place, usually under standard conditions for all students being assessed  SAT  MCAS  unit tests  final exams 18

 Assessments based on observations of behaviors or based on work performed on a complex activity  Natural vs. Artificial  Unstructured vs. Structured  Participant vs. External Observer  Self-rated vs. Other-rated (teacher, peer, observer) 19 Performance/Project

Portfolio  A purposeful and systematic collection of student work  Should include:  student participation in the selection of portfolio content,  the criteria for selection are aligned to standards and grade-level expectations through a rubric or other scoring device,  the criteria for judging merit, and  evidence of student self-reflection  May include both finished work (Product) and work in progress (Process)  May focus on one or more curricular areas 20

 An on-demand assessment that combines two or more types of assessments  Usually a paper-and-pencil or online test with a performance, portfolio, or project assessment 21 Hybrid Assessment

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 22

 Alignment -extent to which the assessment aligns with curriculum as expressed in the curriculum map  Rigor -level of cognitive complexity of the item or of a set of items  Bloom’s revised taxonomy 23 Alignment and Rigor

 DDMs reflect key learning objectives by grade and content area in the district’s curricular maps. 24 Alignment

Identify the key content you want to assess  Standards (Mathematics.G.SRT.3.08) Use trigonometric ratios and the Pythagorean Theorem to solve right triangles in applied problems  Learning objectives Students will correctly apply Pythagorean Theorem when prompted. Students determine when to correctly apply trigonometric ratio models. 25 Alignment

26 Rigor – Revised Bloom’s Taxonomy CreatingEvaluatingAnalyzingApplyingUnderstandingRemembering Higher Order Lower Order

27 Rigor – Revised Bloom’s Taxonomy Understanding Higher Order Lower Order

28 Rigor – Revised Bloom’s Taxonomy Applying Higher Order Lower Order True/False: Will a set of skis that are 6’ (six feet) high fit into an empty closet with the following dimensions? Dimensions: 3 FeetWide 3 FeetDeep 5 FeetHigh

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 29

Assessment Components  Table of Test Specifications  Administration Protocol  Instrument(items)  Scoring Method  Documentation (reporting) 30

Table of Test Specifications 31

 Often found in Administration Manuals  Needed to ensure all students have a fair opportunity to demonstrate what they know and can do in the assessment  Proctoring directions  Security provisions ( not overly familiar to examinees and proctors)  Student accommodations 32 Administration Protocols

 Selected Response  True–False  Multiple Choice  Matching  Constructed Response  Short answer  Restricted constructed response  Extended constructed response (includes essay)  Portfolio item  Performance item 33 Items

34 Selected Response Item The Options: Provide a correct answer (C) and incorrect answers that will “distract” examinees who do not know the material. The Stem and Stimulus: Succinctly describes the problem for examinees.

35 Constructed Response Item MCAS Test Question Grade Question 10

 Scoring objective items  Scoring key or short guide  Based on clearly defined scoring key and set of scoring rules  Limits error variance  Scoring subjective items  Longer scoring guide with rubrics or calibrated scoring papers  Based on personal judgment  Increases potential for error 36 Scoring Items

Sample Holistic Rubric In 200 words or less, describe how you would explain to a home owner the concept of eminent domain and how it is related to the Fifth Amendment.

Sample Analytic Rubric MCAS Test Question Grade Question 10

39 Calibrated Scoring Paper MCAS Test Question Grade Question 10

Simple Score Report 40

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 41

Assessment Quality  Reliability  Validity  Fairness and Non-Bias  Item Quality  Feasibility 42

Reliability  Degree of consistency in measurement  We want to have confidence that scores are stable  Example: Weighing yourself on a scale 43

Reliability  Four typical approaches  Internal consistency  Test-retest  Alternate forms or split-half  Inter-rater agreement  Reliability coefficients are estimated using statistical formulas  We cannot “see” reliability  Ranges from 0 (no reliability) to 1 (perfect reliability) 44

 Validity refers to the validity of inferences made about assessments or based on assessment data  Gives you confidence that what you say about student assessment scores and therefore about students is justified  Example: Weighing yourself on two different kinds of scales 45 Validity

 For existing measures, districts review content in instrument and judge whether it matches curriculum (review of alignment and rigor) 46 Validity Based on Content True/False: Will a set of skis that are 6’ (six feet) high fit into an empty closet with the following dimensions? Dimensions: 3 FeetWide 3 FeetDeep 5 FeetHigh

 Assessment should show:  Moderate to strong and positive correlations with similar instruments/outcomes  Low positive or even negative correlations with dissimilar instruments/outcomes  Correlation = A statistical technique that is used to measure and describe the strength and direction of the relationship between two variables  Range from -1 to Validity Based on Relationships

 Realization of benefits  Student learning  Teacher improvement  Minimization of negative consequences  Poor student or teacher attitudes toward the assessment or assessments generally  Limiting instruction only to the content covered in the instrument  Improper use of scores 48 Consequential Validity

 Fairness  All examinees have equal opportunity to demonstrate knowledge on assessment  Non-Bias  Students with similar ability receive similar scores, regardless of group membership 49 Fairness and Non-Bias

 Item quality is a key to assessment quality  We typically look at three things:  Difficulty  Ensure a range of difficulty (e.g., easy, medium, hard) in items  Average difficulty aligns to assessment purpose and target population  Discrimination  Ensure that people who got higher scores on the instrument overall tend to get higher scores on that item  Guessing  Reduce guessing by writing good response options for selected response items (e.g., multiple-choice items) 50 Item Quality

Feasibility  Cost  Technology  E.g., Paper and pencil, online, adaptive  Assessment length  Reports  E.g., access to, ability to interpret  Accommodations and accessibility 51

Your work today Begin assessing your department’s level of readiness to implement DDMs  Do you have assessments you believe meet the criteria to be a DDM?  Do you have assessments that require some adjustment to meet the criteria to be a DDM?  Do you have to create new assessments?  Do your assessments measure a year of growth?  Do the assessments have rigor?  Are your assessments valid and reliable? Massachusetts Department of Elementary and Secondary Education 52

Based on the presentation, identify some basic “do’s and don’ts” about assessment that you need to consider when selecting or building a DDM. 53 Discussion Guide #1

Take a minute to jot down sources of existing assessments that might be used for DDMs. 54 Discussion Guide #2

Think about the potential DDMs you just listed.  Which one shows the best alignment to the curriculum?  Do they also have the appropriate degree of rigor?  How can it be improved? 55 Discussion Guide #3

Assessments are composed of: Table of Test Specifications Administration Protocol Instrument Scoring Method Documentation Reflect for a moment on one of the assessments you’re considering for use as a DDM. Identify the components you have in place and those you’d want to develop. 56 Discussion Guide #4