Basics of Assessment Webinar Series Part 2. Logistics  Q&A  Type your questions into the Chat box on the lower right-hand corner of the screen  Recording.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Performance Assessment
Analyzing Student Work
District Determined Measures
District-Determined Measures: Basics of Assessment.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
DEEPENING ASSESSMENT LITERACY Fall Objective  Identify best practices for local assessment development  Provide a working knowledge of the WPSD.
Michigan Assessment Consortium Common Assessment Development Series Module 6 – The Test Blueprint.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Principles of High Quality Assessment
Chapter 7 Correlational Research Gay, Mills, and Airasian
Grade 12 Subject Specific Ministry Training Sessions
Understanding Validity for Teachers
What should be the basis of
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
performance INDICATORs performance APPRAISAL RUBRIC
Principles of Assessment
CURRICULUM ALIGNMENT Debbi Hardy Curriculum Director Olympia School District.
Evaluation: A Challenging Component of Teaching Darshana Shah, PhD. PIES
Classroom Assessment and Grading
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
Waiting Room  Today’s webinar will begin shortly. REMINDERS: Dial and enter the passcode # to hear the audio portion of the presentation.
Classroom Assessment A Practical Guide for Educators by Craig A
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Massachusetts Department of Elementary & Secondary Education Waiting Room Today’s webinar will begin shortly. Massachusetts Department of Elementary &
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Introduction: District-Determined Measures and Assessment Literacy Webinar Series Part 1.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
District-Determined Measures Planning and Organizing for Success Educator Evaluation Spring Convening: May 29, 2013.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Measuring Complex Achievement
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Alternative Assessment
Teaching Today: An Introduction to Education 8th edition
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
MVSA Ron Noble - ESE October 16, 2013 DDMs: Updates and Discussion.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Montgomery County ESC Phase II: SLO Training.  In order to show growth you need to have a consistent framework between the pre- and post-assessments.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
FALCON Meeting #3 Preparation for Harnett County Schools Thursday, March 8, 2012.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Assessment and Testing
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
July 11, 2013 DDM Technical Assistance and Networking Session.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Quality Assurance processes
Classroom Assessment A Practical Guide for Educators by Craig A
Child Outcomes Summary Process April 26, 2017
Classroom Assessment Validity And Bias in Assessment.
Waiting Room Today’s webinar will begin shortly. REMINDERS:
Critically Evaluating an Assessment Task
Assessments: Beyond the Claims
Presentation transcript:

Basics of Assessment Webinar Series Part 2

Logistics  Q&A  Type your questions into the Chat box on the lower right-hand corner of the screen  Recording  Each part of the webinar series will be recorded and archived.  Supplemental materials  All materials needed to participate in each session will be posted on the sign-in page. 2

TitleDateLengthTime 1 Introduction: District-Determined Measures and Assessment Literacy 3/1460 minutes4-5pm ET 2 Basics of Assessment4/490 minutes 4-5:30pm ET 3 Assessment Options4/2560 minutes4-5pm ET TA and Networking Session I5/23 4 Determining the Best Approach to District- Determined Measures 7/1860 minutes4-5pm ET 5 Integrating Assessments into Educator Evaluation: Reporting Student Growth 8/1560 minutes4-5pm ET 6 Integrating Assessments into Educator Evaluation: Developing Business Rules and Engaging Staff 8/2960 minutes4-5pm ET TA and Networking Session II9/19 7 Communicating results10/2460 minutes4-5pm ET 8 Sustainability12/560 minutes4-5pm ET TA and Networking Session III12/12 Webinar Series

Audience & Purpose  Target audience  District teams that will be engaged in the work of identifying and selecting District-Determined Measures  Purpose  Participants will:  develop a working knowledge of the key concepts of assessment relevant for using assessments to measure student growth  learn about different types of assessments, how to determine assessment alignment, and elements of assessment quality 4

Agenda  District Determined Measures  Assessment Overview  Types of Assessments  Alignment  Assessment Components  Assessment Quality 5

 Measures of student learning, growth, and achievement related to the Massachusetts Curriculum Frameworks, Massachusetts Vocational Technical Education Frameworks, or other relevant frameworks, that are comparable across grade or subject level district-wide.  Pre and post unit and course assessments  Approved commercial assessments  Portfolios  Capstone projects 6 District Determined Measures

Student Impact Rating  Student Impact Rating must be based on at least 2 years of data (trends) across multiple measures (patterns):  State-wide growth measures  MCAS student growth percentiles  District-Determined Measures 7 Part VII: Rating Educator Impact on Student Learning Using District-Determined Measures of Student Learning Part VII: Rating Educator Impact on Student Learning Using District-Determined Measures of Student Learning YearMeasure Year 1 MCAS SGP, grade 5 mathematics Unit assessment on multiplication and division of fractions Year 2 MCAS SGP, grade 5 mathematics Unit assessment on multiplication and division of fractions

The Opportunity  Identifying DDMs can be the impetus for broadening and strengthening the district’s assessment practices.  DDMs will provide educators with useful data that will help them improve both student outcomes and their instructional practices.  DDMs will yield data educators can use throughout the 5-step evaluation cycle. 8

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 9

Assessment Overview  Assessment is a general term that refers to the process of gaining information about student learning  Process includes administration procedures, scoring, reporting, etc.  A DDM is an assessment  Instrument refers to a specific type of data collection tool or mechanism used in an assessment process  There are many types of instruments  A test is one example 10

11 Value of Good Assessment Better assessment Better teaching Better learning and greater confidence Better student outcomes Better opportunities in life

 Indirect  Gather information from sources other than actual samples of student work  Direct  Gather information from actual samples of student work 12 Assessment Approaches

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 13

14 Types of Assessments On-DemandPerformance, Project PortfolioHybrid Summative Assessments: EOC, EOY, Interim, Capstone

On-Demand Assessments  An assessment that takes place at a predetermined time and place, usually under standard conditions for all students being assessed  E.g., SAT, district and state tests, and most in-class unit tests and final exams 15

 Assessments based on observations of behaviors or based on work performed on a complex activity  Natural vs. Artificial  Unstructured vs. Structured  Participant vs. External Observer  Self-rated vs. Other-rated (teacher, peer, observer) 16 Performance/Project

Portfolio  A purposeful and systematic collection of student work  Should include:  student participation in the selection of portfolio content,  the criteria for selection are aligned to standards and grade- level expectations through a rubric or other scoring device,  the criteria for judging merit, and  evidence of student self-reflection  May include both finished work (Product) and work in progress (Process)  When using portfolios for DDMs, include the student’s finished products whenever possible  May focus on one or more curricular areas 17

 An on-demand assessment that combines of two or more types of assessments  Usually a paper-and-pencil or online test with a performance, portfolio, or project assessment 18 Hybrid Assessment

 Take a minute to jot down sources of existing assessments that might be used for DDMs. 19 Reflection #1

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 20

 Alignment refers to the extent to which the assessment aligns with curriculum as expressed in the curriculum map  Rigor is the level of cognitive complexity of the item or of a set of items  Bloom’s revised taxonomy or other taxonomy  Understanding alignment and rigor is critical for selecting or developing an assessment  Documentation is the key!  E.g., Table of Test Specifications 21 Alignment and Rigor

 DDMs reflect key learning objectives by grade and content area  Information on key objectives is found in the district’s curricular maps and other curricular planning tools 22 Alignment

 Identify the key content you want to assess  Standards  E.g., Use trigonometric ratios and the Pythagorean Theorem to solve right triangles in applied problems (Mathematics.G.SRT.3.08)  Learning objectives  E.g., Students will correctly apply Pythagorean Theorem when prompted.  E.g., Students determine when to correctly apply trigonometric ratio models. 23 Alignment

Table of Test Specifications 24

25 Rigor – Revised Bloom’s Taxonomy

26 Rigor – Revised Bloom’s Taxonomy

27 Rigor – Revised Bloom’s Taxonomy

 Think about the potential DDMs you wrote down in Reflection #1.  Of these, consider which one shows the best alignment to the curriculum and the degree of rigor.  Identify one that is promising but that does not show perfect alignment. Consider how could it be improved. 28 Reflection #2

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 29

Assessment Components  Table of Test Specifications  Administration Protocol  Instrument  Scoring Method  Documentation 30

 Often found in Administration Manuals  Needed to ensure all students have a fair opportunity to demonstrate what they know and can do in the assessment  Proctoring directions  All examinees receive same set of directions and conditions for taking the instrument (including resources – e.g., calculator – breaks, pacing, etc.)  Security provisions  Ensure specific items are not overly familiar to examinees and proctors (unless portfolio assessment is used)  Student accommodations  Ensure the majority of students can participate in the program, with appropriate supports 31 Administration Protocols

 Selected Response  True–False  Multiple Choice  Matching  Constructed Response  Short answer  Restricted constructed response  Extended constructed response (includes essay)  Portfolio item  Performance item 32 Items

33 Selected Response Item

34 Constructed Response Item MCAS Test Question Grade Question 10

 Scoring objective items  Scoring key or short guide  Based on clearly defined scoring key and set of scoring rules  Limits error variance  Scoring subjective items  Longer scoring guide with rubrics or calibrated scoring papers  Based on personal judgment  Increases potential for error 35 Scoring Items

Sample Holistic Rubric In 200 words or less, describe how you would explain to a home owner the concept of eminent domain and how it is related to the Fifth Amendment.

Sample Analytic Rubric MCAS Test Question Grade Question 10

38 Calibrated Scoring Paper MCAS Test Question Grade Question 10

 Reports provide information and conclusions from and based on the assessments  May provide information in the form of text, graphs, images, etc…  A goal of any assessment is to yield information that will help educators make appropriate decisions or driving meaningful change  Assessments are most useful when reports tied to objectives and are easy to understand and use 39 Reporting, Interpretation and Use

Simple Score Report 40

 Technical Manual  Administration Manual 41 Documentation

 In summary, assessments are composed of:  Table of Test Specifications  Administration Protocol  Instrument  Scoring Method  Documentation  Reflect for a moment on one of the assessments you’re considering for use as a DDM. Identify the components you have in place and those you’d want to develop. 42 Reflection #3

Agenda  District-Determined Measures  Assessment Overview  Types of Assessments  Alignment and Rigor  Assessment Components  Assessment Quality 43

Assessment Quality  Reliability  Validity  Fairness and Non-Bias  Item Quality  Feasibility 44

Reliability  Degree of consistency in measurement  We want to have confidence that scores are stable  Example: Weighing yourself on a scale 45

Reliability  Four typical approaches  Internal consistency  Test-retest  Alternate forms or split-half  Inter-rater agreement  Reliability coefficients are estimated using statistical formulas  We cannot “see” reliability  Ranges from 0 (no reliability) to 1 (perfect reliability) 46

 Validity refers to the validity of inferences made about assessments or based on assessment data  Gives you confidence that what you say about student assessment scores and therefore about students is justified  Example: Weighing yourself on two different kinds of scales 47 Validity

 For existing measures, districts review content in instrument and judge whether it matches curriculum (review of alignment and rigor) 48 Validity Based on Content

 Assessment should show:  Moderate to strong and positive correlations with similar instruments/outcomes  Low positive or even negative correlations with dissimilar instruments/outcomes  Correlation = A statistical technique that is used to measure and describe the strength and direction of the relationship between two variables  Range from -1 to Validity Based on Relationships

 Realization of benefits  Student learning  Teacher improvement  Minimization of negative consequences  Poor student or teacher attitudes toward the assessment or assessments generally  Limiting instruction only to the content covered in the instrument  Improper use of scores 50 Consequential Validity

 Fairness  All examinees have equal opportunity to demonstrate knowledge on assessment  Non-Bias  Students with similar ability receive similar scores, regardless of group membership 51 Fairness and Non-Bias

 Item quality is a key to assessment quality  We typically look at three things:  Difficulty  Ensure a range of difficulty (e.g., easy, medium, hard) in items  Average difficulty aligns to assessment purpose and target population  Discrimination  Ensure that people who got higher scores on the instrument overall tend to get higher scores on that item  Guessing  Reduce guessing by writing good response options for selected response items (e.g., multiple-choice items) 52 Item Quality

Feasibility  Cost  Technology  E.g., Paper and pencil, online, adaptive  Assessment length  Reports  E.g., access to, ability to interpret  Accommodations and accessibility 53

 Based on all that we discussed today, identify some basic “do’s and don’ts” about assessment that you need to consider when selecting or building a DDM. 54 Reflection #4

Resources  Association of Test Publishers   Buros Mental Measurements Yearbook and Tests in Print    APA, NCME, and AERA Standards for Educational and Psychological Testing   National Council on Assessment in Education 

Register for Webinar 3 Assessment Options  April 25th from 4:00 PM – 5:00 PM  Identify one or more potential DDMs, create a map of assessment gaps appropriate for accountability purposes, develop a preliminary action plan to bring to the Technical Assistance and Networking Session  Click here to register: event500.webex.com/air- event500/onstage/g.php?d= &t=a event500.webex.com/air- event500/onstage/g.php?d= &t=a 56

Questions  Contact Ron Noble at  Tell us how we did: rict-Determined-Measures-amp-Assessment- Literacy-Webinar-2-Feedback rict-Determined-Measures-amp-Assessment- Literacy-Webinar-2-Feedback 57 Feedback