DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Performance Assessment
Big Ideas, Learning Goals & Success Criteria
Victorian Curriculum and Assessment Authority
Digging Deeper Into the K-5 ELA Standards College and Career Ready Standards Implementation Team Quarterly – Session 2.
Assessment Literacy Shirley Gilfether, Easthampton Public Schools Director of Curriculum and Grants Management DESE Educator Evaluation Spring Convening.
District Determined Measures
Understanding by Design An Overview by Eduardo M. Valerio, Ph.D.
Writing B. Finco. A little light reading! B. Finco.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Open Ended Assignments Deanna E. Mayers Director of Curriculum Blendedschools.net.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Rubrics “You need to learn what rubrics are and how they can help you do a better job.
Analyzing Assessment Data. A process to consider... Student Learning Outcomes identified for program. Courses identified as to where the outcomes will.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Making Numbers Work… NHSAA: Living with the NCLB Act Ann Remus September 21, 2004 To Improve Instruction.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Session 4 Sentence Combining Adolescent Research and Development Team.
© Curriculum Foundation1 Section 2 The nature of the assessment task Section 2 The nature of the assessment task There are three key questions: What are.
WEEK SYNTHESIS Where have we been this week?. Week Synthesis: A Summary We said we would learn new language and structures in order to… 1. Work with colleagues.
Understanding by Design
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
Purpose  Generate words from given letters.  Build writing fluency  Build word choice.
Public Charter School Grant Program Workshop Aligning Teacher Evaluation, Professional Development, Recruitment and Retention March 3, 2014.
SIX STANDARDS RELATED TO TEACHER EFFECTIVENESS STANDARDS 1-5 FROM OBSERVATIONS AND EVIDENCE STANDARD 6 AS A MEASURE OF STUDENT GROWTH WITH A TEACHER AND.
RUBRICS: A REFRESHER COURSE PRINCESS ANNE MIDDLE SCHOOL STAFF WEEK TRAINING, AUGUST 2014.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
EDU 385 Education Assessment in the Classroom
Session 3 Sentence Patterning Adolescent Research and Development Team.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
Checklists and Rubrics
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Performance-Based Assessment Authentic Assessment
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Copyright © 2008 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING. are registered trademarks of.
Moderation. What it’s not Moderation can provide: Insights into the standards attained by different pupils: progress (learning over time) rate of learning.
March 23, NYSCSS Annual Conference Crossroads of Change: The Common Core in Social Studies.
Session 4 Performance-Based Assessment
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Welcome to MMS MAP DATA INFO NIGHT 2015.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Identifying Assessments
Granite School District Multi-Tiered System of Support Part I: Moving Between Tier 1 Differentiation and Tier 2 Interventions and Extensions Teaching and.
MAVILLE ALASTRE-DIZON Philippine Normal University
Rubric Basics KY Writing Project Conference September 12,
GREAT EXPECTATIONS: THE POWER OF SETTING OBJECTIVES September 2014 Ed Director Meeting.
Checklists and Rubrics EDU 300 Newberry College Jennifer Morrison.
Welcome!  Please complete the three “Do Now” posters.  There are nametags on the tables:  Please ensure that more than one district is represented at.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Welcome Parents! FCAT Information Session. O Next Generation Sunshine State Standards O Released Test Items O Sample Test.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
The Achievement Chart Mathematics Grades The primary purpose of assessment and evaluation is to improve student learning.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Writing Effective Learning Outcomes Joe McVeigh Jenny Bixby TESOL New Orleans, Louisiana, USA March 19, 2011.
Checklists and Rubrics EDU 300 Newberry College Jennifer Morrison
Classroom Assessment A Practical Guide for Educators by Craig A
Chapter 6: Checklists, Rating Scales & Rubrics
Making ESL Rubrics Part 1
What Are Rubrics? Rubrics are components of:
Creating Open-Ended Questions
Creating Open-Ended Questions ERPD October 31, 2012
This webinar will start promptly at 3:45pm
This is the second module of the Collaborative Backward Design series
Presentation transcript:

DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether

Performance vs. Growth  Most current assessments are measures of performance and not growth  Growth takes into account the different levels of student achievement  Measures of growth should provide all students an equal opportunity to demonstrate growth What is Growth?  Growth is about improvement and learning.

Does change represent growth?  Are the assessments similar enough to support meaningful inferences about student growth during the year?  Do early assessments provide meaningful information about what students do not understand? baseline data  Do future assessments provide meaningful information about what students have learned?  Do students have the opportunity to demonstrate different levels of growth?

DDMs in a Backwards Design Model*  DDMs require us to define clearly what we want students to be able to do following instruction.  After we have clearly defined what we want students to be able to do, then we plan curriculum and lessons to get us to that goal Integration  Mass Frameworks: What should students learn?  DDMs: How do students demonstrate that learning?  Curriculum Mapping: How do we get there? * Understanding by Design 2.0 © 2011 Grant Wiggins and Jay McTighe

Steps in the Process 1. Develop your DDM based on core learning objectives for your course (backward planning) 2. Develop a method for collecting baseline data on the same set of objectives (this might be a pre-assessment) 3. Develop assessment procedures (when will assessments be given, what are the common directions that will be given to students (consistency), define any tools that can be used, e.g. calculators

Scoring Guides are Important 4. Develop clear directions for scoring individual student work (scoring guide) 1. explicitly state the aspects of student work that are essential 2. Define the scoring tool (answer sheet; rubric (holistic, analytic, growth); checklist; 3. describe who will score and how validity will be insured (double blind, teacher exchange, spot check, or objective assessment…)

Scoring Guides Must be Clear Scoring Guide Example:  2 points for a correct answer with student work shown correctly  1 point for an incorrect answer with student work shown correctly Issue:  Not clear around how to score a student with a correct answer with no student work shown or with student work shown incorrectly.  Not clear what “shown correctly” means.

Scoring Guide Example - Improved  2 points for a correct answer with either a chart or table showing how the student set up the problem.  1 point for an incorrect answer, but the work demonstrates setting up the problem with a table or picture. Supporting work may include incorrect numbers or other mistakes.  1 point for a correct answer and there is no supporting work or if student work is not organized in a table or chart.  0 point for no correct answer, and work is not organized in a table or chart. The scoring guide could be further improved by incorporating anchor examples.  :

Using Rubrics Rubric Style AnalyticHolistic Student work is assessed by clearly defined criteria along multiple dimensions (i.e., each rows of the rubric assesses a different criterion). Student work is assessed as a whole product based on an overall impression.

Growth Rubrics AnalyticHolistic Low Growth Moderate Growth High Growth Low Growth Moderate Growth High Growth 0 or or more Little to no improvement in following writing conventions Average improvement in following writing conventions High improvement in following writing conventions Number of writing mechanics, such as punctuation, capitalization, misspelled word, where student has corrected the mistake in future writing 012 or more Number of examples of improvement of language usage and sentence formation, such as word order, subject- verb agreement, or run-on sentences student has corrected the mistake in future writing.

Step 5 - Clear directions for determining a student’s growth Clear directions for determining a student’s growth  Pre-Test/Post Test  Repeated Measures  Holistic Evaluation  Post-Test Only Learn more Webinar 5 Technical Guide B AppxB.pdf

Step 6 – Setting Parameters for high, moderate, and low student growth  First decide what comparison data makes growth  difference in raw score  difference in % score  percent of increase in score  other  Then build your moderate range  what constitutes normal growth? (in some cases a year’s worth of growth)  this should be the largest range

Finishing the Range  After the moderate range has been determined build the low and high ranges of growth  the low range represents less than expected growth  the high range represents significantly higher than expected growth  It is clearly understood that there may need to be an adjustment to the ranges after you get data next year. There will be an opportunity for that

Final Step: Establishing Teacher’s Impact Rating  Using the class roster, teachers will identify each student’s growth as H igh, M oderate or L ow  Then the teacher is to find the MEDIAN (middle) growth for that class (not average)  If there are multiple classes, the Median would be found for all students’ growth data

Questions and Answers Reminder: The second DDM Drop-In session will be held on Thursday, March 20 th from 2 – 5 pm