Northborough- Southborough DDM Development October 9, 2014 Dr. Deborah Brady.

Slides:



Advertisements
Similar presentations
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Advertisements

English Language Arts The 6+1 Trait Writing Model
District Determined Measures
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
Overview of Next-Generation Learners Part 2 (Growth) Ken Draut, Associate Commissioner Office of Assessment and Accountability Kentucky Department of Education.
We’ve Got the Wright Stuff Mary Wendland Title 1 Coordinator.
Implementing Virginia’s Growth Measure: A Practical Perspective Deborah L. Jonas, Ph.D. Executive Director, Research and Strategic Planning Virginia Department.
Pennsylvania Common Core Standards & Student Assessments Changes.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Student Growth Developing Quality Growth Goals II
WORKING TOGETHER ACROSS THE CURRICULUM CCSS ELA and Literacy In Content Areas.
PROPOSED MULTIPLE MEASURES FOR TEACHER EFFECTIVENESS
Model Curriculum Maps 2012 Curriculum Summit November 13 – 14, 2012 Julia Phelps and Karen White Raising the Rigor of Teaching and Learning.
Principles of High Quality Assessment
Session 6: Writing from Sources Audience: 6-12 ELA & Content Area Teachers.
Session 6: Writing from Sources Audience: K-5 Teachers.
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
ALIGNMENT. INTRODUCTION AND PURPOSE Define ALIGNMENT for the purpose of these modules and explain why it is important Explain how to UNPACK A STANDARD.
Principles of Assessment
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
Philomath School District Board of Directors Work Session May 10, 2012.
NETA Power Point Slides to accompany: Prepared by Luigi Iannacci Trent University Copyright © 2013 by Nelson Education Ltd.
1 Measuring growth in student performance on MCAS: The growth model.
English Language Arts Program Update Dr. Lisa M. White School Committee Meeting January 24, 2011.
Groton Elementary Agenda: Discuss assessments, modifications, and accommodations Review common accommodations for assessments Study of Test.
1 Making sound teacher judgments and moderating them Moderation for Primary Teachers Owhata School Staff meeting 26 September 2011.
Department of Elementary and Secondary Education July, 2011
A framework to move from common core to classroom practice Puget Sound ESD December
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
EDU 385 Education Assessment in the Classroom
Samples of DDMs Dr. Deborah Brady
DDM Part II Analyzing the Results Dr. Deborah Brady.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
Invention Convention Seth Krivohlavek Angie Deck.
Kindergarten Interdisciplinary Writing Unit By Mary Boston, Michelle Chavarria, Mariah Elder.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
Student Learning Objectives: Approval Criteria and Data Tracking September 17, 2013 This presentation contains copyrighted material used under the educational.
TAKS Writing Rubric
Performance-Based Assessment Authentic Assessment
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Dr. John D. Barge, State School Superintendent “Making Education Work for All Georgians” Assessment for Learning Series Module 2: Understanding.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
SATs INFORMATION EVENING Thursday October 8 th.
EQAO Assessments and Rangefinding
A Closer Look Quality Goals Appropriate Assessments.
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
Common Core State Standards Introduction and Exploration.
Parents Information Evening
Auburn and Leicester DDM Scoring
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Last Revised: 10/01/15. Senate Bill 290 has specific goal-setting requirements for all licensed and administrative staff in the State of Oregon. In ,
Interventions Identifying and Implementing. What is the purpose of providing interventions? To verify that the students difficulties are not due to a.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Auburn and Leicester DDM Scoring March 2015 Dr. Deborah Brady.
Supporting the Development of Student Learning Objectives How to Create an SLO.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Curriculum Night Elementary. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Curriculum Night Middle School. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
SATs KS1 – YEAR 2 We all matter.
Professional Learning – October 12, 2015
Smarter Balanced Assessment Results
Understanding Your Child’s Report Card
Tuesday 27th March 2018 KS1 SATs Meeting.
Presentation transcript:

Northborough- Southborough DDM Development October 9, 2014 Dr. Deborah Brady

Northborough-Southborough  DDM 1 - MCAS (SGP)  For teachers who receive a SGP from MCAS (grades 4-8 for ELA and Math only)  The District is only required to use median Student Growth Percentiles (SGP) from one MCASarea per teacher.  In the first year, the K-5 DDM will focus only on MCAS ELA.  In grades 6-12, the MCAS focus may be either math or ELA.  The DDM rating is based on the SGP (student growth) and not the scaled scores (student achievement).  DDM 1 - Common Assessment  For teachers who do not receive a SGP from MCAS:  Teachers will develop grade level/course common assessments utilizing a pre and post assessment model.  DDM 2 - Common Assessment  For all teachers:  Teachers will develop grade level/course common assessments utilizing a pre- and post-assessment model

Goal: (DDMs must be negotiated with our Associations)  Content Student Learning DDMs  *Core Content Areas  (Core areas: math, English, science, and social studies)  Year 1: Identify first two (of four) unique DDM data elements  Alignment of DDM’s with Massachusetts Curriculum Frameworks  Identify/develop DDMs by common grades (K-12) and content  Create rubric  Collect first year of data  Year 2: Identify second two (of four) unique or utilize DDM’s  (same assessment different students)  Note: Consumer science, applied arts, health & physical education, business education, world language and SISPs – Received a one year waiver  Planning: Identify/develop DDMs for implementation Collect first year of data

Core DDMs ELAMathScienceSocial Studies 12CA/CA 11CA/CA 10CA/CA 9 8MCAS SGP/CA CA/CA 7MCAS SGP/CA CA/CA 6MCAS SGP/CA CA/CA 5MCAS SGP/CA 4 3CA/CA 2 1 KMKEACA

Quality Assessments  Substantive  Aligned with standards of Frameworks, Vocational standards  And/or local standards  Rigorous  Consistent in substance, alignment, and rigor  Consistent with the District’s values, initiatives, expectations  Measures growth (to be contrasted with achievement) and shifts the focus of teaching

Scoring Student Work  Districts will need to determine fair, efficient and accurate methods for scoring students’ work.  DDMs can be scored by the educators themselves, groups of teachers within the district, external raters, or commercial vendors.  For districts concerned about the quality of scoring when educators score their own student’s work, processes such as randomly re-scoring a selection of student work to ensure proper calibration or using teams of educators to score together, can improve the quality of the results.  When an educator plays a large role in scoring his/her own work, a supervisor may also choose to include the scoring process into making a determination of a Student Impact.

Some Possible Common Exam Examples  A Valued Process: PORTFOLIO: 9-12 ELA portfolio measured by a locally developed rubric that assesses progress throughout the four years of high school  K-12 Writing or Writing to Text: A district that required that at least one DDM was “writing to text” based on CCSS appropriate text complexity  Focus on Data that is Important: A HS science department assessment of lab report growth for each course (focus on conclusions)  “New CCSS” Concern: A HS science department assessment of data or of diagram or video analysis

More  CCSS Math Practices: A HS math department’s use of PARCC examples that require writing asking students to “justify your answer”  SS Focus on DBQs and/or PARCC-like writing to Text: A social studies created PARCC exam using as the primary sources. Another social stuies department used “mini-DBQs” in freshman and sophomore courses  Music: Writing about a concert  Common Criteria Rubrics for Grade Spans: Art (color, design, mastery of medium), Speech (developmental levels)

More  Measure the True Goal of the Course: Autistic and behavioral or alternative programs and classrooms, Social- emotional development of independence (whole collaborative—each educator is measuring)  SPED “Directed Study” Model—now has Study Skills explicitly recorded by the week for each student and by quarter on manila folder: Note taking skills, text comprehension, reading, writing, preparing for an exam, time management, and differentiated by student  A Vocational School’s use of Jobs USA assessments for one DDM and the local safety protocols for each shop

Assessing Math Practices Communicating Mathematical Ideas  Clearly constructs and communicates a complete response based on:  a response to a given equation or system of equations  a chain of reasoning to justify or refute algebraic, function or number system propositions or conjectures  a response based on data How can you assess these standards?

Demonstrating Growth Billy Bob’s work is shown below. He has made a mistake In the space to the right, solve the problem on your own on the right. Then find Billy Bob’s mistake, circle it and explain how to fix it. Billy Bob’s work ½ X -10 = = +10 _____________________________________________ ½ X +0 = (2/1)(1/2)X =12.5 (2) X=25 Your work Explain the changes that should be made in Billy Bob’s Work Finding the mistake provides students with a model. Requires understanding. Requires writing in math.

A resource for DDMs. A small step? A giant step? The district decides Which of the three conjectures are true? Justify your answer Determine if each of Michelle’s three conjectures are true. Justify each answer.

Rubrics and grading: numbers good or a problem?

Objectivity versus Subjectivity Calibration  Human judgment and assessment  What is objective about a multiple choice test?  Calibrating standards in using rubrics  Common understanding of descriptors  What does “insightful,” “In-depth,” “general” look like?  Use exemplars to keep people calibrated  Assess collaboratively with uniform protocol

Consistency in Directions for Administrating Assessments  Directions to teachers need to define rules for giving support, dictionary use, etc.  What can be done? What cannot?  “Are you sure you are finished?”  How much time?  Accommodations and modifications?

Qualitative Methods of Determining an Assessment’s VALIDITY  Looking at the “body of the work”  Validating an assessment based upon the students’ work  Floor and ceiling effect  If you piled the gain scores (not achievement) into High, M, and Low gain  Is there a mix of at risk, average, and high achievers mixed throughout each pile or can you see one group mainly represented

Low, Moderate, High Growth Validation  Did your assessment accurately pinpoint differences in growth? 1. Look at the LOW pile If you think about their work during this unit, were they struggling? 2. Look at the MODERATE pile. Are these the average learners who learn about what you’d expect of your school’s student in your class? 3. Look at the HIGH achievement pile. Did you see them learning more than most of the others did in your class?  Based on your answers to 1, 2, and 3,  Do you need to add questions (for the very high or the very low?)  Do you need to modify any questions (because everyone missed them or because everyone got them correct?)

 Tracey is a student who was rated as having high growth.  James had moderate growth  Linda had low growth  Investigate each student’s work  Effort  Teachers’ perception of growth  Other evidence of growth  Do the scores assure you that the assessment is assessing what it says it is? Look at specific students’ work Psychometric process called Body of the Work validation

Objectivity versus Subjectivity Multiple Choice Questions  Human judgment and assessment  What is objective about a multiple choice test?  What is subjective about a multiple choice test?  Make sure the question complexity did not cause a student to make a mistake.  Make sure the choices in M/C are all about the same length, in similar phrases, and clearly different

Rubrics and Inter-Rater Reliability Getting words to mean the same to all raters Category4321 ResourcesEffective useAdequate useLimited useInadequate use DevelopmentHighly focusedFocused response Inconsistent response Lacks focus OrganizationRelated ideas support the writers purpose Has an organizational structure Ideas may be repetitive or rambling No evidence of purposeful organization Language conventions Well-developed command Command; errors don’t interfere Limited or inconsistent command Weak command

Protocol for Developing Inter Rater Reliability  Before scoring a whole set of papers, develop Inter-rater Reliability  Bring High, Average, Low samples (1 or 2 each) (HML Protocol)  Use your rubric or scoring guide to assess these samples  Discuss differences until a clear definition is established  Use these first papers as your exemplars  When there’s a question, select one person as the second reader

Annotated Exemplar How does the author create the mood in the poem? Answer and explanation in the student’s words Specific substantiation from the text The speaker’s mood is greatly influenced by the weather. The author uses dismal words such as “ghostly,” “dark,” “gloom,” and “tortured.”

“Growth Rubrics” May Need to Be Developed Pre-conventional Writing Ages 3-5 Emerging Ages 4-6 Developing Ages Relies primarily on pictures to convey meaning. 2 Begins to label and add “words” to pictures. 2 Writes first name. 1 Demonstrates awareness that print conveys meaning. ? Makes marks other than drawing on paper (scribbles). ? Writes random recognizable letters to represent words. J Tells about own pictures and writing. 2 Uses pictures and print to convey meaning. 2 Writes words to describe or support pictures. 2 Copies signs, labels, names, and words (environmental print). 1 Demonstrates understanding of letter/sound relationship. ? Prints with upper case letters. ? Matches letters to sounds. ? Uses beginning consonants to make words. ? Uses beginning and ending consonants to make words. J Pretends to read own writing. J Sees self as writer. J Takes risks with writing. 2 Writes 1-2 sentences about a topic. 2 Writes names and familiar words. 1 Generates own ideas for writing. ? Writes from top to bottom, left to right, and front to back. ? Intermixes upper and lower case letters. ? Experiments with capitals. ? Experiments with punctuation. ? Begins to use spacing between words. ? Uses growing awareness of sound segments (e.g., phonemes, syllables, rhymes) to write words. ? Spells words on the basis of sounds without regard for conventional spelling patterns. ? Uses beginning, middle, and ending sounds to make words. J Begins to read own writing.

Protocols to Use with Implemented Assessments  Floor and Ceiling Effects  Validating the Quality of Multiple Choice Questions  Inter-Rater Reliaibility with Rubrics and Scoring guides  Low-Medium-High Looking at Student Work Protocol (calibration, developing exemplar, developing action plan)

FAQ from DESE  Do the same numbers of students have to be identified as having high, moderate, and low growth? There is no set percentage of students who need to be included in each category. Districts should set parameters for high, moderate, and low growth using a variety of approaches.  How do I know what low growth looks like? Districts should be guided by the professional judgment of educators. The guiding definition of low growth is that it is less than a year’s worth of growth relative to academic peers, while high growth is more than a year’s worth of growth. If the course meets for less than a year, districts should make inferences about a year’s worth of growth based on the growth expected during the time of the course.  Can I change scoring decisions when we use a DDM in the second year? It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time.  Will parameters of growth be comparable from one district to another? Different assessments serve different purposes. While statewide SGPs will provide a consistent metric across the Commonwealth and allow for district-to-district comparisons, DDMs are selected

Calculating Scores What you need to understand as you are creating assessments

to 244/ 25 SGP 230 to 230/ 35 SGP 214 to 225/ 92 SGP

to 244/ 25 SGP 230 to 230/ 34 SGP 214 to 225/ 92 SGP

Median student growth percentile Last nameSGP Lennon6 McCartney12 Starr21 Harrison32 Jagger34 Richards47 Crosby55 Stills61 Nash63 Young74 Joplin81 Hendrix88 Jones95 Imagine that the list of students to the left are all the students in your 6 th grade class. Note that they are sorted from lowest to highest SGP. The point where 50% of students have a higher SGP and 50% have a lower SGP is the median. Median SGP for the 6 th grade class

Sample Cut Score Determination (for local assessments) Pre-test Post test Difference Student Scores Sorted low to high Teacher score is based on the MEDIAN Score of her class for each DDM Cut score LOW Growth Lowest ___% median teacher score median Teacher score Top 20% Cut score HIGH GROWTH Highest ___?

Important Perspective It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time. We are all learners in this initiative.

Next Steps Today  Begin to Develop Common Assessments  Consider Rigor and Validity (Handout Rubrics)  Develop Rubric (Consider scoring concerns)  Develop Common Expectations for Directions (to Teachers) Other Important Considerations:  Consider when assessments will be given  The amount of time they will take  The impact on the school

Handout Rubrics  Bibliography—Sample exams; sample texts  Rubrics  Types of questions (Multiple choice, essay, performance  Reliability  Will you design 2 exams, pre- and post-  Ultimate validity  Does it assess what it says it does?  How does it relate to other data  Step-by-step, precise considerations (DESE)  Quality Rubric (all areas)  Protocol for determining growth scores