District Determined Measures

Slides:



Advertisements
Similar presentations
Common Core Standards (What this means in computer class)
Advertisements

Rubric Design Denise White Office of Instruction WVDE.
Analyzing Student Work
Measurable Annual IEP Goals
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
District Determined Measures
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
Writing B. Finco. A little light reading! B. Finco.
Student Growth Measures in Teacher Evaluation
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Rubrics “You need to learn what rubrics are and how they can help you do a better job.
Making Your Assessments More Meaningful Flex Day 2015.
WORKING TOGETHER ACROSS THE CURRICULUM CCSS ELA and Literacy In Content Areas.
Rubric Design MLTA Conference What is the assessment for?
EXPLORING PURPOSE AND AUDIENCE WITH MIDDLE LEVEL WRITERS Reasons to Write Alisha Bollinger – 2015 Nebraska Reading Conference.
MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity Daniel J. Wiener, Administrator of Inclusive Assessment.
North Carolina Professional Teaching Standards Lee County Schools New Hire Training
Grade 12 Subject Specific Ministry Training Sessions
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
Principles of Assessment
Student Growth Measures in Teacher Evaluation Module 2: Selecting Appropriate Assessments 1.
Shifting to a Standards- Based Mindset Through Quality Assessments and Backwards Design LMS Department Everett High School October 10, 2014.
Becoming a Teacher Ninth Edition
Student Learning targets
2012 Secondary Curriculum Teacher In-Service
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
Classroom Assessments Checklists, Rating Scales, and Rubrics
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Assessment 3 Write a goal for your future personal health. Underneath the goal, list two objectives you would like to meet related to this goal. Grading.
What is Open response?.  A Situation Reading Open Response will have a story, a poem, or an article to read.  A Task Set of questions/prompt to answer.
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
EDU 385 Education Assessment in the Classroom
Information for school leaders and teachers regarding the process of creating Student Learning Targets. Student Learning targets.
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
DDM Part II Analyzing the Results Dr. Deborah Brady.
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
Wednesday, Thursday 1/21 – 1/22 Bellringer Handout – Class set. Please write in your composition notebooks.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
The Essential Skill of Writing An Introductory Training for High School Teachers Penny Plavala, Multnomah ESD Using the Writing Scoring Guide.
DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.
Interdisciplinary Writing Unit: Narrative Kim Stewart READ 7140.
Susan Boone Westside High School Houston Independent School District.
Release of PARCC Student Results. By the end of this presentation, parents will be able to: Identify components of the PARCC English.
Everyone's favorite... Long Compositions!!.
Professional Learning Resources Download presentations and resources from today’s sessions!  Go to BISD homepage  Departments  Professional Learning.
0 Stepping Up Middle School Place your logo here.
0 Stepping Out! High School Place your logo here.
Auburn and Leicester DDM Scoring
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Performance Task Overview Introduction This training module answers the following questions: –What is a performance task? –What is a Classroom Activity?
Student Growth Goals Professional Learning Jenny Ray, PGES Consultant (KDE) 1.
Portfolios A number of years ago the portfolio became part of the requirements to attain the two highest levels of graduation status. Though one.
Auburn and Leicester DDM Scoring March 2015 Dr. Deborah Brady.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
 Florida Standards Assessment: Q & A with the State Literacy Department January Zone Meeting.
Supporting the Development of Student Learning Objectives How to Create an SLO.
Acton-Boxborough Day 2 District Determined Measures December 15, 2014 Dr. Deborah Brady
Curriculum Night Elementary. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Contact Information O Administration O Principal: Melanie Fishman O Assistant Principal: Lisa Gonsky.
Lessons Learned. Communication, Communication, Communication Collaborative effort Support of all stakeholders Teachers, Principals, Supervisors, Students,
District Determined Measures
Six Steps to Building Academic Vocabulary
Roland Wilson, David Potter, & Dr. Dru Davison
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Integrating Outcomes Learning Community Call February 8, 2012
Discussion and Vote to Amend the Regulations
Unit 7: Instructional Communication and Technology
Presentation transcript:

District Determined Measures Acton-Boxborough Day 2 District Determined Measures December 15, 2014 Dr. Deborah Brady dbrady3702@msn.com

Do Now You may want to download from http://ddmsbrady.wikispaces.com Getting Online You may want to download from http://ddmsbrady.wikispaces.com The Excel file (for calculating local DDMs) The Grade 4 file (for looking at student work) The Agenda, Mock Assessment Protocol, Checklist

Agenda II. Scoring DDMs: Calibration and Calculations I. Collecting DDMs, Assuring High Quality Coversheet and Checklist Examples The good the bad, the ugly “Mock” evaluation of sample DDMs II. Scoring DDMs: Calibration and Calculations Group Work: Calibration Protocols—Calibrating with Rubrics Group Work: Excel Calculation of pre-post, rubrics, and MCAS SGP Lunch III. Time to work your district’s plan for Communication Designation and Documentation of DDMs Assessment of DDMs Analysis of pre and post test Calculating Individual Teacher Growth Scores 1:00 Optional: Union Negotiations (30 minutes—table talk) 1:30 Optional: Indirect Measures—Administrator, Guidance Counselor, Nurse, School Psychologists, for example 2:00 Optional: Your choice

Why Flunking Exams is Actually a Good Thing “That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward.” The excitement around pre-finals is rooted in the fact that the tests appear to improve subsequent performance in topics that are not already familiar, whether geography, sociology or psychology. At least they do so in experiments in controlled laboratory conditions. A just-completed study — the first of its kind, carried out by the U.C.L.A. psychologist Elizabeth Ligon Bjork — found that in a live classroom of Bjork’s own students, pretesting raised performance on final-exam questions by an average of 10 percent compared with a control group. Full article is on wiki "Why flunking exams is actually a good thing" from NY Times. http://www.nytimes.com/2014/09/07/magazine/why-flunking-exams-is-actually-a-good-thing.html?module=Search&mabReward=relbias:r,{1:RI:7}&_r=1

Consistency in Directions for Administrating Assessments Directions to teachers need to define rules for giving support, dictionary use, etc. What can be done? What cannot? “Are you sure you are finished?” How much time? Accommodations and modifications?

Examples: The Good, the Bad, the Ugly Scores:. Thumbs up—All is good Examples: The Good, the Bad, the Ugly Scores: Thumbs up—All is good Thumbs horizontal—Some questions Thumbs Down—Needs significant work

Quick Reminder: Assessment Quality Requirements and Definitions from DESE (See Checklist.) Alignment to Frameworks and District Curriculum content and/or district standards Rigor Comparability across all classes and in all disciplines “Substantial” assessment of the course; core content and skills Modifications are allowed as with MCAS Table Vote Thumbs UP? Halfway? DOWN?

Learning Skills Criteria (Special Education) Individual Goals; measured weekly; permanent folder Notes Planner Work/Action Plan Flexible when Necessary Prepared for Class (materials, work) Revises work Follows instructions Uses time well Gets to work Asks for help when needed Advocates for self Moving toward independence Works collaboratively Table Vote Thumbs UP? Halfway? DOWN

Essay Prompt from Text Table Vote Read a primary source about Mohammed based on Mohammed’s Wife’s memories of her husband. Essay: Identify and describe Mohammed’s most admirable quality based on this excerpt. Then, select someone from your life who has this quality. Identify who they are and describe how they demonstrate this trait. What’s wrong with this prompt using a primary source and a district-required text-based question? Table Vote Thumbs UP? Halfway? DOWN?

Scoring Guides from Text A scoring guide from a textbook for building a Lou Vee Air Car. Is it good enough to ensure inter-rater reliability? Lou Vee Air Car built to specs (50 points) Propeller Spins Freely (60 points) Distance car travels 1m 70 2m 80 3m 90 4m 100 Best distance (10,8,5) Best car(10,8,5) Best all time distance all classes (+5) 235 points total Table Vote Thumbs UP? Halfway? DOWN?

Grade 2 for overhand throw and catching. PE Rubric in Progress. Grade 2 for overhand throw and catching. Table Vote Thumbs UP? Halfway? DOWN?

Table Vote Thumbs UP? Halfway? DOWN?

Music: Teacher and Student Instructions Table Vote Thumbs UP? Halfway? DOWN?

Scoring Validity—Does it test what it says it tests? Are the assessors’ ratings calibrated? Floor and ceiling effects Rubric concerns Validity assessment after test is given What happens to these scores and assessments? Stored as an L, M, H for district Used as a discussion topic with evaluator

Beware Rubrics! Holistic Rubric Show Progress across a Scale, Continuum, Descriptors 1 2 3 4 Details No improvement in the level of detail. One is true * No new details across versions * New details are added, but not included in future versions. * A few new details are added that are not relevant, accurate or meaningful Modest improvement in the level of detail * There are a few details included across all versions * There are many added details are included, but they are not included consistently, or none are improved or elaborated upon. * There are many added details, but several are not relevant, accurate or meaningful Considerable Improvement in the level of detail All are true * There are many examples of added details across all versions, * At least one example of a detail that is improved or elaborated in future versions *Details are consistently included in future versions *The added details reflect relevant and meaningful additions Outstanding Improvement in the level of detail * On average there are multiple details added across every version * There are multiple examples of details that build and elaborate on previous versions * The added details reflect the most relevant and meaningful additions Example taken from Austin, a first grader from Answer Charter School in Boise, Idaho.  Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts

x x x x Criterion Referenced Rubric and Raw Scores or % of 100 4(25)= 4(22)= 88 x 4(18)=72 x x 4(15)= 60 + 18 + 22 + 15 = 80%

Rubric “Cut Scores” Create a “growth” rubric and describe a typical year’s growth Translate into 100% www.roobrix.com

Calibration Protocol

Considerations for Scoring Student Work Districts will need to determine fair, efficient and accurate methods for scoring students’ work. (Use consistent directions for teachers.) DDMs can be scored by the educators themselves, groups of teachers within the district, external raters, or commercial vendors. For districts concerned about the quality of scoring when educators score their own student’s work, processes such as randomly re-scoring a selection of student work to ensure proper calibration or using teams of educators to score together, can improve the quality of the results. When an educator plays a large role in scoring his/her own work, a supervisor may also choose to include the scoring process into making a determination of a Student Impact.

Mock Calibration 1. All of the readers come together and are provided student compositions for calibration. These compositions can be pre-selected by the facilitator from all of the writing submitted or can be provided by the teachers. To assure fairness in assessment, teachers will not evaluate their own students’ compositions, but will be provided with their students’ work once compositions are graded and the scores are entered for the entire class. The purpose of the calibration meeting is to make sure that all evaluators are assessing student work on the same scale. In addition, the papers used for calibration become the exemplars that all teachers will use during scoring. Rubrics alone are not sufficient for precise assessment. Encourage each scorer to make notations on these compositions. Differences are expected and will be discussed until it is clear that all compositions are assessed with the same standards. 2. Time is given for each scorer to read and to use the rubric provided (local, textbook, MCAS, PARCC, or 6-Trait rubrics are all appropriate) to score the first composition and to enter their score on a chart like the one illustrated below. After each scorer has entered his or her score, the facilitator discusses the reasons for the scores. Scorers may need to find examples for their scores within the compositions. 3. After there is consensus for the first paper, the scorers go on to a second and third until there is a composition that represents each level of the rubric. Generally, after the first one or two compositions are calibrated, the process goes quickly.

Mock Calibration Ask teachers to select (or you can pre-select 2 low compositions, 2 average, and 2 high compositions (altogether) Try to select a clear range. Photocopy all of the compositions and ask teachers to evaluate the top one and using the 6-level rubric and the 4-level rubric to enter their scores under the first composition. Suggest that they can comment on their copies. Continue with the calibration until you have an exemplar for all levels. Then the discussion needs to begin to discuss the specific reasoning for each scoring Assume each person has a good reason, but the purpose is to work toward consensus. Composition # 1st Exemplar 2nd 3rd 4th 5 6 Content 2,2,3,2,4,2 Conventions 1,1,1,1,2,1

MCAS ( 2 Holistic) Rubrics 1 2 3 4 5 6 Content Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language Moderate topic/idea development and organization Adequate, relevant details Some variety in language Full topic/idea development Logical organization Strong details Appropriate use of language Rich topic/idea development Careful and/or subtle organization Effective/rich use of language Conventions Errors seriously interfere with communication AND Little control of sentence structure, grammar and usage, and mechanics Errors interfere somewhat with communication and/or Too many errors relative to the length of the essay or complexity of sentence structure, grammar and usage, and mechanics Errors do not interfere with communication and/or Few errors relative to length of essay or complexity of sentence structure, grammar and usage, and mechanics Control of sentence structure, grammar and usage, and mechanics (length and complexity of essay provide opportunity for student to show control of standard English conventions)

4th Grade Prompt http://www. doe. mass. edu/mcas/student/2014/question You are finally old enough to baby-sit, and your first job is this afternoon! You will be spending the entire afternoon with a one- year-old. When you open the door you realize that instead of watching a one-year-old child, you will be watching a one-year-old elephant! Write a story about spending your afternoon with a baby elephant. Give enough details to show readers what your afternoon is like baby-sitting the elephant

2014 MCAS Grade 4 English Language Arts Composition Topic/Idea Development - Score Point 3 This composition is rudimentary in topic development and organization. The straightforward introduction moves immediately to the surprise of discovering that the "baby" is a baby elephant. From here, though, only basic supporting details are demonstrated as this composition of five paragraphs unfolds. There is an interesting "snoring scenario" which briefly captures the babysitter's personality in his or her impatience with the elephant: "It was as if there were 100 bells surrounding the house and all ringing at the same time. I covered my ears with pillows, it didn't work. I put ear muffs on, it still didn't work. Finally I just woke him up. He was pretty upset." The job ends as the mom comes home and there is a brief exchange of mildly humorous dialogue. The conclusion is simplistic, reiterating that the experience was not enjoyable

Calculating Growth Scores MCAS and Local What you need to understand as you are creating assessments

Growth Score FAQs from DESE Do the same numbers of students have to be identified as having high, moderate, and low growth? There is no set percentage of students who need to be included in each category. Districts should set parameters for high, moderate, and low growth using a variety of approaches. How do I know what low growth looks like? Districts should be guided by the professional judgment of educators. The guiding definition of low growth is that it is less than a year’s worth of growth relative to academic peers, while high growth is more than a year’s worth of growth. If the course meets for less than a year, districts should make inferences about a year’s worth of growth based on the growth expected during the time of the course. Can I change scoring decisions when we use a DDM in the second year? It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time. Will parameters of growth be comparable from one district to another? Different assessments serve different purposes. While statewide SGPs will provide a consistent metric across the Commonwealth and allow for district-to-district comparisons, DDMs are selected

MCAS SGP Local Manipulation of Scores (4-8; ELA or Math; not grade 10)

Excel File Tour

Sample Cut Score Determination (for local assessments) Pre-test Post test Difference Student Scores Sorted low to high Teacher score is based on the MEDIAN Score of her class for each DDM 20 35 15 5 Cut score LOW Growth Lowest ___% 25 30 50 60 median teacher score 40 70 Teacher score 65 75 80 Top 20% 85 HIGH GROWTH Highest ___?

Measuring Growth Example: Fountas and Pinnell P to Q; N to P; D to K

12.0 Median for whole Grade 3 DDM Teacher A Teacher B Teacher C Teacher D Teacher E Teacher F 5 7.5 6.5 3.5 9 5.5 7 10 8.5 1 3 12 6 12.5 11.5 None 13 16 13.5 13.6 17 16.5 15.5 19 Median 6.5 Median 9 Median 12 Median 10 Median 16 Below 6.5 Between 6.5 and 16 LOW Moderate High 103 Third Graders All Classes 1 3 3.5 5.5 6 6.5 7 7.5 8.5 9 9.5 10 10.7 10.9 11.0 11.2 11.4 11.5 11.7 11.9 12.0 12.2 12.5 12.6 12.7 12.9 13.1 13.2 13.4 13.6 13.8 13.9 14.1 14.3 14.4 14.6 14.8 15.0 15.1 15.3 15.5 15.6 13 13.5 16 16.5 17 19 26 6.5 Cut Score lowest 15% 12.0 Median for whole Grade 3 DDM 16 cut score highest 15%

Fountas and Pinnell Growth for Each Student Is Based on 10 Months of Growth Second Grade Student Level Achievement Level End of Year Levels from beginning to the end of the year Pre-Post F&P Levels Growth HIGH, MODERATE, OR LOW GROWTH (10 MONTHS=YEAR) Q Above Benchmark PQ 7 MONTHS GROWTH LOW GROWTH P At Benchmark NOP 10 MONTHS OF GROWTH MODERATE GROWTH K Below Benchmark DEFGHIJK 17 MONTHS OF GROWTH HIGH GROWTH

Summary DDM Process for Determining L, M, H for Every Teacher Whole Grade Level or Course Score the entire grade level or course or take the MCAS Growth Scores for all students Rank the scores from highest to lowest (post minus pre or MCAS SGP) Identify the median score for the entire group Determine the “cut” scores for local assessments; MCAS 35 and 65 for classrooms Individual Teacher Select students for each teacher Rank the scores from highest to lowest Identify the Median score Is the median below or above the “cut” score? Is it in the middle? Don’t forget Roster Verification might change the specific scores and, therefore, change the Median Distribute scores to teachers for each DDM

Mock Scoring, Storing, Determining Cut Scores Protocol 1. Assess all students for the course or grade level. 2. Enter student names, teacher names pre-test scores, post-test scores, and the gain from pre-test to post-test. 3. Using the SORT function, rank all assessments from highest to lowest. Determine the local cut scores. DESE recommends 1.5 Standard Deviations from 50; local districts have selected 10%, 15%, and 20% as their “cut scores.” 4. Look at samples of student work just above and just below the cut scores. Professional judgment about the appropriateness of this number is then used to determine the local cut scores to determine Low, Moderate, and High Growth. This number may vary from assessment to assessment and from year to year based upon the district’s determination and upon the professional judgment of the district.

Using Excel Excel (really simple method) 1. Enter data 2. Simple pre-post formula 3. Cut and Paste Values 4.Sort Highest to Lowest Tabs include (quick tour): Fountas and Pinnell Sample for 6 teachers, 103 students with median for full assessment and for each teacher and with High, Moderate, and Low determinations Pre-Post-Test Calculations (for three teachers) Rubric Pre-Post (for three teaches) MCAS SGP (Student Growth Percentile) Calculations for three teachers A “Test” file that calculates the “gain,” but does not determine cut scores   Use the “test” tab to enter your local data Or use the data provided

Time to Work with Your Team On line materials Excel Templates