Acton-Boxborough Day 2 District Determined Measures December 15, 2014 Dr. Deborah Brady

Slides:



Advertisements
Similar presentations
Common Core Standards (What this means in computer class)
Advertisements

Measurable Annual IEP Goals
On-Demand Writing Assessment
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
District Determined Measures
GETTING TO KNOW THE SAT TIPS AND TRICKS TO IMPROVE YOUR SAT SCORE MR. TORRES 10/02/2013.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
Writing B. Finco. A little light reading! B. Finco.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Rubrics “You need to learn what rubrics are and how they can help you do a better job.
Rubric Design MLTA Conference What is the assessment for?
Does this look like your RTI team?. Maybe this looks like your team!
EXPLORING PURPOSE AND AUDIENCE WITH MIDDLE LEVEL WRITERS Reasons to Write Alisha Bollinger – 2015 Nebraska Reading Conference.
MCAS-Alt: Alternate Assessment in Massachusetts Technical Challenges and Approaches to Validity Daniel J. Wiener, Administrator of Inclusive Assessment.
North Carolina Professional Teaching Standards Lee County Schools New Hire Training
Grade 12 Subject Specific Ministry Training Sessions
Listening Task Purpose of the test:
Title IIA: Connecting Professional Development with Educator Evaluation June 1, 2015 Craig Waterman.
 “Fluency assessment consists of listening to students read aloud and collecting information about their oral reading accuracy, rate, and prosody.” (Page.
“Fail to plan… plan to fail”
Principles of Assessment
District Determined Measures
Preparing our students for the EAP English Prompt.
 The ACT Writing Test is an optional, 30-minute test which measures your writing skills. The test consists of one writing prompt, following by two opposing.
Student Learning targets
1 Making sound teacher judgments and moderating them Moderation for Primary Teachers Owhata School Staff meeting 26 September 2011.
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
Classroom Assessments Checklists, Rating Scales, and Rubrics
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
Assessment 3 Write a goal for your future personal health. Underneath the goal, list two objectives you would like to meet related to this goal. Grading.
SLOs for Students on GAA February 20, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
EDU 385 Education Assessment in the Classroom
Information for school leaders and teachers regarding the process of creating Student Learning Targets. Student Learning targets.
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
DDM Part II Analyzing the Results Dr. Deborah Brady.
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
Wednesday, Thursday 1/21 – 1/22 Bellringer Handout – Class set. Please write in your composition notebooks.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
The Essential Skill of Writing An Introductory Training for High School Teachers Penny Plavala, Multnomah ESD Using the Writing Scoring Guide.
DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.
Interdisciplinary Writing Unit: Narrative Kim Stewart READ 7140.
Everyone's favorite... Long Compositions!!.
Professional Learning Resources Download presentations and resources from today’s sessions!  Go to BISD homepage  Departments  Professional Learning.
Auburn and Leicester DDM Scoring
Essay Prompt WHAT is a major theme developed in your novel, and HOW is that theme developed throughout the piece of writing? (in discussing the HOW, you.
College Career Ready Conference Today we will:  Unpack the PARCC Narrative and Analytical writing rubrics while comparing them to the standards.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Performance Task Overview Introduction This training module answers the following questions: –What is a performance task? –What is a Classroom Activity?
Portfolios A number of years ago the portfolio became part of the requirements to attain the two highest levels of graduation status. Though one.
Auburn and Leicester DDM Scoring March 2015 Dr. Deborah Brady.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
 Florida Standards Assessment: Q & A with the State Literacy Department January Zone Meeting.
Supporting the Development of Student Learning Objectives How to Create an SLO.
Second Grade Parent Night. Reading and Writing Mini-Workshop S.A.F.A.R.I. Guides: Mrs. Bowen Mrs. Moorhead.
Contact Information O Administration O Principal: Melanie Fishman O Assistant Principal: Lisa Gonsky.
25 minutes long Must write in pencil Off topic or illegible score will receive a 0 Essay must reflect your original and individual work.
 1. optional (check to see if your college requires it)  2. Test Length: 50 min  3. Nature of Prompt: Analyze an argument  4. Prompt is virtually.
KS2 SATS Guidance for Parents
District Determined Measures
KS2 SATS 2018.
PURPOSE/FOCUS/ORGANIZATION
PURPOSE/FOCUS/ORGANIZATION
PURPOSE/FOCUS/ORGANIZATION
Information for Parents on Key Stage 2 SATs
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Year 6 Parent Forum Amina Patel: Head Teacher
PURPOSE/FOCUS/ORGANIZATION
Discussion and Vote to Amend the Regulations
Unit 7: Instructional Communication and Technology
NC Tenth Grade Writing Test
Presentation transcript:

Acton-Boxborough Day 2 District Determined Measures December 15, 2014 Dr. Deborah Brady

Do Now Getting Online You may want to download from Boxborough  The Excel file (for calculating local DDMs)  The Grade 4 file (for looking at student work)  The Agenda, Mock Assessment Protocol, Checklist

Agenda I. Collecting DDMs, Assuring High Quality Coversheet and Checklist Examples The good the bad, the ugly “Mock” evaluation of sample DDMs Speed Sharing II. Scoring DDMs: Calibration and Calculations Group Work: Calibration Protocols—Calibrating with Rubrics Group Work: Excel Calculation of pre-post, rubrics, and MCAS SGP Lunch III. Time to work your district’s plan for Communication Designation and Documentation of DDMs Assessment of DDMs Analysis of pre and post test Calculating Individual Teacher Growth Scores 1:30 Optional: Indirect Measures—Administrator, Guidance Counselor, Nurse, School Psychologists, for example 2:00 Optional: Your choice

Speed Sharing Sharers, go to “your own” table Travelers move in 3’s following the sequence adding a number to the present table’s number The Protocol 1.(1 minute) Sharers— describe your DDM, grade, content area or SISP, rigor, rubrics 2.(2 minutes) Travelers ask questions, discuss 3.(1 minute) Sharers summarize discuss

Example: Generic SISP Rubric Direct(for Students) CriteriaAt riskNIProficientAdvanced IndependenceNeeds frequent prompting, encouragement, support to begin, continue, and finish Needs occasional prompts, encouragement, and support to begin, continue, and finish Generally starts and finishes on own but sometimes needs prompting at one point in the process Completely responsible for all aspects of task from starting to addressing details to checking accuracy to finishing on time

Example: Generic SISP Rubric for Team Indirect(for process improvement) CriteriaPresent Low Improvement Moderate Improvement (what is hoped for) High Improvement (more improvement than expected) Improving attendance for at risk subgroup (may include students who are physically fragile, emotionally fragile, or who are absent frequently and the reasons aren’t clear) Interventions: Nurse: counseling School Psych/Guidance: counseling Principal/Office: follow-ups with families with support from this team Students who have illnesses are often absent or out of class because they don’t understand their disease, etc. Students who have school anxiety are absent frequently or retreat to the office and do not participate in class Frequent absentees often lose interest and connection with school and friends Students with illnesses will attend and participate more than last year Students with social/emotional problems will attend and participate more than last year Frequent absentees will attend more frequently Rare absences because of Physical concerns Social emotional concerns All other factors

Example: Generic SISP Rubric for Team Indirect(for quality/process improvement) CriteriaPresent Low Improvement Moderate Improvement (what is hoped for) High Improvement (more improvement than expected) Improving assessment process for pK-2 referrals for Special Education Many students are referred to be tested for SPED without first providing classroom- based interventions. The process varies among all of the primary schools All Student Support Teams will use a consistent process The specialists and SPED staff will provide consulting or workshops support the classroom RTI process All referrals result only after Level I interventions have been tried and assessed for at least 6 weeks.

Why Flunking Exams is Actually a Good Thing The excitement around pre-finals is rooted in the fact that the tests appear to improve subsequent performance in topics that are not already familiar, whether geography, sociology or psychology. At least they do so in experiments in controlled laboratory conditions. A just-completed study — the first of its kind, carried out by the U.C.L.A. psychologist Elizabeth Ligon Bjork — found that in a live classroom of Bjork’s own students, pretesting raised performance on final-exam questions by an average of 10 percent compared with a control group. “That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward.” Full article is on wiki "Why flunking exams is actually a good thing" from NY Times. 14/09/07/magazine/why- flunking-exams-is-actually-a- good- thing.html?module=Search&m abReward=relbias:r,{1:RI:7}&_ r=1 14/09/07/magazine/why- flunking-exams-is-actually-a- good- thing.html?module=Search&m abReward=relbias:r

Consistency in Directions for Administrating Assessments  Directions to teachers need to define rules for giving support, dictionary use, etc.  What can be done? What cannot?  “Are you sure you are finished?”  How much time?  Accommodations and modifications?

Examples: The Good, the Bad, the Ugly Scores: Thumbs up—All is good Thumbs horizontal—Some questions Thumbs Down—Needs significant work

Quick Reminder: Assessment Quality Requirements and Definitions from DESE (See Checklist.) Alignment to Frameworks and District Curriculum content and/or district standards Rigor Comparability across all classes and in all disciplines “Substantial” assessment of the course; core content and skills Modifications are allowed as with MCAS Table Vote Thumbs UP? Halfway? DOWN?

Learning Skills Criteria (Special Education)  Individual Goals; measured weekly; permanent folder  Notes  Planner  Work/Action Plan  Flexible when Necessary  Prepared for Class (materials, work)  Revises work  Follows instructions  Uses time well  Gets to work  Asks for help when needed  Advocates for self  Moving toward independence  Works collaboratively Table Vote Thumbs UP? Halfway? DOWN

Essay Prompt from Text Re ad a primary source about Mohammed based on Mohammed’s Wife’s memories of her husband. Essay: Identify and describe Mohammed’s most admirable quality based on this excerpt. Then, select someone from your life who has this quality. Identify who they are and describe how they demonstrate this trait. What’s wrong with this prompt using a primary source and a district-required text-based question? Table Vote Thumbs UP? Halfway? DOWN?

Scoring Guides from Text  Lou Vee Air Car built to specs (50 points)  Propeller Spins Freely (60 points)  Distance car travels  1m 70  2m 80  3m 90  4m 100  Best distance (10,8,5)  Best car(10,8,5)  Best all time distance all classes (+5)  235 points total A scoring guide from a textbook for building a Lou Vee Air Car. Is it good enough to ensure inter-rater reliability? Table Vote Thumbs UP? Halfway? DOWN?

PE Rubric in Progress. Grade 2 for overhand throw and catching. Table Vote Thumbs UP? Halfway? DOWN?

Table Vote Thumbs UP? Halfway? DOWN?

Music: Teacher and Student Instructions Table Vote Thumbs UP? Halfway? DOWN?

Scoring  Validity—Does it test what it says it tests?  Are the assessors’ ratings calibrated?  Floor and ceiling effects  Rubric concerns  Validity assessment after test is given  What happens to these scores and assessments?  Stored as an L, M, H for district  Used as a discussion topic with evaluator

Holistic Rubric Show Progress across a Scale, Continuum, Descriptors Details No improvement in the level of detail. One is true * No new details across versions * New details are added, but not included in future versions. * A few new details are added that are not relevant, accurate or meaningful Modest improvement in the level of detail One is true * There are a few details included across all versions * There are many added details are included, but they are not included consistently, or none are improved or elaborated upon. * There are many added details, but several are not relevant, accurate or meaningful Considerable Improvement in the level of detail All are true * There are many examples of added details across all versions, * At least one example of a detail that is improved or elaborated in future versions *Details are consistently included in future versions *The added details reflect relevant and meaningful additions Outstanding Improvement in the level of detail All are true * On average there are multiple details added across every version * There are multiple examples of details that build and elaborate on previous versions * The added details reflect the most relevant and meaningful additions Example taken from Austin, a first grader from Answer Charter School in Boise, Idaho. Used with permission from Expeditionary Learning. Learn more about this and other examples at

4(25)= 100 4(22)= 88 4(18)= 72 4(15)= 60 x x x x = 80% Criterion Referenced Rubric and Raw Scores or % of 100

Rubric “Cut Scores”  Create a “growth” rubric and describe a typical year’s growth  Translate into 100% 

Reading and Writing Rubrics: One test: 2 DDMs Understandin g and Analysis Basic understanding, little or no analysis Sustained and convincing analysis Author’s Craft Understandin g and Analysis Virtually no reference to author’s craft and how it works Clear analysis of author’s use of literary devices, language, etc Organization and Coherence Little organization; no coherence Ideas are clearly, thoughtfully, organized with a clear, coherent plan for the entire essay and each paragraph Varied, clear, accurate vocabulary and sentences Little awareness of register, style, audience; many grammatical errors mar the ideas Clear, effective, sometimes nuanced choice of words and sentences. Appropriate register and awareness of audience. Few, if any, mechanical errors Reading Rubric Writing Rubric

Calibration Protocol

Considerations for Scoring Student Work  Districts will need to determine fair, efficient and accurate methods for scoring students’ work. (Use consistent directions for teachers.)  DDMs can be scored by the educators themselves, groups of teachers within the district, external raters, or commercial vendors.  For districts concerned about the quality of scoring when educators score their own student’s work, processes such as randomly re-scoring a selection of student work to ensure proper calibration or using teams of educators to score together, can improve the quality of the results.  When an educator plays a large role in scoring his/her own work, a supervisor may also choose to include the scoring process into making a determination of a Student Impact.

Mock Calibration 1. All of the readers come together and are provided student compositions for calibration. These compositions can be pre-selected by the facilitator from all of the writing submitted or can be provided by the teachers. To assure fairness in assessment, teachers will not evaluate their own students’ compositions, but will be provided with their students’ work once compositions are graded and the scores are entered for the entire class. The purpose of the calibration meeting is to make sure that all evaluators are assessing student work on the same scale. In addition, the papers used for calibration become the exemplars that all teachers will use during scoring. Rubrics alone are not sufficient for precise assessment. Encourage each scorer to make notations on these compositions. Differences are expected and will be discussed until it is clear that all compositions are assessed with the same standards. 2. Time is given for each scorer to read and to use the rubric provided (local, textbook, MCAS, PARCC, or 6-Trait rubrics are all appropriate) to score the first composition and to enter their score on a chart like the one illustrated below. After each scorer has entered his or her score, the facilitator discusses the reasons for the scores. Scorers may need to find examples for their scores within the compositions. 3. After there is consensus for the first paper, the scorers go on to a second and third until there is a composition that represents each level of the rubric. Generally, after the first one or two compositions are calibrated, the process goes quickly.

Mock Calibration  Ask teachers to select (or you can pre-select 2 low compositions, 2 average, and 2 high compositions (altogether) Try to select a clear range.  Photocopy all of the compositions and ask teachers to evaluate the top one and using the 6-level rubric and the 4-level rubric to enter their scores under the first composition. Suggest that they can comment on their copies. Continue with the calibration until you have an exemplar for all levels.  Then the discussion needs to begin to discuss the specific reasoning for each scoring  Assume each person has a good reason, but the purpose is to work toward consensus. Composition #1 st Exemplar 2nd3rd4th56 Content2,2,3,2,4,2 Conventions1,1,1,1,2,1

MCAS ( 2 Holistic) Rubrics ContentLittle topic/idea development, organization, and/or details Little or no awareness of audience and/or task Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language Moderate topic/idea development and organization Adequate, relevant details Some variety in language Full topic/idea development Logical organization Strong details Appropriate use of language Rich topic/idea development Careful and/or subtle organization Effective/rich use of language ConventionsErrors seriously interfere with communication AND Little control of sentence structure, grammar and usage, and mechanics Errors interfere somewhat with communication and/or Too many errors relative to the length of the essay or complexity of sentence structure, grammar and usage, and mechanics Errors do not interfere with communication and/or Few errors relative to length of essay or complexity of sentence structure, grammar and usage, and mechanics Control of sentence structure, grammar and usage, and mechanics (length and complexity of essay provide opportunity for student to show control of standard English conventions)

4 th Grade Prompt x?GradeID=4&SubjectCode=ela&QuestionID=33207# You are finally old enough to baby-sit, and your first job is this afternoon! You will be spending the entire afternoon with a one- year-old. When you open the door you realize that instead of watching a one-year-old child, you will be watching a one-year-old elephant! Write a story about spending your afternoon with a baby elephant. Give enough details to show readers what your afternoon is like baby-sitting the elephant

2014 MCAS Grade 4 English Language Arts Composition Topic/Idea Development - Score Point 3 This composition is rudimentary in topic development and organization. The straightforward introduction moves immediately to the surprise of discovering that the "baby" is a baby elephant. From here, though, only basic supporting details are demonstrated as this composition of five paragraphs unfolds. There is an interesting "snoring scenario" which briefly captures the babysitter's personality in his or her impatience with the elephant: "It was as if there were 100 bells surrounding the house and all ringing at the same time. I covered my ears with pillows, it didn't work. I put ear muffs on, it still didn't work. Finally I just woke him up. He was pretty upset." The job ends as the mom comes home and there is a brief exchange of mildly humorous dialogue. The conclusion is simplistic, reiterating that the experience was not enjoyable

Calculating Growth ScoresMCAS and Local What you need to understand as you are creating assessments

Growth Score FAQs from DESE  Do the same numbers of students have to be identified as having high, moderate, and low growth? There is no set percentage of students who need to be included in each category. Districts should set parameters for high, moderate, and low growth using a variety of approaches.  How do I know what low growth looks like? Districts should be guided by the professional judgment of educators. The guiding definition of low growth is that it is less than a year’s worth of growth relative to academic peers, while high growth is more than a year’s worth of growth. If the course meets for less than a year, districts should make inferences about a year’s worth of growth based on the growth expected during the time of the course.  Can I change scoring decisions when we use a DDM in the second year? It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time.  Will parameters of growth be comparable from one district to another? Different assessments serve different purposes. While statewide SGPs will provide a consistent metric across the Commonwealth and allow for district-to-district comparisons, DDMs are selected

MCAS SGP Local Manipulation of Scores (4-8; ELA or Math; not grade 10)

Excel File Tour

Sample Cut Score Determination (for local assessments) Pre-test Post test Difference Student Scores Sorted low to high Teacher score is based on the MEDIAN Score of her class for each DDM Cut score LOW Growth Lowest ___% median teacher score median Teacher score Top 20% Cut score HIGH GROWTH Highest ___?

Measuring Growth Example: Fountas and Pinnell P to Q; N to P; D to K

Teacher ATeacher BTeacher CTeacher DTeacher ETeacher F None Median 6.5Median 9Median 12Median 10Median 16Median 12 Below 6.5 Between 6.5 and 16 LOWModerate HighModerate All Classes Cut Score lowest 15% 12.0 Median for whole Grade 3 DDM 16 cut score highest 15% 103 Third Graders

Fountas and Pinnell Growth for Each Student Is Based on 10 Months of Growth Second Grade Student Level Achievement Level End of Year Levels from beginning to the end of the year Pre-Post F&P Levels Growth HIGH, MODERATE, OR LOW GROWTH (10 MONTHS=YEAR) QAbove BenchmarkPQ 7 MONTHS GROWTHLOW GROWTH PAt BenchmarkNOP 10 MONTHS OF GROWTH MODERATE GROWTH KBelow BenchmarkDEFGHIJK17 MONTHS OF GROWTH HIGH GROWTH

Summary DDM Process for Determining L, M, H for Every Teacher Whole Grade Level or Course  Score the entire grade level or course or take the MCAS Growth Scores for all students  Rank the scores from highest to lowest (post minus pre or MCAS SGP)  Identify the median score for the entire group  Determine the “cut” scores for local assessments; MCAS 35 and 65 for classrooms Individual Teacher  Select students for each teacher  Rank the scores from highest to lowest  Identify the Median score  Is the median below or above the “cut” score? Is it in the middle?  Don’t forget Roster Verification might change the specific scores and, therefore, change the Median  Distribute scores to teachers for each DDM

Mock Scoring, Storing, Determining Cut Scores Protocol 1. Assess all students for the course or grade level. 2. Enter student names, teacher names pre-test scores, post-test scores, and the gain from pre-test to post-test. 3. Using the SORT function, rank all assessments from highest to lowest. Determine the local cut scores. DESE recommends 1.5 Standard Deviations from 50; local districts have selected 10%, 15%, and 20% as their “cut scores.” 4. Look at samples of student work just above and just below the cut scores. Professional judgment about the appropriateness of this number is then used to determine the local cut scores to determine Low, Moderate, and High Growth. This number may vary from assessment to assessment and from year to year based upon the district’s determination and upon the professional judgment of the district.

Using Excel Excel (really simple method) 1. Enter data 2. Simple pre-post formula 3. Cut and Paste Values 4.Sort Highest to Lowest Tabs include (quick tour):  Fountas and Pinnell Sample for 6 teachers, 103 students with median for full assessment and for each teacher and with High, Moderate, and Low determinations  Pre-Post-Test Calculations (for three teachers)  Rubric Pre-Post (for three teaches)  MCAS SGP (Student Growth Percentile) Calculations for three teachers  A “Test” file that calculates the “gain,” but does not determine cut scores Use the “test” tab to enter your local data Or use the data provided

Time to Work with Your Team  On line materials  Excel  Templates