Download presentation
Presentation is loading. Please wait.
1
Samples of DDMs Dr. Deborah Brady dbrady3702@msn.com
2
DDMs The GOOD The BAD and the Not-so-good Quality Assessments, Developed Locally, Adapted, or Adopted Dr. Deborah Brady dbrady3702@msn.com
3
The GOOD Substantive Aligned with standards of Frameworks, Vocational standards And/or local standards Rigorous Consistent with K-12 DDMs in substance, alignment, and rigor Consistent with the District’s values, initiatives, expectations Measures growth (to be contrasted with achievement) and shifts the focus of teaching From achievement to growth for all students From teaching to learning From the teacher to the learner
4
As a Result of the GOOD In Districts, schools, departments: Educators have collaborated thoughtfully Initiatives are one step more unified The District, school, department, or specific course Moves forward (a baby step or a giant step) Gains collaborative understanding of the purpose of a course, discipline, year’s work
5
Some GOOD examples 9-12 ELA portfolio measured by a locally developed rubric that assesses progress throughout the four years of high school A district that required that at least one DDM was “writing to text” based on CCSS appropriate text complexity A HS science department assessment of lab report growth for each course (focus on conclusions) A HS science department assessment of data or of diagram or video analysis
6
More A HS math department’s use of PARCC examples that require writing asking students to “justify your answer” A social studies self-created PARCC exam using as the primary source or anchor text Wilson’s speech to congress advocating going to war for high-minded purposes and a 1980 text that describes Wilson’s true motives as financial gain with an essay that must summarize the excerpts then write an essay with a thesis that asserts the US motives for going to war
7
More SPED: Social-emotional development of independence (whole collaborative—each educator is measuring) SPED: “directed study” model—now has Study Skills explicitly recorded by the week for each student and by quarter on manila folder: Note taking skills, text comprehension, reading, writing, preparing for an exam, time management A Vocational School’s use of Jobs USA assessments for one DDM and the local safety protocols for each shop
8
High school SST team example (Frequent Absentees) Child Study Team example (Universal Process) School Psychologists (Did not follow procedure for referral) School Psychologists (subgroup of students studied) High school guidance example (PSAT, SAT, College Applications) IEP goals can be used as long as they are measuring growth (academic or social-emotional)
9
Possibly GOOD Fountas and Pinnell individual assessment of reading comprehension Galileo growth determination using the Galileo question bank aligned to standards DIBELS Text-based, locally created assessments MCAS-like ORQ plus multiple choice assessments Mid-terms, Benchmarks, Final Exams The possibility of goodness depends upon….
10
District Capacity and Time to Collaborate Data teams PLCs Leaders/coaches to provide context and meaning to student work Looking at student work protocols Diagnosing student needs and developing action plans Without Time and Capacity, it’s all just
11
Low Moderate and High in Human Terms A story of two teachers Effective Teaching All levels of learners Curriculum Goals/Agenda Notebook Group work Routines
12
Specific Examples
13
Math Practices Communicating Mathematical Ideas Clearly constructs and communicates a complete response based on: a response to a given equation or system of equations a chain of reasoning to justify or refute algebraic, function or number system propositions or conjectures a response based on data How can you assess these standards?
14
Demonstrating Growth Billy Bob’s work is shown below. He has made a mistake In the space to the right, solve the problem on your own on the right. Then find Billy Bob’s mistake, circle it and explain how to fix it. Billy Bob’s work ½ X -10 = -2.5 +10 = +10 _____________________________________________ ½ X +0 = +12.5 (2/1)(1/2)X =12.5 (2) X=25 Your work Explain the changes that should be made in Billy Bob’s Work Find the mistake provides students with model. Requires understanding. Requires writing in math
15
A resource for DDMs. A small step? A giant step? The district decides Which of the three conjectu res are true? Justify your answer
16
Essay Prompt from Text Read a primary source about Mohammed based on Muhammad’s Wife’s memories of her husband. Essay: Identify and describe Mohammed’s most admirable quality based on this excerpt. Select someone from your life who has this quality. Identify who they are and describe how they demonstrate this trait. What’s wrong with this prompt? Text-based question? PARCConline.org Where are the CLAIMS and EVIDENCE?
17
Science Open Response from Text Again, from a textbook, Is this acceptable? Is this recall?
18
Scoring Guides from Text Lou Vee Air Car built to specs (50 points) Propeller Spins Freely (60 points) Distance car travels 1m 70 2m 80 3m 90 4m 100 Best distance (10,8,5) Best car(10,8,5) Best all time distance all classes (+5) 235 points total A scoring guide from a textbook for building a Lou Vee Air Car. Is it good enough to ensure inter-rater reliability?
19
Technology/Media Rubric A multi-criteria rubric for technology. What is good, bad, problematical? Don’t try to read it!
20
PE Rubric in Progress. Grade 2 for overhand throw and catching. Look good?
21
Music: Teacher and Student Instructions
22
Are numbers good or a problem?
23
The UGLY Comply with regulations Bring about no change or understanding
24
The Best? Comply (sigh) Build on what is in the District, school or department A small step or a larger step in cognitive complexity Use the results to learn about students’ needs and how to address these needs Use time to look at student work, to collaboratively plan to improve
25
Validity and Reliability of Local DDMs A Psychometrician’s view
26
How Do We Determine Cut Scores? Growth Scores? Both are new areas for learning Growth is not achievement. Moderate=a year’s growth What if a student is below benchmark? Again, setting these specific parameters is district determined “Common Sense” Psychometricians are still figuring out what a good/fair assessment is
27
Objectivity versus Subjectivity Calibration Human judgment and assessment What is objective about a multiple choice test? Calibrating standards in using rubrics Common understanding of descriptors What does “insightful,” “In-depth,” “general” look like? Use exemplars to keep people calibrated Assess collaboratively with uniform protocol
28
Assessment Drift Spot Checking; recording; assessment blind Develop EXEMPLARS (simple protocol) In F&P Comprehension “conversation” Grade Level Team: Calibration with sample below benchmark, at benchmark, and above benchmark sample to begin. Discuss differences Then sample recorded F&P
29
Protocols for Administration of Assessments Directions to teachers need to define rules for giving support, dictionary use, etc. What can be done? What cannot? “Are you sure you are finished?” How much time? Accommodations and modifications? Feedback from teachers indicated some confusions about procedures Update instructions (common format)
30
Qualitative Methods of Determining an Assessment’s VALIDITY Looking at the “body of the work” Validating an assessment based upon the students’ work Floor and ceiling effect If you piled the gain scores (not achievement) into High, M, and Low gain Is there a mix of at risk, average, and high achievers mixed throughout each pile or can you see one group mainly represented
31
Low, Moderate, High Growth Validation Did your assessment accurately pinpoint differences in growth? 1. Look at the LOW pile If you think about their work during this unit, were they struggling? 2. Look at the MODERATE pile. Are these the average learners who learn about what you’d expect of your school’s student in your class? 3. Look at the HIGH achievement pile. Did you see them learning more than most of the others did in your class? Based on your answers to 1, 2, and 3, Do you need to add questions (for the very high or the very low?) Do you need to modify any questions (because everyone missed them or because everyone got them correct?)
32
Tracey is a student who was rated as having high growth. James had moderate growth Linda had low growth Investigate each student’s work Effort Teachers’ perception of growth Other evidence of growth Do the scores assure you that the assessment is assessing what it says it is? Look at specific students’ work Psychometric process called Body of the Work validation
33
Objectivity versus Subjectivity Multiple Choice Questions Human judgment and assessment What is objective about a multiple choice test? What is subjective about a multiple choice test? Make sure the question complexity did not cause a student to make a mistake. Make sure the choices in M/C are all about the same length, in similar phrases, and clearly different
34
Rubrics and Inter-Rater Reliability Getting words to mean the same to all raters Category4321 ResourcesEffective useAdequate useLimited useInadequate use DevelopmentHighly focusedFocused response Inconsistent response Lacks focus OrganizationRelated ideas support the writers purpose Has an organizational structure Ideas may be repetitive or rambling No evidence of purposeful organization Language conventions Well-developed command Command; errors don’t interfere Limited or inconsistent command Weak command
35
Protocol for Developing Inter Rater Reliability Before scoring a whole set of papers, develop Inter-rater Reliability Bring High, Average, Low samples (1 or 2 each) Use your rubric or scoring guide to assess these samples Discuss differences until a clear definition is established Use these first papers as your exemplars When there’s a question, select one person as the second reader
36
Annotated Exemplar: How does the author create the mood in the poem? Answer and explanation in the student’s words Specific substantiation from the text The speaker’s mood is greatly influenced by the weather. The author uses dismal words such as “ghostly,” “dark,” “gloom,” and “tortured.”
37
“Growth Rubrics” Can be Developed Pre-conventional Writing Ages 3-5 Emerging Ages 4-6 Developing Ages 5-7 2 Relies primarily on pictures to convey meaning. 2 Begins to label and add “words” to pictures. 2 Writes first name. 1 Demonstrates awareness that print conveys meaning. ? Makes marks other than drawing on paper (scribbles). ? Writes random recognizable letters to represent words. J Tells about own pictures and writing. 2 Uses pictures and print to convey meaning. 2 Writes words to describe or support pictures. 2 Copies signs, labels, names, and words (environmental print). 1 Demonstrates understanding of letter/sound relationship. ? Prints with upper case letters. ? Matches letters to sounds. ? Uses beginning consonants to make words. ? Uses beginning and ending consonants to make words. J Pretends to read own writing. J Sees self as writer. J Takes risks with writing. 2 Writes 1-2 sentences about a topic. 2 Writes names and familiar words. 1 Generates own ideas for writing. ? Writes from top to bottom, left to right, and front to back. ? Intermixes upper and lower case letters. ? Experiments with capitals. ? Experiments with punctuation. ? Begins to use spacing between words. ? Uses growing awareness of sound segments (e.g., phonemes, syllables, rhymes) to write words. ? Spells words on the basis of sounds without regard for conventional spelling patterns. ? Uses beginning, middle, and ending sounds to make words. J Begins to read own writing.
38
Protocols for Administration of Assessments Directions to teachers need to define rules for giving support, dictionary use, etc. What can be done? What cannot? “Are you sure you are finished?” How much time? Accommodations and modifications? Feedback from teachers indicated some confusions about procedures Update instructions (common format)
39
DESE Quote It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time. We are all learners in this initiative.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.