Download presentation
Presentation is loading. Please wait.
Published byRoland Paul Modified over 8 years ago
1
District Determined Measures Diman Regional Vocational School Dr. Deborah Brady
2
Goals for Today By the end of this session You will understand what needs to be done and be able to explain it to your colleagues You will have tools to begin to do that work in your district But please email questions or confusions any time during this process: dbrady3702@msn.com dbrady3702@msn.com http://tinyurl.com/lumqld7 materials from this presentation http://tinyurl.com/lumqld7
3
The Steps Necessary to Get Ready for June Report and After Adapting present assessments Creating new assessments Writing to text for HS Developing and Piloting Assessments Alignment of Content Rigorous and appropriate expectations Approval of assessments Assessing Quality and Rigor Security Calibration of standards and of assessors Rubric quality Analysis of results: High-M-Low Growth Piloting 2 DDMs per educator JUNE REPORT Directions for teachers Directions for students Organizing for the actual assessments Storing, tracking the information 2015 Full Implementation Data storage Data Analysis L-M-H Growth Roster Verification Data team time Interpreting the results Student Impact
4
Living Likert Take a magic marker Review all 6 stages After considering each one, go to your “stage” Consider: What are your school’s barriers? What are your district’s strengths?
5
The DESE Requirements Purpose, timeline, requirements, direct and indirect assessments
6
District Determined Measures DEFINITION DDMs are defined as: “Measures of student learning, growth, and achievement related to the Curriculum Frameworks, that are comparable across grade or subject level district-wide” TYPES OF MEASURES Portfolio assessments Approved commercial assessments District developed pre and post unit and course assessments Capstone projects
7
Timeline for Piloting and Full Implementation 2013-2014 District-wide training, development of assessments and piloting June 2014: Report: All educators in the district have 2 DDMs to be implemented fully in SY2015. 2014-2015 All DDMs are implemented; scores are divided into H-M-and Low and stored locally 2015-2016 Second year data is collected and all educators receive an impact rating that is sent to DESE based on 2 years of data for two DDMs
8
District Determined Measures Regulations Every educator will need data from at least 2 different measures Trends must be measured over a course of at least 2 years One measure must be taken from State-wide testing data such as MCAS if available (grades 4-8 ELA and Math SGP for classroom educators) One measure must not be MCAS; it must be a District Determined Measure which can include local assessments,and Galileo, normed assessments (DRA, MAP, SAT)
9
NEW! Determining Educator Impact on Each DDM Evaluator and educator meet. Evaluator determines whether students demonstrated high, moderate, or low growth on each DDM. Evaluator shares the resulting designations of student growth with educator. Educators confirm rosters. Must be on roster by 10/1 and remain on roster through last day of testing. Must be present for 90% of instructional time.
10
Performance & Impact Ratings Performance Rating Ratings are obtained through data collected from observations, walk- throughs and artifacts Exemplary Proficient Needs Improvement Unsatisfactory Impact Rating Ratings are based on trends and patterns in student learning, growth and achievement over a period of at least 2 years Data gathered from DDM’s and State- wide testing High Moderate Low
11
NEW! Determining a Student Impact Rating Introduces the application of professional judgment to determine the Student Impact Rating Evaluator assigns rating using professional judgment. Evaluator considers designations of high, moderate, or low student growth from at least two measures in each of at least two years. If rating is low, evaluator meets with educator to discuss If rating is moderate or high, evaluator/educator decide if meeting is necessary.
12
Summative Rating Exemplary 1-yr Self- Directed Growth Plan 2-yr Self-Directed Growth Plan Proficient Needs Improvement Directed Growth Plan UnsatisfactoryImprovement Plan LowModerateHigh Rating of Impact on Student Learning Massachusetts Department of Elementary and Secondary Education 12 Impact Rating on Student Performance
13
NEW! Intersection of Ratings Reinforces independent nature of the two ratings. Exemplary or Proficient matched with Moderate or High = 2-Year Self-Directed Growth Plan Exemplary/ Moderate and Exemplary/ High = recognition and rewards, including leadership roles, promotions, additional compensation, public commendation, and other acknowledgements. Proficient/Moderate and Proficient/ High = eligible for additional roles, responsibilities, and compensation. Exemplary or Proficient matched with Low = 1-Year Self-Directed Growth Plan Evaluator’s supervisor confirms rating. Educator and evaluator analyze the discrepancy. May impact Educator Plan goals. Student Impact Rating informs the self-assessment and goal setting processes.
14
Indirect measures of student learning, growth, or achievement provide information about students from means other than student work. These measures may include student record information (e.g., grades, attendance or tardiness records, or other data related to student growth or achievement such as high school graduation or college enrollment rates). To be considered for use as DDMs, a link (relationship) between indirect measures and student growth or achievement must be established. ESE recommends that at least one of the measures used to determine each educator’s student impact rating be a direct measure.
15
Indirect Measure Examples Consider SST Process for a team: High school SST team example RTI team example High school guidance example Subgroups of students can be studied (School Psychologist group example) Social-emotional growth is appropriate (Autistic/Behavioral Program example) Number of times each student says hello to a non-classroom adult on his or her way to gym or class (Direct) Number of days (or classes) a student with school anxiety participates Assess level of participation in a class (Direct) Improve applications to college IEP goals can be used as long as they are measuring growth (academic or social-emotional)
16
Turn and Talk Time to de-brief and review the “rules”
17
Using the 6-phase overview, what are your priorities? Adapting present assessments Creating new assessments Writing to text for HS Developing and Piloting Assessments Alignment of Content Rigorous and appropriate expectations Assessing Quality and Rigor Security Calibration of standards and of assessors Rubric quality Analysis of results: High- M-Low Growth Piloting 2 DDMs per educator JUNE REPORT Directions for teachers Directions for students Organizing for the actual assessments Storing, tracking the information 2015 Full Implementation Data storage Data Analysis L-M-H Growth Interpreting the results Student Impact
18
Assessment Quality Requirements and Definitions from DESE Alignment, Rigor, Comparability, “Substantial,” Modifications
19
What are the requirements? 1. Is the measure aligned to content? Does it assess what is most important for students to learn and be able to do? Does it assess what the educators intend to teach? Bottom Line: “substantial” content of course At least 2 standards ELA: reading/writing Math: Unit exam Not necessarily a “final” exam (unless it’s a high quality exam) 19
20
2. Is the measure informative? Do the results of the measure inform educators about curriculum, instruction, and practice? Does it provide valuable information to educators about their students? Does it provide valuable information to schools and districts about their educators? Bottom Line: Time to analyze is essential 20
21
Two Considerations for Local DDMs, 1. Comparable across schools Example: Teachers with the same job (e.g., all 5 th grade teachers) Where possible, measures are identical Easier to compare identical measures Do identical measures provide meaningful information about all students? Exceptions: When might assessments not be identical? Different content (different sections of Algebra I) Differences in untested skills (reading and writing on math test for ELL students) Other accommodations (fewer questions to students who need more time ) NOTE: Roster Verification and Group Size will be considerations by DESE 21
22
Five Considerations (DESE) 1. Measure growth 2. Employ a common administration procedure 3. Use a common scoring process 4. Translate these assessments to an Impact Rating (High-Moderate-Low) 5. Assure comparability of assessments within the school (rigor, validity). 22
23
Two Forms to Adapt to Your School’s Standards Handout—DDM Proposal form Simple Excel List for June report
24
MEPID Last Name First Name Grade/Dept. DDM1DDM2 DDM3 (optional) Jones BrigitELA 9 Writing for College ELA common assessment ELA writing DDM Smith Marion9-12 libraryLibrary Search Tools DDM Indirect: Increase teachers who do research in library. Watson ElspethPhysicsPhysics mid term or final Physics—interpreting data Holmes BillCarpentrySkills USA testDepartmental Safety DDM GuidanceIndirect: Increase College Applications June Report Form (Not Yet Released) Educators Linked with DDMs
25
Handout Sample Check all Items that are completedDefinitionYour Answers Here Source of DDM Locally developed Standardized test Are you developing the assessment as a department or team, or is your school/district purchasing an assessment? These first four categories can be used for this year’s June report: Educator Grade/Department DDM name Source of DDM Course What is the title of the course that this DDM will be given in? Possible educators who will use this DDM Courses and teachers may change, but who at this time will probably teach this course? Grade(s) of DDMGrade level(s) that this assessment will cover Alignment to State and/or District Standards At least 2 standards must be assessed to make this assessment a “substantial” assessment For indirect measures, 1) what are the substantial, important, essential areas that you are assessing? 2) How does this indirect measure connect with student growth? Please list the two (or more) standards using standards language. 1. 2. Rigor: Check the levels of Blooms that are assessed The original Bloom is the first word on the list. The new Bloom (all verbs) is the second. Note, in the new Bloom, Creating is on the highest level, above evaluating. More than one level can be assessed. Knowledge, Remembering Comprehension, Understanding Application, Applying Analysis, Analyzing Synthesis, Creating Evaluation, Evaluating
26
Type(s) of questions Multiple Choice, fill in, short answer (recall items from content area) Multiple Choice, fill in, short answer (text dependent questions) Open Response (short answer) Essay (long response). Type: Narrative Informational Text Argument with claims and proof One text is read Two texts are read Performance Assessment (CEPA) Other_______ (Fill in at right.) Indicate the percentage of the assessment for each question type, for example, multiple choice=50%; 2 open responses=50% (25% each). Multiple Choice _____% Open Response _____% Essay _____________% Duration of assessment Assessments can take place in a class period or over a period of days. For next year’s scheduling and implementation When assessment(s) will take place Provide approximate month or window for assessment(s), for example, end of first trimester, September. Provide multiple dates if the assessment is a pre-post or is administered more than once. Components of assessment that are completed so far. Directions to teacher for administering Directions to students Graphic organizers (optional) The assessment Scoring guide Rubric Security Calibration protocol if this assessment has a rubric Rubric Not Yet Does not apply How was the rubric created? For example, adapted from DESE’s CEPA rubric, or developed by the middle school science department. Please include rubric (even in draft form) Begin with: CEPA Or PARCC Or MCAS
27
Turn and Talk If you took an inventory of assessments, what are your next steps?
28
Calculating Growth Scores Defining growth, measuring growth, calculating growth for a classroom, for a district
29
4503699 244/ 25 SGP 230/ 35 SGP 225/ 92 SGP Sample Student GROWTH SCORES from the MCAS TEACHER GROWTH SCORES are developed from student growth scores
30
Approaches to Measuring Student Growth Pre-Test/Post Test Repeated Measures Holistic Evaluation Post-Test Only 30
31
Pre/Post Test Description: The same or similar assessments administered at the beginning and at the end of the course or year Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year with the passages changed Measuring Growth: Difference between pre- and post-test. Considerations: Do all students have an equal chance of demonstrating growth? 31
32
Pre-Post Analysis Cut Scores for L-M-H Growth Pre-test Post test Difference %age growth Diff/pre %age growth low to high Sort low to high diff ONE “mock” classroom 20351575%20%5 Cut scoreLOW Growth 2530520%42%15 bottom 20% 30502067%42%20 35602542%50%25 Moderate Growth 35602542%60%25 medianteacher score 40703587%62%25 medianTeacher score 40652562%67%25 50752550%70%30 50803060%75%35 Cut ScoreTop 20%? 50853570%87%35 HIGH GROWTH
33
Determining Growth with Pre- and Post Assessments Cut scores need to be locally determined for local assessments Standardized assessments use “The Body of the Work” protocol which easily translates to local assessments First determine the difference between pre- and post- scores for all students in a grade or course Then determine what Low Moderate and High growth is. (Local cut scores) Top and bottom 10% to begin as a test case Body of the Work check Then all scores are reapportioned to each teacher The MEDIAN score for each teacher determines that teacher’s growth score
34
Further measures beyond pre- and post- tests Repeated measures, Holistic Rubrics, Post-Test Only
35
Repeated Measures Description: Multiple assessments given throughout the year. Example: running records, attendance, mile run Measuring Growth: Graphically Ranging from the sophisticated to simple Considerations: Less pressure on each administration. Authentic Tasks 35
36
Repeated Measures Example Running Record 36 Date of Administration # of errors
37
Holistic Description: Assess growth across student work collected throughout the year. Example: Tennessee Arts Growth Measure System Measuring Growth: Growth Rubric (see example) Considerations: Option for multifaceted performance assessments Rating can be challenging & time consuming 37
38
Holistic Example 38 1234 Details No improvement in the level of detail. One is true * No new details across versions * New details are added, but not included in future versions. * A few new details are added that are not relevant, accurate or meaningful Modest improvement in the level of detail One is true * There are a few details included across all versions * There are many added details are included, but they are not included consistently, or none are improved or elaborated upon. * There are many added details, but several are not relevant, accurate or meaningful Considerable Improvement in the level of detail All are true * There are many examples of added details across all versions, * At least one example of a detail that is improved or elaborated in future versions *Details are consistently included in future versions *The added details reflect relevant and meaningful additions Outstanding Improvement in the level of detail All are true * On average there are multiple details added across every version * There are multiple examples of details that build and elaborate on previous versions * The added details reflect the most relevant and meaningful additions Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho. Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student- work/butterfly-draftshttp://elschools.org/student- work/butterfly-drafts
39
Post-Test Only Description: A single assessment or data that is paired with other information Example: AP exam Measuring Growth, where possible: Use a baseline Assume equal beginning Or use your own historical data Considerations: May be only option for some indirect measures What is the quality of the baseline information? 39
40
Post-Test Only A challenge to tabulate growth Portfolios Capstone Projects AP scores If you have local histories, you can use them For example, a C student in pre-Calculus getting a 1 might be low growth, a 2 or 3 might be moderate, a 4 or 5 might be high. This is a local determination of growth. 40
41
Turn and Talk Discuss the calculations, security, storage, fairness of determining local cut scores.
42
Tools That May be Helpful What is important? What does a good assessment look like?
43
Core Curriculum Objectives (http://www.doe.mass.edu/edeval/ddm/example/)
44
ELA-Literacy — 9 English 9-12 https://wested.app.box.com/s/pt3e203fcjfg9z8r02si https://wested.app.box.com/s/pt3e203fcjfg9z8r02si Assessment Hudson High School Portfolio Assessment for English Language Arts and Social Studies Publisher Website/Sample Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments. http://www.doe.mass.edu/edeval/ddm/example/
45
Click this link to see the Blueprint of each assessment http://www.workforcereadysystem.org/index.shtml
46
Take a sample test You can take a ten question sample test once you find the BLUEPRINT for your exam. To take a sample test: Select Blueprint Select “EXPERIENCE IT” You will have to register. 1) Each assessment provides instructions for giving the assessment. 2) The on-line assessments may have videos and may require students to put things in sequence 3) The questions are not just multiple choice
47
Customer Service Demonstrate professional development skills in a simulated customer service or employment situation. Examples may include: Job interview Customer service scenario Communications Decision making, problem solving and/or critical thinking
48
Safety Could be a common assessment for areas without a specific assessment
49
Additional Testing Examples Massachusetts Model Curriculum Units and Curriculum Embedded Performance Assessment examples www.doe.mass.edu. You must sign up for themwww.doe.mass.edu Other juried sites that may be helpful Most Core areas: Engage New York www.engageny.org www.engageny.org ELA: Achieve at www.achievethecore.orgwww.achievethecore.org Math and ELA and Literacy: PARCC On Line www.parcconline.org www.parcconline.org Rubrics for writing by grade level and writing type http://www.doe.k12.de.us/aab/English_Language_Arts/writin g_rubrics.shtml http://www.doe.k12.de.us/aab/English_Language_Arts/writin g_rubrics.shtml
50
MA Model Curricula and Rubrics CEPAs 123456 Topic development: The writing and artwork identify the habitat and provide details Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language Moderate topic/idea development and organization Adequate, relevant details Some variety in language Full topic/idea development Logical organization Strong details Appropriate use of language Rich topic/idea development Careful and/or subtle organization Effective/rich use of language Evidence and Content Accuracy: writing includes academic vocabulary and characteristics of the animal or habitat with details Little or no evidence is included and/or content is inaccurate Use of evidence and content is limited or weak Use of evidence and content is included but is basic and simplistic Use of evidence and accurate content is relevant and adequate Use of evidence and accurate content is logical and appropriate A sophisticated selection of and inclusion of evidence and accurate content contribute to an outstanding submission Artwork; identifies special characteristics of the animal or habitat, to an appropriate level of detail Artwork does not contribute to the content of the exhibit Artwork demonstrates a limited connection to the content (describing a habitat) Artwork is basically connected to the content and contributes to the overall understanding Artwork is connected to the content of the exhibit and contributes to its quality Artwork contributes to the overall content of the exhibit and provides details Artwork adds greatly to the content of exhibit providing new insights or understandings
51
ELA, math, social studies/history, science, the arts, technology DOK=Depth of Knowledge http://www.stancoe.org/SCOE/iss/common_core/overview/overview_depth_of_knowledge.htm
52
Protocols to Use Locally for Inter-Rater Reliability; Looking at Student Work Rhode Island Calibration Protocol for Scoring Student Work on the wiki Another brief one-hour training handout for assessing student work, and developing local rubrics is also posted (HML Protocol)
53
Next Steps Develop pilot assessments for SY 2014 Assess results; use results to help plan for full implementation in 2015 Develop a plan for all educators to have two DDMs: MCAS growth, purchased, or local Develop a district process for assessing the quality of assessments (DESE Quality Tool or attachment on last two pages) Develop an internal process for cut scores and determining low, average, and high growth of students Track/organize information for June report: Educators/DDMs Plan for 2015 administration for all educators: Tracking, scheduling, storing year 1 scores, storing year 2 scores 53
54
“Don’t let perfection get in the way of good.”
55
Potential as Transformative Process When C, I or A is changed…. Elmore, Instructional Rounds, and the “task predicts performance” Curriculum Instruction Assessment
56
Team Planning Time Adapting present assessments Creating new assessments Writing to text for HS Developing and Piloting Assessments Alignment of Content Rigorous and appropriate expectations Approval of assessments Assessing Quality and Rigor Security Calibration of standards and of assessors Rubric quality Analysis of results: High-M-Low Growth Piloting 2 DDMs per educator JUNE REPORT Directions for teachers Directions for students Organizing for the actual assessments Storing, tracking the information 2015 Full Implementation Data storage Data Analysis L-M-H Growth Roster Verification Data team time Interpreting the results Student Impact
57
Sample DDMs Good, Not-so-good, and Problematical
58
Demonstrating Growth (when accuracy of computation may be a concern)
59
Essay Prompt from Text Read a primary source about Mohammed based on Muhammad’s Wife’s memories of her husband. Essay: Identify and describe Mohammed’s most admirable quality based on this excerpt. Select someone from your life who has this quality. Identify who they are and describe how they demonstrate this trait. What’s wrong with this prompt?
60
Science Open Response from Text Again, from a textbook, Is this acceptable?
61
Scoring Guides from Text Lou Vee Air Car built to specs (50 points) Propeller Spins Freely (60 points) Distance car travels 1m 70 2m 80 3m 90 4m 100 Best distance (10,8,5) Best car(10,8,5) Best all time distance all classes (+5) 235 points total A scoring guide from a textbook for building a Lou Vee Air Car. Is it good enough to ensure inter-rater reliability?
62
Technology/Media Rubric A multi-criteria rubric for technology. What is good, bad, problematical?
63
PE Rubric in Progress. Grade 2 Overhand throw. Looks good?
64
Music: Teacher and Student Instructions
69
World Language Scoring Guide and Rubric
70
World Language Middle School
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.