DDM Part II Analyzing the Results Dr. Deborah Brady.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

C OMMON C ORE S TATE S TANDARDS I NITIATIVE March 2010.
Performance Tasks for English Language Arts
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
District Determined Measures
Current legislation requires the phase-out of high school TAKS and replaces it with 12 EOC assessments in  English I, English II, English III  Algebra.
2 From NECAP to the Common Core and New Assessments English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects Fall 2010,
CCCS = California Common Core Standards.  Common Core standards corresponds with the original NCLB timeline of 2014  Students need real world skills.
District Determined Measures aka: DDMs What is a DDM? Think of a DDM as an assessment tool similar to MCAS. It is a measure of student learning, growth,
C O L L E G E S U C C E S S ™ SAT Writing Rubric Prepare. Inspire. Connect.
Qatar University College of Arts and Sciences Developing rubrics to assess courses Abdou Ndoye Fall 2010.
TUSD Scoring Extended Writing Using the PARCC Rubric as Framework September 2014.
Writing Performance Tasks* for CTE *also known as Writing Tasks.
WORKING TOGETHER ACROSS THE CURRICULUM CCSS ELA and Literacy In Content Areas.
Applying Assessment to Learning
 Assessing Student Achievement New Teacher Induction Program.
© 2013 UNIVERSITY OF PITTSBURGH Module 1: Analysis of a Research Simulation Task in CTE Tennessee Department of Education CTE High School Supporting Rigorous.
Annual Professional performance review (APPR overview) Wappingers CSD.
Overview of the CCSSO Criteria– Content Alignment in English Language Arts/Literacy Student Achievement Partners June 2014.
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
August, 2014 Susan M. Schultz Berkeley ELL Writing Power Writing and Four Square Strategies.
 Here’s What... › The State Board of Education has adopted the Common Core State Standards (July 2010)  So what... › Implications and Impact in NH ›
The New SAT ® What Does It Mean for Students?. 3The New SAT: What Does It Mean for Students? June, 2004 The New SAT Focuses on College Success ™ Skills.
Principles of Assessment
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
District Determined Measures
1 Name: Wael Eid Grade: 11 English project The SAT ® What Does It Mean for Students?
Launching the Common Core State Standards Embrace Initiative Presented by Brittany Austin Literacy Interventionist.
Ensuring State Assessments Match the Rigor, Depth and Breadth of College- and Career- Ready Standards Student Achievement Partners Spring 2014.
Student Growth through Long-term Transfer Boone County Schools May 2014.
District Determined Measures aka: DDMs The Challenge: The Essential Questions: 1.How can I show, in a reliable and valid way, my impact on students’
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
King Philip Day 2 District Determined Measures June 25, 2015 Dr. Deborah Brady
Wednesday, Thursday 1/21 – 1/22 Bellringer Handout – Class set. Please write in your composition notebooks.
DDMs -From Conception to Impact Rating D Easthampton High School – Team Leader Meeting March 17, 2014 Facilitated by Shirley Gilfether.
Score of 0 Essays not written on the essay assignment will receive a score of zero.
TAKS Writing Rubric
District Determined Measures Diman Regional Vocational School Dr. Deborah Brady.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
Guide to Test Interpretation Using DC CAS Score Reports to Guide Decisions and Planning District of Columbia Office of the State Superintendent of Education.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Everyone's favorite... Long Compositions!!.
Identifying the Learning Requirements for Your Class and/or Level What do my students need to know and be able to do as a result of being in my class?
A Closer Look Quality Goals Appropriate Assessments.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
© 2015 The College Board The Redesigned SAT/PSAT Key Changes.
NYS Common Core English Language Arts Grades 9 – 12.
HOME High Quality Assessments in Performance Evaluation and Professional Growth (PEPG) Systems.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Facilitated by Kristin Edlund Issaquah, 2015 Teaching with Performance Tasks.
Using the PARCC Rubrics to Analyze Student Writing College Career Ready Conference 2015.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Reviewing Using the Instructional Materials Evaluation Tool: Mathematics Module 103: Standards for Mathematical Practice and Access for All Students (AC.
Prioritizing Standards How do we decide what matters most?
Understanding AzMERIT Results and Score Reporting An Overview.
Assessment Workshop Creating and Evaluating High Quality Assessments Dr. Deborah Brady.
Portfolios A number of years ago the portfolio became part of the requirements to attain the two highest levels of graduation status. Though one.
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
W R I T I N G M A T T E R S A workshop of the Hoosier Writing Project a site of the National Writing Project at IUPUI Herb Budden, Co-director.
Student Growth Goals Work Session. Goals for the day 1.Develop questions to ask teachers to determine if the SGG meets the criteria established in the.
Curriculum Night Elementary. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
AP Lang by the Numbers. Scoring Systems -When we talk about scores, there are two separate scoring systems that matter to you. What is my grade in class?
Stonewall Middle School IB grading
PARCC in New Jersey New Jersey Department of Education March 2013.
The Rocket Science of Score Points Holistic Scoring and the New Jersey HSPA Writing Assessment.
Table of Specifications
Scaling Up District-Determined Measures
Smarter Balanced Assessment Results
Presentation transcript:

DDM Part II Analyzing the Results Dr. Deborah Brady

Agenda  Overview of how to measure growth in 4 “common sense” ways  Quick look at “standardization”  Not all analyses are statistical or new  We’ll use familiar ways of looking at student work  Excel might help when you have a whole grade’s scores, but it is not essential  Time for your questions; exit slips  My

2 Considerations Local DDMs,” 1. Comparable across schools  Example: Teachers with the same job (e.g., all 5 th grade teachers)  Where possible, measures are identical  Easier to compare identical measures  Do identical measures provide meaningful information about all students?  Exceptions: When might assessments not be identical?  Different content (different sections of Algebra I)  Differences in untested skills (reading and writing on math test for ELL students)  Other accommodations (fewer questions to students who need more time )  NOTE: Roster Verification and Group Size will be considerations by DESE 3

2. Comparable across the District  Aligned to your curriculum (comparable content) K-12 in all disciplines  Appropriate for your students  Aligned to your district’s content  Informative, useful to teachers and administrators  “Substantial” Assessments (comparable rigor):  “Substantial” units with at least 2 standards and/or concepts assessed. (DESE began talking about finals/midterms as preferable recently) See Core Curriculum Objectives (CCOs) on DESE website if you are concerned /  Quarterly, benchmarks, mid-terms, and common end of year exams  NOTE: All of this data stays in your district. Only HML goes to DESE with a MEPID for each educator.

Examples of 4 +1 Methods for Calculating Growth Each is in handout  Pre-post test  Repeated measures  Holistic Rubric (Analytical Rubric)  Post test only  A look at “standardization” with percentiles

Typical Gradebook and Distribution Page 1 of handout  Alphabetical order (random)  Sorted low to high  Determine “cut scores” (validate in the student work)  Use “Stoplight Method” to help see cut scores  Graph of distribution of all scores  Graph of distribution of High, Moderate, Low scores

High Count 6 Mod Count 12 Low Count 5 “Cut” Scores and “common sense”: validate them with performances. What work is not moving at an average rate? What work shows accelerated growth? Some benchmarks have determined rates of growth over time

Pre/Post Test  Description:  The same or similar assessments administered at the beginning and at the end of the course or year  Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year  Measuring Growth:  Difference between pre- and post-test.  Check if all students have an equal chance of demonstrating growth 8

Pre- Post Tests Pre-test Lowest to highest Post testDifference (Growth) Analysis Range of growth ? Cut score? Look at work. Look at distribution.

Holistic  Description:  Assess growth across student work collected throughout the year.  Example: Tennessee Arts Growth Measure System  Measuring Growth:  Growth Rubric (see example)  Considerations:  Option for multifaceted performance assessments  Rating can be challenging & time consuming 10

11 Holistic Example (unusual rubric) Details No improvement in the level of detail. One is true * No new details across versions * New details are added, but not included in future versions. * A few new details are added that are not relevant, accurate or meaningful Modest improvement in the level of detail One is true * There are a few details included across all versions * There are many added details are included, but they are not included consistently, or none are improved or elaborated upon. * There are many added details, but several are not relevant, accurate or meaningful Considerable Improvement in the level of detail All are true * There are many examples of added details across all versions, * At least one example of a detail that is improved or elaborated in future versions *Details are consistently included in future versions *The added details reflect relevant and meaningful additions Outstanding Improvement in the level of detail All are true * On average there are multiple details added across every version * There are multiple examples of details that build and elaborate on previous versions * The added details reflect the most relevant and meaningful additions Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho. Used with permission from Expeditionary Learning. Learn more about this and other examples at

HOLISTIC Easier for Large-Scale Assessments like MCAS Rubric Topic or Conventions and useful when categories overlap Criteria In one cell AdvancedProficientNIAt Risk Writing 1)Claims/evidence 2)Counterclaims 3)Organization 4)Language/style 1)Insightful, accurate, carefully developed claims and evidence. 2) Counterclaims are thoughtfully, accurately, completely discussed and argued. 3) Whole essay and each paragraph are carefully organized and show interrelationships among ideas. 4) Sentence structure, vocabulary, and mechanics show control over language use Adequate Effective “Gets it” Misconcep tions; some errors Serious errors

MCAS Has 2 Holistic Rubrics Topic/D evelop ment Rich topic/idea development Careful, subtle organization Effective rich use of language Full topic/idea development Logical organization Strong details Appropriate use of language Moderate topic/idea development and organization Adequate, relevant details Some variety in language Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task Conven tions Control of sentence structure, grammar, usage, and mechanics, (length and complexity of essay) provide opportunity for student to show control of standard English conventions) Errors do not interfere with communication and/or Few errors relative to length of essay or complexity of sentence structure, grammar and usage, and mechanics Errors interfere somewhat with communication and/or Too many errors relative to the length of the essay or complexity of sentence structure, grammar and usage, and mechanics Errors seriously interfere with communication AND Little control of sentence structure, grammar and usage, and mechanics

Pre and Post Rubric (2 Criteria) Growth Add the scores Pretests Topic Conventions Post tests Topic Conventio ns GrowthAnalysis Add together criteria gains as raw score In order 1/1 0/000 1 / 22/21/011 1/22/31/121 2/33/31/012 Rubrics do not represent percentages. A student who received a 1 would probably receive a 50. F? 1= 50FSeriously at risk 2= range 60-72, 75? D to C-At risk 3= 76-88, 89? C+ to B+ Average 4= A to A+Above most

Holistic Rubric or Holistic Descriptor Keeping 1-4 scale PrePostDifferenceRank orderCut

Converting Rubrics to Percentages Not recommended for classroom use because it distorts the meaning of the descriptors. May facilitate this large-scale use. District Decision Pre Conver ted Post Convert ed Difference Ranked Common Sense analysis Was the assessment too difficult? Zeros in pretest (3) Zero growth (4 plus minus growth) Only 1 student improved Change assessment scale? Look at all of the grade-level assessments.

Repeated Measures  Description:  Multiple assessments given throughout the year.  Example: running records, attendance, mile run  Measuring Growth:  Graphically  Ranging from the sophisticated to simple  Less pressure on each administration.  Authentic Tasks (reading aloud, running) 17

Repeated Measures  Description:  Multiple assessments given throughout the year.  Example: running records, attendance, mile run  Measuring Growth:  Graphically  Ranging from the sophisticated to simple  Considerations:  Less pressure on each administration.  Authentic Tasks 18

Repeated Measures Example Running Record Errors in Reading Average of high, moderate, and low error groups 19

Post test only AP exam: Use as baseline to show growth for each level or… for classroom  This assessment does not have a “normal curve”  An alternative for post test only for a classroom and to show student growth is to give a mock AP pre and post.

Looking for Variability  The second graph is problematic because it doesn’t give us information about the difference between average and high growth because so many students fall into the “high” growth category.  NOTE: Look at the work and make “common sense” decisions.  Consider the whole grade level; one class’s variation may be caused by teacher’s effectiveness  Critical Question: Do all students have equal possibility for success? 21

“Standardizing” Local Norms Percentages versus Percentiles % within class/course %iles across all courses in district 22 Many Assessments with different standards Student A  English: 15/20  Math: 22/25  Art: 116/150  Social Studies:6/10  Science:70/150  Music:35/35 “Standardized” Normal Curve Student A  English: 62 %ile  Math: 72 %ile  Art: 59 %ile  Social Studies:71 %ile  Science: 70 %ile  Music:61 %ile Percentage of 100% Student A English 75% Math 88% Art 77% Social Studies60% Science46% Music100

Standardization In Everyday Terms  Standardization is a process of putting different measures on the same scale  For example  Most cars cost $25,000 give or take $5,000  Most apples costs $1.50 give or take $.50  Getting a $5000 discount on a car is about equal to what discount on an apple?  Technical terms  “Most are” = mean  “Give or take” = standard deviation 23

Percentile/Standard Deviation

Excel Functions Sort high to low or low to high, Graphing Function, Statistical Functions including Percentiles and Standard Deviation  Student grades can be sorted from highest to lowest score with one command  Table of student scores can be easily graphed with one command  Excel will easily calculate %, but this is probably not necessary

“Common Sense”  The purpose of DDMs is to assess Teacher Impact  The student scores, the Low, Moderate, and High growth rankings are totally internal  DESE (in two years) will see  MEPIDS and  L, M or H next to a MEPID  The important part of this process needs to be the focus:  Your discussions about student learning with colleagues  Your discussions about student learning with your evaluator  An ongoing process