Download presentation
Presentation is loading. Please wait.
Published byBranden Scott Modified over 9 years ago
1
Cut Scores, Student Growth and College/Career Readiness: 2011-2012 Data Dialogues Stan Masters Lenawee ISD January 17, 2012
2
Data Dialogues around “hot topics” for 2011-2012 are occurring in Lenawee County. Using data stored in DataDirector provides ways to create reports that can be used in these dialogues. Participants will learn about using data to have conversations that deepen understanding of these "hot topics." Participants should bring their username and password for DataDirector.
3
Goals For Today Describe how the new cut scores will impact the MEAP/MME – Refine Data from 10-11 MEAP Describe how a growth model operates – Create an Assessment with Calculation for Growth Describe how multiple measures help define “College and Career Readiness” – Open an EXPLORE, PLAN, or Elementary Report in Excel and color code College Readiness
4
Norms for Our Work Participate actively Actively listen Seek application Press for clarification Honor time agreements and confidentiality Keep ‘side bars’ to a minimum and on topic Take care of adult learning needs
6
WE MUST UTILIZE AN INQUIRY APPROACH TO DATA ANALYSIS WE MUST USE MULTIPLE SOURCES OF DATA We need a data warehouse for our 21 st century schools WE MUST FOCUS ON DATA TO INCREASE STUDENT ACHIEVEMENT Talking Points for the Purpose of Implementing a Data Warehouse in Lenawee Schools
7
Dialogue About Meaning Systematic Multiple Sources Analysis WE MUST UTILIZE AN INQUIRY APPROACH TO DATA ANALYSIS WE MUST USE MULTIPLE SOURCES OF DATA We need a data warehouse for our 21 st century schools WE MUST FOCUS ON DATA TO INCREASE STUDENT ACHIEVEMENT Data-Driven Culture Process Data-Driven Decision- Making Key Characteristics of the Objectives of the Data Warehouse Taken from the Michigan School Improvement Framework
8
School Improvement Process
9
Who uses DataDirector?
10
Source: Presentation by Dr. Victoria Bernhardt, April 2007
11
LISD Data Director Stan Masters Coordinator of Instructional Data Services “Data out” Creation and analysis of data-driven reports Mike Husband Data Entry Specialist “Data in” Organization of data to add to the warehouse
12
FERPA/HIPAA Pre-Test To be considered an “education record,” information must be maintained in the student’s cumulative or permanent folder. False, because any record that has a student name is an educational record.
13
FERPA/HIPAA Pre-Test FERPA grants parents the right to have a copy of any education record. True
14
FERPA/HIPAA Pre-Test You are in charge of a staff meeting to study student achievement on school improvement goals. As part of your meeting, you are showing a report to the entire staff that shows student scores on a common local assessment. The report shows the student names. In addition, you have given them a paper copy of the report. It is a violation of FERPA to display the results of the assessment to the entire staff. The exception would be a group of teachers working on a specific student strategies, as they are a specific population that then has a “legitimate educational interest” in the information.
15
Login https://www.achievedata.com/lisd Username: – 3-letter school abbreviation + (first 4 letters of last name)(first letter of first name) Password: – For new users: 123456dd (you will create a new password after logging in). – For existing users: use your existing password. – I can reset passwords to 123456dd
16
Data Driven Dialogue di·a·logue or di·a·log n. Abbr. dial. 1.A conversation between two or more people. 2.2. An exchange of ideas or opinions: achieving constructive dialogue with all parties present. -- di·a·logue v. Deb Clancy, Washtenaw ISD, 2008
17
Ways of Talking Norms of Collaboration DialogueDiscussion Outcome: Deep Understanding Outcome: Decisions That Stick Culture of Collaboration Conversation Deliberation The Center for Adaptive Schools www.adaptiveschools.com
18
Listening Respectfully Ear of the Attentive Listener Rectitude of the Heart Eye that is Unswerving
19
What do you need to monitor? Progress on school improvement goals Tracking individual student progress Prioritize key indicators Support teaching and learning Source: “Developing a Monitoring Plan”. Maryland Department of Education. Accessed May 25, 2010 from http://mdk12.org/data/progress/developing.htmlhttp://mdk12.org/data/progress/developing.html
20
How do you develop a monitoring plan? Identify specific learning indicators Create data collection templates Schedule assessment calendar – collaborative collection and analysis Source: “Developing a Monitoring Plan”. Maryland Department of Education. Accessed May 25, 2010 from http://mdk12.org/data/progress/developing.htmlhttp://mdk12.org/data/progress/developing.html Video Source: Reeves, D. (2009). “Planning for the New Year”. Accessed May 25, 2010 from http://www.leadandlearn.com/webinars http://www.leadandlearn.com/webinars
21
Cut Scores
22
Old Cut Scores Scaled Scores represent the stable score on the assessment that is reported for each student. Floor of “Meets” performance level was calculated as (Grade level X 100) 3 rd grade “Meets” = 3 X 100 = 300
23
GradeSubjectCorrectTotalPercent 3 rd Reading1330 43% 3 rd Math2137 57% Fall 2008 Passing Cut Scores Grades 3 rd -5th GradeSubjectCorrectTotalPercent 4 th Reading1330 43% 4 th Math2048 42% GradeSubjectCorrectTotalPercent 5 th Reading1430 47% 5 th Math2149 43% 5 th Science2451 47% Source: Presentation by Terri Portice, June 29, 2009
24
Fall 2008 Passing Cut Scores Grades 6 th – 9th GradeSubjectCorrectTotalPercent 6 th Reading1430 47% 6 th Math2047 43% 6 th Social Studies 2446 52% GradeSubjectCorrectTotalPercent 7 th Reading1530 50% 7 th Math2250 44% GradeSubjectCorrectTotalPercent 8thReading1730 57% 8thMath1338 34% 8 th Science2458 41% GradeSubjectCorrectTotalPercent 9 th Social Studies2346 50% Source: Presentation by Terri Portice, June 29, 2009
25
2009 MME Percent Correct Needed to “Pass” ContentPercent Reading63% Writing66% Math56% Science54% Social Studies44% Source: Personal E-mail Correspondence with Ernie Bauer, Oakland Schools, July 15, 2009
27
Change in Cut Scores for MEAP and MME Comparing “Old” vs “New” Using Fall 2010 and Spring 2011 data Michigan and Lenawee
28
Open MEAP or MME Percent Proficient Report Click on Percentage of Students Proficient Make This a Report Refine the data to Reflect the New Cut Scores PredictionsObservationsInferences I predict... I can count... I believe that the data suggests... because, …
31
Approximate Percent Correct Scores Required to Pass - Math
32
Comparison of Impact of Old and New Cut Scores – Reading
33
Approximate Percent Correct Scores Required to Pass - Science
34
Approximate Percent Correct Scores Required to Pass – Social Studies
36
Growth Models
37
37 Michigan School Reform Law Conduct annual educator evaluations Include measures of student growth as a significant factor Locally determine the details of the educator evaluations, the consequences, and the timeline for implementation.
38
Key Characteristics of Growth Models Data must align with agreed-upon content standards Data must measure a broad range of skills Data must document year-to-year growth on a single scale Measuring Student Growth: A Guide to informed decision making. (2007). Center for Public EducationMeasuring Student Growth: A Guide to informed decision making. (2007). Center for Public Education. Using Student Progress to Evaluate Teachers: A Primer on Value-Added Models. (2005). Education Testing ServiceUsing Student Progress to Evaluate Teachers: A Primer on Value-Added Models. (2005). Education Testing Service.
39
Growth Models Improvement Model Performance Index Simple Growth Growth to Proficiency Value-Added
40
Improvement Model Compares one cohort of students with another cohort in same grade/course Benefits – Easy to implement – Simple to communicate Disadvantages – Does not track individual student progress – Does not take into account other factors that may have promoted/inhibited growth 4 th Graders from 2010-11 to 2011-12
41
Performance Index Combines multiple data sets into a single scale Benefits – Recognizes changes in all achievement levels – Uses multiple measures – Can lead to improvement for all students, not just “bubble” students Disadvantages – Does not track individual student progress – Do not capture change in each achievement level – May be desirable to use more achievement levels Grade-Point Average
42
Simple Growth Follows same cohort of students Benefits – Uses scaled scores from one year to the next – Documents changes in individual students Disadvantages – Includes only the students present for both years – Need to determine how much growth is enough MEAP Scale Scores
43
Growth to Proficiency Designed to show if students are “on-track” to meet standards Benefits – Provides more data points toward goal – Recognizes gains even if students are not proficient – Focus on all students, not just “bubble” students Disadvantages – Targets must be determined by outside agencies – Benchmark points must be agreed upon Cut Scores for Proficiency
44
Value-Added Past performance used to predict future scores Benefits – Measures student performance over time – Documents the impact of instructional resource, program, or school process on the change Disadvantages – Complex statistics – Isolates student demographics that may impact performance EXPLORE to PLAN to ACT
45
Examples of Growth Assessments Source: Britton-Deerfield Teacher-Evaluation Committee, 2011 Local – Classroom tests, performance assessments, IEP goals, portfolio exhibits State – MEAP, MME (ACT), MI-Access National – DIBELS, STAR, NWEA, EXPLORE, PLAN ‘data from multiple sources ’
46
Key Characteristics of Growth Models Data must align with agreed-upon content standards – Identify significant standards for growth – Align assessment and instructional plans Data must measure a broad range of skills – Develop assessment instruments (test blueprints, performance rubrics, and scoring guides) – Construct assessment calendar (beginning to end of year) Data must document year-to-year growth on a single scale – Determine initial threshold scores for determining growth
47
Pre-Test and Post-test If you are using your pretest for comparison to post-tests, then you will want very similar questions on both. The best pre-tests cover exactly the same objectives as the test Make sure the pre-test conditions are as similar to the mode used in post-test Example of teacher using a pre-test/post-test model
48
Student Name Life Skills Pre-test Life Skills Post-test Life Skills Growth Life Skills Writing Pretest Life Skills Writing Post test Writing Growth 95 055649 85905389254 70755516514 6576113211684 6065550544 65705 8010 6550-1510313027
49
Procedures Administered before and after instruction Look at the scores of individual students to determine how many had higher post-test scores (Simple Growth Model) Compare the percentage to the threshold agreed upon by school/district Calculate the mean pre-test score and compare that with the mean post-test score (Simplified Value-Added Growth Model) Source: Measurement Issues Inherent in Educator Evaluation, Presentation by the Michigan Assessment Consortium to the OEAA Educator Evaluation Best Practices Conference, April 15, 2011.
50
Examples Individual A teacher gives 4 pre-post tests during the year. For each sequence, the teacher calculates an average change from pre- test to post-test. The teacher compares the number of students whose scores changed in various amounts. Common A district develops 2 tests to be given pre-post during the year for specific content/grade levels. For each sequence, the district calculates an average change from pre- test to post-test. The district compares the number of students whose scores changed in various amounts. Source: Measurement Issues Inherent in Educator Evaluation, Presentation by the Michigan Assessment Consortium to the OEAA Educator Evaluation Best Practices Conference, April 15, 2011.
51
Which of our 2010-2011 Freshmen met or exceeded NWEA Common Core Target growth rate or met or exceeded 60th percentile in Reading or Math? Change in Reading ScoreChange in Math ScoreNWEA Spring 2011 Spr 2010 to Spr 2011 Reading Test PercentileMath Test Percentile 8-134331 466955 1036224 3-175523 057963 -76255 13-154816
52
Student Name Teacher Name Class Period 10-11 NWEA Fall Reading Test RIT Score 10-11 NWEA Spring Reading Test RIT Score 10-11 NWEA Test RIT Score Growth 10-11 NWEA Fall Reading Test Percentile 10-11 NWEA Spring Reading Test Percentile 10-11 NWEA Test Percentile Growth 720720615183 220721912104030 2212202-10169-7 120721912155035 22102133142511 2220216-428324 6226219-75350-3 7221214-74135-6 6227214-135635-21 72252261517019 2224223365216 6231202-296613-53 6232213-196833-35
53
Create an Assessment with Calculation for Growth – Pretest/Posttest – Calculation Field
55
“On Target” “Off Target” College and Career Readiness
56
56 College Readiness Benchmark Scores Early Indicators of College Readiness EXPLORE PLAN ACT EnglishEnglish Composition 13 15 18 ReadingSocial Sciences 15 17 21 Math Algebra 17 19 22 ScienceBiology 20 21 24 ACT Subject Area Test College Course(s)
57
Using Multiple Measures for Educational Decisions Conjunctive Approach (All measures count) Measures of different constructs College Readiness based upon student achievement meeting identified benchmark targets in English, Math, Reading, and Science
58
Lenawee County College Readiness Data
59
Common Core State Standards
60
College and Career Readiness WORKSHOPS How important are the Michigan Merit Curriculum courses?
61
Source: College and Career Readiness WORKSHOPS, Fall 2011 Rigor Issues National HS Grad Class 2011 Profile Summary Report CRB =22
63
Using PLAN to Predict ACT Uses 10 th grade PLAN scores from 10-11 Use predicted ACT scores in each subject area Use color-coding to indicate probability – Dark Green – Light Green – Yellow – Orange – Red
64
Using EXPLORE to Predict PLAN Uses EXPLORE scores from 2010-2011 Use predicted PLAN scores in each subject area Use color-coding to indicate probability – Dark Green – Light Green – Yellow – Orange – Red
65
10-11 PLANExpected ACT10-11 PLANExpected ACT10-11 PLANExpected ACT10-11 PLANExpected ACT Last nameFirst nameEnglish Reading Mathematics Science 16 16-20 17 17-21 14 14-18 16 16-20 17 17-21 17 17-21 20 21-25 21 22-26 22 23-27 20 21-25 21 22-26 21 22-26 16 16-20 14 14-18 16 16-20 20 21-25 10-11Expected10-11Expected10-11Expected10-11Expected EXPLOREPLANEXPLOREPLANEXPLOREPLANEXPLOREPLAN Last nameFirst nameEnglish Reading Mathematics Science 9 10-13 11 12-15 11 12-15 11 12-15 1415-181213-161415-181516-19 1415-181415-181718-211617-20 1314-171213-161516-191617-20 1314-1710 11-14 11 12-15 1314-17
66
Reasonable Growth “On Target” (met or exceeded CRB) “Nearly On Target” (<2 points from CRB) “Off Target” (>2 points from CRB)
67
Average Growth Points Between Tests “On Target” (met or exceeded CRB) “Nearly On Target” (<2 points from CRB) “Off Target” (>2 points from CRB) Test EXPLORE to PLAN PLAN to ACT EXPLORE to PLAN PLAN to ACT EXPLORE to PLAN PLAN to ACT English 2-3 3-41 1 Math 2-3 1-2 2-31-2 Reading 1-24-53-42-33-41-2 Science 1-22-31-22-3 1
68
Average Growth Points Between Tests “On Target” (met or exceeded CRB) “Nearly On Target” (<2 points from CRB) “Off Target” (>2 points from CRB) Test EXPLORE to PLAN PLAN to ACT EXPLORE to PLAN PLAN to ACT EXPLORE to PLAN PLAN to ACT English 2-3 3-41 1 Math 2-3 1-2 2-31-2 Reading 1-24-53-42-33-41-2 Science 1-22-31-22-3 1
69
Analysis of EXPLORE to PLAN English and Reading – Stronger relationship if low in English, then low in Reading – language usage vs. reading comprehension? Math achievement is low – 8 th grade math vs. Algebra I? Low achievement in Science – 8 th grade Science course?
70
Analysis of PLAN to ACT Stronger relationship in Reading in English – curriculum vs. test? – lower CRB scores? Strongest relationship in Math – instruction in Algebra II? Weaker relationship in Science – process vs. content? Very few perform well on PLAN and miss CRB on ACT
71
Assessment Calendars
72
Time Elements of an Assessment Calendar Source: White, S. H. (2005). “Beyond the Numbers: Making Data Work for Teachers and School Leaders”. Lead and Learn Press: Englewood, CO When will we administer the assessment? When will we collect the data? When will we disaggregate the data? When will we analyze the data? When will we reflect upon the data? When will we make recommendations? When will we make the decisions about the recommendations? When will we provide written documentation about the decisions? When will we share the data with other stakeholders?
73
College and Career Readiness WORKSHOPS How important is the middle school to college readiness?
74
Performance Level Scaled Score Understanding Assessment Reports Domain/ Standard Score Benchmark/ GLCE Score Written Curriculum Alignment Analysis of Performance Task Analysis of Student Learning VALIDITYVALIDITY
75
PLAN and EXPLORE Item Analysis Use test booklets from 11-12 testing – Order extra materials for your staff (no cost) – Review items from the booklet and the student responses
76
College Readiness Benchmark Standards Compare Standards in each subject area – below, at, and beyond benchmark Review written and taught curriculum – Which unit in the course/grade level? – How was it assessed in the classroom? – What were the students’ scores on the classroom assessment? Explore lesson plans and activities
77
College and Career Readiness Source: College and Career Readiness Workshops, Fall 2011 CRB=22
79
Identify students who need assistance with the testing formats Needs identified by students on PLAN test – Writing – Reading – Math – Study Skills Identify students who need assistance with the testing formats – Writings using ACT rubric – Analyzing data in graphs, charts, and tables – Use of released items from MDE – Use of release practice items from ACT – Strategies for completing timed portions of ACT – Close and critical reading strategies from MS/HS Literacy Team
80
ACT Moodle Course Using College and Career Readiness Standards as a Tool for School Improvement
81
College and Career Readiness WORKSHOPS What are the academic achievements, educational plans, and parental educational attainments of our students?
82
Procedures for Reports Current student in 2011-2012: – Grade 10 or 11 for PLAN reports – Grade 9 or 10 for EXPLORE reports Valid PLAN or EXPLORE in 2010-2011 Selected self-reported data sets related to college and career readiness
83
PLAN Student Profile Report Test Scale Scores – color coded Self-Reported GPA on 4.0 scale Self-Reported Post-Secondary plans Self-Reported High School Coursework Self-Reported Educational Attainment: – Mother and Father
84
EXPLORE Student Profile Report Test Scale Scores – color coded Self-Reported grades on coursework aligned with test scores Self-Reported Post-Secondary plans
85
LastFirst PLAN English PLAN Reading PLAN Math PLAN ScienceGPAPlansHSCoursesEd Level - MomEd Level - Dad 142217 1.6 OTJT with military College prepSome college>4-yr degree 212423 3.44-yr coll./uni. College prepSome collegeCTE 91116151.5UndecidedCTE/OtherI don't know 101315162.8 Nothing after HSCTE/Other2-yr degree HS/equivalent 191718211.9>4-yr degree College prepSome college HS/equivalent 21 18213.9>4-yr degree College prep2-yr degree HS/equivalent 2018 212.3CTE schoolCTE/Other4-yr degree2-yr degree
86
Questions to Consider What is the relationship between CCR test scores and grades? What is the relationship between CCR test scores and post-secondary plans? What is the relationship between post-secondary plans and parents’ educational attainment?
87
What are some other data sets to consider for “College and Career Readiness”? Standards Met Credits Earned Courses Taken Grades Earned EDP Completion Resume Completion Interview Completion Application Completion
88
Open an EXPLORE, PLAN, or Elementary Report in Excel Determine College Readiness with color coding
89
Cut Scores, Student Growth and College/Career Readiness: 2011-2012 Data Dialogues Stan Masters Lenawee ISD January 17, 2012
90
Goals For Today Describe how the new cut scores will impact the MEAP/MME – Refine Data from 10-11 MEAP Describe how a growth model operates – Create an Assessment with Calculation for Growth Describe how multiple measures help define “College and Career Readiness” – Open an EXPLORE, PLAN, or Elementary Report in Excel and color code College Readiness
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.