Download presentation
Presentation is loading. Please wait.
Published byGyles Blankenship Modified over 8 years ago
1
Bill Breisch and Ed O’Connor Midwest Instructional Leadership Council
2
Nonprofit organization formed June 2010 Focus of the organization is “building and sustaining capacity for improving the achievement of all children” Pursuing that mission by: ◦ Professional development conferences and institutes ◦ Embedded coaching and support ◦ Bridging research and practice Midwest Instructional Leadership Council2
3
Ed O’Connor: Educational Consultant/Instructional Data Coach Midwest Instructional Leadership Council (miLc) Trained as a School Psychologist MS/PhD University of Wisconsin-Madison 15 Years in the Monona Grove School District School Psychologist at all levels 4K-12 Director of Continuous Improvement and Assessment Currently working with 15+ districts across MN/WI Continuous School Improvement/RtI Systems Analysis Using Data to Improve Instruction
4
Bill Breisch Recently retired after serving entire professional career (44 years) in the Monona Grove School District 15 years – Fifth and Sixth Grade Teacher 17 years – Elementary School Principal 12 years – Director of Instruction Served on the School Administrators Alliance Assessment Project Team that drafted recommendations regarding the development of new statewide student assessment Educational Consultant & Board of Directors - miLc
5
Participants in this session will: be able to connect conceptually the ACT measures to the larger context of Continuous School Improvement (CSI). become aware of (or review) the national correlations (grades 3-11) between Measures of Academic Progress (MAP) and WKCE/ACT. have discussed the relevance of “summative assessments” for local school improvement efforts through ◦ examples of K-12 Benchmark Targets Developed from Local Data. ◦ application examples from WI school districts.
6
The Big Picture
7
We are doing so much assessment there is no time for teaching. We are being forced to “teach to the test” because there is too much emphasis on test results. We are collecting so much assessment data, but I still don’t know what I am supposed to do with it. I don’t know how all this assessment is helping us. 3/3/10(c) Midwest Instructional Leadership Council7
8
“Highly Effective Schools” Research: Reliable and Valid data collection procedures Regular review of outcomes against benchmark targets Routine reporting of results to stakeholders Structures for using data to determine how to allocate resources Structures for evaluating the relationship between action plans and outcomes observed. 3/3/10(c) Midwest Instructional Leadership Council8
9
Quality assessment procedures are necessary but not sufficient to produce exceptional learning outcomes. “You can’t fatten the pig by weighing it” But… “You can’t tell if you are approaching your destination if you don’t have any data” 3/3/10(c) Midwest Instructional Leadership Council9
10
The fundamental purpose of assessment is to allow us to make decisions: ◦ Big Picture: Are we achieving our system’s purpose? What is that purpose? ◦ Intermediate Picture What is working? What is not working? ◦ Immediate Picture What should I/we do? 3/3/10(c) Midwest Instructional Leadership Council10
11
MAP, Explore, Plan, ACT, and WKCE What are our targets? 3/3/10(c) Midwest Instructional Leadership Council11
12
David Conley, Director of the Center for Educational Policy and Research at the University of Oregon Broad Definition of CCR
13
Results of this ACT study provide empirical evidence that, whether planning to enter college or workforce training programs after graduation, high school students need to be educated to a comparable level of readiness in reading and mathematics. Graduates need this level of readiness if they are to succeed in college-level courses without remediation and to enter workforce training programs ready to learn job-specific skills.
14
Identifying the level of reading and mathematics skills students need to be ready for entry-level jobs that require less than a bachelor’s degree, pay a wage sufficient so support a family, and offer the potential for career advancement. Comparing student performance on ACT tests that measure workforce readiness (WorkKeys tests: Reading for Information and Applied Mathematics) with those that measure college readiness (ACT) Determining if the levels of performance needed for college and workforce readiness are the same or different
15
WKCE “Proficient” and “Advanced” Scores, Measures of Academic Progress (MAP) RIT Scores, and National Percentiles (8 th Grade) WKCEMAP RIT ScoreNational Percentile Reading – “Old Proficient” 207 17 th Reading – “New Proficient” 227 70 th Reading – “Old Advanced” 224 60 th Reading – “New Advanced” 242 94 th Math – “Old Proficient” 218 24 th Math – “New Proficient” 235 61 st Math – “Old Advanced” 240 72 nd Math – “New Advanced” 252 90 th Source: Northwest Evaluation Association (May 2010 and August 2012). Wisconsin Linking Study: A Study of the Alignment of the NWEA RIT Scale with the Wisconsin Knowledge and Concepts Examination. Lake Oswego, OR: The Kingsbury Center at Northwest Evaluation Association.
16
http://www.nwea.org/our-research/college-readiness
17
Active NWEA districts that use EXPLORE, PLAN, and ACT were recruited. ACT data was matched to corresponding MAP data at the individual level Total number of matched record pairs 108,000 No formal sampling strategies employed other than to cut extreme residuals
18
MAP Mathematics RIT Score as Predictor – Same Season Cut Scores and Normative Percentile Ranks on MAP Corresponding to College Readiness Benchmarks Grade Mathematics College Readiness Test Benchmark MAP Cut Score MAP Normative Percentile Rank 8EXPLORE Math1724572 10PLAN Math1925177 11ACT Math2225884
19
MAP Reading RIT Score as Predictor – Same Season Cut Scores and Normative Percentile Ranks on MAP Corresponding to College Readiness Benchmarks Grade Reading College Readiness Test Bench mark MAP Cut Score MAP Normed Percentile Rank English College Readiness Test Bench mark MAP Cut Score MAP Normed Percentile Rank 8 EXPLORE Reading 1523070 EXPLORE English 1322044 10 PLAN Reading 1723473 PLAN English 1522758 11 ACT Reading 2123778 ACT English 1823268
20
MAP Language Usage RIT Score as Predictor – Same Season Cut Scores and Normative Percentile Ranks on MAP Corresponding to College Readiness Benchmarks Grade Reading College Readiness Test Bench mark MAP Cut Score MAP Normed Percentile Rank English College Readiness Test Bench mark MAP Cut Score MAP Normed Percentile Rank 8 EXPLORE Reading 1522972 EXPLORE English 1321943 10 PLAN Reading 1723273 PLAN English 1522556 11 ACT Reading 2123475 ACT English 1822862
23
The Big Picture
24
The BIG picture: To achieve exceptional student learning outcomes we must: Employ continuous school improvement frameworks that organize and systematize our “acts of improvement” so that we can determine when we are making progress and when we are not. Establish clear and measurable goals. Review and discuss progress openly and often. In other words we must… ◦ Use data to drive instruction (c) miLc 12/01/201024
26
What framework guides your thinking?
27
System System units ◦ Building ◦ Grade ◦ Classroom ◦ Department Individual students
29
(c) miLc 12/01/201029
30
Summative Assessment Data-Aggregated Measures of Academic ProgressGrades Explore/Plan/ACT Minnesota Comprehensive Assessments Strengths Opportunities for Improvement Goals – District – Building – Grade – Student Root Cause Analysis: Why do the observed strengths and OFI’s (gaps) exist? Action Planning: If we do _________ we should sustain strengths and reduce gaps. Summative Assessment Data -Disaggregated Interim Assessment Data Identify and validate gaps at unit level. Suggest or validate contributing causes. Action plan implementation - Adequate training, resources and support along with fidelity monitoring. Frequent formative Assessment Interim Assessment: Indicates short-term effect and predicts impact on later summative assessment performance.
31
Tracking Student Progress Toward College and Career Readiness 3/3/10(c) Midwest Instructional Leadership Council31
32
Small sample sizes ◦ Each grade group contains between 200 and 300 students Observed probabilities ◦ The proportion of students meeting or exceeding the cut score that also exceed the predicted cut (e.g., exceeds MAP cut and also meets CCR Target on the Explore). Receiver Operating Characteristics (ROC) Curve Analysis procedures ◦ Cut scores attempt to statistically maximize the number of students correctly identified as needing acceleration support (sensitivity) while at the same time maximizing the number of students correctly identified as “on-track” (specificity) ◦ In other words high sensitivity means there is a high probability that a “positive” test result is truly positive and high specificity indicates a high probability that a “negative” result is truly negative. ◦ We elect to prioritize specificity because we are most concerned about “missing” students who truly need additional attention. 3/3/10(c) Midwest Instructional Leadership Council32
34
3/3/10(c) Midwest Instructional Leadership Council34 True Positives True Negatives False Positives False Negatives
35
3/3/10(c) Midwest Instructional Leadership Council35
38
Contact Information: Ed O’Connor(608) 516-0457 ◦ eoconnor.milc@gmail.com Bill Breisch(608) 209-6973 ◦ billmadison@gmail.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.