Download presentation
Presentation is loading. Please wait.
Published byRobert Burns Modified over 7 years ago
1
Karin Aure Dixon, Ed.D. 87th Annual CERA Conference December 4, 2008
Avoiding Data Analysis and Decision-Making Pitfalls: How an Evaluation Guide Can Help Karin Aure Dixon, Ed.D. 87th Annual CERA Conference December 4, 2008
2
How it all began… Asst. Superintendent of Curriculum concerned about short-term views and reactionary decisions Director of Research and Evaluation concerned about disproportionate reliance on CSTs TEAM
3
Data Analysis and Decision Making Pitfalls
We began with identifying problems in our data analysis and decision making… Misuse of data (e.g., CST data to place individual students) Misinterpretation of data (e.g., CST comparisons across grade levels) Disproportionate valuation of data (e.g., CELDT as only measure of ELD) Inappropriate data for program purpose (e.g., CST data for Afterschool Program) Lack of comprehensive data (e.g., all outcome data)
4
The Old Stupid vs. The New Stupid
Resistance to performance Measures The NEW Stupid Reflexive reliance on a few simple metrics “Today's enthusiastic embrace of data has waltzed us directly from a petulant resistance to performance measures to a reflexive and unsophisticated reliance on a few simple metrics—namely graduation rates, expenditures, and the reading and math test scores of students in grades 3 through 8. The result has been a nifty pirouette from one troubling mind-set to another; with nary a misstep, we have pivoted from the "old stupid" to the ‘new stupid.’” (Frederick M. Hess, 2008)
5
Solutions to the Pitfalls
Next, we began to identify solutions to the pitfalls… Address and improve the process of evaluation Create a protocol for evaluating programs and practices Create a guide describing the new protocol Educate stakeholders about the protocol Let the protocol become a habit of mind
6
Guidance for Using Data to Inform Decision Making
From the December 2008/January 2009 issue of Educational Leadership… Focus on questions, not data Be skeptical of easy answers Become assessment literate Think beyond test scores Use informed judgment
7
Evaluation Wheel Then came the wheel… Actually, it started as a pie…
8
Evaluation Guide Then came the guide…
9
Evaluation Steps 1. Identify Question 6. Use Evaluation Findings
2. Identify System of Focus 5. Interpret Data 3. Identify and Collect Data 4. Organize and Analyze Data Examples
10
Step 1: Identify question
What do you want to know? Guiding questions: Focus your evaluation Define your purpose Determine data to be collected Characteristics of good questions: Open-ended Allow for all possibilities Feasible Examples
11
Evaluate these sample questions
Are counseling groups at the high school worthwhile? How does participation in a girls’ group affect students’ understanding of bullying and harassment? Are LUSD graduates becoming contributing members of society? What is the effect of Head-Pollett math instruction on student performance in Measurement and Geometry? What are grade students’ favorite colors?
12
Step 2: Identify System of Focus
Cognitive System Meta Cognitive System Self System Focuses evaluation Aligns with district professional development and strategic direction
13
Step 3: Identify and Collect Data
Identify existing data that address your question (e.g., state tests, local tests, staff development records, observation records) Determine what information is lacking and design new data collection tools/techniques, as necessary (e.g., survey, focus group, assessment) Consult Evaluation Wheel to ensure comprehensive coverage (e.g., outcome and process data, multiple data sources)
14
Step 4: Organize and Analyze Data
Summarize the data (e.g., percentage of students proficient, average student growth, common patterns in student perceptions) Disaggregate by important factors (e.g., CELDT proficiency level, grade, program status, implementation, level of support) Use charts and graphs to analyze data visually Identify limitations of tools and data collection strategies (e.g., no comparison group or data, small sample, imperfect assessments)
15
What accounted for these results and what should be done in response?
Step 5: Interpret Data Review summarized data Generate broad statements of results Consult Evaluation Wheel to identify important context factors and relationships Formulate explanations of the data Determine recommendations What accounted for these results and what should be done in response?
16
Step 6: Use Evaluation Findings
Create an action plan Include continued data collection Determine next review cycle Get to work!
17
Putting the Guide to Work: Examples
Systematic English Language Development Question: How has EL student academic performance changed since the implementation of SELD? Data: Outcome – CELDT, CST ELA, Express test; Process – Classroom observations, review of materials READ 180 Question: How are R180 students performing academically over time and compared to Non-R180 students in Reading? Data: Outcome – CST, SRI lexiles, rSkills, independent reading quizzes; Process – Teacher survey, classroom observation protocol, principal response, software zones
18
Putting the Guide to Work: Examples (cont.)
Dual Language Immersion Program Question: How are Dual Immersion students performing over time and compared to Non-dual students in ELA and SLA? Data: Outcome – CST ELA, STS, Aprenda, CAT/6, student writing samples (Spanish); Process – Student presentations, teacher observations Reading Proficiency Question: How are elementary students performing in Reading? Data: Outcome – CST ELA Reading Comprehension, CAT/6 Reading, SRI, DRA, classroom reading data ; Process – Principal response, literacy framework review
19
Questions and Comments
Thank you! Karin
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.