Download presentation
1
Data Based Decision Making
2
Reading Review Stanovich, 2010 Fuchs & Fuchs -- Progress Monitoring
3
"It ain't so much the things we don't know that get us into trouble
"It ain't so much the things we don't know that get us into trouble. It's the things we know that just ain't so." -Josh Billings Perhaps the second most famous humor writer and lecturer in the United States in the second half of the 19th century after Mark Twain
4
We Never Know for sure… Even practices with the best research base… may not work for some students. So… if you are using a research based intervention – implement & COLLECT DATA! And… if you are struggling to identify a research-based intervention – implement & COLLECT DATA!
5
Critical Concept: Data Based Decision Making
Continuous, purposeful process of collecting, interpreting, presenting and using data to inform actions that support positive educational outcomes. Data based decision making considers the learner’s progress within the contexts of instruction, curriculum and environment.
6
Necessary components of Assessment
When a student is experiencing difficulty, several related & complementary types of assessment should be performed Assessment of the Learner (Student) Assessment of Instruction (or Intervention) Curriculum and Environment Learner Instruction/ Intervention Curriculum Environment
7
Measuring -ICE Instruction, Curriculum, Environment
What questions might you have about the instruction/intervention or curriculum? Are the instructional/interventions methods research based? Implementation fidelity? Is the classroom environment suitable to learning Time on task Instructional time Academic engaged time Opportunities to Respond & % Correct Responses Positive to Negative Ratio Student problem behavior
8
Models for Data Based Decision Making
Problem Solving Models & Outcomes Driven Models
9
Supporting Social Competence &
Academic Achievement OUTCOMES Supporting Decision Making Supporting Staff Behavior SYSTEMS DATA PRACTICES Supporting Student Behavior 9
10
Outcomes Driven Model In an Outcomes Driven Model, the bottom line is achievement of essential educational or social outcomes What are the desired outcomes? Are students attaining the necessary skills to be successful? If not, what changes can we make? Are the changes increasing student progress?
11
Research Based Frameworks Needed
How do we know what to measure & when? Reading RTI & Big 5 Ideas of Reading Math RTI Behavior PBIS, Function of Behavior & ABA
12
Big 5 Ideas of Reading Reading Comprehension Vocabulary Oral Reading
Fluency & Accuracy Phonics (Alphabetic Principle) Acquisition Fluency Maintenance & Generalization Phonemic Awareness
13
We must identify struggling students, BEFORE they fall too far behind
3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998)
14
Response to Intervention
Academic Systems Behavioral Systems Intensive, Individual Interventions Individual Students Assessment-based High Intensity 1-5% 1-5% Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures 5-10% 5-10% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Targeted Group Interventions Some students (at-risk) High efficiency Rapid response 80-90% 80-90% Universal Interventions All students Preventive, proactive Universal Interventions All settings, all students Preventive, proactive Circa 1996 14
15
Problem Solving Meeting Foundations
Identify Problems Team Initiated Problem Solving (TIPS) Model Develop Hypothesis Evaluate and Revise Action Plan Collect and Use Data Discuss and Select Solutions The full TIPS model. Two parts. Implementation of Problem Solving Meeting Foundations Use of the problem solving process (strategy?) Develop and Implement Action Plan Problem Solving Meeting Foundations
16
Purposes of Assessment
Screening “Which students need more support?” Progress Monitoring “Is the student making adequate progress?” Diagnostic “What and how do we need to teach this student?” Outcome “Has our instruction been successful?”
17
Outcomes Driven Model Screening Outcome Screening Diagnostic
Progress Monitoring
18
Effective Data Collection
19
Use the right tools for the right job
Screening Progress Monitoring Diagnostic Assessment Outcomes
20
Use Good Tools Technically Adequate
Reliability = Consistency The extent that an assessment will be consistent in finding the same results across conditions (across different administrators, across time, etc.) If same measure is given several times to the same person, their scores would remain stable & not randomly fluctuate
21
Use Good Tools Technically Adequate
Validity = extent that an assessment measures what it is supposed to measure First we need to know what we should be measuring! Research Based Frameworks for Measurement Students who do well on valid reading tests are proficient readers Valid = assessing reading by having the student read a passage aloud and monitoring errors and rate Not Valid = assessing reading by having a student match printed letters on a page (this is an assessment matching visual figures) Draw a line to Match the letters: A f U p w w E A f I U v B p
22
Use Good Tools A Concern for self-developed assessments
Technical Adequacy can be a problem with self-developed measures Challenge with Professional Learning Team model Which often rely on teacher-developed assessments to measure important student outcomes & guide decision making
23
Low Inference Students are tested using materials that are directly related to important instructional outcomes Low inference Making judgments on a child’s reading skills based on listening to them read out loud. High inference Making judgments on a child’s emotional state based on pictures they’ve drawn
24
Use the tools correctly Standardized Administration
Administered, scored, and interpreted in the same way Directions given to students are consistent Student responses are scored in the same way Every student has the exact same opportunity on the assessment
25
Efficiency Time is precious in classrooms, efficiency is an important consideration When evaluating efficiency of an assessment tool, we must consider: Time & personnel required to design, administer and score assessment tools Design Administration & Scoring PNRT’s Already designed Time intensive (1-2 hours/child) CBA Some already designed, Some teacher-created Quick and Easy (1-10 min/child) CBM
26
Screening
27
1. Compare ALL students to the same grade-level standard
ALL students are assessed against the grade level-standard, regardless of instructional level "If you don't know where you are going, you will wind up somewhere else.“ ~ Yogi Berra
28
2. Be efficient, standardized, reliable, and valid
Robust indicator of academic health Brief and easy to administer Can be administered frequently Must have multiple, equivalent forms (If the metric isn’t the same, the data are meaningless) Must be sensitive to growth
29
We must identify struggling students, BEFORE they fall too far behind
3. Accurately identify those who are on track and those who will need more support We must identify struggling students, BEFORE they fall too far behind Good, Simmons, & Smith (1998)
30
4. Evaluate the quality of your schoolwide instructional system
Are 80% of your students proficient? Are 80% of students reaching benchmarks and “on track” for next goal? If not, then the core curriculum needs to be addressed
31
What are Screening Tools?
Not Screening Tools DIBELS Oral Reading Fluency Maze EasyCBM CBM Math Computation CBM Writing – Story Starters CBM Algebra CBM Early Numeracy Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned
32
One Page of a 3-Page CBM in Math Concepts and Applications (24 Total Blanks)
33
Previous Years Discipline data
Who needs to be on our radar from Day 1? Who had FBA/BSP’s last year? Which students moved on? Which are returning this year? Can we get data for our incoming class & new students? Decision Rule
34
Progress Monitoring
35
Progress Monitoring Tools
Brief & Easy Sensitive to growth Equivalent forms Frequent
36
What course should we follow? How are we doing?
Where are we? What is our goal? What course should we follow? How are we doing? Our Goal Desired Course Notes: For example, in the Northwest boating is an important recreation and livelihood. Whether you are on a whale watching tour or fishing, sometimes finding your way back to your port is easy. The sky is clear, the ocean blue, and you can clearly see your home port and the course you should follow to reach a safe harbor. [click] But sometimes the fog roles in and our journey to our goal becomes much more difficult and challenging. It is hard to tell where we are, where we want to be, what course to follow, and whether we are getting closer to safety or need to make a course adjustment. [click] So we turn on the GPS and ask where we are. [click] Of course, knowing where we are is only of limited help. The great philosopher Buckaroo Bonzai once commented, “No matter where you go, there you are!” [click] We also need to know where the port, our safe harbor, is. [click] [click] We also need to know what course to follow to get there. [click] The GPS can tell us to point the boat at 117 degrees and progress for 20 minutes at 10 knots to reach our goal. Now we have a good plan about how to get to our goal, our safe harbor, and avoid the rocks and cliffs on either side. But, sometimes our plans go awry…. [click] We also need to check up on our progress in time to make course corrections. [click] If we are off course, the time to modify our plan is early, in time to still reach our safe harbor and not end up on the rocks. [click] We are Here Actual Course
37
Progress Monitoring: The GPS for Educators!
38
Purpose of Progress Monitoring
Answers the question(s): Are the children learning? How can we tell? Are they making enough progress? Can we remove some of our supports? Do we need to change or intensify our supports?
39
How often do you progress monitor students?
Determined by district decision rules and level of need Best practice recommendations: Intensive: 1-2 x per week Strategic: 1x or 2x per month
40
How do we know if a student is making adequate progress?
Correct words per Minute Decision Rules
41
Questions to Consider How many data points below the line before you make a change in instruction/intervention? What do you change? Group size? Time? Curriculum? Other factors?
42
Progress Monitoring Phonics for Reading 27 31 35 30 25 32 34 38
43
We do not use progress monitoring data to…
…select specific short-term instructional goals …take a lot of time away from instruction …diagnose educational problems …assign grades to students …evaluate teachers
44
What are Progress Monitoring Tools?
Not Progress Monitoring Tools DIBELS Oral Reading Fluency Maze EasyCBM CBM Math Computation CBM Writing – Story Starters CBM Algebra CBM Early Numeracy Quick Phonics Screener QRI-IV DRA2 Running Records Report cards Meeting OAKS standards Core curriculum weekly tests on skills that are learned
45
Progress Monitoring data tell us WHEN a change is needed
Progress Monitoring data does not always tell us WHAT change is needed
46
Point Card
47
Look at Individual Student graph for Targeted Student(s)
48
Diagnostic Assessment
Answer the question…. Why? WARNING! Critical Thinking Skills may be Required
49
Collecting Diagnostic Data
The major purpose for administering diagnostic tests is to provide information that is useful in planning more effective instruction. Diagnostic tests should only be given when there is a clear expectation that they will provide new information about a child’s difficulties learning to read that can be used to provide more focused, or more powerful instruction.
50
Diagnostic Assessment Questions
“Why is the student not performing at the expected level?” (Defining the Problem) “What is the student’s instructional need?” (Designing an Intervention)
51
Digging Deeper In order to be “diagnostic”:
We need to know the sequence of skill development Content knowledge may need further development
52
Enabling Skills Enabling skills are skills that could be considered prerequisite skills for the demonstration of proficient performances on larger assessments measures They represent the sub-skills of higher order performance demonstration Deficiencies in enabling skills will often result in lower performance on assessments
53
Phonemic Awareness Developmental Continuum
Vital for Diagnostic Process! Hard Phoneme deletion and manipulation Blending and segmenting individual phonemes Onset-rime blending and segmentation Syllable segmentation and blending Sentence segmentation Rhyming Word comparison THEN check here! IF DIFFICULTY DETECTED HERE.. Easy
54
Reading: Diagnostic assessments may include:
In curriculum assessments: Quick Phonics Screener Weekly assessment data Unit and Benchmark assessment data Survey Level Assessments Error Analysis or Running Records Any formal or informal assessment that answers the question: Why is the student having a problem?
56
Survey Level Assessment
Start at expected level and move backward until specific skill deficits are identified Match interventions to address specific skill deficits Example 2nd Grade Math Assignment – Double Digit Math FACTS sheet (+,-,x,/) -- student cannot do Progress backward in assessment to see where student can be successful Cannot do basic facts division multiplication or double digit subtraction or addition Can do single digit addition to +5 successfully
57
Error Analysis Select a 250 word passage on which you estimate that the student will be 80-85% accurate. Record the student’s errors on your copy of the reading probe. Use at least 25 errors for students in grade 1 to conduct an error analysis and at least 50 errors for students in second grade and above. Use an error analysis sheet to conduct error analysis.
58
Error Analysis
59
We do not use diagnostic data…
…for all students …to monitor progress towards a long-term goal …to compare students to each other
60
Outcome Was the goal reached?
Often times, the same assessment as your screener Can be CBM, State-testing (OAKS), other high stakes assessments. Should be linked to district standards and benchmarks
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.