Download presentation
Presentation is loading. Please wait.
1
Strand G, Session 2 Boots on the Ground
Perspectives from District and School Leaders Implementing Intensive Intervention
2
Presenters: Teri Marx, PhD NCII Technical Assistance Liaison
Derrick Bushon Director of Student Services, Swartz Creek School District, MI Mary Anne Wolfmeyer School Psychologist, Columbia Public Schools, MO Melinda Gross Nicole Bucka NCII Coach, Northern Rhode Island Collaborative
3
Strand Session Overview
Overview of the National Center on Intensive Intervention Introduction to Data-Based Individualization (DBI) District and School Level Implementation
4
Learning Objectives Participants will be able to…
Describe how lessons learned from “real-world” implementation of DBI may impact current practice. Identify practical solutions and resources to support students who require intensive intervention.
5
National Center on Intensive Intervention (NCII)
Intensive Intervention through Data-Based Individualization
6
NCII’s Mission To build district and school capacity to support implementation of data-based individualization in reading, mathematics, and behavior for students with severe and persistent learning and behavioral needs.
7
Multi-tiered Systems of Support (MTSS)
Policies and practices within schools should ensure that students with disabilities have access to intensive intervention. When students with disabilities do not respond to Tier 1 and Tier 2 supports, they must have access to intensive intervention if they require it. Students may receive services at different levels of support in different areas. For example, a student with a learning disability in reading may need intensive intervention in reading while needing only core instruction in mathematics.
8
Intervention Levels/Tiers
Primary (T1) Secondary (T2) Intensive (T3) Instruction/ Intervention Approach Comprehensive research-based curriculum Standardized, targeted small-group instruction Individualized, based on student data Group Size Class-wide (with some small group instruction) 3–7 students No more than 3 students Monitor Progress 1x per term At least 1x per month Weekly Population Served All students At-risk students Significant and persistent learning needs Academic example - this is what people are probably familiar with. Review table. Highlight the differences between the tiers in terms of: Increasingly focused/tailored approach Decreasing group size Increasing frequency of progress monitoring Increasing student need
9
Who needs intensive intervention?
Students with disabilities who are not making adequate progress in their current instructional program Students who present with very low academic achievement and/or high-intensity or high-frequency behavior problems (typically those with disabilities) Students in a tiered intervention program who have not responded to secondary intervention programs delivered with fidelity Note for second bullet: The decision to move a student directly to an intensive intervention should be made on an individual and case-by-case basis. In most cases, data should be collected over time to help demonstrate that the student’s low achievement/behavior challenges are both significant AND persistent.
10
Data-Based Individualization (DBI)
Data-Based Individualization (DBI) is a systematic method for using data to determine when and how to provide more intensive intervention: Origins in data-based program modification/experimental teaching were first developed at the University of Minnesota (Deno & Mirkin, 1977) and expanded upon by others (Capizzi & Fuchs, 2005; Fuchs, Deno, & Mirkin, 1984; Fuchs, Fuchs, & Hamlett, 1989). DBI is a process, not a single intervention program or strategy. Not a one-time fix—ongoing process comprising intervention and assessment adjusted over time. Read slide.
11
NCII’s Approach DBI: Integrating data-based decision making across academics and social behavior
12
Integrated Relationship
Integrating intensive behavioral intervention into tiered systems is complicated work. For students with both academic and behavioral needs the relationship is most likely connected. Students who lack proficiency in academic skills may demonstrate avoidance behaviors as a mechanism to avoid assigned tasks. Skill deficit Avoidance behavior Removal from task
13
Academic & Behavior Intervention
Not all students respond to standardized, evidence-based interventions… Analysis of student response data from controlled studies suggests that approximately 3-5 percent of students or 20 percent of at-risk students do not respond to standard, evidence-based intervention programs (Fuchs et al., 2012; Wanzek & Vaughn, 2009; Conduct Prevention Problems Research Group, 2002). Despite interventions being generally effective for students demonstrating difficulty These students may demonstrate BOTH related academic and behavioral needs
14
Considerations for DBI Implementation
NCII provides intensive technical assistance to several districts to support DBI implementation. Through this work, NCII has identified several key elements for implementation. Within each area, some aspects are critical or “nonnegotiable.” We believe these pieces must be in place for successful implementation. The “negotiables” column lists areas that are more flexible. Some aspects will vary with school context and others reflect quantifiable aspects of implementation that are expected to increase over time (e.g., the number of student plans, the grades and content areas to which DBI is applied).
15
Q&A with District and School Leaders
Practical Examples of DBI Implementation
16
“Pre” NCII What processes did you have in place to identify students who needed additional intervention? What was your approach to intensifying supports for learners?
17
What Did You Discover About Your Problem-Solving Processes?
Briefly review Handouts 4 (Agenda) and 5 (Note-Taking Template) We have provided our coaches with a set of tools that schools can use and modify as needed. These forms reflect important components of intervention planning, but these exact forms are not required—if you already have a system in place that is working well for you, that is great!
18
What Did You Learn About Fidelity?
Fidelity refers to the degree to which the program is implemented the way intended by program developer. Fidelity = Consistency and Accuracy Fidelity = Integrity Fidelity refers to how closely prescribed procedures are followed, and in the context of schools, the degree to which teachers implement programs the way they were intended by the program developers. It also relates to the quality of the implementation. This means that teachers are implementing the intervention with consistency and accuracy, and are adhering to the instructional plan with integrity. Note: Throughout discussions of fidelity it is important to ensure that teachers believe that they work in an open, non-threatening environment that values their skills and expertise and where they can learn from their colleagues. With a system of open communication and productive feedback, fidelity checks of classroom techniques and the essential components of multi-tiered systems of support can be a useful and supportive way for teachers to collaborate and become a stronger teaching network. This may be a useful discussion point for some groups. Gersten et al., 2005; Mellard & Johnson, 2007; Sanetti & Kratochwill, 2009
19
Five Elements of Fidelity
Student Engagement: How engaged and involved are the students in this intervention or activity? Adherence: How well do we stick to the plan, curriculum, or assessment? Program specificity: How well is the intervention defined and different from other interventions? Exposure/Duration: How often does a student receive an intervention? How long does an intervention last? This graphic provides one example of a way to think about fidelity, and includes the elements of adherence, exposure, quality of delivery, program specificity, and student engagement. Schools should have procedures in place to monitor the fidelity of their implementation of secondary interventions. While these don’t have to be formal, it is important to consider whether or not they’re implementing programs the way that they are intended to be delivered. In the midst of all of the responsibilities of educators, small checks can make a big difference in keeping services for students on track. Note: The notes on each element on fidelity are animated to pop up with each click. Click ahead each time you discuss a new element of fidelity, and click again to close that element. (Click) When we discuss adherence we are focused on how well we stick to the plan/curriculum/assessment, or implementing the plan/curriculum/assessment as it was intended to be implemented based on research. For a secondary intervention, this may mean how well teachers implement all pieces of an intervention, in the way they were intended to be implemented. This doesn’t necessarily mean that teachers should follow a script word for word, but that covering certain content with appropriate pacing and relevant language and techniques are important. (Click) 2. (Click) Duration/Exposure refers to how often a student receives an intervention and how long an intervention lasts. When thinking about fidelity we are considering whether the exposure/duration being used with a student matches the recommendation by the author/publisher of the curriculum. In the case of secondary interventions, developers and researchers typically specify the required exposure/duration that is needed for the intervention to be effective for most students. If the intervention developer calls for the intervention 3 days a week for 45 minutes each day, is the student receiving this dosage? (Click) 3. (Click) Not only is it important to adhere to the plan/curriculum/assessment, but it is also import to look at the quality of delivery. This refers to how well the intervention, assessment, or instruction is delivered. For example, do you use good teaching practices? Quality instructional delivery also means that teachers are engaged in what they’re teaching, and animated in their delivery, not simply reading from a script. Providing teachers with constructive feedback on their instructional delivery is one way to improve the quality of delivery for secondary interventions. (Click) 4. (Click) Another component is program specificity, or how well the intervention is defined and how different it is from other interventions. Having clearly defined interventions/assessments allows teachers to more easily adhere to the program as defined. Is the intervention a good match for the student’s needs? Or does every low reader get the same intervention? (Click) 5. (Click) Just as quality of delivery is critical, it also is important to also focus on student engagement, or how engaged and involved the students are in the intervention or activity. Following a prescribed program alone is often not enough. Consider whether or not competing behaviors make it difficult for students to take part in the intervention as designed. During the delivery of secondary interventions, teachers may need to use behavior management strategies to manage student behaviors, including providing choice, adding elements of competition, and offering frequent opportunities to respond. (Click) Quality of Delivery: How well is the intervention, assessment, or instruction delivered? Do you use good teaching practices? (Dane & Schneider, 1998; Gresham et al., 1993; O’Donnell, 2008)
20
How Did You Begin to Use Data to Make Decisions?
Use valid, reliable progress monitoring tool. Graph the data. Collect data at regular, frequent intervals within intensive intervention. DBI requires valid, reliable data on student performance to determine when and how an intervention should be changed. For these data to be useful, they need to be regularly collected and readily available to those making decisions. Progress monitoring tools must be sensitive to student growth, and data should be graphed. Clear decision rules should provide guidance on when an intervention needs to be changed.
21
What can a graph tell you?
The first step in using progress monitoring data to inform instructional decisions is to examine the graph. Reviewing graphed data not only tells us how a student is responding to an intervention, but it may also reveal patterns that indicate specific considerations for future data collection or intervention changes that may be necessary. For more information about reviewing graphed progress monitoring data and using progress monitoring data to make intervention decisions, view the webinar “Data Rich, Information Poor? Making Sense of Progress Monitoring Data to Guide Intervention Decisions” at this link: 21
22
Resources for Identifying Assessment Tools
NCII website Academic Progress Monitoring Tools Behavioral Progress Monitoring Tools tools The NCII DBI Training Series ( has modules on both academic and behavioral progress monitoring.
23
What About Buy-In? Strategies for success
Establish vision and goals for DBI Promote staff buy-in by: Making DBI relevant, shaping culture and expectations Involving staff in decision making Provide supporting structures and resources Including assessments, interventions, professional development, staff time
24
How Did You Move from “Admiring a Problem” to Identifying Solutions?
Key considerations Accurate student data Measurable goal(s) for the intervention Timeline for executing and revisiting the intervention plan Follow meeting routines Ask clarifying questions Prioritize intensification adaptations
25
Hypothesize to Identify Adaptations
Quantitative Instructional Environmental Setting Events Triggering Events Function When making a plan for what you are doing with a child. We want to be systematic. We have to narrow the focus (not do everything at once) Brainstorm and rank on this
26
Resources for Identifying Interventions
NCII website Academic Intervention Chart Behavioral Intervention Chart The NCII DBI Training Series ( has modules on both academic and behavioral progress monitoring.
27
What If Evidence-Based Interventions Aren’t Available?
Use them when available and consider augmenting current offerings, if feasible. Also consider: Remediation materials that came with your core program materials Expert recommendations (if evidence-based programs are not available) Standards-aligned materials Collect data to determine whether most students are profiting. Perhaps in your school or district, pre-packaged evidence-based intervention programs aren’t available. NCII recommends that schools use evidence-based intervention programs when available, and consider augmenting current offerings if feasible. If evidence-based interventions aren’t available, consider using remediation materials that came with your core program materials, expert recommendations, or standards-aligned materials. Most importantly, always collect data to determine whether most students are profiting from the instruction you’re providing. If the data reveal that most students are not profiting from your instruction, you will need to refine your instruction until you find that most students are profiting.
28
Final Thoughts
29
What Are Some Key Take-Aways?
DBI is an ongoing, team process that comprises ongoing assessment, intervention, evaluation, and adjustment to maximize student outcomes. Intensive interventions will not look the same for all students. They are individualized based on unique needs.
30
Connect to NCII Sign up on our website intensiveintervention.org to receive our newsletter and announcements Follow us on YouTube and Twitter YouTube Channel: National Center on Intensive Intervention Twitter
31
Disclaimer This presentation was produced under the U.S. Department of Education, Office of Special Education Programs, Award No. H326Q Celia Rosenquist serves as the project officer. The views expressed herein do not necessarily represent the positions or policies of the U.S. Department of Education. No official endorsement by the U.S. Department of Education of any product, commodity, service or enterprise mentioned in this website is intended or should be inferred.
32
References Aud, S., Hussar, W., Johnson, F., Kena, G., Roth, E., Manning, et al. (2012). The condition of education 2012 (NCES ). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Retrieved from Capizzi, A. M., & Fuchs, L. S. (2005). Effects of curriculum-based measurement with and without diagnostic feedback on teacher planning. Remedial and Special Education, 26(3), 159–174. Cortiella, C. (2011). The state of learning disabilities. New York: National Center for Learning Disabilities. Retrieved from _state_of_ld_final.pdf Deno, S. L., & Mirkin, P. K. (1977). Data-based program modification: A manual. Minneapolis, MN: Leadership Training Institute for Special Education.
33
References Fuchs, L. S., Deno, S. L., & Mirkin, P. K. (1984). The effects of curriculum-based measurement evaluation on pedagogy, student achievement, and student awareness of learning. American Educational Research Journal, 21(2), 449–460. Fuchs, L. S., Fuchs, D., & Hamlett, C. L. (1989). Effects of instrumental use of curriculum- based measurement to enhance instructional programs. Remedial and Special Education, 10, 43–52. Planty, M., Hussar, W., Snyder, T., Provasnik, S., Kena, G., Dinkes, R., et al. (2008). The condition of education 2008 (NCES ). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Retrieved from
34
Strand G, Session 2 Contact:
Teri Marx, PhD 1000 Thomas Jefferson Street NW Washington, DC
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.