Presentation is loading. Please wait.

Presentation is loading. Please wait.

Matching Efforts to Outcomes: Are you really sure? Keynote Address for Nonpublic Special Education Programs Annual Conference Hank Bohanon, Ph.D.

Similar presentations


Presentation on theme: "Matching Efforts to Outcomes: Are you really sure? Keynote Address for Nonpublic Special Education Programs Annual Conference Hank Bohanon, Ph.D."— Presentation transcript:

1 Matching Efforts to Outcomes: Are you really sure? Keynote Address for Nonpublic Special Education Programs Annual Conference Hank Bohanon, Ph.D. hbohano@luc.edu http://www.luc.edu/cseit Center for School Evaluation, Intervention and Training (CSEIT), Loyola University, Chicago

2 Thank yous Sarah Sebert and Paul Nijensohn (ISBE) Barbara Simms (ISBE), Illinois State Technical Assistance Center Kelly Raucher (ISBE), SEL Kathy Cox (ISBE), Illinois ASPIRE Dean David Prasse, Loyola University of Chicago

3 Additional Thank yous Dr. Pamela Fenning, CSEIT Sara Golomb, CSEIT Agnieszka Kielian, CSEIT Lisa Lewis, CSEIT Dr. Diane Morrison, CSEIT Audrey Shulruff, CSEIT

4 Goal and Objective Increase participants awareness of how program data decision making and evaluation efforts can be integrated. Data process Data integration Examples

5 Data Process

6 Process What are your questions? What are your data sources? What reports do you need? Who needs access? What resources do you have?

7 Questions Align with targets/strategic plan Align with state initiatives Training objectives Evaluation questions: 1. If you train, do people implement? 2. Do people implement with fidelity 3. Do the interventions sustain? 4. What is the impact on your constituents?

8 Data Process – Instruments (e.g., fidelity tools) – What is your process? Outcomes –Performance data Curriculum-based measures Classroom checklists Office discipline referrals Transition information (e.g., post-secondary)

9 Instrument development Item development (align with questions) –Systems, Practice, Data, Outcomes –Schoolwide, Classroom, Non-Classroom, Individual Judgmental validity (Expert Judgment) Concurrent validity (With Reliable Tool)

10 Instrument development Pilot instrument Item analysis (factor analysis) Review and update Balance input with process

11 Access What types of decisions need to be made based on the data and who makes them – Statewide personnel – Administrators – Direct service providers

12 Reports In what format do you need the data? –Graphs Specific instruments Combinations (process and outcomes) –Summaries & reports –Output files (flat/rectangular format) Common identifiers Clean your data (SOP)

13 Resources The more you can draw upon existing resources, the less time it will take

14 Data Integration

15 Technology Data decision making Data inputs – Data already exists SOP for cleaning data – Additional data from direct assessments SOP for reliability Data integration –Requirements, common identifier –National Educational Technology Standards (NETS) –Universal data system for educational data

16 Tips Data cleaning Reliability of data entry

17 Reports Static –Graphs, tables, etc. –based on questions Dynamic –Analysis based requires export of flat files

18 Decision Making Statewide –Summative and formative: Annual and Quarterly reports Administration –Formative: Professional development, tracking data collection, reports Direct Service –Diagnostic: How much of the interventions is in place, what is the impact, and what changes do you need to make?

19 Integration of Data Example of questions Sample instrument Sample report

20 Considerations Key Components from Three- Tiered Intervention Programs

21 STUDENTS WHO RESPOND TO A CONTINUM OF SUPPORTS Students who respond to intensive academic and behavior support Students who respond to less intensive academic and behavior support Students who would respond to effective core academic and behavior curriculum National Standard Schoolwide support Group Support 1-7% 5-15% 80-90% Individual Support OSEP-PBS

22 Key Elements Systems –Administrative Commitment –Priority for Staff –Representative Team –Audit of practices –Action Plan –Data System –Internal Coaching –External Coaching Practices –Based on evidence Data –Process and impact What and with whom?

23 Schoolwide Supports Identify expectations of the setting Evaluate implementation and evaluation of core curriculum Develop team/plan/support Directly teach expectations Consistent consequences Acknowledgement Collect data – Process, academics and behavior Communicate with staff On-going evaluation

24 Questions to ask What does your current system provide information that is: –Efficient, reliable, dependable, user friendly Who needs to see your data? –Who are your stakeholders What is your timeline?

25 Resource Visit our center website for more information –http://www.luc.edu/cseit/learning.shtml Useful example of reports –http://www.pbisillinois.org/ Example of coaching and data collection –http://flpbs.fmhi.usf.edu/coachescorner.asp

26 Thanks!


Download ppt "Matching Efforts to Outcomes: Are you really sure? Keynote Address for Nonpublic Special Education Programs Annual Conference Hank Bohanon, Ph.D."

Similar presentations


Ads by Google