Are We making a Difference

Slides:



Advertisements
Similar presentations
The IEP Individualized Educational Program. The IEP is the process and document that outlines what a free appropriate public education (FAPE) is for an.
Advertisements

Transition and Indicator 13 Writing Individualized Education Programs (IEPs) That Meet the Legal Mandate A Webinar Series Presented by The California Community.
Pre-test Please come in and complete your pre-test.
Opening Session Tutor Conference March 11, :00 am – 10:15 am.
CHILD OUTCOMES BASELINE AND TARGETS FOR INDICATOR 7 ON THE STATE PERFORMANCE PLAN State Advisory Panel for Exceptional Children November 12, 2009 January.
Title I Needs Assessment and Program Evaluation
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Part B Indicator 13 FFY 09 SPP/APR Writing Suggestions Western Regional Resource Center APR Clinic 2010 November 1-3 San Francisco, California.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
What Is TRANSITION & Transition PLANNING?
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
SAISD Office for Professional Learning Advisory Board March 2, 2005 Navarro Academy March 2, 2005 Navarro Academy 623 S. Pecos.
A Model for Collaborative Technical Assistance for SPP Indicators 1, 2, 13, & 14 Loujeania Bost, Charlotte Alverson, David Test, Susan Loving, & Marianne.
11th Annual NDE Transition Conference
Skills and Achievement Commencement Credential
Presented by Monica Ballay, LASPDG Staff
Tools Parent Centers Can Use to Help Families with Secondary Transition Planning Catherine Fowler, National Secondary Transition Technical Assistance.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
Ways to Assess Family Engagement Outcomes Wendy Allen
HECSE Quality Indicators for Leadership Preparation.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
ND Topical Call Subgrantee Monitoring Tools: Meeting the Civil Rights Obligations to Students (Call 1) January 14, 2015.
IEP Training for Kansas Schools 2013 – 2014 Kansas State Department of Education Technical Assistance System Network (TASN) Overview and Preparation for.
Council for Exceptional Children/Division of Early Childhood Conference October 2010 Kim Carlson, Asst. Director/619 Coordinator Ohio Department of Education.
A Principled Approach to Accountability Assessments for Students with Disabilities CCSSO National Conference on Student Assessment Detroit, Michigan June.
The Regionalization Project New Regional Field Coordinator Orientation.
YOUTH TRANSITION PROGRAM (YTP) PUT INTO PRACTICE Reynolds School District.
Pennsylvania Training and Technical Assistance Network Overview Effective Practices for Secondary Transition (EPST) Cohort # 1 PAIU Special Education Directors.
Parent Involvement: Who’s Accountable? Who Benefits? Batya Elbaum, Ph.D. University of Miami Annual Meeting of The Family Cafe Orlando, FL June 3, 2006.
An Introduction to the State Performance Plan/Annual Performance Report.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
National High School Center Summer Institute What’s the Post-School Outcomes Buzz? Jane Falls Coordinator, National Post-School Outcomes Center Washington,
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Tips and tools for using the institute model for professional development, data- based decision-making, and strategic planning Jennifer Hill, Western Michigan.
 Development of a model evaluation instrument based on professional performance standards (Danielson Framework for Teaching)  Develop multiple measures.
Strengthening Student Success Conference October 7 – 9, 2009 Assessing and Improving Student Learning throughout Student Services Presenters from Sacramento.
Cc Developed through a partnership of the Maryland State Department of Education Division of Special Education and Early Intervention Services and the.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
TigerLIFE Behavioral Unit J. Brian Smith, Ed.D., BCBA Marissa Harris, M.S., Ed.D. Graduate Student.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
The Leadership Challenge in Graduating Students with Disabilities Guiding Questions Joy Eichelberger, Ed.D. Pennsylvania Training and Technical Assistance.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Road to Discovery: Transition to Adult Living, Learning and Working 1 Funded by the Iowa Department of Education Developed by PACER Center®
Age Appropriate Assessments: A Necessary Component to Transition.
Sustainability Training Series 2016 From Piloting to Sustaining Practices January 13, :00pm - 3:00pm The Essentials of Full Implementation Part II.
State Planning Institute Team Leader Preparation 5 th Annual Secondary Transition State Planning Institute May 17 – Hilton Charlotte University.
Are we there yet? Evaluating your graduation SiMR.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
LEA Self-Assessment LEASA: Presentations:
Evidence-Based and Promising Practices to Increase Graduation and Improve Outcomes Dr. Loujeania Bost Dr. Catherine Fowler Dr. Matthew Klare.
IS CTE THE NEW VOC ED? MI CAREER EDUCATION CONFERENCE FEBRUARY 2016.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
Hillsboro School District High School Learning Resource Center (LRC) Transition Strand Presentation for OAVSNP Conference 2/19/10.
Child Outcomes Summary Process April 26, 2017
Title III of the No Child Left Behind Act
Federal Policy & Statewide Assessments for Students with Disabilities
2018 OSEP Project Directors’ Conference
SPR&I Regional Training
2018 OSEP Project Directors’ Conference
Parent-Teacher Partnerships for Student Success
What’s an NSTTAC? National Secondary Transition
Measuring Child and Family Outcomes Conference August 2008
Presentation transcript:

Are We making a Difference Are We making a Difference? Application of a Five-Level Evaluation Model to Determine Impact of State Capacity Building Paula D. Kohler Western Michigan University David W. Test UNC Charlotte Paper presented at the OSEP Project Director’s Conference, July 20, 2010

What’s an NSTTAC? National Secondary Transition Technical Assistance Center Federally funded technical assistance and dissemination center (TA&D) January 1, 2006 through December 31, 2010 By the U.S. Department of Educations Office of Special Education Programs (OSEP) Award #H326J050004

Our Charge Assist SEAs in collecting data on federally mandated State Performance Plan Indicator 13 (content of the IEP) and using these data to improve transition education and services. Build state capacity to implement evidence-based secondary education and transition services for youth with disabilities.

Model for Extending Transition Research Effective Transition Practices Increase Capacity to Implement Effective Transition Practices Facilitate Implementation of Effective Transition Practices Data-Based Decision Making Professional Development Policy Analysis and Change Technical Assistance

The Guskey Model Evaluating the Impact of Professional Development Level 1 – Participants’ reactions Level 2 – Participants’ learning Level 3 – Organizational impact Level 4 – Participant implementation Level 5 – Student learning outcomes In evaluating professional development in education, there are five critical stages or levels of information to consider. These levels represent an adaptation of Kirkpatrick 1959 evaluation model for judging the value of supervisory training programs in business and industry. Kirkpatrick's model, although widely applied, has seen limited use in education because of inadequate explanatory power. Guskey’s model s designed to resolve that inadequacy. At NSTTAC we have adopted and somewhat adapted this model and used it to evaluate our capacity building efforts. The five levels in the model are hierarchically arranged from simple to more complex. With each succeeding level, the process of gathering evaluation information is likely to require more time and resources. More importantly, each higher level builds on the ones that came before. In other words, success at one level is necessary for success at the levels that follow.

Level 1 – Participant Satisfaction Questions Did they like it? Was their time well spent? Did the material make sense? Will it be useful? What’s measured Initial satisfaction with the experience Level 1 is the participants' reactions to the experience. This is the simplest, the most common form of professional development evaluation. It is also the easiest type of information to gather and analyze. The questions addressed focus on whether or not participants liked it. When they walked out, did they feel their time was well spent? Did the material make sense to them? Were the activities meaningful? Was the instructor knowledgeable and helpful? Do they believe what they learned will be helpful? Also important are questions such as, Was the coffee hot and ready on time? Were the refreshments fresh and tasty? Was the room the right temperature? Were the chairs comfortable? Questions such as these may seem silly and inconsequential. But, experienced professionals know the importance of attending to basic human needs. Measuring participants' initial satisfaction with the experience provides information that can help improve the design and delivery of programs or activities in valid ways. In addition, positive reactions from participants are usually a necessary prerequisite to higher-level evaluation results.

NSTTAC Examples - Level 1 Level 1 – Participant reactions Likert-like scale evaluations of institutes, cadre meetings, workshops Achievement of intended outcomes Usefulness of information Relevance of materials Qualitative open ended questions What worked and what didn’t Information on participants' reactions is generally gathered through questionnaires handed out at the end of a session or activity. These questionnaires typically include a combination of rating scale items and open-ended response questions that allow participants to provide more personalized comments. Because of the general nature of this information, often the same questionnaire is used for a broad range of professional development experiences.

Participant Satisfaction At NSTTAC, we ask sets of questions about Usefulness, Relevance, and Quality at every professional development activity we conduct. The above example is from a Kentucky State Transition Institute

Your Thoughts?

Level 2 – Participant Learning Questions Did participants acquire the intended knowledge and skills? What’s measured New knowledge and skills of participants Level 2 focuses on measuring the knowledge, skills, and perhaps attitudes participants gained. Besides “liking it”, we would hope participants learned something from their professional development experience. Measures must be based on the learning goals prescribed for that particular program or activity. This means specific criteria and indicators of successful learning must be outlined prior to the beginning of the professional development experience. Analysis of this information provides a basis for improving the content, format, and organization of the program or activities.

NSTTAC Examples - Level 2 Level 2 – Participant learning Pre-post tests New knowledge and skills of participants: student, teacher, and parent instruments Analysis of products Development of IEPs Depending on the goals of the program or activity, evaluation can involve: a pencil‑and‑paper assessment- Can participants describe the critical attributes of an IEP required by Indicator 13? a simulation or full-scale skill demonstration- Can participants simulate a student-led IEP meeting? oral or written personal reflections examination of the portfolios participants assemble

Participant Learning NOTE – I changed the note – I don’t think it was accurate – but read below-JG note yes this was a post then pre exam. Example from CO - implemented I-13 training at regional sites across states Participants completed pre and post tests This table represents results for each of the 7 items (I don’t have the instrument) – here’s June’s summary on the eval report: As part of their yearly goals for increasing capacity and compliance on Indicator 13, the Colorado Department of Education conducted Indicator 13 professional development for local school districts in South Central BOCES. Participants were asked to indicate “the number that best represents your knowledge and skills before and then after this training”. To eliminate the historically “response-shift bias”, participants responded in a post-then-pre approach. This method allowed participants to first report present knowledge (post) and then rate how they perceived their knowledge just prior to the training (then pre). NSTTAC staff provided technical assistance to analyze the evaluation data. Dependent t-tests were conducted for pretest and posttest means. All 13 components proved statistically different at a .05 level of significance (see Table 1). Figure 1 also shows an upward trend in the rate of perceived learning. Note. Frequency (f) represents the number of participants who answered the item on both the pretest and posttest. Dependent t test (across all items) revealed a significant difference between pretest scores and posttest scores, t(396)=-22.06, p < .0001.

Participant Learning This is a plot of pre test/post test results from a workshop re: transition assessment in Michigan AT the beginning of the workshop, participants completed a pre-test that asked them to identify several kinds of transition assessments. They completed the same test at the conclusion This plot indicates the change in scores (from pre to post) in term of % of items correct

Your Thoughts?

Level 3 – Organization Factors Questions What was the impact on the organization? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Were sufficient resources available? What’s measured The organization’s advocacy, support, accommodation, facilitation, and recognition At Level 3 our focus shifts to the organization Specifically, to information on organization support and change. Organizational variables can be key to the success of any professional development effort. They also can hinder or prevent success, even when the individual aspects of professional development are done right (Sparks, 1996a). Gathering information on organization support and change is generally more complicated than previous levels. Questions focus on the organizational characteristics and attributes necessary for success. (best practices for capacity building!!) Procedures may involve analyses of district or school records, documents, or rules, examination of the minutes from follow-up meetings, compliance results, and the like.

NSTTAC Examples - Level 3 Level 3 – Organization support and change Analysis of teacher reports regarding curriculum implementation Identification of facilitators and barriers to curriculum implementation, including administrative support Analysis of annual performance reports (APRs) to determine Change in data collection procedures Alignment of strategic plans (from institutes) with improvement activities in “determination ” areas Change in target indicators Questions may include: Was the advocated change aligned with the mission of the organization? Was change at the individual level encouraged and supported at all levels? Did the program or activity affect organizational climate and procedures? Was administrative support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources made available, including time for sharing and reflection? Were successes recognized and shared? Issues such as these can be major contributing factors to the success of any professional development effort.

Curriculum Implementation This eval is completed when we work with a local site to implement curriculum – (This information is from NM) Participants provide feedback on facilitators and barriers regarding the implementation – this is useful in understanding factors that influence level and success of a new practice/curriculum, etc. In this example, time to plan and implement was somewhat of a barrier, whereas materials and student reactions were quite positive Adapted from : Klingner, J. K., Ahwee, S., Pilonieta, P., Menendez, R. (2003). Barriers and facilitators in scaling up research-based practices. Exceptional Children, 69, 411-429.

Your Thoughts?

Level 4 -- Participant Implementation Questions Did participants effectively apply the new knowledge and skills? What’s measured Degree and quality of implementation At Level 4 our central question is, "Did what participants' learn make a difference in their professional practice?" The key to gathering relevant information at this level rests in the clear indicators that reveal both the degree and quality of implementation. In other words, how can you tell if what participants learned is being used and being used well? Unlike Levels 1and 2, information at Level 4 cannot be gathered at the completion of a professional development session. Measures of use must be made after sufficient time has passed to allow participants to adapt the new ideas and practices to their setting. Because implementation is often a gradual and uneven process, measures also may be necessary at several time intervals. Analysis of this information provides evidence on current levels of use and can help restructure future programs and activities to facilitate better and more consistent implementation.

NSTTAC Examples - Level 4 Level 4 – Participant use of new knowledge and skills Analysis of state and local strategic plans (from institutes) To document and improve the implementation of program content To assess growth from year to year Evaluation of local curriculum implementation To assess if and how participants applied their new knowledge at the classroom level Depending on the goals of the program or activity, this may involve: questionnaires for participants, students, or administrators structured interviews with participants and their administrators oral or written personal reflections examination of participants' journals or portfolios direct observations, either with trained observers or by reviewing video or audio tapes.

Teacher Involvement This is feedback from the Windsor CO team- The team members (including teachers, special education administrator, counselor, transition specialist, and school-to-work-project (SWAP) coordinator) reviewed implementation of the goals from their team plan for that year. They provided feedback regarding their level of participation in these goals/objectives/activities from their plan and barriers and facilitators to implementing their plan goals and activities.

Your Thoughts?

Level 5 – Student Learning Questions What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Is student attendance improving? Are dropouts decreasing? What’s measured Student learning outcomes: Cognitive, affective, psychomotor Level 5 addresses what is typically "the bottom line" in education: What was the impact on students? Did the professional development program or activity benefit students‘ in any way? Summatively, this information can document a program or activity's overall impact. Formatively, it can be used to improve professional development, program design, implementation, and follow-up. In some cases information on student learning outcomes is used to estimate the cost effectiveness of professional development, or what is sometimes referred to as "return on investment" or "ROI evaluation"

NSTTAC Examples - Level 5 Level 5 – Student learning Analysis of APRs and SPP/APR Indicators To determine school and student improvement on federal performance and compliance indicators To demonstrate the overall impact of capacity building To assess impact of capacity building model at the state and local levels Student portfolios and oral reports To measure student learning outcomes Measures of student learning typically include indicators of student performance and achievement, such as assessment results portfolio evaluations marks or grades scores from standardized examinations postsecondary outcomes But in addition affective (attitudes and dispositions) and psychomotor outcomes (skills and behaviors) may be considered as well. Examples include: assessments of students' self concepts study habits school attendance homework completion rates classroom behaviors School wide indicators such as: enrollment in advanced classes memberships in honor societies participation in school related activities disciplinary actions detention drop‑out rates

Student Participation in IEP LEVEL 5 EVALUATION: STUDENT LEARNING OUTCOMES EVALUATION Durant High School Tool 4: Student Self-Assessment of Student Involvement Durant, OK, March 11, 2009 PURPOSE To measure extent self-determination courses have impacted student learning outcomes as seen in student involvement on the IEP QUESTIONS Q1. Did the student attend their IEP? Q2. How much did the student contribute in the IEP meeting? DATA SOURCES Tool 4: Student Assessment of Student Involvement OUTCOME Q1. The percentage of students that attended their IEP: 100% Students attended their IEP Q2. The percentage of students that felt they contributed somewhat to yes in the IEP meeting: 100% Identified their post-secondary goals 100% Provided information about their strengths 100% Provided information about their limitations or problem areas 100% Provided information about their interests 100% Provided information about the courses they want to take 100% Reviewed their past goals and performance 100% Asked for feedback or information from the other participants at their IEP meeting 100% Identified the support they need 100% Summarized the decisions made at the meeting IMPLICATION A significant increase in the number and extent of students involved in their own IEP as self-assessed Durant High School, OK A student self-assessment of outcomes for IEP involvement

List 3 things you learned today (n=16) Student Learning List 3 things you learned today (n=16) Dress nice and appropriately (12) Be on time (4) Don’t rush Work hard (2) Respect (2) Turn off cell phones (3) Resumes (2) Different types of jobs (2) Don’t chew gum (3) Be nice in the work place How to find jobs (6) How to interview (3) How to use community resources to find a job (3) How to apply for a job (2) How to act during an interview (5) How to look-up jobs in the Internet (5)

Your Thoughts?

Challenges Easy to collect and analyze data regarding Level 1 -- satisfaction Somewhat more difficult to collect and analyze data regarding Level 2 – participant learning More difficult to collect and analyze data regarding Levels 3, 4, 5 – organization, application, and student learning

Questions?

Resources www.nsttac.org NSTTAV Evaluation Toolkit NSTTAC Indicator 13 Checklist NSTTAC’s training materials NSTTAC Transition Institute Toolkit Paula Kohler ( paula.kohler@wmich.edu) David Test (dwtest@uncc.edu)