Download presentation
1
Are We making a Difference
Are We making a Difference? Application of a Five-Level Evaluation Model to Determine Impact of State Capacity Building Paula D. Kohler Western Michigan University David W. Test UNC Charlotte Paper presented at the OSEP Project Director’s Conference, July 20, 2010
2
What’s an NSTTAC? National Secondary Transition Technical Assistance Center Federally funded technical assistance and dissemination center (TA&D) January 1, 2006 through December 31, 2010 By the U.S. Department of Educations Office of Special Education Programs (OSEP) Award #H326J050004
3
Our Charge Assist SEAs in collecting data on federally mandated State Performance Plan Indicator 13 (content of the IEP) and using these data to improve transition education and services. Build state capacity to implement evidence-based secondary education and transition services for youth with disabilities.
4
Model for Extending Transition Research
Effective Transition Practices Increase Capacity to Implement Effective Transition Practices Facilitate Implementation of Effective Transition Practices Data-Based Decision Making Professional Development Policy Analysis and Change Technical Assistance
6
The Guskey Model Evaluating the Impact of Professional Development
Level 1 – Participants’ reactions Level 2 – Participants’ learning Level 3 – Organizational impact Level 4 – Participant implementation Level 5 – Student learning outcomes In evaluating professional development in education, there are five critical stages or levels of information to consider. These levels represent an adaptation of Kirkpatrick 1959 evaluation model for judging the value of supervisory training programs in business and industry. Kirkpatrick's model, although widely applied, has seen limited use in education because of inadequate explanatory power. Guskey’s model s designed to resolve that inadequacy. At NSTTAC we have adopted and somewhat adapted this model and used it to evaluate our capacity building efforts. The five levels in the model are hierarchically arranged from simple to more complex. With each succeeding level, the process of gathering evaluation information is likely to require more time and resources. More importantly, each higher level builds on the ones that came before. In other words, success at one level is necessary for success at the levels that follow.
7
Level 1 – Participant Satisfaction
Questions Did they like it? Was their time well spent? Did the material make sense? Will it be useful? What’s measured Initial satisfaction with the experience Level 1 is the participants' reactions to the experience. This is the simplest, the most common form of professional development evaluation. It is also the easiest type of information to gather and analyze. The questions addressed focus on whether or not participants liked it. When they walked out, did they feel their time was well spent? Did the material make sense to them? Were the activities meaningful? Was the instructor knowledgeable and helpful? Do they believe what they learned will be helpful? Also important are questions such as, Was the coffee hot and ready on time? Were the refreshments fresh and tasty? Was the room the right temperature? Were the chairs comfortable? Questions such as these may seem silly and inconsequential. But, experienced professionals know the importance of attending to basic human needs. Measuring participants' initial satisfaction with the experience provides information that can help improve the design and delivery of programs or activities in valid ways. In addition, positive reactions from participants are usually a necessary prerequisite to higher-level evaluation results.
8
NSTTAC Examples - Level 1
Level 1 – Participant reactions Likert-like scale evaluations of institutes, cadre meetings, workshops Achievement of intended outcomes Usefulness of information Relevance of materials Qualitative open ended questions What worked and what didn’t Information on participants' reactions is generally gathered through questionnaires handed out at the end of a session or activity. These questionnaires typically include a combination of rating scale items and open-ended response questions that allow participants to provide more personalized comments. Because of the general nature of this information, often the same questionnaire is used for a broad range of professional development experiences.
9
Participant Satisfaction
At NSTTAC, we ask sets of questions about Usefulness, Relevance, and Quality at every professional development activity we conduct. The above example is from a Kentucky State Transition Institute
10
Your Thoughts?
11
Level 2 – Participant Learning
Questions Did participants acquire the intended knowledge and skills? What’s measured New knowledge and skills of participants Level 2 focuses on measuring the knowledge, skills, and perhaps attitudes participants gained. Besides “liking it”, we would hope participants learned something from their professional development experience. Measures must be based on the learning goals prescribed for that particular program or activity. This means specific criteria and indicators of successful learning must be outlined prior to the beginning of the professional development experience. Analysis of this information provides a basis for improving the content, format, and organization of the program or activities.
12
NSTTAC Examples - Level 2
Level 2 – Participant learning Pre-post tests New knowledge and skills of participants: student, teacher, and parent instruments Analysis of products Development of IEPs Depending on the goals of the program or activity, evaluation can involve: a pencil‑and‑paper assessment- Can participants describe the critical attributes of an IEP required by Indicator 13? a simulation or full-scale skill demonstration- Can participants simulate a student-led IEP meeting? oral or written personal reflections examination of the portfolios participants assemble
13
Participant Learning NOTE – I changed the note – I don’t think it was accurate – but read below-JG note yes this was a post then pre exam. Example from CO - implemented I-13 training at regional sites across states Participants completed pre and post tests This table represents results for each of the 7 items (I don’t have the instrument) – here’s June’s summary on the eval report: As part of their yearly goals for increasing capacity and compliance on Indicator 13, the Colorado Department of Education conducted Indicator 13 professional development for local school districts in South Central BOCES. Participants were asked to indicate “the number that best represents your knowledge and skills before and then after this training”. To eliminate the historically “response-shift bias”, participants responded in a post-then-pre approach. This method allowed participants to first report present knowledge (post) and then rate how they perceived their knowledge just prior to the training (then pre). NSTTAC staff provided technical assistance to analyze the evaluation data. Dependent t-tests were conducted for pretest and posttest means. All 13 components proved statistically different at a .05 level of significance (see Table 1). Figure 1 also shows an upward trend in the rate of perceived learning. Note. Frequency (f) represents the number of participants who answered the item on both the pretest and posttest. Dependent t test (across all items) revealed a significant difference between pretest scores and posttest scores, t(396)=-22.06, p <
14
Participant Learning This is a plot of pre test/post test results from a workshop re: transition assessment in Michigan AT the beginning of the workshop, participants completed a pre-test that asked them to identify several kinds of transition assessments. They completed the same test at the conclusion This plot indicates the change in scores (from pre to post) in term of % of items correct
15
Your Thoughts?
16
Level 3 – Organization Factors
Questions What was the impact on the organization? Did it affect organizational climate and procedures? Was implementation advocated, facilitated, and supported? Were sufficient resources available? What’s measured The organization’s advocacy, support, accommodation, facilitation, and recognition At Level 3 our focus shifts to the organization Specifically, to information on organization support and change. Organizational variables can be key to the success of any professional development effort. They also can hinder or prevent success, even when the individual aspects of professional development are done right (Sparks, 1996a). Gathering information on organization support and change is generally more complicated than previous levels. Questions focus on the organizational characteristics and attributes necessary for success. (best practices for capacity building!!) Procedures may involve analyses of district or school records, documents, or rules, examination of the minutes from follow-up meetings, compliance results, and the like.
17
NSTTAC Examples - Level 3
Level 3 – Organization support and change Analysis of teacher reports regarding curriculum implementation Identification of facilitators and barriers to curriculum implementation, including administrative support Analysis of annual performance reports (APRs) to determine Change in data collection procedures Alignment of strategic plans (from institutes) with improvement activities in “determination ” areas Change in target indicators Questions may include: Was the advocated change aligned with the mission of the organization? Was change at the individual level encouraged and supported at all levels? Did the program or activity affect organizational climate and procedures? Was administrative support public and overt? Were problems addressed quickly and efficiently? Were sufficient resources made available, including time for sharing and reflection? Were successes recognized and shared? Issues such as these can be major contributing factors to the success of any professional development effort.
18
Curriculum Implementation
This eval is completed when we work with a local site to implement curriculum – (This information is from NM) Participants provide feedback on facilitators and barriers regarding the implementation – this is useful in understanding factors that influence level and success of a new practice/curriculum, etc. In this example, time to plan and implement was somewhat of a barrier, whereas materials and student reactions were quite positive Adapted from : Klingner, J. K., Ahwee, S., Pilonieta, P., Menendez, R. (2003). Barriers and facilitators in scaling up research-based practices. Exceptional Children, 69,
19
Your Thoughts?
20
Level 4 -- Participant Implementation
Questions Did participants effectively apply the new knowledge and skills? What’s measured Degree and quality of implementation At Level 4 our central question is, "Did what participants' learn make a difference in their professional practice?" The key to gathering relevant information at this level rests in the clear indicators that reveal both the degree and quality of implementation. In other words, how can you tell if what participants learned is being used and being used well? Unlike Levels 1and 2, information at Level 4 cannot be gathered at the completion of a professional development session. Measures of use must be made after sufficient time has passed to allow participants to adapt the new ideas and practices to their setting. Because implementation is often a gradual and uneven process, measures also may be necessary at several time intervals. Analysis of this information provides evidence on current levels of use and can help restructure future programs and activities to facilitate better and more consistent implementation.
21
NSTTAC Examples - Level 4
Level 4 – Participant use of new knowledge and skills Analysis of state and local strategic plans (from institutes) To document and improve the implementation of program content To assess growth from year to year Evaluation of local curriculum implementation To assess if and how participants applied their new knowledge at the classroom level Depending on the goals of the program or activity, this may involve: questionnaires for participants, students, or administrators structured interviews with participants and their administrators oral or written personal reflections examination of participants' journals or portfolios direct observations, either with trained observers or by reviewing video or audio tapes.
22
Teacher Involvement This is feedback from the Windsor CO team-
The team members (including teachers, special education administrator, counselor, transition specialist, and school-to-work-project (SWAP) coordinator) reviewed implementation of the goals from their team plan for that year. They provided feedback regarding their level of participation in these goals/objectives/activities from their plan and barriers and facilitators to implementing their plan goals and activities.
23
Your Thoughts?
24
Level 5 – Student Learning
Questions What was the impact on students? Did it affect student performance or achievement? Did it influence students’ physical or emotional well-being? Is student attendance improving? Are dropouts decreasing? What’s measured Student learning outcomes: Cognitive, affective, psychomotor Level 5 addresses what is typically "the bottom line" in education: What was the impact on students? Did the professional development program or activity benefit students‘ in any way? Summatively, this information can document a program or activity's overall impact. Formatively, it can be used to improve professional development, program design, implementation, and follow-up. In some cases information on student learning outcomes is used to estimate the cost effectiveness of professional development, or what is sometimes referred to as "return on investment" or "ROI evaluation"
25
NSTTAC Examples - Level 5
Level 5 – Student learning Analysis of APRs and SPP/APR Indicators To determine school and student improvement on federal performance and compliance indicators To demonstrate the overall impact of capacity building To assess impact of capacity building model at the state and local levels Student portfolios and oral reports To measure student learning outcomes Measures of student learning typically include indicators of student performance and achievement, such as assessment results portfolio evaluations marks or grades scores from standardized examinations postsecondary outcomes But in addition affective (attitudes and dispositions) and psychomotor outcomes (skills and behaviors) may be considered as well. Examples include: assessments of students' self concepts study habits school attendance homework completion rates classroom behaviors School wide indicators such as: enrollment in advanced classes memberships in honor societies participation in school related activities disciplinary actions detention drop‑out rates
26
Student Participation in IEP
LEVEL 5 EVALUATION: STUDENT LEARNING OUTCOMES EVALUATION Durant High School Tool 4: Student Self-Assessment of Student Involvement Durant, OK, March 11, 2009 PURPOSE To measure extent self-determination courses have impacted student learning outcomes as seen in student involvement on the IEP QUESTIONS Q1. Did the student attend their IEP? Q2. How much did the student contribute in the IEP meeting? DATA SOURCES Tool 4: Student Assessment of Student Involvement OUTCOME Q1. The percentage of students that attended their IEP: 100% Students attended their IEP Q2. The percentage of students that felt they contributed somewhat to yes in the IEP meeting: 100% Identified their post-secondary goals 100% Provided information about their strengths 100% Provided information about their limitations or problem areas 100% Provided information about their interests 100% Provided information about the courses they want to take 100% Reviewed their past goals and performance 100% Asked for feedback or information from the other participants at their IEP meeting 100% Identified the support they need 100% Summarized the decisions made at the meeting IMPLICATION A significant increase in the number and extent of students involved in their own IEP as self-assessed Durant High School, OK A student self-assessment of outcomes for IEP involvement
27
List 3 things you learned today (n=16)
Student Learning List 3 things you learned today (n=16) Dress nice and appropriately (12) Be on time (4) Don’t rush Work hard (2) Respect (2) Turn off cell phones (3) Resumes (2) Different types of jobs (2) Don’t chew gum (3) Be nice in the work place How to find jobs (6) How to interview (3) How to use community resources to find a job (3) How to apply for a job (2) How to act during an interview (5) How to look-up jobs in the Internet (5)
28
Your Thoughts?
29
Challenges Easy to collect and analyze data regarding Level 1 -- satisfaction Somewhat more difficult to collect and analyze data regarding Level 2 – participant learning More difficult to collect and analyze data regarding Levels 3, 4, 5 – organization, application, and student learning
30
Questions?
31
Resources www.nsttac.org NSTTAV Evaluation Toolkit
NSTTAC Indicator 13 Checklist NSTTAC’s training materials NSTTAC Transition Institute Toolkit Paula Kohler ( David Test
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.