Presentation is loading. Please wait.

Presentation is loading. Please wait.

SPDG Directors' Webinar: Professional Development Series #1 Evaluation

Similar presentations


Presentation on theme: "SPDG Directors' Webinar: Professional Development Series #1 Evaluation"— Presentation transcript:

1 SPDG Directors' Webinar: Professional Development Series #1 Evaluation
Julie Q. Morrison, Ph.D. SPDG Evaluator (Ohio)

2 Purpose of the Session Explore how the National Implementation Research Network’s Implementation Drivers Framework has prioritized staff competence as essential for effective programs and practices Examine how Guskey’s Five Critical Levels for Evaluating Professional Development can be used as a framework for designing effective professional development

3 Core Implementation Components
Systems Interventions

4

5 SPDG’s Focus on Competent Use
Implementation of evidence-based practices requires behavior change at the practitioner, supervisory, and administrative support levels. Training and Coaching are the principle ways in which behavior change is brought about for carefully selected staff in the beginning stages of implementation and throughout the life of evidence-based practices and program. (Fixsen et al., 2005, p. 29)

6 Best Practices in Selection
Job or role description should be explicit about expectations and accountability for all positions (e.g., teachers, coaches, staff, administrators) Readiness measures to select at a school building-level or school district-level. Interactive interview process (Blase, VanDyke, & Fixsen, 2010)

7 Best Practices in Training
Training must be … Timely Theory grounded (adult learning) Skill-based Information from Training feeds back to Selection and feeds forward to Coaching Selection Training Coaching (Blase, VanDyke, & Fixsen, 2010)

8 Best Practices in Coaching
Design a Coaching Service Delivery Plan Develop accountability structures for Coaching – Coach the Coach! Identify on-going professional development for coaches Training Coaching Performance Assessment (Blase, VanDyke, & Fixsen, 2010)

9 Best Practices in Performance Assessment (Fidelity)
Must be a transparent process Use of multiple data sources Fidelity of implementation should be assessed at the local, regional, and state levels Tied to positive recognition Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers

10 Best Practices in Decision Support Data Systems
Assess fidelity of implementation at all levels and respond accordingly Identify outcome measures that are … Intermediate and longer-term Socially valid Technically adequate: reliable and valid Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently

11 Acknowledgements The description of the Implementation Drivers Framework and its implications for best practices represent the work of the members of the National Implementation Research Network. My professional experiences with the Implementation Drivers Framework has been informed through discussions with other SPDG Evaluators, most notably Pat Mueller (NH & MS), Amy Gaumer Erickson (KS & MO), and Pattie Noonan (KS)

12 Guskey’s Five Critical Levels for Evaluating Professional Development
Level 1: Participants’ Reactions Level 2: Participants’ Learning Level 3: Organizational Support and Change Level 4: Participants’ Use of New Knowledge and Skills Level 5: Student Learning Outcomes

13 Level 1: Participants’ Reactions
Measuring participants’ initial satisfaction with the experience provides information that can help improve the design and delivery of programs or activities. Positive reactions from participants are usually a necessary prerequisite to higher level evaluation results (e.g., fidelity of implementation, impact on student achievement)

14 Implications for Increasing Participants’ Positive Reactions
Why Professional Development Fails: Poor planning and organization Lack of relevance to the day-to-day issues of the participants Failure to differentiate the needs of individual schools and teachers (Wood & Thompson, 1980) Planning professional development to meet participants needs will increase the likelihood that they will have positive perceptions of the experience and acquire the intended knowledge and skills.

15 Level 2: Participants’ Learning
Evidence of participants’ learning validates the relationship between what was intended and what was achieved

16 Implications for Maximizing Participants’ Learning
A clear understanding of the learning objectives targeted by the professional development is needed to promote learning. Bloom’s Taxonomy of Educational Objectives (Bloom, 1956) The Instructional Hierarchy (Haring, Lovitt, Eaton, & Hansen, 1978).

17 Implications for Maximizing Participants’ Learning
Research on Effective Professional Development also Supports: Opportunities to practice the skill or concept under simulated conditions Timely, specific, constructive feedback Coaching to refine implementation (Loucks-Horsley, Harding, Arbuckle, Murray, Dubea, & Williams, 1987; Showers, 1996; Showers, Joyce, & Bennett, 1987)

18 Level 3: Organizational Support and Change
Organizational variables can be key to the success of any professional development effort. They also can hinder or prevent success, even when the individual aspects of professional development are done right (Sparks, 1996). Some of the best and most promising improvement strategies have been seriously stifled or halted completely because of seemingly immutable factors in the organization’s culture (Fullan, 1993)

19 Implications for Facilitating Organizational Support and Change
Organizational Policies Resources Protections from Intrusions

20 Implications for Facilitating Organizational Support and Change
Openness to Experimentation and Alleviation of Fears Collegial Support Among Teachers Principal’s Leadership and Support

21 Implications for Facilitating Organizational Support and Change
Higher-Level Administrators’ Leadership and Support Recognition of Success Provision of Time

22 Level 4: Participants’ Use of New Knowledge and Skills
Fidelity of Implementation Are participants using the new knowledge and skills to implement the practice as it was intended to be implemented? Critical Indicators What would you expect to see if effective implementation were taking place?

23 Implications for Increasing Participants’ Use of New Knowledge and Skills
Allow sufficient time for participants to adapt the new practices to their setting. How much fidelity? (replication vs. mutual adaptation) Anticipate that implementation is often a gradual and uneven process Attend to depth of implementation (Coburn, 2003)

24 Level 5: Student Learning Outcomes
Teacher professional development must be explicitly linked to positive student outcomes In many cases, changes in teacher practices and attitudes are sustained only when professional development and implementation is combined with evidence of improved student learning (Guskey, 1982, 1984).

25 Identify outcome measures that are …
Implications for Increasing the Impact of Professional Development on Student Learning Outcomes Identify outcome measures that are … Intermediate (formative assessment) and longer-term (summative assessment) Socially valid Technically adequate: reliable and valid Relevant data that are feasible to gather, useful for decision making, widely shared and reported frequently

26 Acknowledgements The description of the five critical levels for evaluating professional development for teachers represent the work of Tom Guskey. My professional experiences applying Guskey’s framework has been informed through discussions with other evaluators, most notably: Stacey Farber (Cincinnati Children’s Hospital) Kelly Hannum (Center for Creative Leadership) Vanessa Moss-Summers (Xerox)

27 References Bloom, B. S. (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc. Blase, K. A., Van Dyke, M. K., & Fixsen, D. L. (2010). Implementation Drivers – Best Practices. Chapel Hill, NC: National Implementation Research Network. Coburn, C. E. (2003). Rethinking scale: Moving beyond numbers to deep and lasting change. Educational Researcher, 32(6), 3-12. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231)

28 Fullan, M. G. (1993). Change forces: Probing the depths of educational reform. Bristol, PA: Falmer.
Guskey, T. R. (1982). The effects of change in instructional effectiveness upon the relationship of teacher expectations and student achievement. Journal of Educational Research, 75(6), Guskey, T. R. (1984b). The influence of change in instructional effectiveness upon the affective characteristics of teachers. American Education Research Journal, 21(2), Guskey, T. R. (2000). Evaluating Professional Development. Thousand Oaks, CA: Corwin Press, Inc.

29 Haring, N. S. , Lovitt, T. C. , Eaton, M. D. , & Hansen, C. L. (1978)
Haring, N. S., Lovitt, T. C., Eaton, M. D., & Hansen, C. L. (1978). The fourth R: Research in the classroom. Columbus, OH: Charles E. Merrill Publishing Co. Loucks-Horsley, S., Harding, C. K., Arbuckle, M. A., Murray, L. B., Dubea, C., & Williams, M. K. (1987). Continuing to learn: A guidebook for teacher development. Andover, MA: Regional Laboratory for Educational Improvement of the Northeast & Islands. Showers, B. (1996). The evolution of peer coaching. Educational Leadership, 53(6), Showers, B., Joyce, B., & Bennett, B. (1987). Synthesis of research on staff development: A framework for future study and a state of the art analysis. Educational Leadership, 45(3),

30 Sparks, D. (1996, February). Viewing reform from a systems perspective
Sparks, D. (1996, February). Viewing reform from a systems perspective. The Developer, pp. 2, 6. Wood, F. H., & Thompson, S. R. (1980). Guidelines for better staff development. Educational Leadership, 37(5),

31 Contact Information Julie Q. Morrison, Ph.D. Assistant Professor University of Cincinnati College of Education, Criminal Justice, & Human Services School of Human Services, School Psychology Program


Download ppt "SPDG Directors' Webinar: Professional Development Series #1 Evaluation"

Similar presentations


Ads by Google