Presentation is loading. Please wait.

Presentation is loading. Please wait.

Georgia Compensatory Educational Leaders, Inc Conference

Similar presentations


Presentation on theme: "Georgia Compensatory Educational Leaders, Inc Conference"— Presentation transcript:

1 Georgia Compensatory Educational Leaders, Inc. 2016 Conference
Evaluating Educational Programs: Using Formative Evaluation to Develop and Monitor Your Programs Georgia Compensatory Educational Leaders, Inc Conference Nicholas L. Handville Operations Analyst III Office of School Improvement 11/27/2018

2 11/27/2018

3 Session Description This session will cover evaluation planning and formative evaluation. Formative evaluation occurs prior to or during program implementation with the aim of improving program design and performance. Topics covered in this session will include: an introduction to theory of change and logic models, needs assessments, planning and conducting formative evaluations, and using evaluation results to improve education programs during the school year. This session is designed to be beneficial to those working with educational programs regardless of prior experience with evaluation. 11/27/2018

4 Assessing Your Needs and Interests
Expectations and goals for the workshop Experience with evaluation 11/27/2018

5 Types of Evaluation 11/27/2018

6 Two Main Types of Evaluation
Formative evaluations strengthen or improve the object being evaluated -- they help form it by examining the delivery of the program, the quality of its implementation, and the assessment of the organizational context, personnel, procedures, inputs, and so on. Summative evaluations examine the effects or outcomes -- they describe what happens subsequent to delivery of the program; assessing whether the program’s services can be said to have caused the outcome; determining the overall impact of the causal factor beyond only the immediate target outcomes; and, estimating the relative costs associated with the object. 11/27/2018

7 Types of Evaluation: Summative vs. Formative
Descriptor Formative Evaluation Summative Evaluation Purpose Quality assurance, improvement Providing an overall judgment of program Use Guiding decision making Ascertaining accountability for successes and failures, promoting understanding of program Functions Provides feedback for improvement Informs stakeholders about program’s value When Conducted During development or ongoing operations After completion of program or program cycle Types of Services Assists with goal setting, planning, and management Assists management and stakeholders in decision making Foci Goals, alternative courses of action, plans, implementation of plans, interim results Completed projects, established programs, ultimate outcomes 11/27/2018 Source: Stufflebeam (2014)

8 Types of Formative Evaluation
Needs assessment: Determines who needs the program, how great the need is, and what might work to meet the need Implementation evaluation: Monitors the fidelity of the program Process evaluation: Investigates the process of delivering the program; focus may include program description, program monitoring, quality assurance 11/27/2018 Sources: Royce (2010).

9 Implementation Evaluation
Seeks to influence the initial development of a program Focus is on qualities of the program itself, not program outcomes Allow for adjustments and improvements to program 11/27/2018

10 Process Evaluation Can be conducted anytime during a program’s life cycle Used to identify what was learned during program implementation May help determine if program failure was due to a poor program model or intervention or if it was the result of program implementation 11/27/2018

11 Before the Evaluation Documenting the Purpose and Functions of Your Program 11/27/2018

12 Mission Statements, Goals, and Objectives
Explain what the agency or program is all about Provide a common vision for the organization (along with vision statement) Goals Specify a program direction based on values, ideals, mandates, and program purpose Speak to aspiration of the work Provide focus, orientation, and direction Objectives Specific and precise Allow for measurement of progress toward goals Should have a single aim or result that is clearly verifiable 11/27/2018

13 Logic Models A logic model depicts your program’s inner workings
Explains how you believe your program works Set short-term and intermediate objectives that you can check throughout the evaluation to determine the extent to which your program is working as envisioned 11/27/2018

14 Logic Models Adapted from: Kellogg Foundation (2004) 11/27/2018

15 Logic Models Source: US ED (2014) 11/27/2018

16 Benefits of Logic Models
Help the evaluator to conceptualize the important program components Assist with understanding what must happen for outcomes to be achieved Provide a map for programmatic or organizational change Clarify for stakeholders the sequence of events and processes that contribute to program performance 11/27/2018 Source: Royce (2010).

17 Theory of Change Adapted from: Kellogg Foundation (2004) 11/27/2018

18 Theory of Action Adapted from: 11/27/2018

19 Planning a Formative Evaluation
11/27/2018

20 Planning Your Formative Evaluation
Decide what to evaluate Build a logic model of your program State your program implementation and participation objectives in measureable terms Identify the context for your evaluation 11/27/2018

21 Conducting a Formative Evaluation
11/27/2018

22 Conducting a Formative Evaluation
Approach 1: Locate model standards Approach 2: Get expert consultation Approach 3: Form an ad hoc evaluation committee 11/27/2018 Source: Royce (2010).

23 Conducting an Evaluation
Step 1: Assemble an evaluation team. Step 2: Prepare for the evaluation. Step 3: Develop an evaluation plan. Step 4: Collect evaluation information. Step 5: Analyze your evaluation information. Step 6: Prepare the evaluation report. 11/27/2018 Source: U.S. Department of Health and Human Services

24 Guidelines for Conducting a Successful Evaluation
Invest heavily in planning Integrate the evaluation into ongoing activities of the program Participate in the evaluation and show program staff that you think it is important Involve as many of the program staff as much as possible and as early as possible Be realistic about the burden on you and your staff 11/27/2018 Source: Royce (2010).

25 Potential Challenges and Concerns
11/27/2018

26 Potential Challenges Starting Data Collection too early in the life of a program Failure to clarify program expectations about what can be learned from the evaluation Inadequately training data collectors Inappropriate involvement of program providers in data collection. 11/27/2018

27 Potential Concerns Evaluation diverts resources away from the program and therefore harms participants. Evaluation increases the burden for program staff. Evaluation is too complicated. Evaluation may produce negative results and lead to information that will make the program look bad or lose funding. Evaluation is just another form of monitoring. Evaluation requires setting performance standards, and this is too difficult. If the evaluation is done correctly, none of these are true. 11/27/2018 Source: U.S. Department of Health and Human Services

28 Group Activity: Designing and Implementing Your Formative Evaluation
11/27/2018

29 Group Activity: Guiding Questions
Is the program serving the right target group(s)? Are potential clients rejecting the program or dropping out? Why? Is the program being implemented in accordance with the program design? Is the program producing the expected outputs (such as services or products)? Is the program meeting its standards of quality? What implementation obstacles are being encountered? What implementation differences exist among site locations? Are significant internal or external events affecting the program, its staff, or its clients? 11/27/2018

30 Closing Thoughts: What We Have Learned
How has your understanding of evaluating a school-based program been enhanced? In what ways can you use what you have learned to improve your program? 11/27/2018

31 Resources Annie E. Casey Foundation. (2004). Theory of change: A practical tool for action, results, and learning. Bennett, J. (2003). Evaluation methods in research. Continuum. Dinsmore, P. (1993). The AMA handbook of project management. New York: AMACOM. Harris, E. (2011). Afterschool evaluation 101: How to evaluate and expanded learning program. Harvard Family Research Project. Holden, D. and M. Zimmerman. (2009). A practical guide to program evaluation planning: Theory and case examples. Sage. Royse, D., B. Thyer, and D. Padgett. (2010). Program evaluation: An introduction. Wadsworth. Stockmann, R. (Ed.) (2011). A practitioner handbook on evaluation. Edward Elgar. U.S. Department of Education. (2014). Evaluation matters: Getting the information you need from your evaluation. U.S. Department of Health and Human Services. The program manager’s guide to evaluation: Second edition. W.K. Kellogg Foundation (2004). Logic model development guide: Using logic models to bring together planning, evaluation, and action. Weiss, C. (1997). Evaluation: Second Edition. New Jersey: Prentice Hall. Wholey, J. et. Al. (2010). Handbook of Practical Program Evaluation. San Francisco: John Wiley & Sons. 11/27/2018

32 Nicholas L. Handville Operations Analyst III Office of School Improvement Georgia Department of Education Twin Towers East 205 Jesse Hill Jr. Drive, SE Atlanta, Georgia  Office: (404) 11/27/2018


Download ppt "Georgia Compensatory Educational Leaders, Inc Conference"

Similar presentations


Ads by Google