Download presentation
Presentation is loading. Please wait.
Published byAlexia Chambers Modified over 9 years ago
1
Product Evaluation the outcome phase
2
Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know when an innovative educational program has “worked”? What kinds of evidence are needed to make a judgment that something “works” in education? What kinds of evidence are needed to make a judgment that something “works” in education?
3
Do the magic bullets work? Input – Process – Product Input – Process – Product Has the process improved? Has the process improved? Has teacher morale improved? Has teacher morale improved? Do teachers consider the Program a Resource or a Demand? Do teachers consider the Program a Resource or a Demand? Did the test scores improve? Did the test scores improve? Did the test scores improve as a result of Program participation? Did the test scores improve as a result of Program participation?
4
CIPPI Model Context Context Input Input Process Process Product Product Impact Impact The goal is to make INFORMED judgments about Program merit or value. The goal is to make INFORMED judgments about Program merit or value.
5
The Nature of Evaluation Tasks The goal is to make INFORMED judgments about Program merit or value. The goal is to make INFORMED judgments about Program merit or value. Making judgments requires a thorough understanding of whether Program goals and objectives were met. Making judgments requires a thorough understanding of whether Program goals and objectives were met. Were the needs of the target audience met? Were the needs of the target audience met?
6
Product Evaluation The central focus of a Product evaluation is the documentation of Program achievements or accomplishments. The central focus of a Product evaluation is the documentation of Program achievements or accomplishments. The Product evaluation phase requires the evaluator to develop and implement a plan to collect the outcome information called for on the measurable objectives worksheets. The Product evaluation phase requires the evaluator to develop and implement a plan to collect the outcome information called for on the measurable objectives worksheets.
7
The Product Evaluation Plan Define a typical service delivery cycle. Define a typical service delivery cycle. Is there a typical unit of time that makes a natural timeframe for the Product evaluation? Is there a typical unit of time that makes a natural timeframe for the Product evaluation? This issue is made easy in education (i.e. academic year). This issue is made easy in education (i.e. academic year).
8
The Product Evaluation Plan Does the Program operate based on client “spells” or “incidents” rather than a service delivery cycle? Does the Program operate based on client “spells” or “incidents” rather than a service delivery cycle? Once the typical service delivery cycle is defined, decide when the first “fair test” service delivery cycle has taken place. Did a “pilot” phase take place? Once the typical service delivery cycle is defined, decide when the first “fair test” service delivery cycle has taken place. Did a “pilot” phase take place?
9
The Product Evaluation Plan The plan should include: The plan should include: –Multiple measures of success –Collected at multiple points in time –Multiple methods of data collection It should be based on the Measurable Objectives Worksheets outlined during the Input evaluation phase. It should be based on the Measurable Objectives Worksheets outlined during the Input evaluation phase.
10
Evaluation Strategies Surveys designed to document the subjective judgments of the stakeholders about the success of the Program. Surveys designed to document the subjective judgments of the stakeholders about the success of the Program. Interviews designed to provide rich supplements to the survey data. Interviews designed to provide rich supplements to the survey data.
11
Evaluation Strategies Standardized tests and objective measurements. Standardized tests and objective measurements. Measures designed to assess knowledge gains, attitude changes, or behavior changes. Measures designed to assess knowledge gains, attitude changes, or behavior changes. Performance assessment data. Performance assessment data.
12
Evaluation Strategies Observations. Observations. Case studies of individual Program participants or sub-groups. This method can help take a more in-depth look at the subjective experience of selected Program participants. Case studies of individual Program participants or sub-groups. This method can help take a more in-depth look at the subjective experience of selected Program participants. Collections of Program products and other artifacts of Program success. Collections of Program products and other artifacts of Program success.
13
Side Effects What were the intended and unintended (side effects) consequences of the Program, for each group of stakeholders, across a complete service delivery cycle? What were the intended and unintended (side effects) consequences of the Program, for each group of stakeholders, across a complete service delivery cycle? Document the positive and negative outcomes of participation in the Program. Document the positive and negative outcomes of participation in the Program.
14
Sub Group Analysis Do the intended and unintended consequences of Program participation vary by sub-group? Do the intended and unintended consequences of Program participation vary by sub-group? Ask “Did it work?” But also ask, “For whom did it work, and under what circumstances?” Ask “Did it work?” But also ask, “For whom did it work, and under what circumstances?”
15
Interpretation Interpret the data gathered according to standards of growth and improvement that are accepted by the Program and stakeholders. Interpret the data gathered according to standards of growth and improvement that are accepted by the Program and stakeholders. Were the gains or accomplishments large enough to justify continuing the Program? Were the gains or accomplishments large enough to justify continuing the Program?
16
Interpretation In addition to just gain scores based on pre / post measures, can your interpretations be enhanced by any information about: In addition to just gain scores based on pre / post measures, can your interpretations be enhanced by any information about: –Continuous improvement –Trends in improvements made between the measurements –Periods of time when the Program worked better than others
17
Cost Considerations Was all of the funding spent properly in accordance with the objectives? Was all of the funding spent properly in accordance with the objectives? Did the Program follow all the financial guidelines from the funding agency? Did the Program follow all the financial guidelines from the funding agency? Could the same benefits be achieved for less expense? Could the same benefits be achieved for less expense?
18
Cost Considerations Are these outcomes worth the investment? Are these outcomes worth the investment? Formal cost-benefit or cost-effectiveness analysis may be part of a Product evaluation as needed. Formal cost-benefit or cost-effectiveness analysis may be part of a Product evaluation as needed. This may require consultation with financial professionals. This may require consultation with financial professionals.
19
Summative Conclusions Make a summary judgment about whether the Program is worth repeating for another service delivery cycle. Make a summary judgment about whether the Program is worth repeating for another service delivery cycle. Make a summary judgment about what outcomes can be attributed to the Program. Make a summary judgment about what outcomes can be attributed to the Program. Cause and Effect is hard to establish. Cause and Effect is hard to establish.
20
Summative Conclusions Make sure your interpretations of the Product evaluation are grounded in the results of the Context, Input, and Process phases of the evaluation. Make sure your interpretations of the Product evaluation are grounded in the results of the Context, Input, and Process phases of the evaluation.
21
Formative Conclusions The Product phase consists of Summative evaluation activities. However, some Formative conclusions can come out of the Product phase. The Product phase consists of Summative evaluation activities. However, some Formative conclusions can come out of the Product phase. Include recommendations about Program modification, extension, or export to other settings. Include recommendations about Program modification, extension, or export to other settings.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.