Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Similar presentations


Presentation on theme: "Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)"— Presentation transcript:

1 Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

2 Four considerations Identifying evaluation audiences Identifying evaluation audiences Setting boundaries on whatever is evaluated Setting boundaries on whatever is evaluated Analyzing evaluation resources Analyzing evaluation resources Analyzing the political context Analyzing the political context

3 1. Audience Identification Evaluation is adequate only if it collects information from and reports to all legitimate evaluation audiences Evaluation is adequate only if it collects information from and reports to all legitimate evaluation audiences Primary Audience: sponsor and client Primary Audience: sponsor and client Secondary audiences: depends on how the evaluator defines constituents Secondary audiences: depends on how the evaluator defines constituents Common to limit to too narrow an audience Common to limit to too narrow an audience Figure 11.1 (p. 202) Figure 11.1 (p. 202) Return to list of audiences periodically Return to list of audiences periodically Who will use results and how is key to outlining study Who will use results and how is key to outlining study

4 Potential Secondary Audiences Policy makers Policy makers Managers Managers Program funders Program funders Representatives of program employees Representatives of program employees Community members Community members Students and their parents (or other program clients) Students and their parents (or other program clients) Retirees Retirees Reps of influence groups Reps of influence groups

5 2. Setting the Boundaries Start point: detailed description of the program being evaluated Start point: detailed description of the program being evaluated Program description: describes the critical elements of the program (goals, objectives, activities, target audiences, physical setting, context, personnel) Program description: describes the critical elements of the program (goals, objectives, activities, target audiences, physical setting, context, personnel) Need for description to be thorough enough to convey program’s essence Need for description to be thorough enough to convey program’s essence

6 Characterizing the Evaluand What problem was program designed to correct? What problem was program designed to correct? Of what does the program consist? Of what does the program consist? What is the program’s setting and context? What is the program’s setting and context? Who participates in the program? Who participates in the program? What is the program’s history? Duration? What is the program’s history? Duration?

7 When and under what conditions is the program implemented? When and under what conditions is the program implemented? Are there unique contextual events (contract negotiations, budget, elections…) that may distort evaluation? Are there unique contextual events (contract negotiations, budget, elections…) that may distort evaluation? What resources (human, materials, time) are consumed by the program? What resources (human, materials, time) are consumed by the program? Has there been a previous evaluation? Has there been a previous evaluation?

8 Program Theory Specification of what must be done to achieve desired goals, other impacts may be anticipated, & how goals & impacts would be generated (Chen, 1990) Specification of what must be done to achieve desired goals, other impacts may be anticipated, & how goals & impacts would be generated (Chen, 1990) Serves as a tool for: Serves as a tool for: Understanding program Understanding program Guiding evaluation Guiding evaluation Evaluators must understand assumptions that link problem to resolve with program actions & characteristics & those a/c with desired outcomes Evaluators must understand assumptions that link problem to resolve with program actions & characteristics & those a/c with desired outcomes

9 Helpful in developing program theory (Rossi, 1971) 1. Causal hypothesis: links problem to a cause 2. Intervention hypothesis: links program actions to the cause 3. Action hypothesis: links the program activities with reduction of original problem Sample Problem Declining fitness levels in children Declining fitness levels in children Causal hypothesis? Causal hypothesis? Intervention hypothesis? Intervention hypothesis? Action hypothesis? Action hypothesis?

10 Methods for Describing Evaluand Descriptive Documents Descriptive Documents Program documents, proposals for funding, publications, minutes of meetings, etc… Program documents, proposals for funding, publications, minutes of meetings, etc… Interviews Interviews Stakeholders, all relevant audiences Stakeholders, all relevant audiences Observations Observations Observe program in action, get a “feel” for what really is going on Observe program in action, get a “feel” for what really is going on Often reveal difference between how program runs and how it is supposed to run Often reveal difference between how program runs and how it is supposed to run

11 Challenge of balancing different perspectives Challenge of balancing different perspectives Minor differences may reflect stakeholder values or positions and can be informative Minor differences may reflect stakeholder values or positions and can be informative Major differences require that evaluator attempt to achieve some consensus description of the program before initiating the evaluation Major differences require that evaluator attempt to achieve some consensus description of the program before initiating the evaluation Redescribing evaluand as it changes Redescribing evaluand as it changes Changes may be due to Changes may be due to Responsiveness to feedback Responsiveness to feedback Implementation not quite aligned with designers’ vision Implementation not quite aligned with designers’ vision Natural historical evolution of an evaluand Natural historical evolution of an evaluand

12 3. Analyzing Evaluation Resources: $ Cost-free evaluation: cost savings realized via evaluation may pay for evaluation over time Cost-free evaluation: cost savings realized via evaluation may pay for evaluation over time If budget limits are set before the evaluation process begins, it will affect planning decisions that follow If budget limits are set before the evaluation process begins, it will affect planning decisions that follow Often evaluator has no input into the budget Often evaluator has no input into the budget Offer 2-3 levels of services (Chevy vs. BMW) Offer 2-3 levels of services (Chevy vs. BMW) Budgets should remain somewhat flexible to allow for evaluation process to focus on new insights during the process Budgets should remain somewhat flexible to allow for evaluation process to focus on new insights during the process

13 Analyzing Resources- Personnel Can the evaluator use ‘free’ staff on site? Can the evaluator use ‘free’ staff on site? Program staff could collect data Program staff could collect data Secretaries type, search records Secretaries type, search records Grad students doing internship, course-related work Grad students doing internship, course-related work PTA PTA Key that evaluator ORIENT, TRAIN, QC such volunteers to maintain evaluation’s integrity Key that evaluator ORIENT, TRAIN, QC such volunteers to maintain evaluation’s integrity Supervision and spot-checking useful practices Supervision and spot-checking useful practices Task selection is essential to maintain study’s validity/credibility Task selection is essential to maintain study’s validity/credibility

14 Analyzing Resources: Technology, others, constraints The more information that must be generated by the evaluator, the costlier the evaluation The more information that must be generated by the evaluator, the costlier the evaluation Are existing data, records, evaluations, and other documents available? Are existing data, records, evaluations, and other documents available? Using newer technology, less expensive means of data collection can be employed Using newer technology, less expensive means of data collection can be employed Web-based surveys, e-mails, conference calls, posting final reports on websites Web-based surveys, e-mails, conference calls, posting final reports on websites Time (avoid setting unrealistic timelines) Time (avoid setting unrealistic timelines)

15 4. Analyzing the Political Context Politics begin with decision to evaluate and influence entire evaluation process Politics begin with decision to evaluate and influence entire evaluation process Who stands to gain/lose most from different evaluation scenarios? Who stands to gain/lose most from different evaluation scenarios? Who has the power in this setting? Who has the power in this setting? How is evaluator expected to relate to different groups? How is evaluator expected to relate to different groups? From which stakeholders will cooperation be required? Are they willing to cooperate? From which stakeholders will cooperation be required? Are they willing to cooperate? Who has vested interest in outcomes? Who has vested interest in outcomes? Who will need to be informed along the way? Who will need to be informed along the way? What safeguards need to be formalized (i.e., IRB)? What safeguards need to be formalized (i.e., IRB)?

16 Variations Caused by Evaluation Approach Used Variations in the evaluation plan will occur based on the approach taken by the evaluator Variations in the evaluation plan will occur based on the approach taken by the evaluator Each approach has strengths and limitations Each approach has strengths and limitations Review Table 9.1 for characteristics of each Review Table 9.1 for characteristics of each Use of single approaches tends to be limiting Use of single approaches tends to be limiting

17 To Proceed or Not? Based on information about context, program, stakeholders & resources, decide ‘go/no-go’ Based on information about context, program, stakeholders & resources, decide ‘go/no-go’ Ch. 10 inappropriate evaluation conditions: Ch. 10 inappropriate evaluation conditions: Evaluation would produce trivial information Evaluation would produce trivial information Evaluation results will not be used Evaluation results will not be used Cannot yield useful, valid information Cannot yield useful, valid information Evaluation is premature for the stage of the program Evaluation is premature for the stage of the program Motives of the evaluation are improper Motives of the evaluation are improper Ethical considerations (utility, feasibility, propriety, accuracy) Ethical considerations (utility, feasibility, propriety, accuracy)


Download ppt "Evaluation Planning II: Setting Boundaries and Analyzing the Evaluation Context Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)"

Similar presentations


Ads by Google