Download presentation
Presentation is loading. Please wait.
Published byDonald Watkins Modified over 9 years ago
1
Focusing Your Evaluation A Planning Template
2
Discerning Readiness Evaluate no program before its time Internal Chemistry Objectives Target program selected from priority area Program concept and practice well-developed Logic Model complete enough to explain to others Objectives reviewed as Specific Measurable Achievable Realistic Time-bound
3
Discerning Readiness Evaluate no program before its time Internal Chemistry Capable, trained staff Dependable, adaptable Implement program with fidelity Able/willing to accommodate evaluation
4
Discerning Readiness Evaluate no program before its time Internal Chemistry Recruitment of target audience Consistent, sizeable group Engaged at expected levels of intensity, frequency, duration Time to implement and refine program Integrate evaluation into or beyond activities Use waves of feedback to demonstrate improvements and effects
5
Discerning Readiness Evaluate no program before its time Internal Chemistry Time and money to conduct evaluation Planning-implementation-evaluation priority reflected in annual and daily plan of work and budget Expertise to plan, gather, analyze data Personal and staff skills, values adequate to perform Appropriate consultants engaged in a timely manner
6
Discerning Readiness Evaluate no program before its time External Chemistry Clear objectives—understood by stakeholders Stakeholders clear about program objectives Program leader clear about stakeholder expectations Constructive relationships with staff Staff clear about evaluation purpose, procedures Program leader clear about staff expectations and abilities
7
Discerning Readiness Evaluate no program before its time External Chemistry Reasonable timeline for program development Expectations consistent with research, reason Evaluation questions match program maturity Resources and assistance to evaluate 10% added or accommodated For high-level evaluations (control or comparison groups) receive additional staff, consultants, collaborator assistance (e.g., teachers help with parent consent, data collection)
8
Discerning Readiness Evaluate no program before its time External Chemistry Support for gathering, analyzing, using data Human resource support: staff time, consulting Technology: hardware and software Organizational effectiveness: coordinated efforts resulting in Greater understanding and support of the program Increased capacity of individuals Strengthened relationships among stakeholders Increased organizational capacity to serve and lead
9
Determining Purpose Sweeten to taste Stakeholder Perspectives Citizens: Does it make a difference? Short-term Outcomes: How well this Program addresses priority state and local issues Mid-term Outcomes: How this Program builds capacities for learning, leading, relating, and serving Long-term Outcomes: How this Program will improve the status and culture of the community relative to health, education, and economic prosperity
10
Determining Purpose Sweeten to taste Stakeholder Perspectives Funding Agency: Is it cost-effective? Short-term Outcomes: How the money was used appropriately (consistent with goals) and efficiently (consistent with good management principles) Mid-term Outcomes: How this Program builds capacities for sustaining and expanding learning Long-term Outcomes: How this Program will improve the organization’s ability to manage, innovate, build community capacity, and change systems
11
Determining Purpose Sweeten to taste Stakeholder Perspectives Parents: Is my child safe? Learning? Short-term Outcomes: How does the Program provide a safe, supportive setting led by high-skill, high-character staff; How well does the Program train youth in areas related to the advertised goals? Mid-term Outcomes: How this Program provide a sequence of learning opportunities and keep him/her interested in belonging Long-term Outcomes: How does this Program help my child get into college or a career
12
Determining Purpose Sweeten to taste Stakeholder Perspectives Staff: Are we running a quality program? Is this worth my effort? Short-term Outcomes: How does the Program equip me with training and support to do best practices Mid-term Outcomes: How does our training (esp. in evaluation) sequence of learning opportunities and keep him/her interested in belonging Long-term Outcomes: What keeps our Program “state-of-the-Art”?
13
Determining Purpose Sweeten to taste Stakeholder Perspectives Youth: Are we having a good time? Short-term Outcomes: What does the Program help me learn hands-on and have a good time with friends Mid-term Outcomes: What will I be able to do in another year? (Will it still be fun?) Long-term Outcomes: What keeps our Program “state-of-the-Art”?
14
Determining Purpose Sweeten to taste Evaluation Standards that shape purpose Utility: How will this evaluation be useful Feasibility: Is evaluation realistic? Propriety: Can this be done ethically? Accuracy: Will the (concept and technical skill) of this evaluation produce Source: The Evaluation Center
15
Deciding on Level/Type Checking the vintage: Taste test Formative Performance testing needs assessment evaluability assessment structured conceptualization implementation evaluation process evaluation Summative Long-term effects outcome eval. impact evaluation cost-effective/cost- benefit analysis secondary analysis meta-analysis
16
Deciding on Level/Type Checking the vintage: Taste test Formative Evaluation Ongoing feedback about performance needs assessment to determine target audience, needs, best strategies evaluability assessment to determines feasiblliiy, usefulness structured conceptualization to define the program or technology, the target population, and the possible outcomes implementation evaluation monitors the fidelity of the program or technology delivery process evaluation investigates the process of delivering the program or technology, including alternative delivery procedures Source: Research Methods Knowledge Base
17
Deciding on Level/Type Checking the vintage: Taste test Summative Evaluation Long-term effects or outcomes of a program outcome evaluations explore whether and how much a program produces targeted results impact evaluation examines the broader effects of the program or technology as a whole cost-effectiveness and cost-benefit analysis address questions of efficiency by standardizing outcomes in dollar costs and values secondary analysis reexamines existing data to address new questions or use methods not previously employed meta-analysis integrates the outcome estimates from multiple studies to arrive at an overall or summary judgement on an evaluation question procedures Source: Research Methods Knowledge Base
18
Distilling Questions Made to Order Program Quality and Best Practice Safety Support Structure Social acceptance Service-orientation Skill-building Self-efficacy Synergy with family and community
19
Distilling Questions Made to Order Program Improvement by adjusting Developmental appropriateness Learning Style Topical Focus/Relevance Intensity Frequency Duration
20
Distilling Questions Made to Order Youth Outcomes (borrow from logic model) Short-term: Knowledge-Attitude-Skill-Aspirations Mid-term: Behavior Change-Application Long-term: Social-Economic-Environment-Cultural Change
21
Developing a Plan Tapping the Casks Eat Right Indicators Healthy diet Reduced junk food Move More Indicators Flexibility Lung capacity Reduced screen time Healthy weight
22
Developing a Plan Tapping the Casks Eat Right Indicators Healthy diet Reduced junk food Move More Indicators Flexibility Lung capacity Reduced screen time Healthy weight
23
Developing a Plan Tapping the Casks Eat Right Indicators Healthy diet Reduced junk food Move More Indicators Flexibility Lung capacity Reduced screen time Healthy weight
24
Developing a Plan Tapping the Casks Data Sources Child Youth leader Parents Others impacted by child’s work (team members or helpees) Generic data (School records, community
25
Developing a Plan Tapping the Casks Data Collection Options Test-Survey-Interview Portfolio-Demonstration Observation-Case Study
26
Developing a Plan Tapping the Casks Data Collection Protocol Ethical Educational Effective
27
Developing a Plan Tapping the Casks Analyzing data Qualitative: Trends, anecdotes, comments Quantitative: Individual growth, group patterns; correlated elements
28
Directing Implementation Managing the Vineyard Step-by-Step Procedures Budgeting Timeline Monitoring roles Critique
29
You can’t find out what you want to know until you know what you want to find.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.