Too expensive
Too complicated
Too time consuming
Not a priority
Just don’t know where to start
Lack of research/statistics skills Lack of time Lack of resources Other priorities Lack of incentive Fear Don’t see value
The process of determining the merit, worth, or value of a program (Scriven, 1991)
Systematic inquiry that describes and explains, policies’ and programs’ operations, effects, justifications, and social implications (Mark, Henry, & Julnes, 2000)
The systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention programs (Rossi & Freeman, 1989)
In simpler terms….. Collection of information to determine the value of a program e VALU ation
Auditing Personnel assessment Monitoring (although this can be part of an evaluation process) Used to end or shut down programs
Evaluation is an extraneous activity that generates lots of boring data with useless conclusions
Evaluation is about proving the success or failure of a program
Evaluation is a unique and complex process that occurs at a certain time in a certain way, and almost always includes the use of outside experts.
Demonstrate program effectiveness or impacts
Better manage limited resources
Demonstrate program effectiveness or impacts Better manage limited resources Document program accomplishments
Demonstrate program effectiveness or impacts Better manage limited resources Document program accomplishments Justify current program funding
Demonstrate program effectiveness or impacts Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding
Demonstrate program effectiveness or impacts Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding Satisfy ethical responsibility to clients to demonstrate positive and negative effects of participation
Demonstrate program effectiveness or impacts Better manage limited resources Document program accomplishments Justify current program funding Support need for increased funding Satisfy ethical responsibility to clients to demonstrate positive and negative effects of participation Document program development and activities to help ensure successful replication
To improve program performance which leads to better value for your resources
No evidence that your program is working or how it works
Lack of justification for new or increased funding
No evidence that your program is working or how it works Lack of justification for new or increased funding No marketing power for potential clients
No evidence that your program is working or how it works Lack of justification for new or increased funding No marketing power for potential clients Lack of credibility
No evidence that your program is working or how it works Lack of justification for new or increased funding No marketing power for potential clients Lack of credibility Lack of political and/or social support
No evidence that your program is working or how it works Lack of justification for new or increased funding No marketing power for potential clients Lack of credibility Lack of political and/or social support No way to know how to improve
DevelopmentImplementationEvaluation RevisionSustainability
Types of Evaluation › Outcome (summative) › Process (formative)
Types of Evaluation › Outcome (summative) › Process (formative) Outcomes
Types of Evaluation › Outcome (summative) › Process (formative) Outcomes Indicators
Types of Evaluation › Outcome (summative) › Process (formative) Outcomes Indicators Measures
Types of Evaluation › Outcome (summative) › Process (formative) Outcomes Indicators Measures Benchmarks
Types of Evaluation › Outcome (summative) › Process (formative) Outcomes Indicators Measures Benchmarks Quantitative vs. qualitative
Engage stakeholders Clearly define program Written evaluation plan Collect credible/useful data Analyze data Share/use results
Those involved in program design, delivery, and/or funding Those served by the program Users of the evaluation results
Resources, activities, outcomes Context in which program operates Logic model › Explicit connections between “how” and “what” › Helps with program improvement › Good for sharing program idea with others › Living, breathing model
IF THEN
IF THEN I take an aspirin
IF THEN I take an aspirin My headache will go away
IF = Inputs & Activities THEN = Outcomes
Outcomes Indicators Tools Timelines Person(s) responsible (optional)
PROGRAM OUTCOME INDICATOR(S) DATA COLLECTION TOOL DATA COLLECTION SCHEDULE Training participants know how to recognize a seizure Percent of training participants who correctly identify 10 out of 13 possible symptoms of a seizure Participant pre, post and follow- up surveys Pre survey given prior to training; post survey given immediately after training; follow up survey given 30 days after training
Valid and reliable tools › Valid=measures what it is intended to measure › Reliable=consistent results over time Qualitative Quantitative Will answer your evaluation questions and inform decision-making
Quantitative › Surveys › Tests › Skill assessments Qualitative › Focus groups › Interviews › Journals › Observations
Many methods Answer evaluation questions Engage stakeholders in interpretations Justify conclusions and recommendations Get help if needed!
Reporting format Getting results into the right hands Framing the results Collaborative vs. confrontational approach Keeping users “in the loop” Debriefs and follow-up
Purpose
Audience
Purpose Audience Resources
Purpose Audience Resources Data
Purpose Audience Resources Data Timeline
Purpose Audience Resources Data Timeline Planning is key
Purpose Audience Resources Data Timeline Planning is key Expertise
Staff to perform work
› Available
Staff to perform work › Expertise › Available
Staff to perform work › Expertise › Available Credibility
Staff to perform work › Expertise › Available Credibility Technological support
Staff to perform work › Expertise › Available Credibility Technological support › Collect data
Staff to perform work › Expertise › Available Credibility Technological support › Collect data › Analyze data
Staff to perform work › Expertise › Available Credibility Technological support › Collect data › Analyze data Time frame
Training program for caretakers of seniors with epilepsy/seizures ADC staff and primary care providers Training provided by affiliates Delivery varies but content is consistent
Meeting with EF staff to learn about the program Collaboration with affiliate staff to design logic model Decisions regarding which outcomes to measure Decisions regarding how to best collect data Designed data collection tools Pilot testing and revision
What impact did the training program have on knowledge of seizures in seniors? › Pre and post knowledge assessment › Post-training survey What impact did the training program have on participants’ confidence and comfort in working with seniors ? › Post-training survey
Our benchmark is a rating of 7.0 or higher
Kathleen Dowell, Ph.D., President EvalSolutions 6408 Whistling Wind Way Mt. Airy, MD