Download presentation
Presentation is loading. Please wait.
Published byRudolph Powell Modified over 8 years ago
1
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University
2
Administrative matters & check-in Research as a second language Assignment #1 On over-arching questions, et alia Implementing evidence-supported interventions Introduction to program evaluation Intro to theory Dyad time 2
3
3 Reading on implementation Novel Unsupported Treatments (NUTS) Why should we care about NUTS? You tell me what you want to check-in about! 1 5 234
4
Critical consumption of research AND skills to evaluate practice Learn to critically consume research Learn to develop practice evaluation plans Consider the process of evidence-based practice beyond evidence- supported interventions
5
Research Consumption Production Program Evaluation Production Process of EBP What to do with clients
6
6 Over- arching research question (umbrella) Study aims (points on the umbrella)
7
Problem area Specific aims, hypotheses Research question Area of interest Existing knowledge/ theory 7
8
Research (exploratory, descriptive, explanatory, constructing measurement instruments) QualitativeQuantitative Program evaluation (can include descriptive, constructing measurement instruments) Process/formativeOutcome/summative 8
9
Process of evidence- based or informed practice A specific evidence- supported intervention for a unique setting/population
10
10
11
1. Formulate a question to answer practice needs 2. Search for the evidence 3. Critically appraise the relevant studies you find 4. Determine which evidence-based intervention is most appropriate for your clients 5. Apply the evidence-based intervention 6. Evaluate your intervention – feedback loop PROCESS of Evidence-Based Practice is different from Evidence-Based Practices themselves… may be easier to call them “Evidence-Supported Interventions” 11
12
12
13
Choose intervention thoughtfully – with or without research ImplementEvaluate Research QualitativeQuantitative Program evaluation Process/formativeOutcome/summative
14
Area of interest a.k.a. problem area Research vs. evaluation Purposes of research Matching questions and methods Evidence-supported interventions Process of evidence-based practice Over-arching research question Study aims Hypothesis/hypotheses
15
15 Area of interest Research question Hypothesis Null hypothesis Community factors are less likely to predict obesity than demographic factors Childhood obesity What factors are predictive of childhood obesity? Community factors are more likely to predict higher rates of obesity than demographic factors
20
What are the potential implementation challenges to my chosen intervention in my chosen setting?
29
Before social work intervention
30
During beginning of social work intervention
31
Towards end of social work intervention
32
At end or after social work intervention
33
What happens if there is too much water? What happens if the water is tainted? What happens if there is not enough water? What happens if there is not enough sun? What happens if the bulb gets dug up?
34
Measure inputs Enough/safe water used? Enough sun provided? Ground not dug up? Lawnmower/deer/rabbi ts didn’t eat green shoots? Measure outcomes How was the flower? How long did it last?
35
Was steel delivered on time? Was the steel faulty? Was there a worker strike? Were there unexpected design/building challenges?
36
Measure inputs Correct steel used? Rivets installed correctly? Rust inhibitor applied correctly? Design not faulty? Measure outcomes? Completed on time? Actually a sturdy structure? Works as planned? How long did it last before needing repair?
37
Was chosen treatment delivered according to treatment plan? Were adjustments needed to treatment plan? How did young man respond to treatment? Was a course correction needed? How did young man function at end of treatment?
38
Measure inputs Treatment delivered as plans Order of treatment made sense Regular meetings with therapist Measure outcomes? Goal reached at end of treatment? Retention of goal functioning? Relapse?
39
What is needed? Are you accomplishing your goals along the way? Are your clients achieving their goals? How does cost/inputs factor in to the process?
40
Document program processes (implementation) outcomes (success) Identify program strengths, weaknesses Improve program (effectiveness, impact) Program planning, development Demonstrate how use of resources justifies investment of time, money, labor Meet local, state, federal accountability measures
41
Evaluation helps you monitor the resources you’ve put into a program $$$ Time Expertise Energy Assessment of goals, objectives, reality Helps determine value on product, process, or program, (eVALUation)
42
Dorothea Dix – treatment for people with mental illness Seeking to define recovery – used “discharge” as operationalization (90% success rate!) Growth of program evaluation post WWII – “age of accountability” through $$$ cuts Impact of managed care – evaluation embraced to control costs, promoting efficiencies in treatment Critique for poor methods-questions match
43
Vested interests abound Not wanting to hear “bad news” even if in the guise of data for program improvement ($$$ incentives) Use of non-skilled/experienced researchers who may not use best critical thinking re: research methods Question-method match Instruments Data collection approaches
44
1. Identify stakeholders, learn about them 2. Involve all in planning the evaluation (obtain buy-in) 3. Develop logic model 4. Assure all of feedback build-in 5. Determine format of report needed 6. Present negative data thoughtfully 7. Make realistic recommendations, positive spin (See page 328)
45
Graphic portrayal depicting essential elements of program How goals/objectives link to elements Link to short-term process measures Measurable indicators of success Link to longer-term outcome measures Measurable indicators of success (See pages 342-343)
46
Type depends on purpose & timing Formative Process Implementation Needs assessment Summative Outcome Cost effectiveness Cost-benefit
47
Formative Before program While program is running, make changes as needed Collect and analyze data at various intervals Make program improvements along the way Summative Use at end of the program Summarize outcomes and results
48
Ideally more than one method used: Survey key informants Community forum Examine existing data – rates under treatment Examine existing data – social indicators Conduct targeted survey
49
Measuring progress along the way Intermediate goals Can be a repeated measure (think: tracking)
50
Ensure that all program components are being properly and consistently implemented Use when introducing a new program Standardized implementation? Are all sites are using program components in the proper way
51
Identify the results or effects of your program Measure how your program participants’ knowledge, attitudes, and behaviors have changed as a result of your program
52
Cost-benefit: Outcomes considered use monetary units Victimization Criminal justice expenses Receipt of social welfare- derived income transfers Cost-effectiveness: Assess relative efficiency of alternative approaches to improving outcomes Classically: health conditions as outcomes Such studies create indices to relate non- financially-defined outcomes to costs for alternatives
53
Community Resources for Justice, Inc. Implementation of a treatment paradigm amongst all line-level staff Client satisfaction survey for needs assessment Youth Opportunities Upheld (YOU), Inc. Effectiveness of new therapeutic approach for major depression amongst women
54
Work in dyads, start by explaining your assignment ideas to someone new!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.