Elspeth Slayter, Ph.D., Associate Professor School of Social Work, Salem State University.

Slides:



Advertisements
Similar presentations
Allyn & Bacon 2003 Social Work Research Methods: Qualitative and Quantitative Approaches Topic 11: Evaluation of Practice Visit the American.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Introduction to Impact Assessment
Introduction to Monitoring and Evaluation
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Planning an improved prevention response up to early childhood Ms. Giovanna Campello UNODC Prevention, Treatment and Rehabilitation Section.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Does It Work? Evaluating Your Program
Developing a Logic Model
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Formative and Summative Evaluations
Chapter 15 Evaluation.
Program Design: Analyzing Program Models
Types of Evaluation.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Measuring for Success Module Nine Instructions:
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
1 Types of Evaluation. 2 Different types of evaluation Needs assessment Process evaluation Impact evaluation Cost-benefit analysis.
Reporting and Using Evaluation Results Presented on 6/18/15.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Module 3. Session DCST Clinical governance
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Community Planning Training 5- Community Planning Training 5-1.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Basic Concepts of Outcome-Informed Practice (OIP).
Are we there yet? Evaluating your graduation SiMR.
Elspeth Slayter, Associate Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University 1.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
© 2012 Behavioral Tech KEY COMPONENTS IN DBT IMPLEMENTATION: A SURVEY FROM THE GROUND UP Linda A. Dimeff, Ph.D. 1, Andre Ivanoff, Ph.D. 2, 3, & Erin Miga,
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Elspeth Slayter, Ph.D., Associate Professor School of Social Work, Salem State University 1.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University 1.
Research in Social Work Practice Salem State University
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Performance and Quality Improvement
Presentation transcript:

Elspeth Slayter, Ph.D., Associate Professor School of Social Work, Salem State University

Administrative matters & check-in Questions about Assignment #1? Research as a second language Implementing evidence-supported interventions Introduction to program evaluation Short consultation sessions 2

…questions about syllabus, assignments, break time, other announcements? 3

4 Textbook readings Article reading Assignment #1 Other (you tell me!)

Critical consumption of research AND skills to evaluate practice Learn to critically consume research Learn to develop practice evaluation plans Consider the process of evidence-based practice beyond evidence- supported interventions Today’s class:

All are types of Research Looking up references, compiling existing information Practice evaluation (a.k.a. program evaluation) Social research that informs social work practice in some way 6

Research ?? Program evaluation ?? 7

Qualitative: ? Quantitative: ? 8

9

Choose intervention thoughtfully – with or without research ImplementEvaluate Research QualitativeQuantitative Program evaluation Process/formativeOutcome/summative

Process of evidence- based or informed practice An evidence- supported intervention for a unique setting/population

All are types of Research Common speech: Looking up references, compiling existing information Formal speech: Practice evaluation (a.k.a. program evaluation) Formal Speech: Social research that informs social work practice in some way 12

Problem area Specific aims, hypotheses Research question Area of interest Existing knowledge/ theory 13

14 Over- arching research question (umbrella) Study aims (points on the umbrella)

Review of this week’s article

Before social work intervention

During beginning of social work intervention

Towards end of social work intervention

At end or after social work intervention

What happens if there is too much water? What happens if the water is tainted? What happens if there is not enough water? What happens if there is not enough sun? What happens if the bulb gets dug up?

Measure inputs Enough/safe water used? Enough sun provided? Ground not dug up? Lawnmower/deer/rabbi ts didn’t eat green shoots? Measure outcomes How was the flower? How long did it last?

Was steel delivered on time? Was the steel faulty? Was there a worker strike? Were there unexpected design/building challenges?

Measure inputs Correct steel used? Rivets installed correctly? Rust inhibitor applied correctly? Design not faulty? Measure outcomes? Completed on time? Actually a sturdy structure? Works as planned? How long did it last before needing repair?

Was chosen treatment delivered according to treatment plan? Were adjustments needed to treatment plan? How did young man respond to treatment? Was a course correction needed? How did young man function at end of treatment?

Measure inputs Treatment delivered as plans Order of treatment made sense Regular meetings with therapist Measure outcomes? Goal reached at end of treatment? Retention of goal functioning? Relapse?

What is needed? Are you accomplishing your goals along the way? Are your clients achieving their goals? How does cost/inputs factor in to the process?

Document program processes (implementation) outcomes (success) Identify program strengths, weaknesses Improve program (effectiveness, impact) Program planning, development Demonstrate how use of resources justifies investment of time, money, labor Meet local, state, federal accountability measures

Evaluation helps you monitor the resources you’ve put into a program $$$ Time Expertise Energy Assessment of goals, objectives, reality Helps determine value on product, process, or program, (eVALUation)

Dorothea Dix – treatment for people with mental illness Seeking to define recovery – used “discharge” as operationalization (90% success rate!) Growth of program evaluation post WWII – “age of accountability” through $$$ cuts Impact of managed care – evaluation embraced to control costs, promoting efficiencies in treatment Critique for poor methods-questions match

Vested interests abound Not wanting to hear “bad news” even if in the guise of data for program improvement ($$$ incentives) Use of non-skilled/experienced researchers who may not use best critical thinking re: research methods Question-method match Instruments Data collection approaches

1. Identify stakeholders, learn about them 2. Involve all in planning the evaluation (obtain buy-in) 3. Develop logic model 4. Assure all of feedback build-in 5. Determine format of report needed 6. Present negative data thoughtfully 7. Make realistic recommendations, positive spin (See page 328)

Graphic portrayal depicting essential elements of program How goals/objectives link to elements Link to short-term process measures Measurable indicators of success Link to longer-term outcome measures Measurable indicators of success (See pages )

Type depends on purpose & timing Formative Process Implementation Needs assessment Summative Outcome Cost effectiveness Cost-benefit

Formative Before program While program is running, make changes as needed Collect and analyze data at various intervals Make program improvements along the way Summative Use at end of the program Summarize outcomes and results

Ideally more than one method used: Survey key informants Community forum Examine existing data – rates under treatment Examine existing data – social indicators Conduct targeted survey

Measuring progress along the way Intermediate goals Can be a repeated measure (think: tracking)

Ensure that all program components are being properly and consistently implemented Use when introducing a new program Standardized implementation? Are all sites are using program components in the proper way

Identify the results or effects of your program Measure how your program participants’ knowledge, attitudes, and behaviors have changed as a result of your program

Cost-benefit: Outcomes considered use monetary units Victimization Criminal justice expenses Receipt of social welfare- derived income transfers Cost-effectiveness: Assess relative efficiency of alternative approaches to improving outcomes Classically: health conditions as outcomes Such studies create indices to relate non- financially-defined outcomes to costs for alternatives

Community Resources for Justice, Inc. Implementation of a treatment paradigm amongst all line-level staff Client satisfaction survey for needs assessment Youth Opportunities Upheld (YOU), Inc. Effectiveness of new therapeutic approach for major depression amongst women

Work in dyads: Design a process and outcome measure …as if you were doing program evaluation in your work, field setting or past work setting…