Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Evidence Based Practices Lars Olsen, Director of Treatment and Intervention Programs Maine Department of Corrections September 4, 2008.
Introduction to Monitoring and Evaluation
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Chapter 2 Flashcards.
Developing a Logic Model
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
PPA 502 – Program Evaluation
Chapter 15 Evaluation.
Questions from a patient or carer perspective
Hypothesis & Research Questions
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Professional Growth= Teacher Growth
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Intro to Positive Behavior Interventions & Supports (PBiS)
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
Too expensive Too complicated Too time consuming.
Program Evaluation and Logic Models
RESEARCH IN MATH EDUCATION-3
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Journal Write a paragraph about a decision you recently made. Describe the decision and circumstances surrounding it. How did it turn out? Looking back,
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
OVERVIEW Partners in Pregnancy is a community program aimed at giving young couples the resources they need with their pregnancy. Young pregnant couples.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Quantitative and Qualitative Approaches
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Why Do State and Federal Programs Require a Needs Assessment?
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
THIS PRESENTATION IS INTENDED AS ONE COMPLETE PRESENTATION. HOWEVER, IT IS DIVIDED INTO 3 PARTS IN ORDER TO FACILITATE EASIER DOWNLOADING AND VIEWING,
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Evidenced Based Protocols for Adult Drug Courts Jacqueline van Wormer, PhD Washington State University NADCP/NDCI.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Basic Concepts of Outcome-Informed Practice (OIP).
Step One: Research Problem, Question & Hypothesis.
Elspeth Slayter, Associate Professor School of Social Work, Salem State University.
Minnesota’s Promise World-Class Schools, World-Class State.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
A Strategic Measurement and Evaluation Framework to Support Worker Health COMMITTEE ON DHS OCCUPATIONAL HEALTH AND OPERATIONAL MEDICINE INFRASTRUCTURE.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Elspeth Slayter, Associate Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University 1.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Department of Specialized Instruction & Student Services Strategic Plan – Initiative 1.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Associate Professor School of Social Work, Salem State University.
Elspeth Slayter, Ph.D., Associate Professor School of Social Work, Salem State University 1.
Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University 1.
Conceptual Change Theory
Chapter 33 Introduction to the Nursing Process
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Research in Social Work Practice Salem State University
Presentation transcript:

Elspeth Slayter, Ph.D., Assistant Professor School of Social Work, Salem State University

Administrative matters & check-in Research as a second language Assignment #1 On over-arching questions, et alia Implementing evidence-supported interventions Introduction to program evaluation Intro to theory Dyad time 2

3 Textbook readings Article reading Assignment #1 Other (you tell me!)

Critical consumption of research AND skills to evaluate practice Learn to critically consume research Learn to develop practice evaluation plans Consider the process of evidence-based practice beyond evidence- supported interventions

Research (exploratory, descriptive, explanatory, constructing measurement instruments) QualitativeQuantitative Program evaluation (can include descriptive, constructing measurement instruments) Process/formativeOutcome/summative 5

6 Over- arching research question (umbrella) Study aims (points on the umbrella)

Process of evidence- based or informed practice An evidence- supported intervention for a unique setting/population

Choose intervention thoughtfully – with or without research ImplementEvaluate Research QualitativeQuantitative Program evaluation Process/formativeOutcome/summative

Area of interest a.k.a. problem area Research vs. evaluation Purposes of research Matching questions and methods Evidence-supported interventions Process of evidence-based practice Over-arching research question Study aim/s

Problem area Specific aims, hypotheses Research question Area of interest Existing knowledge/ theory 10

11 Hypothesis vs. hypotheses What you think the answer to your questions will be Null hypothesis The opposite of what you think the answer will be Why do we do this?

12 Type I error False positive Example: We observe a difference between groups, when in reality, there may not be one Type II error False negative Example: We don’t observe a difference between groups, when in reality there may be one

13 Area of interest Research question Hypothesis Null hypothesis Community factors are less likely to predict obesity than demographic factors Childhood obesity What factors are predictive of childhood obesity? Community factors are more likely to predict higher rates of obesity than demographic factors

Before social work intervention

During beginning of social work intervention

Towards end of social work intervention

At end or after social work intervention

What happens if there is too much water? What happens if the water is tainted? What happens if there is not enough water? What happens if there is not enough sun? What happens if the bulb gets dug up?

Measure inputs Enough/safe water used? Enough sun provided? Ground not dug up? Lawnmower/deer/rabbi ts didn’t eat green shoots? Measure outcomes How was the flower? How long did it last?

Was steel delivered on time? Was the steel faulty? Was there a worker strike? Were there unexpected design/building challenges?

Measure inputs Correct steel used? Rivets installed correctly? Rust inhibitor applied correctly? Design not faulty? Measure outcomes? Completed on time? Actually a sturdy structure? Works as planned? How long did it last before needing repair?

Was chosen treatment delivered according to treatment plan? Were adjustments needed to treatment plan? How did young man respond to treatment? Was a course correction needed? How did young man function at end of treatment?

Measure inputs Treatment delivered as plans Order of treatment made sense Regular meetings with therapist Measure outcomes? Goal reached at end of treatment? Retention of goal functioning? Relapse?

What is needed? Are you accomplishing your goals along the way? Are your clients achieving their goals? How does cost/inputs factor in to the process?

Document program processes (implementation) outcomes (success) Identify program strengths, weaknesses Improve program (effectiveness, impact) Program planning, development Demonstrate how use of resources justifies investment of time, money, labor Meet local, state, federal accountability measures

Evaluation helps you monitor the resources you’ve put into a program $$$ Time Expertise Energy Assessment of goals, objectives, reality Helps determine value on product, process, or program, (eVALUation)

Dorothea Dix – treatment for people with mental illness Seeking to define recovery – used “discharge” as operationalization (90% success rate!) Growth of program evaluation post WWII – “age of accountability” through $$$ cuts Impact of managed care – evaluation embraced to control costs, promoting efficiencies in treatment Critique for poor methods-questions match

Vested interests abound Not wanting to hear “bad news” even if in the guise of data for program improvement ($$$ incentives) Use of non-skilled/experienced researchers who may not use best critical thinking re: research methods Question-method match Instruments Data collection approaches

1. Identify stakeholders, learn about them 2. Involve all in planning the evaluation (obtain buy-in) 3. Develop logic model 4. Assure all of feedback build-in 5. Determine format of report needed 6. Present negative data thoughtfully 7. Make realistic recommendations, positive spin (See page 328)

Graphic portrayal depicting essential elements of program How goals/objectives link to elements Link to short-term process measures Measurable indicators of success Link to longer-term outcome measures Measurable indicators of success (See pages )

Type depends on purpose & timing Formative Process Implementation Needs assessment Summative Outcome Cost effectiveness Cost-benefit

Formative Before program While program is running, make changes as needed Collect and analyze data at various intervals Make program improvements along the way Summative Use at end of the program Summarize outcomes and results

Ideally more than one method used: Survey key informants Community forum Examine existing data – rates under treatment Examine existing data – social indicators Conduct targeted survey

Measuring progress along the way Intermediate goals Can be a repeated measure (think: tracking)

Ensure that all program components are being properly and consistently implemented Use when introducing a new program Standardized implementation? Are all sites are using program components in the proper way

Identify the results or effects of your program Measure how your program participants’ knowledge, attitudes, and behaviors have changed as a result of your program

Cost-benefit: Outcomes considered use monetary units Victimization Criminal justice expenses Receipt of social welfare- derived income transfers Cost-effectiveness: Assess relative efficiency of alternative approaches to improving outcomes Classically: health conditions as outcomes Such studies create indices to relate non- financially-defined outcomes to costs for alternatives

Community Resources for Justice, Inc. Implementation of a treatment paradigm amongst all line-level staff Client satisfaction survey for needs assessment Youth Opportunities Upheld (YOU), Inc. Effectiveness of new therapeutic approach for major depression amongst women

Theory Can test a proposed/documented theory Can be justified by a documented theory Can relate results to existing theory Sometimes a theory is not used 52

Justify: Theory tells us we need to do this study because… Structure: Overtly named as part of the research design (used to structure study process) Interpretation: Used in discussion of findings (relating findings back to theory) Creating theory and/or grounded theory 53

Look in the existing literature What theories or concepts drive the research you are reading? Look at the sociological or public health literature 54

Frameworks & theories in social work Practice frameworks: --Perspectives --Theories --Models Orienting conceptual frameworks: --Social capital Orienting theories: --Diffusion of innovation --Theory of reasoned action --Street-level bureaucracy 55

Theory Analytic structure Identifies distinct observations Makes assertions about the underlying reality that brings about or affects something Conceptual frameworks Intermediate theory Potential to connect to all aspects of inquiry Outline of possible courses of action, ways of being, relationships 56

Policymakers Managers Workers (street-level bureaucrats) 57

Sex & Drugs & Rock ‘n’ Roll: Implementing the dignity of risk among community-based adults with intellectual disabilities 58

Interest in choice-making among people with ID: “Twinkies” for breakfast Substance abuse does happen Psychotropic medications “Home alone” policies Sex education, pregnancy scares and legal competency Going out to shows and non-disabled friends 59

Not much literature: Sex & drugs & rock n’ roll Some focus on prevalence/treatment, not much on parenting or “management” Central policy goals: “Dignity of risk” Community inclusion Self-determination Not really discussed together 60

Many people with ID/DD live in the community Families, group homes/congregate care settings Group home workers are at the “front lines” Leads me to ask: How are group home workers implementing the dignity of risk? If yes, how? If no, how and why? Haven’t heard their voices 61

Sometimes to set up study, sometimes not…in this case… Lipsky: Street-level Bureaucracy “Policy implementation in the end comes down to the people who actually implement it” Workers’ own views influence their work with clients Justification for approach What can we learn from street-level bureaucrats on this topic? Use to explain results Policy vs. “lived” policy 62

Work in dyads, start by explaining your ideas to someone new!