Program Evaluation Using qualitative & qualitative methods.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Introduction to Monitoring and Evaluation
Educational Specialists Performance Evaluation System
Postgraduate Course 7. Evidence-based management: Research designs.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
The Continuous Quality Improvement Process Empowering staff to develop local level solutions.
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Needs Assessment and Program Evaluation. Needs Assessment is: A type of applied research. Data is collected for a purpose! Can be either a descriptive.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Agency Planning Operational and Strategic Plans. Who decides on the direction of an agency?
Understanding Organization Culture and Community Needs
The Academic Assessment Process
Evaluation of Community Interventions Running programs and finding out if they work.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Evaluation. Practical Evaluation Michael Quinn Patton.
Competency Assessment Public Health Professional (2012)-
USING THE METHODOLOGY FOR EXTERNAL LEGAL EDUCATION QUALITY ASSESSMENT Training on behalf of USAID FAIR Justice project – 27 th and 28 th May 2015.
P ARTICIPATORY A CTION R ESEARCH Involving Constituents in Social Change Oriented Research.
How to Develop the Right Research Questions for Program Evaluation
Internal Auditing and Outsourcing
RESEARCH DESIGNS FOR QUANTITATIVE STUDIES. What is a research design?  A researcher’s overall plan for obtaining answers to the research questions or.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Quantitative Research Qualitative Research? A type of educational research in which the researcher decides what to study. A type of educational research.
© 2006 Prentice Hall Leadership in Organizations 4-1 Chapter 4 Participative Leadership, Delegation, and Empowerment.
Evaluating a Research Report
Introduction to Evaluation Odette Parry & Sally-Ann Baker
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
The Process of Conducting Research
Hypothesis & Research Questions Understanding Differences between qualitative and quantitative approaches.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Quantitative and Qualitative Approaches
1 Experimental Research Cause + Effect Manipulation Control.
Needs Assessment Qualitative & Quantitative Methods.
Chapter 8: Participant-Oriented Evaluation Approaches
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Quasi Experimental and single case experimental designs
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Evaluation design and implementation Puja Myles
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Copyright © 2014 by The University of Kansas Data Collection: Designing an Observational System.
Chapter Eight: Quantitative Methods
There are 6 main components to Care Provider’s Committed to Quality Program: Visionary Leadership Mission Statement Customer Satisfaction Employee Satisfaction.
CJ490: Research Methods in Criminal Justice UNIT #4 SEMINAR Professor Jeffrey Hauck.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
Are we there yet? Evaluating your graduation SiMR.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Direct Practice in Social Work, 2e Scott W. Boyle Grafton H. Hull, Jr. Jannah Hurn Mather Larry Lorenzo Smith O. William Farley University of Utah, College.
1 Copyright © 2012 by Mosby, an imprint of Elsevier Inc. Copyright © 2008 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 40 The Nurse Leader in.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Program Evaluation Using qualitative & qualitative methods

Program evaluations measure: Program effectiveness, efficiency, quality, and participant satisfaction with the program.

Program evaluation can also measure: How or why a program is effective or is not effective

Program evaluation looks at the program or component of a program. It is not used to measure the performance of individual workers or teams. Consequently it differs from performance evaluation.

The program’s goals & objectives serve as the starting place for program evaluations. Objectives must be measurable, time-limited, and contain an evaluation mechanism. Be developed in relation to a specific program or intervention plan. Specify processes and tasks to be completed. Incorporate the program’s theory of action – describe how the program works and what it is expected to do (outcomes). To start an evaluation, the evaluator must find out what program participants identify as the goal (evaluability assessment).

Theory of action for a hunger program might be: Advisory Committee is formed to improve food bank services This improves service delivery More food is provided Families miss fewer meals There is less hunger

Evaluations can measure process or outcomes Qualitative methods are used to answer how and why questions (process) Quantitative methods are used to answer what questions- what outcomes were produced; was the program successful, effective, or efficient. (outcome)

Differences between the two methods: MethodQuantitativeQualitative LogicDeductiveInductive Values/BiasObjectiveSubjective Role of the Researcher ExpertPartner with Research Subjects Source of Research Questions Theory/Previous Research Can be grounded in experiences of researchers and participants MethodologyStructured measurement instruments Semi-structured surveys, interviews, or observation

Quantitative & Qualitative approaches include: Experimental Designs Quasi-Experimental Designs Pre & Post test studies Time Series Analysis Social Indicator Analysis Longitudinal Study Survey Client Satisfaction Survey Goal Attainment Program Monitoring Ethnographic Study Feminist Research Constructivist Evaluation Process Analysis Implementation Analysis Focus Groups

Most common types: Outcome evaluation (quantitative - may or may not use control groups to measure effectiveness). Goal attainment (have objectives been achieved). Process evaluation (qualitative - looks at how or why a program works or doesn’t work). Implementation analysis (mixed methods – was the program implemented in the manner intended). Program monitoring (mixed methods – is the program meeting its goals – conducted while the program is in progress).

Outcome Evaluations can include: Random Experimental Designs Comparisons of the pre and post-test scores for each participant on one or more outcome indicators. Using all members of pre-existing groups to serve as experimental and control groups. Using social indicator data collected by government agencies (for example, using U.S. Census data on poverty rates in a specific community to determine if an economic development program has been successful in increasing the income of neighborhood residents). Time series analysis, using repeated measures over a number of time periods to track social indicators or caseload data) Using statistical controls to hold constant the effects of confounding variables (for example, such as cross-tabulation or regression analysis). Using a quasi-experimental design in which participants are separated into groups and different levels of the intervention are compared (Chambers et al., 1992; Royce & Thyer, 1996).

Time Series Analysis Examines Data Trends: School Breakfast Program

Client satisfaction surveys are often used as one component of a program evaluation. Can provide valuable information about how clientele perceive the program and may suggest how the program can be changed to make it more effective or accessible. Client satisfaction surveys also have methodological limitations.

Limitations include: It is difficult to define and measure “satisfaction.” Few standardized satisfaction instruments, that have been tested for validity and reliability exist. Most surveys find that 80-90% of participants are satisfied with the program. Most researchers are skeptical that such levels of satisfaction exist. Hence, most satisfaction surveys are believed to be unreliable. Since agencies want to believe their programs are good, the wording may be biased. Clients who are dependent on the program for services or who fear retaliation may not provide accurate responses.

Problems with client satisfaction surveys can be addressed. Pre-testing to ensure face validity and reliability. Asking respondents to indicate their satisfaction level with various components of the program. Ensuring that administration of the survey is separated from service delivery and that confidentiality of clients/consumers is protected.

Process and Most Implementation Evaluations Assume that the program is a “black box” – with input- throughput – and output. Use some mixture of interviews, document analysis, observations, or semi-structured surveys. Gather information from a variety of organization participants: administrators, front-line staff, and clients. These evaluations also examine communication patterns, program policies, and the interaction about individuals, groups, programs, or organizations in the external environment.

Use the following criteria to determine type of evaluation Research question to be addressed. Amount of resources and time that can be allocated for research. Ethics (can you reasonably construct control groups or hold confounding variables constant) Will the evaluation be conducted by an internal or external evaluator? Who is audience for the evaluation? How will the data be used? Who will be involved in the evaluation?

Types of evaluation approaches that involve organization constituents Participatory Action Research Empowerment Evaluation Self-evaluation

Differences in approaches are: Participatory Action Research Empowerment Evaluation Self-Evaluation Role of Researcher Consultant; Partner with participants Consultant; works for participants Consultant; works for agency/funder PurposeSocial ChangeSelf- Determination Evaluate Agency Services OutcomeAlleviate Oppression Increases Participant Skills and Control Improved Service Quality

Advantages of Methods Increases feelings of participant ownership of process/programs. Increases likelihood that data will be used. Increases likelihood that the resulting program or intervention will meet needs of stakeholders and be culturally appropriate. Participants develop skills and confidence. They gain knowledge and information and thus become empowered.

Disadvantages of Method Distrust and conflict among participants. Length of time needed to develop consensus around goals, mission, and methods. The need for training around research methods, data collection, and analysis. The need for skilled facilitation, coordination, and follow-up on task completion. Money and an organizational structure are needed to do all these things. The group must be able to apply findings in order to achieve an outcome