Program Evaluation Planning & Data Analysis

Slides:



Advertisements
Similar presentations
Quantitative and Scientific Reasoning Standard n Students must demonstrate the math skills needed to enter the working world right out of high school or.
Advertisements

Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
Anita M. Baker, Ed.D. Jamie Bassell Evaluation Services Program Evaluation Essentials Evaluation Support 2.0 Session 2 Bruner Foundation Rochester, New.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Formative and Summative Evaluations
Recreational Therapy: An Introduction
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
How to Develop the Right Research Questions for Program Evaluation
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Questionnaires and Interviews
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
The Evaluation Plan.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Evaluating a Research Report
Chapter Thirteen Validation & Editing Coding Machine Cleaning of Data Tabulation & Statistical Analysis Data Entry Overview of the Data Analysis.
Measuring Complex Achievement
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Why Do State and Federal Programs Require a Needs Assessment?
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Chapter Twelve Copyright © 2006 John Wiley & Sons, Inc. Data Processing, Fundamental Data Analysis, and Statistical Testing of Differences.
Program Evaluation.
PROCESSING OF DATA The collected data in research is processed and analyzed to come to some conclusions or to verify the hypothesis made. Processing of.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Facilitate Group Learning
The Marketing Research Process Overview. Learning Objectives  To learn the steps in the marketing research process.  To understand how the steps in.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Research Methods in Psychology Introduction to Psychology.
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Session 2: Developing a Comprehensive M&E Work Plan.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
The IEP: Progress Monitoring Process. Session Agenda Definition Rationale Prerequisites The Steps of Progress Monitoring 1.Data Collection –Unpack Existing.
Tier III Preparing for First Meeting. Making the Decision  When making the decision to move to Tier III, all those involve with the implementation of.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Statistics & Evidence-Based Practice
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Understanding Results
AN INTRODUCTION TO EDUCATIONAL RESEARCH.
Finding Answers through Data Collection
Chapter Eight: Quantitative Methods
Presentation transcript:

Program Evaluation Planning & Data Analysis ScWk 242 – Session 11 Slides

Evaluation in Social Work In social services, evaluation is primarily guided via the framework of decision-making, but also includes the aspects of cost-effectiveness and cost-benefit analysis. “Evaluation research is a means of supplying valid and reliable evidence regarding the operation of social programs or clinical practices--how they are planned, how well they operate, and how effectively they achieve their goals” (Monette, Sullivan, & DeJong, 1990, p. 337). “Program evaluation is done to provide feedback to administrators of human service organizations to help them decide what services to provide to whom and how to provide them most effectively and efficiently” (Shaughnessy & Zechmeister, 1990, p. 340). Evaluation research refers to the use of scientific research methods to plan intervention programs, to monitor the implementation of new programs and the operation of existing ones, and to determine how effectively programs or clinical practices achieve their goals.

Basic vs. Evaluation Research In Basic Research, the researcher can afford to be tentative and conduct more research before they draw strong conclusions about their results (Cozby, 1993). In Evaluation Research, the researcher usually recommends immediate action on the basis of the results. He/she must determine clearly whether a program is successful and valuable enough to be continued. According to Shaughnessy and Zechmeister (1990), the purpose of program evaluation is practical, not theoretical.

The Need for Program Evaluation According to Monette, Sullivan, and DeJong (1990), evaluation research is conducted for three major reasons: It can be conducted for administrative purposes, such as to fulfill an evaluation requirement demanded by a funding source, to improve service to clients, or to increase efficiency of program delivery. A program is assessed to see what effects, if any, it is producing (i.e., impact assessment). It can be conducted to test hypotheses or evaluate practice approaches.

Program Evaluation Design A design is a plan which dictates when and from whom measurements will be gathered during the course of the program evaluation Three types of evaluators: Monitoring evaluator: tracks progress through regular reporting. Usually focuses on activities and/or expenditures. Summative evaluator: responsible for a summary statement about the effectiveness of the program Formative evaluator: helper and advisor to the program planners and developers The critical characteristic of any one evaluation study is that it provide the best possible information that could have been collected under the circumstances, and that this information meet the credibility requirements of its evaluation audience

Evaluation Implementation Steps Initial planning: Deciding what to measure: What are the program’s critical characteristics? How much supporting data do you need? Steps for planning data collection: Choosing data collection methods Determining whether appropriate measures already exist Creating a sampling strategy Thinking about validity and reliability Planning for data analysis

Evaluation Data Collection Options Sources of Information: Surveys Interviews Observations Record Reviews All have limitations and benefits All can be used to collect either quantitative or qualitative data Require preparation on the front end of: Instrument Development and testing Administration plan development Analysis plan development

Standards for Questions and Answers: Questions need to be consistently understood. Questions need to be consistently administered or communicated to respondents. What constitutes an adequate answer should be consistently communicated. Unless measuring knowledge is the goal of the question, all respondents should have access to the information needed to answer the question accurately. Respondents must be willing to provide the answers called for in the question.

Info Needed Prior to Evaluation What are the purposes of the program? What stage is the program in? (new, developing, mature, phasing out) Who are the program clients? Who are the key program staff (and where applicable, in which department is the program)? What specific strategies are used to deliver program services? What outcomes are program participants expected to achieve? Are there any other evaluation studies currently being conducted regarding this program? Who are the funders of the program? What is the total program budget? Why this program was selected for evaluation?

Assessing Organizational Effectiveness Organizational effectiveness = the ability of an organization to fulfill its mission via: sound management, strong governance, persistent rededication to achieving results Assessing Outcomes = Changes in attitudes, behavior, skills, knowledge, condition or status. Must be: Realistic and attainable Related to core organizational goals Within the program’s sphere of influence

Examples of Indicators Outcome Process Indicators Outcome Indicators Improved communication skills Improved relationships Increased positive behaviors Improved life skills Number of meetings indiv./family/school Duration of meetings Meeting attendance Quality of staff and materials Effective expression of thoughts and feelings More positive interaction with peers and adults Reduced/no indication of illegal or inappropriate behavior

Data Collection Questions Who will you collect data about? Clients, caregivers, other service providers working with clients, staff, some other group? Who are considered participants of your program? Be sure to clearly specify your eval. target population. What instruments do you need? Surveys, interview guides, observation checklists and/or protocols, record extraction or record review protocols? Are there any pre-tested instruments (e.g., scales for measuring human conditions and attitudes)? -- If not, how will you confirm validity?

Sample Size Considerations The sample should be as large as practically and probabilistically required. If a population is smaller than 100, generally include them all. When a sample is comparatively large, adding cases usually does not increase precision. When the population size is small, relatively large proportions are required, and vice versa. You must always draw a larger sample than needed to accommodate refusals. Desired sample size ÷ (1-refusal proportion)

Analyzing Quantitative Data Develop an Analysis Plan Code and Enter Data Verify Data Entry Prepare Data for Analysis Conduct Analyses According to the Plan Develop Tables, Figures and Narrative Summaries to Display Results of Analysis

Sources of Record Review Data

Important Data Related Terms Data can exist in a variety of forms Records: Numbers or text on pieces of paper Digital/computer: Bits and bytes stored electronically Memory: Perceptions, observations or facts stored in a person’s mind Qualitative vs. Quantitative Primary vs. Secondary Data Variables (Items) Unit of Analysis Duplicated v. Unduplicated Unit Record (Client-level) vs. Aggregated

More Data Definitions Case: individual record (e.g., 1 participant, 1 day, 1 activity) Demographics: descriptive characteristics (e.g., gender) Disaggregate: to separate or group information (e.g., to look at data for males separately from females) – conducting crosstabs is a strategy for disaggregating data. Partition(v): another term that means disaggregate. Unit of Analysis: the major entity of the analysis – i.e., the what or the whom is being studied (e.g., participants, groups, activities) Variable: something that changes (e.g., number of hours of attendance)

Options - Analyzing Quantitative Data Items to Look at or Summarize: Frequencies: How often a response or status occurs. Total and Valid Percentages: Frequency/total *100 Measures of Central Tendency: Mean, Median, (Modes) Distribution: Minimum, Maximum, Groups Cross-Tabulations: Relationship between two or more variables (also called contingency analyses, can include significance tests such as chi-square analyses) Useful, 2nd Level Procedures: Means testing and analysis (ANOVA, t-Tests) Correlations Regression Analyses

Good Evaluation Designs Include: Summary information about the program Questions to be addressed by the evaluation Data collection strategies that will be used The individuals who will undertake the activities When the activities will be conducted Products of the evaluation (who will receive them and how they should be used) Projected costs to do the evaluation

Using Multiple Info Sources Develop an overall analysis plan, guided by your evaluation questions. Clarify why you are using multiple methods. Sequence: Are you using one form of data collection to inform the design of the next (i.e. informant interviews prior to record review)? To answer different questions: Your different methods may be designed to answer different evaluation questions. For example, record review may be your source of information on caseworker compliance with protocols, while interviews may be your source on how they are using the new protocols. Triangulation: Both sources may answer the same question. Interviews with caseworkers is one source of information on compliance with the new protocols, record reviews could be another source to substantiate the interview findings. Plan out how you will join your analysis across the methods and determine the overall findings. E.g., -- analyze the interview data to determine the extent to which caseworkers are using the new protocols, and will then check these results against the record review data. Will also examine the records for specific examples of the types of protocol use that caseworkers report in the interviews

Projecting Timelines Timelines can be constructed separately or embedded in a chart. To project timelines: Assign dates to your level of effort, working backward from overall timeline requirements. Be sure the number of days required for a task and when it must be completed are in sync and feasible. Check to make sure evaluation calendar is in alignment with program calendar. Don’t plan to do a lot of data collecting around program holidays Don’t expect to collect data only between 9 am and 5 pm

Communicating Findings Who is your audience? Staff? Funders? Board? Participants? Multiple? What Presentation Strategies work best? PowerPoint Newsletter Fact sheet Oral presentation Visual displays Video Storytelling Press releases Report Formats: full report, executive summary, stakeholder-specific report?

Preparing a Strong Evaluation Report Description of the subject program. Clear statement about the evaluation questions and the purpose of the evaluation. Description of actual data collection methods Summary of key findings (including tables, graphs, vignettes, quotes, etc.) Discussion or explanation of the meaning and importance of key findings Suggested Action Steps Next Steps (for the program and the evaluation) Issues for Further Consideration (loose ends)

Linking Evaluation with Planning Organizations that Regularly use Evaluation Should Also . . . Think carefully about developing and assessing programs and other actions. Incorporate program evaluation findings and other assessment findings into program and other planning. Involve significant others in planning and revising plans. Develop written, logical plans. Follow plans. Have strategies in place to modify plans