Chapter 11 Planning the Intervention Effects Evaluation

Slides:



Advertisements
Similar presentations
Developing a Questionnaire
Advertisements

Promoting Quality Child Outcomes Data Donna Spiker, Lauren Barton, Cornelia Taylor, & Kathleen Hebbeler ECO Center at SRI International Presented at: International.
Mywish K. Maredia Michigan State University
Part II Sigma Freud & Descriptive Statistics
Research methods – Deductive / quantitative
Does It Work? Evaluating Your Program
CERD The Center for Educational Research and Development Program and Policy Evaluation.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Evaluation. Practical Evaluation Michael Quinn Patton.
Chapter 13 Survey Designs
Formulating the research design
Methodology: How Social Psychologists Do Research
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
+ Program Planning Chapter 2. + Individual and/or Program Cornerstones 1. Needs assessment** Needs assessment 2. Planning Planning 3. Implementation Implementation.
How to Develop the Right Research Questions for Program Evaluation
Defining and Measuring Variables Slides Prepared by Alison L. O’Malley Passer Chapter 4.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Collecting Quantitative Data
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Education 793 Class Notes Welcome! 3 September 2003.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Copyright  2004 McGraw-Hill Pty Ltd. PPTs t/a Marketing Research by Lukas, Hair, Bush and Ortinau 2-1 The Marketing Research Process Chapter Two.
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 3: The Foundations of Research 1.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
HOLT, RINEHART AND WINSTON P SYCHOLOGY PRINCIPLES IN PRACTICE 1 Chapter 2 PSYCHOLOGICAL METHODS Section 1: Conducting ResearchConducting Research Section.
Chapter 4 – Research Methods in Clinical Psych Copyright © 2014 John Wiley & Sons, Inc. All rights reserved.
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
1 The Theoretical Framework. A theoretical framework is similar to the frame of the house. Just as the foundation supports a house, a theoretical framework.
DESIGNING, CONDUCTING, ANALYZING & INTERPRETING DESCRIPTIVE RESEARCH CHAPTERS 7 & 11 Kristina Feldner.
What is randomization and how does it solve the causality problem? 2.3.
Experimental Research Methods in Language Learning Chapter 5 Validity in Experimental Research.
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Research Tools and Techniques The Research Process: Step 6 (Research Design – Element 5 Part B Measurement and Measures) Lecture 15.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Chapter 5 Assessment: Overview INTRODUCTION TO CLINICAL PSYCHOLOGY 2E HUNSLEY & LEE PREPARED BY DR. CATHY CHOVAZ, KING’S COLLEGE, UWO.
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
CONDUCTING EDUCATIONAL RESEARCH Guide to Completing a Major Project Daniel J. Boudah Chapter 5 Designing and Conducting Experimental Research Lucy B. Houston,
SCIENTIFIC METHOD RESEARCH METHODS ETHICS PSYCHOLOGICAL RESARCH.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
HOLT, RINEHART AND WINSTON P SYCHOLOGY PRINCIPLES IN PRACTICE 1 Chapter 2 PSYCHOLOGICAL METHODS Section 1: Conducting ResearchConducting Research Section.
Some Terminology experiment vs. correlational study IV vs. DV descriptive vs. inferential statistics sample vs. population statistic vs. parameter H 0.
CHOOSING A RESEARCH DESIGN
Right-sized Evaluation
Overview of Research Designs
Chapter Six Training Evaluation.
Finding Answers through Data Collection
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

Chapter 11 Planning the Intervention Effects Evaluation © 2009 Jones and Bartlett Publishers

Effect Evaluation in the Planning and Evaluation Cycle © 2009 Jones and Bartlett Publishers

Basis for Decisions about Evaluation Focus and Purpose Effect theory Logic model Outcome objectives Who the evaluation is for e.g., funders, stakeholders, research © 2009 Jones and Bartlett Publishers

Characteristics of the Right Question Relevant data can be collected More than 1 answer is possible Produces info that decision makers want and feel they need © 2009 Jones and Bartlett Publishers

Outcome Documentation, Assessment, and Evaluation To what extent were the outcome objectives met? Assessment To what extent is any noticeable change or difference in participants related to having received the program interventions? Evaluation Were the changes or differences due to participants having received the program and nothing else? © 2009 Jones and Bartlett Publishers

Three Levels of Intervention Effects Evaluations Outcome documentation Outcome assessment Outcome evaluation Purpose Show that outcome and impact objectives were met Determine whether participants in the program experienced any change/benefit Determine whether the program caused a change or benefit for the recipients Relationship to program effect theory Confirms reaching targets set in the objectives that were based on the theory Supports the theory Verifies the theory Level of rigor Minimal Moderate Maximum Data collection Data type and collection timing based on objectives being measured Data type based on effect theory; timing based on feasibility Data type based on effect theory; baseline and post-intervention data are required © 2009 Jones and Bartlett Publishers

Evaluation vs. Research Characteristic Research Evaluation Goal or purpose Generating new knowledge for prediction Social accounting and program or policy decision making Questions addressed Scientist's own questions Derived from program goals and impact objectives Problem addressed Areas where knowledge is lacking Program impacts and outcomes Guiding theory Theory used as basis for hypothesis testing Theory underlying the program interventions, theory of evaluation Appropriate techniques Sampling, statistics, hypothesis testing, etc. Whichever research techniques fit with the problem Setting Anywhere that is appropriate to the research question Anywhere evaluators can access the program recipients and controls Dissemination Scientific journals Internal and externally viewed reports, scientific journals Allegiance Scientific community Funding source, policy preference, scientific community © 2009 Jones and Bartlett Publishers

Rigor and Identifying a Program’s Net Effects © 2009 Jones and Bartlett Publishers

Three Theories Comprising the Program Effect Theory Causal theory Existing and causal factors, moderators and mediators, and health outcome Intervention theory How the interventions affect the causal, moderating, and mediating factors Impact theory How immediate outcomes become long-term impact At minimum, evaluation should measure causal factors and outcomes © 2009 Jones and Bartlett Publishers

Nomenclature for Effect Evaluation Variables © 2009 Jones and Bartlett Publishers

Dependent (y) Variables Need to choose most important outcome objectives, not a “fishing expedition” Typically from the 6 health and well-being domains: Knowledge, lifestyle behaviors, cognitive processes, mental health, social health, resources © 2009 Jones and Bartlett Publishers

Independent (x) Variables Called “independent” because they are not influenced by the outcome Start by measuring causal factors May be measured before and/or after a program, in participants and/or controls © 2009 Jones and Bartlett Publishers

Moderating and Mediating Variables Mediating – intervene between x and y Moderating – change strength or direction of relationship between x and y Including them in the evaluation helps in understanding what influences intervention effectiveness © 2009 Jones and Bartlett Publishers

Measurement Considerations Unit of observation must match level of program e.g., individuals, schools, communities Levels of measurement for variables Nominal, ordinal, interval Measurement timing Sensitivity of measures © 2009 Jones and Bartlett Publishers

Pros and Cons of Levels of Measurement Type Examples Advantage Disadvantage Nominal, categorical ZIP code, race, yes/no Easy to understand Limited information from the data Ordinal, rank Social class, Likert scale, “top 10” list (worst to best) Considerable information, can collapse into nominal categories Sometimes statistically treated as a nominal variable, ranking can be a difficult task for respondents Interval, continuous Temperature, IQ, distances, dollars, inches, age Most information, can collapse into nominal or ordinal categories Can be difficult to construct valid and reliable interval variables © 2009 Jones and Bartlett Publishers

Examples of Nominal, Ordinal, and Interval Variables Outcome variable Nominal Ordinal Interval Childhood immunization Yes/no up-to-date None required, 1 immunization required, >1 required Rubella titer Breastfeeding Yes/no breastfed Category for how long breastfed: <2 weeks, 2-6 weeks, >6 weeks # of days breastfed Housing situation Homeless or not Housing autonomy (own, rent monthly, rent weekly, homeless) # of days living at current residence © 2009 Jones and Bartlett Publishers

Example Timeline of Intervention and Evaluation Activities Month Intervention activity Evaluation activity 1 Pilot intervention with small group Conduct focus group to refine intervention acceptability and elements of services utilization plan 2 Recruit into program, screen for eligibility Randomly assign to program or wait list, collect data for baseline and comparison Participants n=150 Wait listed controls n=150 3 Provide intervention to 1st group of participants Analyze baseline, pre-intervention data 4 Collect post-intervention data Participants (time 1) who completed program n=125 New nonparticipant controls from wait list n=130 5 Repeat intervention Analyze data 6 Previous program participants (time 1) n=95 Current program participants (time 2) n=120 Current nonparticipant controls n=110 © 2009 Jones and Bartlett Publishers

Threats to Data Quality Missing data Reliability Instrument issues, individual variability day-to-day, inter-rater agreement, data entry Validity © 2009 Jones and Bartlett Publishers

Contextual Considerations in Evaluation Planning Evaluation budget Roughly 10 – 20% of implementation budget Evaluation standards Evaluation ethics Stakeholders’ interests © 2009 Jones and Bartlett Publishers

Summary of Evaluation Elements Elements of effect evaluation Science considerations Program considerations What to evaluate Impact & outcome variables most likely to demonstrate the strength of the evidence for the effect theory Highest-priority impact and outcome objectives, variables that meet funding agency requirements Who to evaluate Sample representativeness & comparability to non-participants, ethics of assignment to program or not Accessibility of program participants, availability of easily accessed target audience members When to evaluate Effect onset and duration Convenience and accessibility of program participants Why evaluate Scientific contributions and knowledge generation Program promotion, program refinement, funding agency requirements How to evaluate Maximize rigor through choice of measures, design, and analysis Minimize intrusion of evaluation into program through seamlessness of evaluation with program implementation © 2009 Jones and Bartlett Publishers

Effect Evaluation across the Pyramid Direct services level Evaluation of individuals may be most straightforward Questionnaire construction and secondary data analysis are main considerations Enabling services level Similar to direct services level How to identify participants and choosing the right unit of observation are main issues © 2009 Jones and Bartlett Publishers

Effect Evaluation across the Pyramid, Continued Population-based services level Major issues are aggregation of data and selecting the unit of observation Infrastructure level Evaluation itself is an infrastructure process If the program affects infrastructure, then may need to collect individual-level data May need to develop infrastructure measures © 2009 Jones and Bartlett Publishers

© 2009 Jones and Bartlett Publishers Coming Up… April 19 Chapter 12 (Chapter 13 covered in Research Methods) April 26 Chapter 14 (Chapter 15 covered in Research Methods) May 3 Final Group Presentation You will present the entire Proposal in 30 minutes. Be creative. Q&A by me as well. Course Evaluation Group Evaluation May 11 Final Exam © 2009 Jones and Bartlett Publishers