Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health.

Slides:



Advertisements
Similar presentations
Andrea M. Landis, PhD, RN UW LEAH
Advertisements

Introduction to Monitoring and Evaluation
Program Theory and Logic Models (1) CHSC 433 Module 2/Chapter 5 Part 1 L. Michele Issel, PhD UIC School of Public Health.
Needs Assessment and Program Evaluation. Needs Assessment is: A type of applied research. Data is collected for a purpose! Can be either a descriptive.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Quantitative vs. Qualitative Research Method Issues Marian Ford Erin Gonzales November 2, 2010.
Reading the Dental Literature
The Methods of Social Psychology
47.269: Research I: The Basics Dr. Leonard Spring 2010
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
TOOLS OF POSITIVE ANALYSIS
Variables cont. Psych 231: Research Methods in Psychology.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Formulating the research design
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 14.
Several Evaluations Theories and Methods Reference: Foundation of Program Evaluation by Sadish, Cook, and Leviton (1991)
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Difference Two Groups 1. Content Experimental Research Methods: Prospective Randomization, Manipulation Control Research designs Validity Construct Internal.
CHAPTER 4 Research in Psychology: Methods & Design
Collecting, Presenting, and Analyzing Research Data By: Zainal A. Hasibuan Research methodology and Scientific Writing W# 9 Faculty.
Research Process Research Process Step One – Conceptualize Objectives Step One – Conceptualize Objectives Step Two – Measure Objectives Step Two – Measure.
Collecting Quantitative Data
Program Evaluation Using qualitative & qualitative methods.
Variation, Validity, & Variables Lesson 3. Research Methods & Statistics n Integral relationship l Must consider both during planning n Research Methods.
Research Process Step One – Conceptualization of Objectives Step Two – Measurement of Objectives Step Three – Determine Sampling Technique Step Four –
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Tutor: Prof. A. Taleb-Bendiab Contact: Telephone: +44 (0) CMPDLLM002 Research Methods Lecture 8: Quantitative.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Chapter 1: Introduction to Statistics. 2 Statistics A set of methods and rules for organizing, summarizing, and interpreting information.
Education 793 Class Notes Welcome! 3 September 2003.
Experimental Methods Sept 13 & 14 Objective: Students will be able to explain and evaluate the research methods used in psychology. Agenda: 1. CBM 2. Reading.
Reasoning in Psychology Using Statistics Psychology
The Scientific Method in Psychology.  Descriptive Studies: naturalistic observations; case studies. Individuals observed in their environment.  Correlational.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
Group Quantitative Designs First, let us consider how one chooses a design. There is no easy formula for choice of design. The choice of a design should.
The Research Enterprise in Psychology
Research Methods of Applied Linguistics and Statistics (3) Types of research Constructing Research Designs.
Statistical analysis Prepared and gathered by Alireza Yousefy(Ph.D)
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 3: The Foundations of Research 1.
Understanding Research Design Can have confusing terms Research Methodology The entire process from question to analysis Research Design Clearly defined.
Chapter 11 Planning the Intervention Effects Evaluation
HOLT, RINEHART AND WINSTON P SYCHOLOGY PRINCIPLES IN PRACTICE 1 Chapter 2 PSYCHOLOGICAL METHODS Section 1: Conducting ResearchConducting Research Section.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
1 Experimental Research Cause + Effect Manipulation Control.
Chapter 1 Review Clickers. 1.1 Introduction to the Practice of Statistics Define statistics and statistical thinking Explain the process of statistics.
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
CHAPTER OVERVIEW The Measurement Process Levels of Measurement Reliability and Validity: Why They Are Very, Very Important A Conceptual Definition of Reliability.
Academic Research Academic Research Dr Kishor Bhanushali M
Unit 5—HS 305 Research Methods in Health Science
Learning Objectives In this chapter you will learn about the elements of the research process some basic research designs program evaluation the justification.
Reliability & Validity  Reliability  “dependability”  is the indicator consistent?  same result every time?  Does not necessary measure what you think.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Program Evaluation Principles and Applications PAS 2010.
Chapter Eight: Quantitative Methods
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Unit-IX Samples sampling measurement tools, instruments.
Formulating the Research Design
CONDUCTING EDUCATIONAL RESEARCH Guide to Completing a Major Project Daniel J. Boudah Chapter 5 Designing and Conducting Experimental Research Lucy B. Houston,
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
Overview Introduction to marketing research Research design Data collection Data analysis Reporting results.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Chapter 5: Variables and measurement IN research.
CHAPTER 4 Research in Psychology: Methods & Design
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Chapter Eight: Quantitative Methods
The Nature of Probability and Statistics
RES 500 Academic Writing and Research Skills
Presentation transcript:

Health Program Effect Evaluation Questions and Data Collection Methods CHSC 433 Module 5/Chapter 9 L. Michele Issel, PhD UIC School of Public Health

Objectives 1. Develop appropriate effect evaluation questions 2. List pros and cons for various data collection methods 3. Distinguish between types of variables

Involve Evaluation Users so they can: l Judge the utility of the design l Know strengths and weaknesses of the evaluation l Identify differences in criteria for judging evaluation quality l Learn about methods l Have debated BEFORE have data

Terminology The following terms are used in reference to basically the same set of activities and for the same purpose:  Impact evaluation  Outcome evaluation  Effectiveness evaluation  Summative evaluation

Differences between Research - Evaluation l Nature of problem addressed:new knowledge vs assess outcomes l Goal of the research: new knowledge for prediction vs social accounting l Guiding theory: theory for hypothesis testing vs theory for the problem l Appropriate techniques: sampling, statistics, hypothesis testing, etc. vs fit with the problem

CharacteristicResearchEvaluation Goal or PurposeGenerate new knowledge for prediction Social accounting and program or policy decision making The questionsScientist’s own questions Derived from program goals and impact objectives Nature of problem addressed Areas where knowledge lacking Assess impacts and outcomes related to program Guiding theoryTheory used as base for hypothesis testing Theory underlying the program interventions, theory of evaluation Research-Evaluation Differences

CharacteristicResearchEvaluation Appropriate techniques Sampling, statistics, hypothesis testing, etc. Whichever research techniques fit with the problem SettingAnywhere that is appropriate to the question Usually where ever can access the program recipients and non- recipient controls DisseminationScientific journalsInternal and externally viewed program reports, scientific journals AllegianceScientific communityFunding source, policy preference, scientific community Research-Evaluation Differences

Evaluation Questions… l What questions do the stakeholders want answered by the evaluation? l Do the questions link to the impact and outcome objectives? l Do the questions link to the effect theory?

From Effect Theory to Effect Evaluation l Consider the effect theory as source of variables l Consider the effect theory as guidance on design l Consider the effect theory as informing the timing of data collection

From Effect Theory to Variables The next slide is an example of using the the effect theory components to identify possible variables on which to collect evaluation data.

Impact vs Outcome Evaluations l Impact is more realistic because it focuses on the immediate effects and participants are probably more accessible. l Outcomes is more policy, longitudinal, population based and therefore more difficult and costly. Also, causality (conceptual hypothesis) is fuzzier.

Effect Evaluation Draws upon and uses what is known about how to conduct rigorous research: Design Design --overall plan, such as experimental, quasi-experimental, longitudinal, qualitative Method Method -- how collect data, such as telephone survey, interview, observation

Methods --> Data Source s l Observational--> logs, video l Record review--> Client records, patient chart l Survey--> participants/not, family l Interview--> participants/not, l Existing records --> birth & death certificates, police reports

Comparison of Data Collection Methods Characteristics of each method to be considered when choosing a method: 1. Cost 2. Amount of training required for data collectors 3. Completion time 4. Response rate

Validity and Reliability l Method must use valid indicators/measures l Method must use reliable processes for data collection l Method must use reliable measures

Variables, Indicators, Measures l Vv l Variable is the “thing” of interest, variable is how that thing gets measured l Some agencies use “indicator” to mean the number that indicates how well the program is doing l Measure the way that the variable is known It’s all just language…. Stay focused on what is needed.

Levels of Measurement LevelExamplesAdvantageDisadvantage Nominal, Categorical Zip code, race, yes/no Easy to understand. Ordinal, Rank Social class, Lickert scale, “top ten” list (worst to best) Limited information from the data Interval, Ratio: continuous Temperature, IQ, distances, dollars, inches, dates of birth Gives most information; can collapse into nominal or ordinal categories. Used as a continuous variable. Can be difficult to construct valid and reliable interval variable

Types of Effects as documented through Indicators Indicators of physical change Indicators of knowledge change Indicators of psychological change Indicators of behavioral change Indicators of resources change Indicators of social change

Advise It is more productive to focus on a few relevant variables than to go on a wide ranging fishing expedition. Carol Weiss (1972)

Variables l Intervening variable: any variable that forms a link between the independent variable, AND without which the independent variable is not related to the dependent variable (outcome).

Variables l Confounding variable is an extraneous variable which accounts for all or part of the effects on the dependent variable (outcome); mask underlying true assumptions. l Must be associated with the dependent variable AND the independent variable.

Confounders l Exogenous confounding l Exogenous (outside of individuals) confounding factors are uncontrollable (selection bias, coverage bias). l Endogenous confounding l Endogenous (within individuals) confounding factors equally important: secular drift in attitudes/knowledge, maturation (children or elderly), seasonality, interfering events that alter individuals.

Variable story… To get from Austin to San Antonio, there is one highway. Between Austin and San Antonio there is one town, San Marcus. San Marcus is the intervening variable because it not possible to get to San Antonio from Austin without going through San Marcus. The freeway is often congested, with construction and heavy traffic. The highway conditions is the confounding variable because it is associated with both the trip (my car, my state of mind) and with arriving (alive) in San Antonio.

Measure Program Impact Across the Pyramid