A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Test Development.
Standardized Scales.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Chapter 15 Evaluation Recognizing Success. Social Work Evaluation and Research Historically –Paramount to the work of early social work pioneers Currently.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Tracking Test Scores to Address Educational Accountability Standards in a Public School System Faculty Sponsor: George Tremblay, Ph.D. Antioch University.
Is a community court a program or a partnership?: Evaluation scope and design issues Stuart Ross & Karen Gelb, University of Melbourne BOCSAR Applied Research.
Instrument Development for a Study Comparing Two Versions of Inquiry Science Professional Development Paul R. Brandon Alice K. H. Taum University of Hawai‘i.
Personality, 9e Jerry M. Burger
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Squeezing more out of existing data sources: Small Area Estimation of Welfare Indicators Berk Özler The World Bank Development Research Group, Poverty.
Quantitative Research
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
Dobson DaVanzo & Associates, LLC Vienna, VA Prostate Cancer Patients Report on Benefits of Proton Therapy: Follow- on.
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. Correlational Research Chapter Fifteen.
2 Enter your Paper Title Here. Enter your Name Here. Enter Your Paper Title Here. Enter Your Name Here. ANALYSIS OF THE RELATIONSHIP BETWEEN JOB SATISFACTION.
Chapter 2: The Research Enterprise in Psychology
Chapter 1 Psychology as a Science
Chapter 2: The Research Enterprise in Psychology
RESEARCH A systematic quest for undiscovered truth A way of thinking
Transformational leadership, goal difficulty, and job design: Independent and interactive effects on employee outcomes Article Presentation Course 614.
Educator Evaluations: Growth Models Presentation to Sand Creek Schools June 13, 2011.
S519: Evaluation of Information Systems Week 14: April 7, 2008.
Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns.
Copyright © Allyn & Bacon 2007 Chapter 2: Research Methods.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Skills Online: Building Practitioner Competence in an Inter-professional, Virtual Classroom Canadian Public Health Association 2008 Annual Conference.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Evaluating a Research Report
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Evidencing Outcomes Ruth Mann / George Box Commissioning Strategies Group, NOMS February 2014 UNCLASSIFIED.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
Correlational Research Chapter Fifteen Bring Schraw et al.
Using a Dose-Response Analysis Strategy to Measure Outcomes of Place-based Education Prepared by: Michael Duffin, PEER Associates, Inc. With support from:
Module 2: Quality and Quality Measures The degree to which health services for individuals and populations increase the likelihood of desired health outcomes.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
The Practice of Social Research Chapter 6 – Indexes, Scales, and Typologies.
Personally Important Posttraumatic Growth as a Predictor of Self-Esteem in Adolescents Leah McDiarmid, Kanako Taku Ph.D., & Aundreah Walenski Presented.
Chapter 6 - Standardized Measurement and Assessment
Single-Subject and Correlational Research Bring Schraw et al.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
McGraw-Hill © 2007 The McGraw-Hill Companies, Inc. All rights reserved. Slide 1 Sociological Research SOCIOLOGY Richard T. Schaefer 2.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Symposium CLIENT –PROVIDER RELATIONSHIP AS AN ACTIVE INGREDIENT IN DELIVERY OF SOCIAL SERVICES Organizer: Jeanne C. Marsh, PhD, MSW University of Chicago.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
CHAPTER 15: THE NUTS AND BOLTS OF USING STATISTICS.
New Survey Questionnaire Indicators in PISA and NAEP
Typical farms and hybrid approaches
DATA COLLECTION METHODS IN NURSING RESEARCH
EHS Lecture 14: Linear and logistic regression, task-based assessment
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
American Evaluation Association Conference November 10, 2010
Elayne Colón and Tom Dana
Impact of State Reporting Laws on Central Line– Associated Bloodstream Infection Rates in U.S. Adult Intensive Care Units Hangsheng Liu, Carolyn T. A.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Chapter Eight: Quantitative Methods
Integrating Outcomes Learning Community Call February 8, 2012
Contemporary Issues of HRM
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

A “Dose-Response” Strategy for Assessing Program Impact in Naturalistic Contexts Megan PhillipsGeorge Tremblay Antioch University Antioch University New England New England Michael Duffin Program Evaluation and Educational Research Associates, Inc. Presented at the Annual Convention of the Association for Behavioral and Cognitive Therapies, November 18, 2006

RATIONALE  With increasing accountability pressures in virtually all service sectors (American Psychological Association, 2005), we need evaluation strategies that can be utilized outside of highly controlled research contexts (cf. Strosahl et al., 1998, on the “manipulated training research method”).  Evaluation of dose-response relationships, while not a “strong” form of causal evidence (McCabe, 2004), has nevertheless been recognized as providing some support for a causal relationship (O’Neill, 2002).

REQUIREMENTS  Variability in exposure to intervention  Identification and measurement of targeted outcomes

ADVANTAGES  Provides a relatively efficient probe for active program effects, which can warrant further and more rigorous controlled analyses.  Uses a single measurement event while allowing for the collection of a wide range of dose values.  Data can be readily aggregated across time or settings.  Successfully detects small effects that are statistically significant.

LIMITATIONS  Measurement of dose may be somewhat indirect (e.g., estimates of time exposed to intervention). Evaluators must be open to site-specific operationalization of the dose measure, which may complicate comparison across programs.  Requires a deeper level of understanding of statistics than users of the evaluation data may be accustomed to.  Evaluators need to provide users of the data with some benchmark for interpreting the significance of observed effect sizes.

AN ILLUSTRATION  The Place-based Education Evaluation Collaborative (PEEC): represents several innovative educational programs that share common themes, such as:  Enhanced community-school connections  Increased understanding of and connection to local place  Increased civic participation maintains an ongoing, cross-program, multi- method evaluation effort.

EVALUATION QUESTION:  Is variability in dose (independent variable) of a place-based education program associated with variability in levels of behaviors and attitudes that the program is attempting to impact, i.e. a larger response (dependent variable)?  Sample: 338 educator and 721 student surveys from 55 schools, collected over one year. Representative of wide range of demographic characteristics, grade ranges, & program intensities.

METHOD: Measures  Dose measures Composite dose was calculated from survey items including:  extent of program implementation: measured on a scale of 0 to 4  total # of hours of exposure to program elements: raw scores in hours scaled to 0 to 4 metric comparable to “program implementation” Distribution of composite dose scores across sample covered entire range from 0 to 4, offering suitable variability in the independent variable for dose- response calculations.  Response measures Broad conceptual categories (modules) were developed that matched desired program outcomes. Each module was composed of indices designed to capture specific dimensions of the modules. Individual survey questions were developed for each index, using items from existing surveys when possible to maximize validity of comparing current and future results to previously collected data.

METHOD: Analysis Dose-response analysis:  Multiple regression analyses were used to explore the percent variance of outcome variables (modules & indices) that could be accounted for by the predictor variable (program dose).

RESULTS  Statistically significant relationships (p<.01) were found between program dose and all outcome measures except two student-level indices and one educator-level index.  This analysis allowed for the identification of more and less active ingredients of the program (Figures 1 & 2).

Indication of an active ingredient Figure 1. Overall educator practice was analyzed at the super-ordinate level by combining average Likert scale responses for 12 items. The best fit multiple regression line shows that 19% of the variability in survey response is predicted by program dose.

Figure 2. Student attachment to place was analyzed at the super-ordinate level by combining average Likert scale responses for 15 student survey items. The best fit multiple regression line shows that 6% of the variability in survey response is predicted by dose. Indication of a less active ingredient

DISCUSSION  Benefits of the dose-response strategy in the PEEC evaluation context: Data set can now be cumulative year-to-year. Once an initial investment in survey instrument design & administration was made, future evaluation costs should decline.  Limitations of the dose-response strategy in the PEEC evaluation context: Relied on self-report data as opposed to more empirically verifiable observations. Psychometric properties of the survey instruments have yet to be validated.

REFERENCES American Psychological Association. (2005, August). Policy statement on evidence-based practice in psychology. Retrieved February 19, 2006, from McCabe, O.L. (2004). Crossing the quality chasm in behavioral health care: The role of evidence-based practice. Professional Psychology: Research and Practice, 35, Strosahl, K. D., Hayes, S. C., Bergan, J., & Romano, P. (1998). Assessing the field effectiveness of acceptance and commitment therapy: An example of the manipulated training research model. Behavior Therapy, 29, O’Neill, R.T. (2002, June). A perspective on exposure-response relationships. Paper presented at the annual meeting of the American Association of Pharmaceutical Scientists, Arlington, VA. Retrieved October 25, 2006 from  The data presented here were collected as part of an evaluation conducted by Program Evaluation and Educational Research Associates, Inc., under the supervision of Michael Duffin. The project was undertaken with the support of the Place-Based Education Evaluation Collaborative (PEEC). For more information about PEEC go to:  An electronic version of this poster can be downloaded from: