Using Qualitative Comparative Analysis to Identify Evidence-based Best Practices in Program Evaluation Brandy Farrar 1 and Jennifer Craft Morgan 2 1 American.

Slides:



Advertisements
Similar presentations
New Challenges in M&E Lets go. Scaling Up Monitoring & Evaluation Strategic Information PROGRAM GUIDANCE RESULT NEEDS OPPORTUNITIES Resources New directions.
Advertisements

Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
I.T. Works Principal Investigator: Peter D. Blanck, Ph.D., J.D. Project Director: James L. Schmeling, J.D. Co-Investigator: Kevin M. Schartz, Ph.D., M.C.S.
CONSUMERS HAVE A RIGHT TO EXPECT AND PROFESSIONAL PRACTICE WOULD REQUIRE: –That to the extent possible, the service provider knows the consumer’s bio-psycho-
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Neighbor to Neighbor Lessons learned from a community- based HIV testing partnership: The HIV Minority Community Health Partnership Presented at American.
An Evaluation Model to promote linkages between community-based public health practice and academia.
SPRING/SUMMER 2009 IPPE & PEMS Preceptor Training.
California Regional Workforce Funders Collaboratives and Innovation: Emerging Lessons from Experience to Date Key Informant Interviews with Funders and.
Harper College 1200 West Algonquin Road, Palatine, IL Visit our website: inam.net.
Program purpose: American Indians/Alaskan Natives (AI/AN) have the lowest rates of college retention and graduation in the United States. These students.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Performance Appraisal
Online Career Assessment: Matching Profiles and Training Programs Bryan Dik, Ph.D. Kurt Kraiger, Ph.D.
Social Science Research and
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Using the ICF as a Framework for Washington Group Measures Barbara M. Altman Jennifer Madans Elizabeth Rasch National Center for Health Statistics.
Providing Leaders with the Missing Link: Making Customer Information that is Linked to the Bottom Line Part of Leaders’ 360-Degree Feedback Jim Miller.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Work Survey Instrument Revision for Case Management Work Jane Christianson, RN; MSN L. Sue Davis, RN; PhD.
The Impact of a Faculty Learning Community Approach on Pre-Service Teachers’ English Learner Pedagogy Michael P. Alfano, John Zack, Mary E. Yakimowski,
© Center for Companies That Care, 2007 College Matters! AIM High Education Initiative The Aim High Education Initiative, a partnership among urban schools,
Imagine that you're one of the estimated 36 million adults in the U.S. who has limited skill levels. You want to improve your skills and get a better.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Communication Degree Program Outcomes
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Curriculum Design. A Learner Centered Approach May, 2007 By. Rhys Andrews.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Successful Contract Training: A Grounded Theory for a Sustainable Business Model presented at the National Council for Workforce Education Conference by.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
Bochum, June 2013 Luk Zelderloo Secretary General EASPD
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
JENNIFER CLEARY AND MICHELLE VAN NOY HELDRICH CENTER FOR WORKFORCE DEVELOPMENT, RUTGERS, THE STATE UNIVERSITY OF NEW JERSEY NATIONAL COUNCIL ON WORKFORCE.
Sara Teitelbaum, Tom Beckley, Solange Nadeau An Assessment of the Practices and Outcomes of Community Forestry in the Canadian Context.
From Output to Outcome: Quantifying Care Management Kelly A. Bruno, MSW and Danielle T. Cameron, MPH National Health Foundation Background Objectives Methods.
FIRST STEP for Success Texas AgriLife Extension Service New Employee Development.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
Copyright © 2003 by The McGraw-Hill Companies, Inc. All rights reserved.
JEFF ALEXANDER The University of Michigan The Challenge and Promise of Delivery System Research: A Meeting of AHRQ Grantees, Experts, and Stakeholders.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Single Parent Employment Support Program (SESP) SESP Presentation Presentation to Welfare to Work: The Next Generation Forum.
Student volunteers and the volunteer- involving community organisations vinspiredstudents research.
A Practical Approach to Assessment in Service-Learning North Shore Community College.
Session Summary Improving the Jobs of Direct Care Workers in Long Term Care: Findings from the Better Jobs Better Care Demonstration.
Factors impacting implementation of a community coalition-driven evidence- based intervention: results from a cluster randomized controlled trial Methods.
0 Emerging Findings from the Employment Retention and Advancement (ERA) Evaluation Gayle Hamilton, MDRC Workforce Innovations 2005 Conference.
Evaluation of the Quebec Community Learning Centres: An English minority language initiative Learning Innovations at WestEd May 21, 2008.
Today.. Overview of my realist synthesis Reflections on the process
Boston | Geneva | Mumbai | San Francisco | Seattle | Washington FSG.ORG Collective Impact July 2013 Session for: United Way of San Diego County.
Keeping Up With Demand: Measuring Labor Market Alignment in TAACCCT Programs Michelle Van Noy and Jennifer Cleary TCI Research Symposium: Evidence of What.
Prior Learning Assessment (PLA) Pilot Project At VSU What is it and how is it important to VSU and our students?
WELCOME HEALTH SCIENCE ACADEMY STUDENTS! MS. HULSEY, RN, BSN, WCC, CPHT ATHENS DRIVE CENTER FOR MEDICAL SCIENCES & GLOBAL INITIATIVES MAGNET HIGH.
Measuring Mathematics Self Efficacy of students at the beginning of their Higher Education Studies With the TransMaths group BCME Manchester Maria.
‘Education has for its purpose not the imparting of particular knowledge but the strengthening of mental faculties’ (Kant c1770)
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
Navigating the Proposal Process Keys to Successful Submission.
Introduction The majority of telemedicine interactions in Ontario, outside of a hospital, take place among a small number of medical specialities, including.
Symposium CLIENT –PROVIDER RELATIONSHIP AS AN ACTIVE INGREDIENT IN DELIVERY OF SOCIAL SERVICES Organizer: Jeanne C. Marsh, PhD, MSW University of Chicago.
Internal assessment criteria
Lessons from the Breaking Through Initiative
Assessment: Measuring the Impact of our Programs on Students
Measuring Data Quality and Compilation of Metadata
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
Disability Program Navigator Training A Joint Initiative of the U. S
Presentation transcript:

Using Qualitative Comparative Analysis to Identify Evidence-based Best Practices in Program Evaluation Brandy Farrar 1 and Jennifer Craft Morgan 2 1 American Institutes for Research 2 Georgia State University

Background Challenge: Evaluating programs that have component programs that are non-standard or varied in their approach to leveraging key program outcomes. Strategy: Comparative case study or process evaluation.

Background Limitations of traditional approaches: 1.Difficult to identify which programmatic conditions (or combinations of conditions) have influence on program outcomes. 2.The likelihood that multiple co-occurring programmatic conditions lead to a desired outcome complicates attempts to identify evidence-based best practices.

Objective The purpose of this presentation is to illustrate the utility of using Qualitative Comparative Analysis in program evaluations involving mixed-method case study data.

Purpose: Create education and career advancement for frontline health care workers Partnerships Health care employers Educational Institutions Community Organizations

Sample program description Medical Assistant Medical Terminology Medical Interpreting I Medical Interpreting II Job Shadowing Certified Auxiliary Interpreter 9 hours college credit Wage premium

Data and Methods  Case study methodology  Process and outcome evaluation  Mixed methods (e.g. surveys, interviews)  Longitudinal  Formative and summative reports to sites and funder

Data Table 1 Jobs to Careers Evaluation Data Elements Type # of data elements# of people Semi-structured interviews with key informants Focus groups with frontline workers39282 Focus groups with frontline supervisors33184 Organizational surveys100N/A Implementation assessment tool17N/A Frontline worker survey - baselineN/A1129 Frontline worker survey - follow upN/A576 Local economic indicators17N/A

Qualitative Comparative Analysis  Allows researchers to quantify and empirically test qualitative data without losing its richness and substantive complexity  Allows evaluators to assess significance, magnitude, and parsimony of qualitatively assessed concepts

Qualitative Comparative Analysis 1.Identify the causal conditions that are likely to be related to the outcome of interest Condition Career self-efficacy Barrier reduction Career guidance Experiential learning Socio-emotional support Positive role models Organizational learning culture Community context

Qualitative Comparative Analysis 2. Calibration Condition IndicatorDescriptionCalibration metric Barrier reduction Educational release time Paid work time to attend class or complete coursework 1 = Significant paid release time that includes study time and was widely granted by supervisors Experiential learning N/A Extent to which experiential learning was used in curriculum delivery. Examples of experiential learning include: discussions of real clients in class; class assignments that have to be performed within the context of work responsibilities; credit for prior learning/work experience; precepting; job shadowing, etc. 1 = Experiential learning occurred more often than traditional delivery methods such as lecture Organizational learning culture N/AAggregated mean based on a scale of 11 survey items (see Appendix I) 1 = Aggregate mean greater than or equal to 3.0

Qualitative Comparative Analysis Sample Raw data table casebarrierguidanceexperientialsupportcohortorgculturecommunityefficacy A B C D E F G H I J K L M N O P Q R

Qualitative Comparative Analysis 3. Analysis - Tests of consistency and coverage  If a particular combination of conditions is present when the outcome of interest is also present in the vast majority of cases displaying this set of conditions (80% or more of the time), consistency is high  If a particular set of conditions is one of a few, versus one of many sets of conditions that are present when the outcome is also present, coverage is high

Qualitative Comparative Analysis 3. Analysis – Tests of consistency and coverage Consistency Coverage Case Outcome Combinations * Blue combination has best consistency * VS. Better coverage for blue combination

Qualitative Comparative Analysis 4. Analysis – Parsimony  Necessary versus sufficient combinations of conditions associated with the outcome  A condition is necessary if it is present in all instances of the outcome.  A condition is sufficient if it can produce an outcome, but is not the only condition that can produce the outcome.

Combinations of causal conditions associated with high career self- efficacy  No necessary conditions  4 combinations of sufficient conditions

Combinations of causal conditions associated with high career self- efficacy

Discussion Need a comprehensive approach to engendering career self-efficacy But one size does not fit all Community context was not as important as we suspected Focused efforts in a few, rather than all, areas can still produce high career self-efficacy among program participants

Conclusion  QCA allows for a more precise specification regarding:  Importance and relevance of qualitatively assessed process evaluation constructs  How constructs work together to facilitate or constrain desired program outcomes  What conditions are necessary and which are sufficient helping resource-strapped program planners to develop effective, lower resourced programs

Questions? For further information contact: Brandy Farrar (202)