Conducting Successful Program Evaluations: Researchers Reveal Their Insights Sue Palsbo, PhD Thilo Kroll, PhD NRH Center for Health & Disability Research.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Donald T. Simeon Caribbean Health Research Council
Medicaid Managed Care: Health Care Benefits and Barriers for People with Disabilities Gwyn C. Jones, Ph.D. National Association of State Health Plans Annual.
COMMUNITY RESOURCE MAPPING Train the Trainer MAST - NH December 15, 2006 Facilitated by: Kelli Crane.
1 Wisconsin Partnership Program Steven J. Landkamer Program Manager Wisconsin Dept. of Health & Family Services July 14, 2004.
Health Alliance of Wisconsin Grow a backbone: Building collective impact backbones at the state and local level Abby Collier,
The Hilltop Institute was formerly the Center for Health Program Development and Management. Non-Emergency Medical Transportation Study July 24, 2008 Cheryl.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Generating evidence for change: Implementing the post-ICIUM research agenda Dennis Ross-Degnan, ScD Harvard Medical School and Harvard Pilgrim Health Care.
Home Career Counseling and Services: A Cognitive Information Processing Approach James P. Sampson, Jr., Robert C. Reardon, Gary W. Peterson, and Janet.
Dollars and Sense of Rehab Part 2: Physician Payment Systems Sue Palsbo, PhD, MS NRH Center for Health & Disability Research.
Proposed Cross-center Project Survey of Federally Qualified Health Centers Vicky Taylor & Vicki Young.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Step 6: Implementing Change. Implementing Change Our Roadmap.
The County Health Rankings & Roadmaps Take Action Cycle.
11/8/2006 Benefits and Work Incentives Planning: System Development NCHSD Fall Conference November 8, 2006 Damon Terzaghi: Oregon Competitive Employment.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Too expensive Too complicated Too time consuming.
The 2002 Commonwealth Fund International Health Policy Survey Adults with Health Problems The Commonwealth Fund Harvard University School of Public Health.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
CONNECTICUT HEALTH FOUNDATION: Update on Evaluation Planning for the Strategic Plan.
A Framework to Guide Full Service Partnerships for Adults Maria Funk, Ph.D. Mental Health Clinical District Chief ASOC Countywide Programs Los Angeles.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
1 The Patient Perspective: Satisfaction Survey Presented at: Disease Management Colloquium June 22, 2005 Shulamit Bernard, RN, PhD.
Collecting and Using Cost Data in the Orange County System of Care AEA – October 17, 2013 Brad R. Watts, Senior Research Scientist, Center for Human Services.
2008 CHIP & aB Disenrollment Survey April 16, In 2008, Market Decisions interviewed 405 prior adultBasic enrollees and parents or guardians of 801.
Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented.
1 Department of Medical Assistance Services Stakeholder Advisory Committee June 25, 2014 Gerald A. Craver, PhD
Delivery System Reform Incentive Payment Program (DSRIP), Transforming the Medicaid Health Care System.
1 Department of Medical Assistance Services Stakeholder Advisory Committee March 19, 2014 Gerald A. Craver, PhD
Learning Objectives by Karen McNamara Define ingredients to a successful collaborative How to establish a structure for a collaborative partnership Recognize.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
NV AHEC – How It All Began IMIA Conference Boston, MA October 10 – 12, 2008 Dallice Joyner, M.Ed.
Another Perspective on PRO Content in Clinical Practice Ron D. Hays, Ph.D. University of California, Los Angeles June 25, 2007.
RHODA MEADOR, PHD ASSOCIATE DIRECTOR OUTREACH AND EXTENSION, COLLEGE OF HUMAN ECOLOGY CORNELL INSTITUTE FOR TRANSLATIONAL RESEARCH ON AGING Project Home.
Age & Disabilities Odyssey Conference Tuesday, June 21, 2011.
LEARNING FROM PRACTICE: OPENING THE BLACK BOX OF CONSULTING ENGAGEMENTS Supporting material: SMS Conference Dr. Paul N. Friga.
The Interactive Model Of Program Planning
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Presentation to: Presented by: Date: Developing Shared Goals in Public Health, Coalition Building, and District Partnership Success Chronic Disease University.
1 Department of Medical Assistance Services Stakeholder Advisory Committee November 6, 2013 Gerald A. Craver, PhD
Evaluation of the Quebec Community Learning Centres: An English minority language initiative Learning Innovations at WestEd May 21, 2008.
COMMUNITY RESOURCE MAPPING Train the Trainer MAST - NH December 15, 2006 Facilitated by: Kelli Crane.
Alexandra B. McGoldrick Director, Central Grants Office City of Bridgeport Bill Finch Mayor.
Community Needs Assessment LIS 490CEL Oct 6, 2009 Jaime Schumacher.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Increased Academic Success Motivation Commitment College Prep Skills Academic Vision (Goals) Life Skills Academic Support Student Engagement Content Relevancy.
The Power of Health Insurance On the Lives of The effectiveness of the Vermont Health Access Plan Catherine Hamilton, Ph.D. New York University Blue Cross.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Evaluating the Impact of the National Tobacco Quitline Network Paula A. Keller, MPH Linda A. Bailey, JD, MHS Shu-Hong Zhu, PhD Michael C. Fiore, MD, MPH.
Results from an Evaluation of the Minnesota Disability Health Options (MnDHO) Program: Survey and Focus Group Findings Phillip W. Beatty, PhD Thilo Kroll,
The Hospital CAHPS Program Presented by Maureen Parrish.
Business Leadership Network: Program Evaluation in Public Health March 28, 2016.
Listening to Our Learners: Designing a Residency Evidence- based Medicine Curriculum Using a Learner-Driven Method Drew Keister, MD 55 MDG Family Medicine.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Using Mixed Methods to Evaluate Patient and Family Perceptions of an End of Life Program Sarah Cote, MA, Patricia Housen, PhD, Yanyang Liuqu, MA, Nelson.
Developing Community Assessments
Best Practice Strategies for Maximizing Clinic Efficiency: Part 1
Presentation transcript:

Conducting Successful Program Evaluations: Researchers Reveal Their Insights Sue Palsbo, PhD Thilo Kroll, PhD NRH Center for Health & Disability Research

Why Do an Evaluation? Does the program work? Why? How? Who benefits? Who loses? Who pays? How does it compare to other programs?

Evaluation Process Analyze data Determine the Outcome Collect data State the Performance Measure

Who Cares? Interested in different outcomes Different outcomes require different methods Different methods require different data sources

Funding Considerations Public or private? Contracts or grants? Single source or multiple? Lead time Scope and /or size Funding organization’s priorities

Diverse Stakeholders Diverse interests for outcomes Requires diverse set of measures quantitative qualitative Blended together for total picture

Evaluation Types Formative –Immediate feedback allows program modifications and mid-course corrections Summative –Post-program perspective –Comprehensive overview of outcomes –Total impact of the intervention or program

Design Considerations Unit of analysis –Person, Provider, State Funding for the study Single vs. multi-method approach Short-term vs long-term outcomes Legacy impacts e.g. sustainability and duplication

How to Select the Evaluators Your evaluation identifies the questions and methods; these indicate the skill set you need In-house or external? Sole source contracting Competitive bidding

Example

Stakeholders

Evaluation Funding

Evaluation Design Consortium model Prospectively designed Contemporaneous (formative) Highly leveraged 3-year cumulative

MnDHO Goals 1. Create and maintain satisfaction 2. Promote overall well-being of enrollees 3. Meet cost and utilization goals.

Measurement Domain 1. Satisfaction 2. Quality of care 3. Utilization and patterns of care 4. Costs and rate setting

Measurement Selection Criteria 1. Use existing data 2. Avoid duplication of information. 3. Avoid recall bias. 4. Minimize respondent burden.

Preliminary Findings From Mixed-Methods Evaluation Longitudinal CAHPS®-based survey –At MnDHO enrollment –1 year post-enrollment Focus groups –MnDHO enrollees –Eligible people who chose not to enroll

Survey Content General satisfaction Care coordination experience Access to a wide variety of health and long- term-care services Self-directed care Quality of interactions with health care providers Quality of interactions with MA or AXIS staff

Longitudinal Survey 100 MnDHO participants – 35 enrollees have completed baseline and follow-up surveys. Baseline –Health care experiences in the year before enrollment. Follow-up –Health care experiences in MnDHO.

4 Focus Groups October, MnDHO enrollees 2 fee-for-service Medical Assistance program

Care Coordination. In the year before / after you enrolled in AXIS/UCare Complete, did anyone help manage the health care services you received from different doctors, nurses, therapists, PCAs,or equipment providers?

Focus Group Statements The MnDHO / AXIS Experience: –“She’ll (health coordinator) line me up with my appointments and stuff. She’ll find my therapists and my physical equipment…, all that stuff.” –“My stress level has been relieved somewhat. I’m able to focus on more vocational and future issues, as opposed to the day-to-day healthcare issues……”

Focus Group Statements The fee-for-service experience: “I spend most of my time on the phone calling people, setting up appointments, or trying to get services myself, and that is very tiring. I get exhausted because I do have MS.” “Our experience was so bad we had to hire our own coordinator. It seems like if the person is stable……, people don’t return their calls, people don’t seem to care……”

Self-direction in health care. In the year before / after you enrolled in AXIS/UCare Complete, were you involved as much as you wanted in making decisions about your health care?

Focus Group Statement “My healthcare coordinator, she always involves me. It is our lives. They can give us all the information, but ultimately it comes down to us, what we want to do…... it comes to us making the final decision…... The healthcare coordinators are always there to give you the pros and cons.”

Recommendations Its worth the time and money to do a program evaluation Do a formative evaluation so you can make changes in real time Important lessons from a summative evaluation can tell you what worked and what didn’t, to improve the next effort Mixed methods provide more information together than a single method