Program Evaluation: A Pseudo-Case Study

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Setting internal Quality Assurance systems
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
RESEARCH CLINIC SESSION 1 Committed Officials Pursuing Excellence in Research 27 June 2013.
Donald T. Simeon Caribbean Health Research Council
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Corry Bregendahl Leopold Center for Sustainable Agriculture Ames, Iowa.
Project Monitoring Evaluation and Assessment
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Ray C. Rist The World Bank Washington, D.C.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
Quality evaluation and improvement for Internal Audit
Evaluation. Practical Evaluation Michael Quinn Patton.
Standards and Guidelines for Quality Assurance in the European
UOFYE Assessment Retreat
Competency Assessment Public Health Professional (2012)-
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Evaluating NSF Programs
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Evidence of Success: Assessing Student Learning Outcomes in International Education Dr. Darla K. Deardorff Association of International Education.
Program Evaluation and Logic Models
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Outcome Based Evaluation for Digital Library Projects and Services
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evaluation of Efforts to Broaden STEM Participation: Results from A Two-Day Workshop Planning Committee: Bernice Anderson Elmima Johnson Beatriz Chu Clewell.
Evaluating a Research Report
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
NSF IGERT proposals Yang Zhao Department of Electrical and Computer Engineering Wayne State University.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Quantitative and Qualitative Approaches
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
“Outcomification”: Development and Use of Student Learning Outcomes Noelle C. Griffin, PhD Director, Assessment and Data Analysis Loyola Marymount University.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Using Logic Models to Create Effective Programs
Continual Service Improvement Methods & Techniques.
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
Session 2: Developing a Comprehensive M&E Work Plan.
Session 5: Selecting and Operationalizing Indicators.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Development of Gender Sensitive M&E: Tools and Strategies.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Evaluation What is evaluation?
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Logic Models How to Integrate Data Collection into your Everyday Work.
Research Problems, Purposes, & Hypotheses
Program Evaluation Essentials-- Part 2
Changing the Game The Logic Model
Presentation transcript:

Program Evaluation: A Pseudo-Case Study Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation National Institute of General Medical Sciences MORE Program Directors Meeting Colorado Springs, Colorado June 12, 2009

We’re looking for……

But we often get…..

An evaluation plan should include….. Program description Purpose & rationale for evaluation Evaluation Design Data Collection & Analyses Products of evaluation & their use Project Management Budget estimate

Long-term Goal Example Program Description Include program goals and baseline data Program goals are the intended effects of a program Activities should be organized to achieve specific goals Types of goals: Process, intermediate, long-term Long-term Goal Example Weak: To train a diverse biomedical workforce Strong: To significantly increase the # of URMs graduating with baccalaureate STEM degrees and persisting through to graduate study Program goals are the intended effects of a program, as noted in authorizing legislation or other documents written when the program was established. In some cases, additional program goals that are not listed in official documents may be included in the evaluation. For a program that is not yet established, the program goals should summarize the anticipated effects of the new program. There are three types of program goals: Process goals – Goals that describe how the program should operate and what levels of output should be expected. Intermediate goals – Goals that describe specific outcomes the program should achieve in the near term. Long-term goals – Goals that describe the ultimate outcomes the program is designed to achieve.

Purpose & Rationale of Evaluation Type of Evaluation Needs assessment, Feasibility study, Process evaluation Outcome evaluation? Timing Why is right now the time to conduct an evaluation? Program Maturity Is it reasonable to expect certain levels of output or measurable changes at this stage?

Evaluation Design An evaluation design should include… Study questions Target population Key variables Conceptual framework if applicable

Study Questions What are the key questions the evaluation is designed to answer? Key questions link to stated purpose of evaluation and program activities Include any hypotheses that will be tested Examples How is the training program being implemented? (process) What factors have inhibited the achievement of goals? (process)

Examples What has been the impact of the training program on the participants? (outcome) What is the quality and character of the mentorship that is being provided in the program? (outcome) How and to what extent does the program increase student skills and knowledge about laboratory research? (outcome)

Institutional Impact Questions How has the training program affected your institution? Institutions have structures which are defined by formal rules (laws, regulations, policies) and informal rules (culture, tradition, trust, implied codes of conduct) that shape people’s behavior Where might we see institutional change? Curriculum development Policies and practices Services and support offered to students and faculty Increased faculty awareness and responsibility for diversity Impact on students not supported by the training program Reported Effects on Institutions – NSF Louis Stokes Alliances for Minority Participation Program LSAMP has affected institutions in multiple ways. Interviewees report that LSAMP has enhanced institutional capacity for student talent development, and brought about changes in institutional culture as well as in institutional policies and practices. Through LSAMP services and support, institutions assist students in their efforts to continue through the STEM pipeline. All three case study Alliances, along with other Alliances, report increases in minority and nonminority STEM enrollment and STEM degree attainment. In all three of the case studies, interviewees observed a change in institutional culture. For example, some COAMP interviewees spoke about greater faculty awareness, understanding, and responsibility for diversity. In the case of FGAMP, some credited the project with increasing dialogue among faculty about effective teaching and learning strategies, and the opening up of research labs to undergraduates. Similarly, some of the NYC LSAMP interviewees spoke about how more professors are now seeing research as an integral part of the undergraduate experience, and how institutions are placing a greater focus on affirming the equal opportunity clause. In addition, across the three case study sites, significant changes in practice and policies are attributed to LSAMP. For instance, projects such as the NYC LSAMP are heavily pursuing course restructuring; over 18,000 students are reported to have enrolled in LSAMP restructured courses. Data drawn from the telephone interviews show that over half of the LSAMP projects are engaged in course reform efforts. The case study data reveal the varying nature of LSAMP-inspired changes taking place across various partner sites, including new emphasis on student participation in research grant proposals, the pursuit of research expositions by individual schools, development of a schoolwide research opportunity database, improvements in advisement procedures, creation of a standardized campus scholarship/funding procedure, and enhancement of community outreach and recruitment. Some participants noted that LSAMP serves as a “great recruitment tool” for schools and that the prestige and recognition it brings help participating institutions to secure funding to bring other intervention programs to campus.

Target Population What groups or groups do you need information about in order to answer the study questions? People or institutions? Size, general characteristics, any subgroups, etc. ? Examples Trainees and students Project managers Academic coordinators Faculty High-ranking administrators

Key Variables What specific information is needed to answer the study questions? Examples Program resources – funding, staffing, infrastructure Population characteristics – demographics Program activities – operations, processes, other activities Program goals, performance measures, comparison measures Program goal: Provide training opportunity for participants Performance measure: Minimum # workshops held per year Comparison measure: At least 4 workshops held per year (recognized standard of performance) External factors – factors beyond control of the program that may influence program success

Conceptual Framework Consider developing a conceptual framework (logic model) to illustrate how the program is supposed to achieve its goals What is a logic model? A logic model is a systematic and visual way to present and share your understanding of the relationships among the resources your have to operate your program, the activities you plan, and the changes or results you hope to achieve W.K. Kellogg Foundation Logic Model Development Guide (www.wkkf.org/Pubs/Tools/Evalaution/Pub3669.pdf)

Model of a Training Program Resources Activities Impact (Inputs) (Outputs) (Outcomes) What are the changes or benefits? What is invested? What is done? What is invested? Workshops & Seminars Short term Knowledge Skills Attitudes Intermediate Behaviors Practices Long term Enter PhD Program Faculty & Staff Money Training in scientific methods Equipment & Technology Mentoring by faculty member Research base

Conceptual Framework: Why should we use one? Increases understanding of program Provides a common language & framework Links activities to results Helps identify variables to measure Reflects group process and shared understanding Strengthens case for program investment The bane of evaluation is a poorly designed program. Ricardo Miller, Director, Evaluation Unit WK Kellogg Foundation

Data Collection and Analysis Will you use new data or secondary data? Will it be quantitative, qualitative or mixed? Are there appropriate comparison groups? How will you collect the data? Are there ethical or IRB considerations? What are the limitations of the data?

Typical Data Collection Strategies Method Pro Con Bibliometric analysis Quantitative; useful in aggregate as tool to assess quality of medical research Measures only quantity; can be artificially influenced Case studies Provides understanding of interaction of various influences on research process Cases not necessarily representative within or across programs Database extractions, Document reviews Useful for analyzing archival data: databases, program records, literature review, etc Records incomplete

Typical Data Collection Strategies (Cont.) Method Pro Con Expert panel Useful in research fields, especially when few quantifiable indicators exist Difficult to obtain systematic, objective assessment Focus groups Provides understanding of attitudes and thoughts on subject; group dynamic can help elicit honest responses Results cannot be statistically generalized to larger populations; not quantifiable Interviews Offer insight from perspective of specific program roles and expertise Limited perspective; time-intensive Surveys Generate statistically reliable data – rating services, behavior, demographics, etc Requires statistically representative sample & adequate response rate

Project Management: Who Participates? Role Contributions Challenges Program manager and staff Program knowledge Vested interest Evaluator Evaluation expertise Independence Limited program knowledge Evaluation Advisory Committee Program familiarity Organizational context Senior Leader/ Decision-maker Resources Contributions and challenges are not exhaustive – just key highlights Organizational context includes knowledge of budget constraints, the NIH and/or IC scientific portfolio, external pressures (Congress, advocacy groups). Example – a program might be doing well, but if it is duplicative with another program, it may need to be modified or terminated. Organizational context is critical to using evaluation results effectively. Evaluator – internal vs. external

“Rule of thumb” – 10% of project’s total budget Budget Estimate “Rule of thumb” – 10% of project’s total budget Common Pitfalls Failure to consider in up-front planning Lack of resources – for analysis & interpretation Lack of time – be realistic & consider time for each step Qualitative evaluation – more costly to implement

Products of Evaluation What reports or products are planned? Executive summary & final report Briefings – for students, faculty, & administrators How will results be used?

The Evaluation Design Matrix: A Tool for Discussion Key Question(s) Information Required Information Source(s) Data Collection Methods Data Analysis Methods Limitations Conclusions WHAT DO YOU WANT TO KNOW? WHAT DO YOU NEED TO ANSWER THE QUESTION? WHERE ARE YOU GOING TO GET IT? HOW ARE YOU GOING TO GET IT? WHAT WILL YOU DO WITH IT ONCE YOU GET IT? WHAT CAN'T YOU DO (CAVEATS)? WHAT CAN YOU SAY? Clear and specific Measurable Doable Key terms defined Scope Timeframe Population Program goals Evidence Program criteria Participant rates Cost information Funding levels Program officials or participants External stakeholders documents Databases Journals Structured interviews Focus groups Structured surveys Case studies Data extractions Document retrieval Descriptive statistics Inferential statistics (T-test, regression) Cost/ benefit analysis Qualitative analysis Data quality or reliability Access to records Staffing/ funding constraints Generalize Unexpected finding Anecdotal information Precise statements about sample Impact of program changes

Why do we ask for program evaluation? ….................Because we’re accountable What gets measured get’s done If you don’t measure success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it If you can demonstrate results, you can win public support Osborne & Gabler (1992) in Reinventing Government As summarized by Ellen Taylor Powell -U of Wisc Extension