Download presentation
Presentation is loading. Please wait.
Published byDorothea Langenberg Modified over 6 years ago
1
Data is your friend: how to incorporate evaluation into your research
Allison Karpyn, Ph.D. Associate Director & Assistant Professor of Education, Behavioral Health and Nutrition May 23,2016
2
Goals Have a meaningful engaged discussion about research and evaluation Think more about the intersection of research and evaluation Encourage process evaluation data collection in research Feedback loops, reporting
3
Evaluation Evokes Different Feelings
When I say evaluation to you, what feelings does it evoke? Write on your sticky note.
4
What is evaluation? Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness (Am Eval Asoc) "the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of ... programs.“ (Rossi & Freeman)
5
Issues in Evaluation
6
Evaluation or research?
Not a clear answer “Social science research does not establish standards or values and then integrate them with factual results to reach evaluative conclusions. In fact, the dominant social science doctrine for many decades prided itself on being value free. So for the moment, social science research excludes evaluation.” (Scriven) “Research aims to produce knowledge and truth, useful evaluation supports action” (Cronbach Suppes) “ In short, how to define evaluation and what to call a particular evaluation are maters for discussion, clarification and negotiation. Therefore I find it useful to distinguish research fro evaluation to facilitate this very discussion.” (Patton)
7
What is evaluation?
8
Turn to neighbor… Share your most recent evaluation experience!
What was most challenging? What did you need to know or have that you did not?
9
Public Health Models to Guide Program Planning and Evaluation
CDC’s Evaluation Framework is a concise, thorough approach that can be easily understood and applied. The framework, which has been adapted for use with CoPs, emphasizes six logical steps and can be used as a starting point for CoPs evaluation. 25( Each of the six steps is described in detail below. Engage Stakeholders The evaluation cycle begins by engaging stakeholders (i.e., the persons or organizations with an investment in what will be learned from an evaluation and what will be done with the knowledge). Public health work, including CoPs, involves partnerships; therefore, any assessment of a public health program requires considering the value systems of the partners. Stakeholders should be engaged in a way that ensures their perspectives can be heard and understood. When stakeholders are not engaged in this way, evaluation findings might be ignored, criticized, or resisted because they do not address the stakeholders' questions or values. After becoming involved, stakeholders help to execute the remaining steps. Describe the Community Community descriptions convey the mission and objectives of the CoP being evaluated. Descriptions should be sufficiently detailed to ensure understanding of the CoP’s goals and strategies. The description should discuss the CoP's capacity to effect change, its stage of development, and how it fits into the larger public health community. CoP descriptions set the frame of reference for all subsequent decisions in an evaluation. The description enables comparisons with similar CoPs and facilitates attempts to connect community components to their effects. Moreover, stakeholders may have differing ideas regarding CoP goals and purpose. Evaluations done without agreement on the community definition are likely to be of limited use. Sometimes, negotiating with stakeholders to formulate a clear and logical description will bring benefits before data are available to evaluate CoP effectiveness. Focus the Evaluation Design The direction and process of the evaluation must be focused to assess the issues of greatest concern to stakeholders, while using time and resources as efficiently as possible. Not all design options are equally well-suited to meeting the information needs of stakeholders. After data collection begins, changing procedures might be difficult or impossible, even if better methods become obvious. A thorough plan anticipates intended uses and creates an evaluation strategy with the greatest chance of being useful, feasible, ethical, and accurate. Gather Credible Evidence Persons involved in an evaluation should strive to collect information that will convey a well-rounded picture of the CoP and be seen as credible by the evaluation’s intended audience. Information (i.e., evidence) should be perceived by stakeholders as believable and relevant for answering their questions. Such decisions depend on the evaluation questions being posed and the motives for asking them. Credible evidence strengthens evaluation conclusions and the recommendations that follow. Justify Conclusions Evaluation conclusions are justified when linked to the evidence gathered and judged against agreed-upon values or standards set by the stakeholders. Stakeholders must agree that conclusions are justified before they will use the evaluation results with confidence. Ensure Use and Share Lessons Learned Lessons learned during the evaluation should automatically translate into informed decision-making and appropriate action. To make sure the evaluation findings are used correctly, deliberate effort is needed to ensure the evaluation processes and findings are disseminated and interpreted appropriately. Preparing for use involves strategic thinking and continued vigilance of the changing environment, both of which begin in the earliest stages of stakeholder engagement and continue throughout the evaluation. Assess the Quality of your Evaluation Activities Utility: Does the evaluation serve the information needs of intended users? Feasibility: Is the evaluation realistic, prudent, diplomatic, and frugal? Propriety: Has the evaluation been conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results? Accuracy:Will the evaluation reveal and convey technically adequate information about the features that determine worth or merit of the CoP?
10
CDC Evaluation Framework
12
Ecological Model: Changing Systems
13
The Logic Model Logic is extremely good for action planning.
14
How can evaluation be integrated into research?
15
How are data used for action and impact?
New Stanford Social Innovation Review Article
16
Article: Common Evaluation Questions
How much did we spend? How much did we do? How much did it matter?
17
Thinking with a colleague..
Spend a few minutes talking about how data about: - How much we spend - How much we do - How much it matters Could these issues be better leveraged in your work?
18
Share thoughts
19
Final Thoughts Evaluation is about use – research sometimes is not
Ensuring information comes full circle and makes sense to all who are involved is important Evaluation and Research Use can be scary, and not welcome Advocacy needs data – how do we get the data we have into the hands of policy-makers? Problem centered thinking vs. Promoting what is working
20
Allison Karpyn Karpyn@udel.edu 610-909-3154
Thank you! Allison Karpyn
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.