Download presentation
Presentation is loading. Please wait.
Published byLouisa Richards Modified over 8 years ago
1
New Directions in Community Strengthening LCSA Conference September 2011 Sue Leahy
2
Agenda 2 1.ARTD – directions in good evaluation 2.Community Services – evaluation framework and strategy 3.LCSA – monitoring data collection and reporting 4.Discussion (10 mins) 5.Q & A (20 mins)
3
A new direction 3 CS evaluation framework and strategy set new direction for monitoring and evaluation Previously ad hoc, no aggregate data, challenge of measuring outcomes not addressed New approach to build a stronger evidence base Framework and monitoring system based on program performance matrix Informed by key literature Collaborative/ partnership approach
4
Lets talk evaluation...
5
Evaluation is complex a technical and political process dealing with stakeholders with different expectations about methods and evidence in an increasingly complex public policy context and not just evaluation studies but frameworks, strategies, designs, plans
6
We wish...
7
Early documented evaluation Genesis 1 1 In the beginning God created the heaven and the earth... 31 And God saw every thing that he had made, and, behold, it was very good. And the evening and the morning were the sixth day.
8
Some influences over time 1960US Great Society. Donald Campbell Experimental 1970Eval not used MQ Patton Utilisation focus 1980Paradigm warsQuant vs Qual 1990New public management Performance indicators 2000 Look inside the black box Program logic, theory based eval “Nothing works” Tilley, Pawson Realist evaluation 2010Complexity MQ Patton Systems thinking, Developmental eval
9
Different approaches and methods
10
Evaluation is just one form of evidence Synthesis of evidence Research Consultation Political research Program evaluation Killer Facts!
11
Monitoring, evaluation or research? MonitoringEvaluationResearch Purpose Accountability Assurance program is on track/ early warning signs Accountability Inform decisions Determining worth or value New knowledge MethodsAdministrative data collection Systematic inquiryScientific inquiry Reporting For funding bodies For service providers To agreed stakeholders Scientific publication
12
Defining evaluation The systematic assessment on the basis of evidence of the appropriateness, effectiveness, efficiency and/or economy of a program. the systematic acquisition and assessment of information to provide useful feedback about some object (Trochim 2006) the production of knowledge based on systematic enquiry to assist decision-making about a program (Owen 2009) Etc etc
13
So program evaluation is Systematic –> research methods –> data from monitoring Judges –> assessment based on evidence Applied -> to public policy purposes
14
But evaluation is also a social and political activity A negotiated process between stakeholders with varied and competing interests in the program different evaluation purposes varied views of credible evidence diversity of orientations, designs, methods, disciplines
15
Common features of evaluations Questions about efficiency/ effectiveness/ appropriateness Scope shaped by the purpose and questions Reach conclusions – judgements using evidence and evaluative arguments
16
Contested features of evaluations Methods - qualitative vs quantitative What constitutes credible evidence OR when are experimental methods (RCTs) feasible and worthwhile? Orientations – experimental, constructivist, theory-based, realist, pragmatic
17
More complex policy world Wicked problems Programs increasingly complex Multiple actors Multiple / unclear outcomes
18
Greater interest in economics program evaluation - often limited attention to cost effectiveness economic evaluation often needs outcomes and effect size (e.g. RCT)
19
Understanding outcomes
20
Role of program logic the causal links between elements of a program and its intended outcomes focus on outcomes not processes diagram showing this a tool and not an end in itself NextNext Back to menuBack to menu
21
Use program logic in evaluation to surface assumptions, test logic clarify intermediate outcomes frame evaluation questions design systematic data collection assess progress along causal chain performance story/evaluative arguments
22
Many terms for program logic
23
Program logics look different style and direction scale and detail purpose – planning or evaluation focus - implementation or impact intended or actual
24
Use of outcome hierarchy Policy / longer term outcomes What the program is contributing Intermediate outcomes What differences are made –what impact, for whom, how much Immediate outcomes What direct changes result – for whom, how much, what quality Service delivery (activities and outputs) How much was done, how well was it done Resources (inputs) more important but less control External factors
25
Issues for Community Builders Focus of analysis –multiple and overlapping communities Outcomes difficult to measure: respect, trust, participation, connectedness, resourcefulness Benefits expected for participants, communities? Different stakeholder needs Implications for framework strategy methods 25
26
What makes good evaluation? 1.Clear purpose 2.Appropriate methods 3.Comprehensive monitoring data 4.Collaboration, ownership and involvement of stakeholders 26
27
References Davidson, E Jane Evaluation Methodology Basics. California: Sage 2005 Donaldson, Stewart I. Program Theory-Driven Evaluation Science: Strategies and Applications. Mahwah, NJ: Erlbaum. 2007. Funnell, Sue and Patricia Rogers Purposeful Program Theory: Effective Use of Theories of Change and Logic Models Jossey-Bass 2011 Owen, John M. Program Evaluation: Forms and Approaches Third Edition, Sage Thousand Oaks Milne, C, Issues in public policy Evaluation for Office of Environment and Heritage, 2011 Patton, Michael Quinn Developmental Evaluation Applying Complexity Concepts to Enhance Innovation and Use New York: Guildford 2010 Patton, Michael Quinn Utilization-Focused Evaluation Sage, 2008 Patton, Michael Quinn Qualitative Research and Evaluation Methods, 3rd edition 2002 Pawson, R. and N. Tilley Realistic Evaluation. London: Sage 1997. Shaw, Ian, Jennifer Greene and Melvin Mark (editors) The SAGE Handbook of Evaluation: Policies, Programs and Practices Sage Thousand Oaks 2006 See the UK government’s evaluation guidelines http://www.hm-treasury.gov.uk/data_magentabook_index.htm
28
Sue Leahy Principal Consultant sue.leahy@artd.com.au artd.com.au Contact details
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.