Presentation is loading. Please wait.

Presentation is loading. Please wait.

Hints and tips for evaluating programmes and providing evidence for long-term sustainability A Practitioners Perspective Mark Sanderson Suffolk Fire &

Similar presentations


Presentation on theme: "Hints and tips for evaluating programmes and providing evidence for long-term sustainability A Practitioners Perspective Mark Sanderson Suffolk Fire &"— Presentation transcript:

1 Hints and tips for evaluating programmes and providing evidence for long-term sustainability A Practitioners Perspective Mark Sanderson Suffolk Fire & Rescue Service

2 Future challenges; CSR = more from less Efficiency agenda CAA= outcome focus for delivery = UoR LAA = more focus on delivery through partnerships Service Improvement = public value Duty to involve

3 Over the last few years there has been a growing awareness that evaluation should attempt to determine how and why measures and activities had an impact Is it still enough only to say that they did have an impact? Is it sufficient only to count the number of interventions delivered?

4 Understanding the value of fire prevention activities is at the heart of being able to sustain and justify investment Evaluation is important to support continuous improvement and innovation.

5 What is it? Strategy to determine the value and effect of something Assessment of organisational activities - aims to understand how and why outputs and outcomes have been achieved But also; Process for understanding and learning from experience Supports organisational improvement and innovation

6 INPUTS The resources that contribute to production and delivery; labour, ICT, physical and financial assets OUTPUTS The final products or goods and services produced and delivered to citizens; number of interventions OUTCOMES The impacts or consequences for the community; what are we trying to achieve i.e. reduction in deaths and injuries, longer life expectancy etc EVALUATIONKNOWLEDGE

7 Its about asking questions: What are likely to be the results? Have we met the original objectives? Have we achieved the outcomes? Could we achieve these in another way? What would have happened without the intervention? What are the likely cost versus benefits?

8 Where are we? Good practice evident along with opportunities for development Diverse approaches Much interest in evaluation at present Research and development Backward looking approach favoured Recording systems/information needs vary as does delivery

9 But; Evaluation should not be another ‘add-on’ layer of bureaucracy It should be integral to the development of programmes and activities, and; It should be embedded within general management systems, business and activity planning, performance management and feedback systems.

10 The ‘ROAMEF’ Planning Cycle

11

12

13

14 Evaluation Techniques It is clear there are many evaluation techniques; Different circumstances and contexts Time available to undertake the analysis varies Amount or nature of data may vary Complexity of programmes varies Analytical skills vary Culture and resources of organisations vary External requirements vary.

15 Broad types of evaluation 1.Qualitative – the outcomes achieved i.e. reduction in fires, behaviours changed or modified? 2.Quantitative – the inputs and outputs related to the delivery of a programme 3.Process – what is done by a programme of activity and how effective was it. Is the policy correct? Measurement of what is done and for whom services are provided. Is a programme meeting the needs of citizens and did it reach the right group of people?

16 When to do it? 1.In advance of a the programme - ‘prospective’ evaluation - looking forward to what may be achieved and the resources, costs and benefits predicted 2.Throughout - ‘interim’ or ‘formative’ evaluation linked to milestones to assess if an activity has achieved its objectives by the dates set and is on track to achieve objectives 3.At the end of a programme - ‘final’, ‘summative’ or ‘retrospective’ evaluation.

17 Programmes and interventions may be prioritised: Most significant spend Most important to the achievement of objectives Where vital to obtain evidence that expansion of a programme is likely to have the desired impact Provide best evidence of overall impact for spending review discussions. Little value in devoting scarce resources to evaluating legacy arrangements where little opportunity to influence!

18 Legacy Programmes High Value High Impact New Programmes High Value High Impact Planned Programmes Low Value Low Impact Legacy Programmes Low Value Low Impact High Spend Mission Critical Low Spend Not Mission Critical RetrospectiveProspective

19 Common pitfalls Not understanding past performance against which to measure success Not setting a baseline or understanding future performance against which to record success before the start of a project Little thought for the recording of activities and information required for effective evaluation Not being integrated within the business planning cycle or culture Sole reliance on national data as opposed to use of feedback from local activities

20 Evaluation principles Be clear about what is to be evaluated Identify the groups of people who will benefit from the intervention/programme Identify other service providers working with the same group and liaise Establish indicators for success Establish data collection and feedback methods Carry out interventions Collect data as you go and liaise with partners Interpret and report.

21 Logic Model – Tool for Effective Evaluation A logic model describes the main elements of an intervention and how they work together to reduce risk in a specific population Displayed in a flow chart, map, or table to show the steps leading to intervention outcomes Elements that are connected within a logical model vary, but generally include inputs; activities; outputs; immediate and intermediate outcomes, and long-term impacts

22 ComponentDefinitionExample Problem Factors which put a population, at risk i.e. attitudes, behaviours, environmental factors Young drivers are more at risk from injury and fatality on the roads Inputs Resources – staff, money, materials £100K for salaries (3 FTE) 3 x vehicles Activities Services the intervention provides to accomplish the objectives Conduct 500 road safety sessions Outputs Direct products of the intervention 500 interventions completed 5000 young people completed all sessions Immediate Outcomes Immediate results such as change in knowledge Perception of road risk improved Driving skills improved Intermediate Outcomes Intervention results that are expected in the medium term % reduction in injuries and fatalities relating to young people % reduction in offending rates Impact Results expected in the longer-term % reduction in injuries and fatalities, % reduction in offending and re-offending rates, achievement of national targets Reduction in cost to society £££££ Evaluation Methods Outline of the evaluation methods to be employed Realistic Evaluation method. Small scale pilot to test effectiveness, questionnaires (before and after), cost benefit realisation, partners performance information

23 Challenges Difficulty attributing observed changes to the work of the service alone – impact of partners work etc Low levels of effect can make detection of change difficult - use of detailed modelling Context dependency - interventions are applied in varying localities and involve different people Practitioners want quick decisions about whether a measure has been effective while policy makers stress the need to do things properly, which takes time

24 Benefits of effective evaluation Identify what worked well and what did not (outputs and outcomes) and how achieved (processes) Identifies where public value may be added Increased predictability for the delivery of quality initiatives Better steer the planning, management and implementation of activities towards desired goals Strengthening information sharing and mainstreaming of good practice

25 Benefits of effective evaluation Build on successes and avoid repeating mistakes - learning Helps justify or protect investments or disinvestments to Councillors Minimise risk of failure Reduces the risk of optimism bias Evidenced based policy

26 Hints and tips for evaluating programmes and providing evidence for long-term sustainability A Practitioners Perspective Mark Sanderson Suffolk Fire & Rescue Service


Download ppt "Hints and tips for evaluating programmes and providing evidence for long-term sustainability A Practitioners Perspective Mark Sanderson Suffolk Fire &"

Similar presentations


Ads by Google