Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.

Similar presentations


Presentation on theme: "Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means."— Presentation transcript:

1 Evaluation Nicola Bowtell

2 Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means employing expensive consultants is about counting everything should be tagged onto the end of a project always has to involve a control group

3 What is evaluation? “Evaluation is attributing value to an intervention by gathering reliable and valid information about it in a systematic way, and by making comparisons, for the purpose of making more informed decisions or understanding causal mechanisms or general principles.” Ovretveit.J 1998 “Evaluation is concerned with assessing an activity against values and goals in such a way that results can contribute to future decision making and or policy” Tones & Tilford, 1994 3Presentation title - edit in Header and Footer

4 More definitions… Judgement based on careful assessment and critical appraisal (WHO 1981) Making a judgement about the value of something by looking critically (Ewles and Simnett 1995) Concerned with making judgements about the value of activities (Downie, Tannahill and Tannahill 1996) Identifying and ranking criteria (value and aims) and gathering information which makes it possible to assess the extent to which criteria are being met (Perberdy 1997) 4Presentation title - edit in Header and Footer

5 Monitoring, Evaluation, Research..? Monitoring:  the process of appraising and assessing work activities through performance monitoring.  WHAT? How are we doing so far? Have we done what we said we’d do? Evaluation:  a formal and systematic activity where assessment is linked to original intentions / outcomes and is fed back into the planning process.  SO WHAT? What difference have we made? Research: a formal and systematic activity where the intervention, method and context of activity are constructed by the researchers. 5Presentation title - edit in Header and Footer

6 Why bother with Evaluation? Evaluation and evidence of outcomes are essential for sustainability and learning. If we don’t evaluate…. 6Presentation title - edit in Header and Footer Projects won’t know whether they are making a difference or not Projects may become ineffective or irrelevant Projects may lose funding if they can’t show results To inform practice and contribute to the evidence base To optimise use of resources Projects won’t know how to improve their work To ensure ethical practice

7 What can we evaluate in public health..?  Services / projects  screening programmes  surveillance systems  policy changes  media campaigns  outbreak investigations  communications – e.g. newsletters, websites  IT systems  a training course ... pretty much anything! 7Presentation title - edit in Header and Footer

8 Questions evaluation can help answer  What impact is the service having on the community?  Have service users benefited? Are we meeting their needs?  Are we reaching the right people? Who are we not reaching?  Do services provide good value for money?  What do people think of services?  How well are we working with partners? 8Presentation title - edit in Header and Footer

9 Quantitative data Quantitative  How many? Where? In what proportion?  Breadth  ‘hard’ data Acquired from:  existing sources e.g. demographic data, Service records  using new data collection tools e.g. Surveys, Assessment tools 9Presentation title - edit in Header and Footer

10 Qualitative data  Views, experience & opinions  Why? How? What for?  Depth  ‘soft’ data Acquired by conversation, observation or written accounts from stakeholders e.g. :  Service users  Service staff or Volunteers  Other agencies  Funders 10Presentation title - edit in Header and Footer

11 To be useful, outcomes must…  Be in line with national, local and programme priorities  Make sense to service users  Be measurable  Lie within the scope of the service 11Presentation title - edit in Header and Footer

12 Example Outcomes 12Presentation title - edit in Header and Footer People and communities… have improved mental well-being e.g. Looked after children have improved social networks. are more physically active e.g. Older adults increase their weekly participation in moderate physical activity. eat more healthily e.g. Young parents are able to prepare healthy meals

13 Steps towards a robust evaluation design... Types of evaluation:  Formative - during a project’s development stage  Process - implementation and delivery  Impact/Outcome - changes for individuals / communities 13Presentation title - edit in Header and Footer

14 Process evaluation Development Intervention Outcomes 14Presentation title - edit in Header and Footer Aims to provide an explanation of how or why intended outcomes of the project were (or were not) brought about.

15 Benefits and costs of process evaluation Benefits Can show if you’ve done what you said you’d do What worked well What didn’t Improve scheme by using learning Demonstrate progress to stakeholders Can inform practice 15Presentation title - edit in Header and Footer

16 Outcome evaluation Focuses on the various impacts of a project over time 16Presentation title - edit in Header and Footer

17 Outcome evaluation: challenges Size of effect Gains may be modest because: The size of the target population The nature of outcome measures and endpoints The scope of the intervention The timescale Attributing outcome to intervention 17Presentation title - edit in Header and Footer

18 Study designs  Post-intervention  Pre-post (Before and after)  Quasi-experimental  Experimental (e.g. RCT) 18Presentation title - edit in Header and Footer

19 Quantitative methods Collect numerical data.. Surveys Existing quantitative data sources e.g. HES Mathematical modelling 19Presentation title - edit in Header and Footer

20 Qualitative methods Interviewing - structured/semi-structured - individual - group (focus group) Observation - participant observer - documentary and textual analysis - official documents, historical records, newspapers Case studies 20Presentation title - edit in Header and Footer


Download ppt "Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means."

Similar presentations


Ads by Google