Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation & measuring impact

Similar presentations


Presentation on theme: "Evaluation & measuring impact"— Presentation transcript:

1 Evaluation & measuring impact

2 Why zoo education is important
“Long-term conservation success will be linked to how zoos and aquariums engage with their visitors and change behaviour.” Translation… WAZA 2015 Strategy – Committing to Conservation makes the case for zoos to engage visitors, and particularly to focus on behaviour change.

3 Why evaluate?

4 Why do we evaluate? To find out what is working? Translation…
To find out what can be improved? Translation… To ensure you are meeting your learning aims? Translation… To ensure your participants’ needs are met? Translation… To measure if you’re meeting your targets Translation… Because its ‘best practice’? Translation… To create evidence? If so what sort? Translation… To tell people how good the programme is? Translation… To monitor how things are going? Translation… DISCUSSION Get the group to quickly discuss in 2s or 3s why they think we evaluate. Each group will share two reasons, trying not to repeat each other. Following this go through the list one at a time making sure each makes sense and asking if this would be a reason for them. Keep note of anything the suggest not on this list for future presentations.

5 Who is it for? Why are they interested?
Funders Translation… Yourselves Translation… Senior management Translation… Visitor audiences Translation… Scientific publication Translation… Conservation organisations Translation… DISCUSSION Get the groups to quickly discuss in 2s or 3s who they would share their evaluation with and why are they interested. Each group to feedback two audiences trying not to repeat. Following this go through the list in the slide asking the group ‘why are they interested’ for each audience type. Once you have decided who the evaluation is for you can then think about what kind of reporting is needed and this will feed into your decision of what data to collect and how. Keep note of anything the suggest not on this list for future presentations.

6 Who is it for? Why are they interested?
What kind of reporting is needed? How much proof do you need? What sort of data will you need to collect? Translation… DISCUSSION Get the groups to quickly discuss in 2s or 3s who they would share their evaluation with and why are they interested. Each group to feedback two audiences trying not to repeat. Following this go through the list in the slide asking the group ‘why are they interested’ for each audience type. Once you have decided who the evaluation is for you can then think about what kind of reporting is needed and this will feed into your decision of what data to collect and how. Keep note of anything the suggest not on this list for future presentations.

7 Theory of Change Now that you’ve thought about why you’re evaluating and who it is for you can start to think about what measures you will use and how you will implement them. Planning your evaluation should start with the theory of change that you have produced to make sure that you’re actually measuring if you’re achieving what you set out to achieve. You should try to capture information related to each of the sections. Who – Keep a record of the audiences that you engage How – Keep a record of the drivers supporting you to deliver your activities (this can be the difference between why something did or didn’t work well) What – Keep a record of the activities that you deliver Why – This is the most important part, as this measures your impact, did you make a difference through your activities? You should collect information before and after and always question what else might have made the difference.

8 What to measure Messages or objectives are what you want to say –you can measure whether these are delivered and/or whether they are received Translation… Outcomes are what our learner goes away with – you can only measure this by interacting or observing them directly

9 What to measure Numbers of sessions delivered or people participating, what level of detail – age, gender? Translation… Quality of presentations, presenters, activities, resources? Return on investment – what is the cost of each activity, is it value for money?

10 When and why evaluate Programme Stage Evaluation type Question asked
Before programme begins Needs assessment To what extent is the need being met? What can be done to address this need? Translation… New programme Process/ implementation evaluation Is the programme operating as planned You also need to consider when to evaluate and the reason why. Evaluation before you begin will help you to confirm if there is a need for what you have planned. Evaluation at the start of your delivery will help you to check if what you are doing is working – you could consider doing this as a pilot of you want to be sure before delivering to your full audience Evaluation of an established programme will assess if you’re meeting your objectives – to accurately assess this you should consider pre and post evaluation so that you can compare. Long term evaluation will assess the impact of your activities. Formative assessment is a method of continually evaluating needs and using the feedback to input into programme design/review/implementation Summative assessment (or summative evaluation) refers to the assessment of participants where the focus is on the outcome of a program. This contrasts with formative assessment, which summarizes the participants' development at a particular time.

11 When and why evaluate Programme Stage Evaluation type Question asked
Established programme Outcomes evaluation Is the programme achieving its objectives? Translation… Mature programme Impact evaluation What predicted and unpredicted impacts is the programme having? You also need to consider when to evaluate and the reason why. Evaluation before you begin will help you to confirm if there is a need for what you have planned. Evaluation at the start of your delivery will help you to check if what you are doing is working – you could consider doing this as a pilot of you want to be sure before delivering to your full audience Evaluation of an established programme will assess if you’re meeting your objectives – to accurately assess this you should consider pre and post evaluation so that you can compare. Long term evaluation will assess the impact of your activities. Formative assessment is a method of continually evaluating needs and using the feedback to input into programme design/review/implementation Summative assessment (or summative evaluation) refers to the assessment of participants where the focus is on the outcome of a program. This contrasts with formative assessment, which summarizes the participants' development at a particular time.

12 What are your constraints?
Your time Planning & design Collecting the data Analysing the data Reporting on the evaluation Participants time Access to participants – ease, sample size Timing – when can people take part in evaluation activity Budget for materials, staff time or software Motivation of participants Availability of resource Translation… DISCUSSION Before you rush ahead and try to gather as much evidence as possible make sure you also consider your constraints. How much can you manage? Do you really need to collect that type of data AND that type of data or will one of them do? You need to do as much evaluation as you need but not too much that you can’t manage. But you can decide when what you have planned is good enough as you know what you want to achieve. Things to consider include….go through list

13 Methods of evaluation

14 Types of evidence Photographic Translation… Video Translation…
Sound Translation… Questionnaires Translation… Observations – structured or semi-structured Translation… Peer reviews Translation… Q&As Translation… Drawings Translation… Notes from activities Translation… Numerical data Translation… Written work Translation… Interviews Translation… DISCUSSION Group discussion prior to running through the list

15 Quantitative Translation.. Social Research
Seeks to measures the prevalence of something E.g. How many people have a certain view Translation.. Translation…

16 Social Research Qualitative Translation… Seeks to understand why
E.g. ‘I decided to do that because…’ Doesn’t matter how many people think a certain thing, doesn’t value one reason or opinion more than another Why do people think differently about the same thing? Translation… Translation..

17 Qualitative versus Quantitative
Qualitative evaluation helps you to find out the reasons behind the answers that you’re getting and to have a more in depth understanding

18 Qualitative versus Quantitative
Translation… Focus groups Essays Video diaries Portfolios Student presentations Drawings Activities Open questions - interviews Evaluation methods are split into quantitative and qualitative. Quantitative data are measures of values or counts and are expressed as numbers. Quantitative data are data about numeric variables (e.g. how many; how much; or how often).  Quantitative Research is used to quantify the problem by way of generating numerical data or data that can be transformed into usable statistics. It is used to quantify attitudes, opinions, behaviors, and other defined variables – and generalize results from a larger sample population.  Qualitative Research is primarily exploratory research. It is used to gain an understanding of underlying reasons, opinions, and motivations. It provides insights into the problem or helps to develop ideas or hypotheses for potential quantitative research. Qualitative Research is also used to uncover trends in thought and opinions, and dive deeper into the problem. Qualitative data collection methods vary using unstructured or semi-structured techniques. Some common methods include focus groups (group discussions), individual interviews, and participation/observations. The sample size is typically small, and respondents are selected to fulfil a given quota.

19 Qualitative versus Quantitative
Translation.. Collecting numbers, data Questionnaires and quizzes (with closed answers) Categorised information – could be from qualitative methods (eg. drawings) Interactive activities – eg. quiz style questions Simple choices Evaluation methods are split into quantitative and qualitative. Quantitative data are measures of values or counts and are expressed as numbers. Quantitative data are data about numeric variables (e.g. how many; how much; or how often).  Quantitative Research is used to quantify the problem by way of generating numerical data or data that can be transformed into usable statistics. It is used to quantify attitudes, opinions, behaviors, and other defined variables – and generalize results from a larger sample population.  Qualitative Research is primarily exploratory research. It is used to gain an understanding of underlying reasons, opinions, and motivations. It provides insights into the problem or helps to develop ideas or hypotheses for potential quantitative research. Qualitative Research is also used to uncover trends in thought and opinions, and dive deeper into the problem. Qualitative data collection methods vary using unstructured or semi-structured techniques. Some common methods include focus groups (group discussions), individual interviews, and participation/observations. The sample size is typically small, and respondents are selected to fulfil a given quota.

20 Qualitative versus Quantitative
Qualitative – how and why Translation… Tells a story – quotes and images Gives context - can tell you why something happens Harder to interpret and categorise Can be more time consuming Not representative, hard to replicate Deeper understanding of smaller number of subjects or activities Requires skill to administer and interpret – e.g. interviewing skills

21 Qualitative versus Quantitative
Quantitative – how many, how much, how long Translation… Quicker to analyse Relatively easy Mass data – large numbers of subjects or variables Can have limited depth, if detailed analyses of variables is not undertaken With right sample size can generalize to whole populations

22 Gathering your data

23 How to gather evidence Basically there are 2 main approaches:
Observing people (in a structured way) Talking to people or asking them to give their opinion (in a structured or unstructured way)

24 Use observation when… You want to understand physical behaviour in a geographic space (asking people is not valid, you need to observe). Translation… You want to understand real-time reactions to stimuli (it is often hard to articulate how you thought and felt about something after the fact). You want to look at the things that might predict visitors’ behaviour.

25 Observation Benefits Easy, quick and repeatable method
Quantitative data – good for analysis Allows you to see what visitors are actually doing, rather than what they tell you! Minimal bias Translation…

26 Observation Problems Does not tell you what somebody is thinking! (unless you are ‘observing’ someone’s conversation) Translation…

27 Examples: Structured Observation: This can be any type of measure of behaviour T.. What do people stop at? T… How long for? T… What do they do when they stop? T… What are they saying? T… Who too? T… Essentially this is animal behaviour research. Almost always quantitative. T….

28 Talk to people when… You want to find out what visitors think and feel about things Translation… You want to understand change in some kind of cognitive variable (knowledge, attitudes etc.) You want in-depth (probably qualitative) data

29 Talking to people Benefits
Is really the only way to measure what people know, think or feel Data collection can be quick depending on method Quantitative data (rating scales for example) can be easy to input and analyse. Translation…

30 Talking to people Problems
Poorly designed surveys are worse than no survey at all! You need to know what you are doing. Qualitative data is probably the most difficult and time consuming to analyse – it is not the easy option! Translation…

31 Examples: Talking to people (or people talking to you): Many methods
Surveys (could be online) T… Interviews T… Unstructured feedback (comments board) T… Focus groups T… Drawings T… Data can be quantitative and qualitative.

32 Social media use (2015) Information doesn’t have to be gathered face to face. Social media may be a very powerful tool for collecting data.

33 Example from Chester Zoo

34 Example questionnaire
Here are some examples of questionnaires that we have completed with teachers and students which included both multiple choice (quant) and open ended (qual) questions. Some of the questions were different for students and teachers to make sure the language was appropriate and question was appropriate. We have been able to use this evaluation to show to Zoo Trustees and funders the impact of this area of our work. This resulted in greater financial support and we’re now able to do more of this work.

35 Significant difference (p<0.001)
Analysis of drawings (related to lesson content) Significant difference (p<0.001) Pre-test Qualitative evaluation like drawings can also be turned into quantitative data through coding. This can then be used to show if there is a significant difference between the pre and the post. In this example we observed more native species and relevant conservation actions in the post test drawings than in the pre-test drawings The images on screen are the pre and post test drawings of the same child Post-test

36 Pupil questionnaires results
* * * Detailed evaluation of the POW project shows that significant learning was achieved with the project based approach. It also showed greater shifts in attitude and greater learning outcomes from pupils who took part in a zoo visit as part of their programme (8 of the 14 schools). However, this is primarily due to the lower starting point of those schools – who were all in more deprived areas. Working theory is that targeting areas of deprivation with a focused project based programme not only develops audiences but also achieves greater conservation impact. Newspaper readership and sources of information used in those households are less likely to include environmental content, less likely to engage with conservation organisations such as RSPB or National Trust. Good opportunity for us as we have higher than average numbers of visitors from lower socio-economic groups.

37 Over to you

38 How would you evaluate these things …
Driving skills T… Knowledge of how to make an omelette How much a teacher inspires their pupils Singing ability Number of people who watch soap operas

39 Tool trial Your activity? T… ACTIVITY
Provide examples of different evaluation tools and allow the group time to move around a discuss how/when/if they might use them. Card sort – review what you’re doing already, if it meets the objectives that you have now set, and if it will be easy or difficult to achieve (organisational review not evaluation of impact) Peer review – provide feedback to the delivery team for ongoing improvements Observational evaluation – observing people and recording their behaviour by codes Where do you stand – with a group using questions and physical response (an example of activity based feedback) Drawings – drawing answer to question and image is then coded for analysis Questionnaires – we’ll discuss this later Everyone think of one activity/resource they are currently using and how they would evaluate it Your activity? T…

40 Are you measuring the right thing?

41 Tool rules – being robust
Use your Theory of Change to plan what you will measure Translation… Collect information before and after Account for what else made the difference Be careful with language, order and nature of questions Representative samples What matters to many (everyone?) is that the evidence is robust (and later on we talk about engaging) Story of Change Before and After information Adjust for context - what would or could have happened anyway? (deadweight and displacement) and for contribution – how much was due to others? (attribution) Avoid leading questions Consider the order Ask open or closed questions Disguise the scale and calibrate afterwards Make sure samples are representative

42 Validity (internal) Are you measuring what you think you are measuring? And are you using the right tool to do the measuring? Translation…

43 Reliability Are you measuring the same thing each time – consistency
Translation…

44 Sampling and generalisability
Random (probability) sampling Translation… Each member of the population has an equal chance of being selected for the sample Translation…

45 Survey design

46 Understanding Bias Response Bias:
Where a survey question is constructed in such a way that one answer is much more likely than another. E.g. ‘Don’t you agree that zoo animals are amazing?’ Translation…

47 Common types of Response Bias
Demand characteristics: In surveys, this is where the question itself ‘cues’ a participant in to what they think the study is about or what the researcher wants to hear. E.g. ‘Do you think that the zoo is an important conservation organisation?’ Translation…

48 The problem of self-report
All surveys are essentially self-report (people are responding about themselves) but some questions are not really suitable: T… Any question that relies on people giving their intention to do something e.g. “as a result of your visit, will you now recycle at home?” (note, other biases too) Any question that relies on the person understanding a difficult concept to answer accurately: e.g. “do you think you have learnt a lot today?” (what does the person understand by ‘learnt’ and how do they judge what ‘a lot’ is – this might be different for everyone)

49 Being careful with using proxy measures
For example: Asking a teacher to comment on the learning outcomes of their students. T… Asking a animal professional to comment on the behaviour of an animal. What is actually being measured ?

50 Closed/Open questions
We can translate this one verbally

51 Rating scales

52 Other things to look out for
Avoid asking two questions in one – e.g. “This experience increased my interest in learning more about dolphins and the oceans?” (what if I think it increased my interest in dolphins but not the ocean?) Translation…

53 Other things to look out for
Avoid ambiguous questions – e.g. “Do you regularly visit the zoo?” (what does regularly mean?) Translation…

54 Other things to remember
You must have questions that will describe your sample, other wise you will not be able to demonstrate how representative your survey responses are. Translation… For example: Age T Gender T Education level T Ethnicity T Postcode T

55 Activity: In pairs write an example survey question. Review your example questions with each other and give feedback. Translation… ACTIVITY

56 Next steps…..

57 Evaluation for your activities…
Who is it for? Why are you doing it? What constraints do you have? What do you want to know? What methods might be most appropriate? Translation…

58 Any questions? We’re happy to help you after this training too


Download ppt "Evaluation & measuring impact"

Similar presentations


Ads by Google