Evaluation & measuring impact

Slides:



Advertisements
Similar presentations
Project Monitoring Evaluation and Assessment
Advertisements

Business research methods: data sources
The Research Process Interpretivist Positivist
Thinking Actively in a Social Context T A S C.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Impact assessment framework
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Market research for a start-up. LEARNING OUTCOMES By the end of this lesson I will be able to: –Define and explain market research –Distinguish between.
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Paper III Qualitative research methodology.  Qualitative research is designed to reveal a specific target audience’s range of behavior and the perceptions.
M ARKET R ESEARCH Topic 3.1. W HAT IS MARKET RESEARCH ? The process of gaining information about customers, products, competitors etc through the collection.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
1 Assessment Professional Learning Module 4: Assessment AS Learning.
Russell & Jamieson chapter Evaluation Steps 15. Evaluation Steps Step 1: Preparing an Evaluation Proposal Step 2: Designing the Study Step 3: Selecting.
Section 2 Effective Groupwork Online. Contents Effective group work activity what is expected of you in this segment of the course: Read the articles.
EVALUATION An introduction: What is Audience Data and
How would this information be useful to a business?
Market research THE TIMES 100.
Data Collection Techniques
Understanding different types and methods of research
Developing Community Assessments
3 Chapter Needs Assessment.
Research Methods 1 Planning Research
Introduction paragraph – what looking to investigate.
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation in Asset Based Approaches
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
LEARNING WALKS How we can share good practice
Monitoring, Evaluation and Learning
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Evaluation Emma King.
Training Trainers and Educators Unit 8 – How to Evaluate
How learners learn in my teaching world…
Market Research.
Evaluation of Research Methods
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
In-Service Teacher Training
ASSESSMENT OF STUDENT LEARNING
Research Methods Lesson 1 choosing a research method types of data
AICE Sociology - Chapter 3
Why bother – is this not the English Department’s job?
Qualitative and Quantitative Data
Questionnaires and interviews
Research in Psychology
The Scientific Method in Psychology
SOCIAL PSYCHOLOGY OF TOURISM
Providing Evidence for your Impact
Research Methods AS Sociology Unit 2.
Conducting Research in the Social Sciences
Starter Look at the photograph, As a sociologist, you want to study a particular group in school. In pairs think about the following questions… Which group.
Training Trainers and Educators Unit 8 – How to Evaluate
Talk of the Town Staff Meeting - Listening
Market Research.
Final Year Project (Translation)
Youngwummin: Ethics and Data Collection Methods
Safety Culture Self-Assessment Methodology
Data and Data Collection
Immediate activity.
Consider the Evidence Evidence-driven decision making
Speech, language and communication needs (SLCN)
Primary Research Methods
Assessments and the Kirkpatrick Model
Fahrig, R. SI Reorg Presentation: DCSI
Research Session: introduction
RBWM SCITT Mentor Meetings 2017.
Planning a cross- curricular topic
Faye Nicholson, P7 Class Teacher, Kingsland Primary School
NHS DUDLEY CCG Latest survey results August 2018 publication.
The Research Process & Surveys, Samples, and Populations
Presentation transcript:

Evaluation & measuring impact

Why zoo education is important “Long-term conservation success will be linked to how zoos and aquariums engage with their visitors and change behaviour.” Translation… WAZA 2015 Strategy – Committing to Conservation makes the case for zoos to engage visitors, and particularly to focus on behaviour change. http://www.waza.org/en/site/conservation/conservation-strategies

Why evaluate?

Why do we evaluate? To find out what is working? Translation… To find out what can be improved? Translation… To ensure you are meeting your learning aims? Translation… To ensure your participants’ needs are met? Translation… To measure if you’re meeting your targets Translation… Because its ‘best practice’? Translation… To create evidence? If so what sort? Translation… To tell people how good the programme is? Translation… To monitor how things are going? Translation… DISCUSSION Get the group to quickly discuss in 2s or 3s why they think we evaluate. Each group will share two reasons, trying not to repeat each other. Following this go through the list one at a time making sure each makes sense and asking if this would be a reason for them. Keep note of anything the suggest not on this list for future presentations.

Who is it for? Why are they interested? Funders Translation… Yourselves Translation… Senior management Translation… Visitor audiences Translation… Scientific publication Translation… Conservation organisations Translation… DISCUSSION Get the groups to quickly discuss in 2s or 3s who they would share their evaluation with and why are they interested. Each group to feedback two audiences trying not to repeat. Following this go through the list in the slide asking the group ‘why are they interested’ for each audience type. Once you have decided who the evaluation is for you can then think about what kind of reporting is needed and this will feed into your decision of what data to collect and how. Keep note of anything the suggest not on this list for future presentations.

Who is it for? Why are they interested? What kind of reporting is needed? How much proof do you need? What sort of data will you need to collect? Translation… DISCUSSION Get the groups to quickly discuss in 2s or 3s who they would share their evaluation with and why are they interested. Each group to feedback two audiences trying not to repeat. Following this go through the list in the slide asking the group ‘why are they interested’ for each audience type. Once you have decided who the evaluation is for you can then think about what kind of reporting is needed and this will feed into your decision of what data to collect and how. Keep note of anything the suggest not on this list for future presentations.

Theory of Change Now that you’ve thought about why you’re evaluating and who it is for you can start to think about what measures you will use and how you will implement them. Planning your evaluation should start with the theory of change that you have produced to make sure that you’re actually measuring if you’re achieving what you set out to achieve. You should try to capture information related to each of the sections. Who – Keep a record of the audiences that you engage How – Keep a record of the drivers supporting you to deliver your activities (this can be the difference between why something did or didn’t work well) What – Keep a record of the activities that you deliver Why – This is the most important part, as this measures your impact, did you make a difference through your activities? You should collect information before and after and always question what else might have made the difference.

What to measure Messages or objectives are what you want to say –you can measure whether these are delivered and/or whether they are received Translation… Outcomes are what our learner goes away with – you can only measure this by interacting or observing them directly

What to measure Numbers of sessions delivered or people participating, what level of detail – age, gender? Translation… Quality of presentations, presenters, activities, resources? Return on investment – what is the cost of each activity, is it value for money?

When and why evaluate Programme Stage Evaluation type Question asked Before programme begins Needs assessment To what extent is the need being met? What can be done to address this need? Translation… New programme Process/ implementation evaluation Is the programme operating as planned You also need to consider when to evaluate and the reason why. Evaluation before you begin will help you to confirm if there is a need for what you have planned. Evaluation at the start of your delivery will help you to check if what you are doing is working – you could consider doing this as a pilot of you want to be sure before delivering to your full audience Evaluation of an established programme will assess if you’re meeting your objectives – to accurately assess this you should consider pre and post evaluation so that you can compare. Long term evaluation will assess the impact of your activities. Formative assessment is a method of continually evaluating needs and using the feedback to input into programme design/review/implementation Summative assessment (or summative evaluation) refers to the assessment of participants where the focus is on the outcome of a program. This contrasts with formative assessment, which summarizes the participants' development at a particular time.

When and why evaluate Programme Stage Evaluation type Question asked Established programme Outcomes evaluation Is the programme achieving its objectives? Translation… Mature programme Impact evaluation What predicted and unpredicted impacts is the programme having? You also need to consider when to evaluate and the reason why. Evaluation before you begin will help you to confirm if there is a need for what you have planned. Evaluation at the start of your delivery will help you to check if what you are doing is working – you could consider doing this as a pilot of you want to be sure before delivering to your full audience Evaluation of an established programme will assess if you’re meeting your objectives – to accurately assess this you should consider pre and post evaluation so that you can compare. Long term evaluation will assess the impact of your activities. Formative assessment is a method of continually evaluating needs and using the feedback to input into programme design/review/implementation Summative assessment (or summative evaluation) refers to the assessment of participants where the focus is on the outcome of a program. This contrasts with formative assessment, which summarizes the participants' development at a particular time.

What are your constraints? Your time Planning & design Collecting the data Analysing the data Reporting on the evaluation Participants time Access to participants – ease, sample size Timing – when can people take part in evaluation activity Budget for materials, staff time or software Motivation of participants Availability of resource Translation… DISCUSSION Before you rush ahead and try to gather as much evidence as possible make sure you also consider your constraints. How much can you manage? Do you really need to collect that type of data AND that type of data or will one of them do? You need to do as much evaluation as you need but not too much that you can’t manage. But you can decide when what you have planned is good enough as you know what you want to achieve. Things to consider include….go through list

Methods of evaluation

Types of evidence Photographic Translation… Video Translation… Sound Translation… Questionnaires Translation… Observations – structured or semi-structured Translation… Peer reviews Translation… Q&As Translation… Drawings Translation… Notes from activities Translation… Numerical data Translation… Written work Translation… Interviews Translation… DISCUSSION Group discussion prior to running through the list

Quantitative Translation.. Social Research Seeks to measures the prevalence of something E.g. How many people have a certain view Translation.. Translation…

Social Research Qualitative Translation… Seeks to understand why E.g. ‘I decided to do that because…’ Doesn’t matter how many people think a certain thing, doesn’t value one reason or opinion more than another Why do people think differently about the same thing? Translation… Translation..

Qualitative versus Quantitative Qualitative evaluation helps you to find out the reasons behind the answers that you’re getting and to have a more in depth understanding

Qualitative versus Quantitative Translation… Focus groups Essays Video diaries Portfolios Student presentations Drawings Activities Open questions - interviews Evaluation methods are split into quantitative and qualitative. Quantitative data are measures of values or counts and are expressed as numbers. Quantitative data are data about numeric variables (e.g. how many; how much; or how often).  Quantitative Research is used to quantify the problem by way of generating numerical data or data that can be transformed into usable statistics. It is used to quantify attitudes, opinions, behaviors, and other defined variables – and generalize results from a larger sample population.  Qualitative Research is primarily exploratory research. It is used to gain an understanding of underlying reasons, opinions, and motivations. It provides insights into the problem or helps to develop ideas or hypotheses for potential quantitative research. Qualitative Research is also used to uncover trends in thought and opinions, and dive deeper into the problem. Qualitative data collection methods vary using unstructured or semi-structured techniques. Some common methods include focus groups (group discussions), individual interviews, and participation/observations. The sample size is typically small, and respondents are selected to fulfil a given quota.

Qualitative versus Quantitative Translation.. Collecting numbers, data Questionnaires and quizzes (with closed answers) Categorised information – could be from qualitative methods (eg. drawings) Interactive activities – eg. quiz style questions Simple choices Evaluation methods are split into quantitative and qualitative. Quantitative data are measures of values or counts and are expressed as numbers. Quantitative data are data about numeric variables (e.g. how many; how much; or how often).  Quantitative Research is used to quantify the problem by way of generating numerical data or data that can be transformed into usable statistics. It is used to quantify attitudes, opinions, behaviors, and other defined variables – and generalize results from a larger sample population.  Qualitative Research is primarily exploratory research. It is used to gain an understanding of underlying reasons, opinions, and motivations. It provides insights into the problem or helps to develop ideas or hypotheses for potential quantitative research. Qualitative Research is also used to uncover trends in thought and opinions, and dive deeper into the problem. Qualitative data collection methods vary using unstructured or semi-structured techniques. Some common methods include focus groups (group discussions), individual interviews, and participation/observations. The sample size is typically small, and respondents are selected to fulfil a given quota.

Qualitative versus Quantitative Qualitative – how and why Translation… Tells a story – quotes and images Gives context - can tell you why something happens Harder to interpret and categorise Can be more time consuming Not representative, hard to replicate Deeper understanding of smaller number of subjects or activities Requires skill to administer and interpret – e.g. interviewing skills

Qualitative versus Quantitative Quantitative – how many, how much, how long Translation… Quicker to analyse Relatively easy Mass data – large numbers of subjects or variables Can have limited depth, if detailed analyses of variables is not undertaken With right sample size can generalize to whole populations

Gathering your data

How to gather evidence Basically there are 2 main approaches: Observing people (in a structured way) Talking to people or asking them to give their opinion (in a structured or unstructured way)

Use observation when… You want to understand physical behaviour in a geographic space (asking people is not valid, you need to observe). Translation… You want to understand real-time reactions to stimuli (it is often hard to articulate how you thought and felt about something after the fact). You want to look at the things that might predict visitors’ behaviour.

Observation Benefits Easy, quick and repeatable method Quantitative data – good for analysis Allows you to see what visitors are actually doing, rather than what they tell you! Minimal bias Translation…

Observation Problems Does not tell you what somebody is thinking! (unless you are ‘observing’ someone’s conversation) Translation…

Examples: Structured Observation: This can be any type of measure of behaviour T.. What do people stop at? T… How long for? T… What do they do when they stop? T… What are they saying? T… Who too? T… Essentially this is animal behaviour research. Almost always quantitative. T….

Talk to people when… You want to find out what visitors think and feel about things Translation… You want to understand change in some kind of cognitive variable (knowledge, attitudes etc.) You want in-depth (probably qualitative) data

Talking to people Benefits Is really the only way to measure what people know, think or feel Data collection can be quick depending on method Quantitative data (rating scales for example) can be easy to input and analyse. Translation…

Talking to people Problems Poorly designed surveys are worse than no survey at all! You need to know what you are doing. Qualitative data is probably the most difficult and time consuming to analyse – it is not the easy option! Translation…

Examples: Talking to people (or people talking to you): Many methods Surveys (could be online) T… Interviews T… Unstructured feedback (comments board) T… Focus groups T… Drawings T… Data can be quantitative and qualitative.

Social media use (2015) Information doesn’t have to be gathered face to face. Social media may be a very powerful tool for collecting data.

Example from Chester Zoo

Example questionnaire Here are some examples of questionnaires that we have completed with teachers and students which included both multiple choice (quant) and open ended (qual) questions. Some of the questions were different for students and teachers to make sure the language was appropriate and question was appropriate. We have been able to use this evaluation to show to Zoo Trustees and funders the impact of this area of our work. This resulted in greater financial support and we’re now able to do more of this work.

Significant difference (p<0.001) Analysis of drawings (related to lesson content) Significant difference (p<0.001) Pre-test Qualitative evaluation like drawings can also be turned into quantitative data through coding. This can then be used to show if there is a significant difference between the pre and the post. In this example we observed more native species and relevant conservation actions in the post test drawings than in the pre-test drawings The images on screen are the pre and post test drawings of the same child Post-test

Pupil questionnaires results * * * Detailed evaluation of the POW project shows that significant learning was achieved with the project based approach. It also showed greater shifts in attitude and greater learning outcomes from pupils who took part in a zoo visit as part of their programme (8 of the 14 schools). However, this is primarily due to the lower starting point of those schools – who were all in more deprived areas. Working theory is that targeting areas of deprivation with a focused project based programme not only develops audiences but also achieves greater conservation impact. Newspaper readership and sources of information used in those households are less likely to include environmental content, less likely to engage with conservation organisations such as RSPB or National Trust. Good opportunity for us as we have higher than average numbers of visitors from lower socio-economic groups.

Over to you

How would you evaluate these things … Driving skills T… Knowledge of how to make an omelette How much a teacher inspires their pupils Singing ability Number of people who watch soap operas

Tool trial Your activity? T… ACTIVITY Provide examples of different evaluation tools and allow the group time to move around a discuss how/when/if they might use them. Card sort – review what you’re doing already, if it meets the objectives that you have now set, and if it will be easy or difficult to achieve (organisational review not evaluation of impact) Peer review – provide feedback to the delivery team for ongoing improvements Observational evaluation – observing people and recording their behaviour by codes Where do you stand – with a group using questions and physical response (an example of activity based feedback) Drawings – drawing answer to question and image is then coded for analysis Questionnaires – we’ll discuss this later Everyone think of one activity/resource they are currently using and how they would evaluate it Your activity? T…

Are you measuring the right thing?

Tool rules – being robust Use your Theory of Change to plan what you will measure Translation… Collect information before and after Account for what else made the difference Be careful with language, order and nature of questions Representative samples What matters to many (everyone?) is that the evidence is robust (and later on we talk about engaging) Story of Change Before and After information Adjust for context - what would or could have happened anyway? (deadweight and displacement) and for contribution – how much was due to others? (attribution) Avoid leading questions Consider the order Ask open or closed questions Disguise the scale and calibrate afterwards Make sure samples are representative

Validity (internal) Are you measuring what you think you are measuring? And are you using the right tool to do the measuring? Translation…

Reliability Are you measuring the same thing each time – consistency Translation…

Sampling and generalisability Random (probability) sampling Translation… Each member of the population has an equal chance of being selected for the sample Translation…

Survey design

Understanding Bias Response Bias: Where a survey question is constructed in such a way that one answer is much more likely than another. E.g. ‘Don’t you agree that zoo animals are amazing?’ Translation…

Common types of Response Bias Demand characteristics: In surveys, this is where the question itself ‘cues’ a participant in to what they think the study is about or what the researcher wants to hear. E.g. ‘Do you think that the zoo is an important conservation organisation?’ Translation…

The problem of self-report All surveys are essentially self-report (people are responding about themselves) but some questions are not really suitable: T… Any question that relies on people giving their intention to do something e.g. “as a result of your visit, will you now recycle at home?” (note, other biases too) Any question that relies on the person understanding a difficult concept to answer accurately: e.g. “do you think you have learnt a lot today?” (what does the person understand by ‘learnt’ and how do they judge what ‘a lot’ is – this might be different for everyone)

Being careful with using proxy measures For example: Asking a teacher to comment on the learning outcomes of their students. T… Asking a animal professional to comment on the behaviour of an animal. What is actually being measured ?

Closed/Open questions We can translate this one verbally

Rating scales

Other things to look out for Avoid asking two questions in one – e.g. “This experience increased my interest in learning more about dolphins and the oceans?” (what if I think it increased my interest in dolphins but not the ocean?) Translation…

Other things to look out for Avoid ambiguous questions – e.g. “Do you regularly visit the zoo?” (what does regularly mean?) Translation…

Other things to remember You must have questions that will describe your sample, other wise you will not be able to demonstrate how representative your survey responses are. Translation… For example: Age T Gender T Education level T Ethnicity T Postcode T

Activity: In pairs write an example survey question. Review your example questions with each other and give feedback. Translation… ACTIVITY

Next steps…..

Evaluation for your activities… Who is it for? Why are you doing it? What constraints do you have? What do you want to know? What methods might be most appropriate? Translation…

Any questions? We’re happy to help you after this training too