Evaluation: Find out what visitors think Improve what we do Nicola Bell, MA culture ~ evaluation ~ learning

Slides:



Advertisements
Similar presentations
Customised training: Learner Voice and Post-16 Citizenship.
Advertisements

Evaluation What, How and Why Bother?.
Middle Years Program (MYP)
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Writing a Grant Application in 10 Steps Museum Development Network 20 Nov 2013.
Project Monitoring Evaluation and Assessment
ROAD SAFETY ETP EVALUATION TRAINING
1 Classroom management and partnerships Working in partnership with pupils.
Cultures and Memories. Sharing Heritage Grant requests of £3,000 to £10,000 First World War: then and now Grant requests of £3,000 to £10,000 Our Heritage.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
How’s it Working? Evaluating Your Program MAAPS Conference, 7 May 2010 Debra Smith & Judah Leblang Program Evaluation & Research Group School of Education,
Effectively applying ISO9001:2000 clauses 5 and 8
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Unit 2: Managing the development of self and others Life Science and Chemical Science Professionals Higher Apprenticeships Unit 2 Managing the development.
MANAGEMENT OF MARKETING
Developing a Culture of Participation in CAMHS Jason Smith February2013.
PROJECT EVALUATION Food for your future bids and backers FAY BLAIR.
SEN 0 – 25 Years Pat Foster.
Future of the Partnership Board. The Board asked for a working group to look at the future of the Board People were worried about – Too many meetings.
Impact assessment framework
Aim/Learning Objective: Students will explore how we make decisions Students will examine the levels of participation Students will take part in a decision.
The Evaluation Plan.
Using volunteers to interpret collections. In 2006: 348 volunteers carried out 33, hours of work. Areas of work: 218 Guides undertook 3482 tours.
Involving Girls in Advocacy © 2014 Public Health Institute.
SCIENCE FAIR 2009.
3.2.1 The role of Market Research and Methods Used:
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Multi-Agency Planning in Practice Skill development workshop.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
Logic Models and Theory of Change Models: Defining and Telling Apart
Counselling Skills Level Three Week 13 Identity in Counselling. The Research Assignment.
Workshop 6 - How do you measure Outcomes?
Insert organisational logo here on master slide TECHNIQUES.
CHILDREN’S PERCEPTIONS OF LEARNING WITH EDUCATIONAL GAMES USING IPOD TOUCHES Yasemin Allsop ICT Coordinator, Wilbury Primary School (UK)
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Evaluation Workshop Nicola Bell, MA culture ~ evaluation ~ learning
SOCIAL SCIENCE RESEARCH METHODS. The Scientific Method  Need a set of procedures that show not only how findings have been arrived at but are also clear.
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Attending Meetings at School Louise Mottershead Aspire North West 2015.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Silver Arts Award Guide Use this PowerPoint to help you to complete your Silver Arts Award. The instructions in GREEN tell you what you need to do, these.
Making it Count! Program Evaluation For Youth-Led Initiatives.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
Teaching and Learning Policy Summary. Having purpose Putting the vision into practice Analyse Plan Do Review Record Report.
n Taking Notes and Keeping a Journal n Listening Skills n Working Together n Managing Your Time.
Monitoring & Evaluating WHAT IS MEANT BY MONITORING AND EVALUATING?
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Y1 SBT Workshop EYFS Input Please ensure you have registered your name before you take a seat.
National Federation of Young Farmers' Clubs "Fun, Learning and Achievement"
In touch with our cultural heritage How museums, historic sites, libraries and archives can support the Welsh Baccalaureate.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
ST MARY’S RC HIGH SCHOOL Communicating with Pupils A Whole School Approach to Improving Access, Participation and Achievement.
Research in Psychology. Quantitative Methods  Quantitative: experiments and studies gathering data with questionnaires and analyzing results with correlations.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Surf smart training.
EVALUATION An introduction: What is Audience Data and
Evaluation Emma King.
Evaluation Styles Logic Model Generic Social Developmental
Research in Psychology
Youngwummin: Ethics and Data Collection Methods
Logic Models and Theory of Change Models: Defining and Telling Apart
Rights Respecting Schools Award What is it?
Evaluation tools training
Presentation transcript:

Evaluation: Find out what visitors think Improve what we do Nicola Bell, MA culture ~ evaluation ~ learning

What is Evaluation? Evaluation is the systematic collection of information about the content, characteristics, and outcomes of a programme to make judgements about it, improve its effectiveness, and/or inform decisions about future activities. (Kirby, P., and Bryson, S. (2002) Measuring the Magic? Evaluating and Researching Young People’s Participation in Public Decision Making, London: Carnegie Young People's Initiative) Consulting with young people - literature review young-people-13.pdf

What is evaluation? It is Quality Control, to find out: Is it good? How can we make it better?

Why do we want to carry out an evaluation? To find out how to develop a project To find out if the project was successful To find out if the money was well spent To get money for future activities To find out what non-visitors think

If you don’t evaluate, you might: waste time waste money produce something that you can't change lose interest from your target audience lose an opportunity to learn something useful lose an opportunity to tell people about the good work you are doing lose funding

When planning an evaluation, remember: Plan the evaluation at the beginning of the project Allow enough time to collect and analyse the data Allow enough money for the evaluation Consider using a freelance evaluator who can bring an independent view to the project

Who is the evaluation for? Museum staff Project partners Project participants Funders Local or national government, future partner organisations Visitors

Types of evaluation Front end - e.g. planning new exhibit Prototyping - e.g. for a new interactive Baseline - e.g. visitors’ knowledge at the beginning Formative - e.g. how the programme is developing Summative - e.g. was the programme effective? Usually evaluations only cover a short time; longitudinal studies (over several years) would be good

Methodology: Link the evaluation to the project’s aims and objectives Aims and objectives must be made clear at the start Aim = what you are going to do? What changes will happen? (to participants, to the museum, to staff) Objectives = how you are going to do it? Objectives should be SMART

SMART objectives: S - specific M - measurable A - agreed R - realistic T - time-based

Outputs, Outcomes and Impact Logic Framework model: Inputs > activities > outputs > outcomes > impacts Examples: InputsActivitiesOutputsOutcomesImpacts Money Staff time Ideas Activities Events Exhibitions Numbers of visitors at events Number of events Changes which happen: People learn skills People have more confidence Changes which happen beyond the outcomes: Changes for visitors Changes for the museum

Outcomes can be: Positive Negative Expected Unexpected They can happen: During the project At the end of the project

What sort of data do you need to collect? Quantitative data: How much? How many? Qualitative data: What was it like? Opinions and views

Decide how many people you need to gather data from: (assuming 3% error) Population size 90% confidence level 95% confidence level 99% confidence level

Sampling The sample should be representative of the whole population (e.g. everyone who attended the exhibition) Random sampling: e.g. every 5 th or 15 th person who leaves the exhibition Convenience sampling: e.g. people who are available Snowball sampling: e.g. for people who are hard to reach - ask some people, then ask them to suggest other people (e.g. their friends) to interview

Make sure that the data collection method is suitable for the people and the information Triangulation = using several different methods, to get strong evidence partnerships.com/data/files/consulting- young-people-13.pdf Make sure that you have appropriate permissions for photography, audio recording, interviewing children

Methods of collecting data: Mediated (with a member of staff): Questionnaires (with interviewer) Interviews Focus groups Observation or tracking Participant observation Video Personal Meaning Mapping (concept maps)

Need not (necessarily) be mediated: Questionnaires (self-completion) Diaries, journals, scrapbooks Comments books Comments cards Drawings Mind maps On-line surveys (e.g. Blogs SMS, Facebook, Twitter

What do you know, think, feel about: Etnografski Muzej Zagreb Concept map: visitors write what they know etc about the topic, before the visit the exhibition. Afterwards, they write (in a different colour) what they now know, think, feel etc. Analyse comments using e.g. Generic Learning Outcomes.

Make an Evaluation Plan Who are the participants? What is the activity? Who is the evaluation for? What are the aims and objectives? Inputs, activities, outputs, outcomes, impacts? Baseline, formative, summative? Which methods?for each part of the project for each group of people (participants, staff, artists etc) How will you present the evaluation?

Analysing the data The analysis must reflect the evaluation aims and objectives Allow plenty of time (and therefore money) for analysis A written report is usually necessary for managers, funders and other stakeholders The report should give recommendations for the future The report can be in alternative formats …

Alternative formats: Website, blog DVD or video Performance Presentations Set of reference cards Book

What will you do with the evaluation when it is finished? Present it in a suitable format Tell the relevant people about it Ensure that recommendations are acted upon Don’t leave it on the shelf!

Best Practice in Educational and Cultural Action Conception of the project Achievement Evaluation Remediation / Improvement

Inspiring Learning for All Generic Learning Outcomes Generic Social Outcomes Ways of measuring hard-to-measure things Shows to other organisations what museums do, to benefit their visitors

Generic Learning Outcomes

Generic Social Outcomes

Working with community groups Youth groups Services for disadvantaged people Services for people with disabilities Communities of place Communities of interest Work with partner organisations, who already have a good relationship with local people

Why talk to teenagers and children? United Nations Convention on the Rights of the Child, article 12: “assure to the child who is capable of forming his or her own views the right to express those views freely in all matters affecting the child, the views of the child being given due weight in accordance with the age and maturity of the child.” Every child should be able to express their views and influence decisions affecting them. “Child” is anyone under 18 years old, so includes teenagers.

Why talk to teenagers and children? They have views in their own right - they are not just part of a family, part of a school, or on the way to being an adult It develops a relationship between the teenagers and the museum They can learn skills, e.g. decision making, interviewing, analysing data They can have a positive influence on their community It can increase their confidence and social skills

Ways of involving teenagers in the museum: Volunteer tour guides Provide training for them as volunteers in shop and café - good skills for jobs Young people’s forum Curating exhibitions Creating interpretation (e.g. film, guide) for an exhibition Projects to develop skills (e.g. reading, writing, art)

Young people as evaluators It is better if young people are involved in planning and doing the evaluation They can learn, and grow as a person Everyone (any age) brings their own knowledge and world-view Everyone makes their own meaning UNICEF guide to participatory evaluation with teenagers: participatory-evaluation.pdf

Maslow’s Hierarchy of Needs

Outcomes Star: a way to measure things like increase in self- confidence

Do… Make it fun! Explain the process and what influence they can have Provide a welcoming place Provide food which young people like Respect young people’s views; confidentiality, privacy Celebrate achievements Treat young people as individuals, not as a group Use e.g. artists to help with creative evaluation Be aware of needs of people with physical / learning disabilities Tell young people the outcomes of the project Provide help with transport to the museum

Don't… Make assumptions - about what they know (or don’t know), want or can do Promise too much Forget about child protection (especially in mixed age groups) Do adult things e.g. boring meetings Consult too much - people will get bored Do it too quickly – good consultation takes time