Evaluating Social Marketing Campaigns

Slides:



Advertisements
Similar presentations
Experimental Research Designs
Advertisements

Evaluating Risk Communication Katherine A. McComas, Ph.D. University of Maryland.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Account Planning and Research Chapter 06 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Market Research Stage 6 Business Studies. Success depends on a lot of things, but when you have information about a particular market segment, a geographic.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Intensive Positive Behavior Support -- Secondary and Tertiary Behavioral Interventions Bruce Stiller, Ph.D.; Celeste Rossetto Dickey, M.Ed.
Measuring The Effectiveness of Integrated Marketing Communications
Personnel Learning. Primary Reference Emergency Management Principles and Practices for Healthcare Systems, The Institute for Crisis, Disaster and Risk.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
 2008 Johns Hopkins Bloomberg School of Public Health Section B Development, Implementation, and Evaluation.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Statement of the Problem Goal Establishes Setting of the Problem hypothesis Additional information to comprehend fully the meaning of the problem scopedefinitionsassumptions.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Allison Nichols, Ed.D. Extension Specialist in Evaluation.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Creating Public Relations Campaigns
Evaluating the Quality and Impact of Community Benefit Programs
National Coalition Academy Summary
Learning Goals Explain the importance of information to the company
ADVOCACY Presentation by Baiko Suleman Dass at “DE WATSON LEADERSHIP ACADEMY” Semester training for Union Leaders and Cultural groups Date: Venue:
Developing Trade Unions Advocacy, Campaigns and Communication Strategy
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Designing Effective Evaluation Strategies for Outreach Programs
Managing Marketing Information to Gain Customer Insights
THINK Public Relations
SAMPLE Implement Performance Improvement Plans
Research and Campaign Planning
Finding & Chasing Purple Squirrels:
The Systems Engineering Context
The Lycurgus Group: Instructional Effectiveness Survey System
Using Logic Models in Program Planning and Grant Proposals
Missouri’s Interagency Statewide Planning Team: Improving Quality of Life for Individuals Across the Lifespan Julia LePage and Terri Rodgers Missouri DDD.
Purposeful Literacy Leadership for Administrators: Start a Movement
MAN 252 PRINCIPLES OF MaRKETING
America’s Promise Evaluation What is it and what should you expect?
DESE Educator Evaluation System for Superintendents
Community Technology Assessments
Table Objectives set at one of three levels
Logic Models and Theory of Change Models: Defining and Telling Apart
Managing Marketing Information to Gain Customer Insights
Monitoring vs Evaluation
Monitoring and Evaluation of Postharvest Training Projects
Design Process Overview
Resource 1. Evaluation Planning Template
Engaging Institutional Leadership
Implementation Challenges
Design Process Overview
The Process of Advertising Research
Chapter 14 Evaluation in Healthcare Education
Unit 7: Instructional Communication and Technology
School PR 100A Setting the Foundation: Research, Evaluation & Strategic Planning.
Evaluation of Programs
MAP-IT: A Model for Implementing Healthy People 2020
Managing Marketing Information
Georgia’s Tiered System of Supports for Students Karen Suddeth, Project Director Carole Carr, Communications & Visibility Specialist
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Using Logic Models in Project Proposals
You Can’t Get There Without It!
Program Planning: Models and Theories
Creating Health Communication Campaign
M & E Plans and Frameworks
Presentation transcript:

Evaluating Social Marketing Campaigns Emerson College Summer Institute for Social Marketing and Health Communication JuNE 9, 2016

Wotan’s Ravens

Types of Evaluation Formative Process *Delivery/Implementation *Exposure/Reach Outcome (AKA summative or impact)

Important Principles of Evaluation Start thinking about and planning for evaluation from the first day Build it into your budget Look for ways to conduct evaluation economically

Purpose of Formative Evaluation Gathering information to making good choices about audiences, concepts, messages, channels, and materials Formative research becomes the foundation for all important decisions made throughout a social marketing initiative

Formative Evaluation Methods Surveys One-on-one exploratory interviews Intercept interviews Expert reviews Readability testing Focus groups

Cutting Costs with Focus Groups If you need to gather information beyond a single geographic region or gathering people proves a challenge, consider conducting telephone, online, or Webcam groups Get a partner to donate something other than cash that can be used as an incentive

Cutting Costs with Focus Groups (cont.) Send someone from your own team for moderator training Find an inexpensive location to conduct the groups (church, community center, etc.) If taping is needed, contact a local educational institution

Common Mistakes with Formative Evaluation Not enough data Not enough of the right kind of data Using convenience samples to draw firm conclusions

Process Evaluation

Colonel Brighton: “And what are you to do for the Arab Bureau Colonel Brighton: “And what are you to do for the Arab Bureau?” Lawrence: “I’m to appreciate the situation.”

What is Process Evaluation? An extension of formative evaluation A prelude and companion to outcome evaluation Concerned with monitoring and collecting data on the fidelity and implementation of campaign activities

Why Conduct Process Evaluation? Detecting small problems before they become large *Is the social marketing program on track? *Are midcourse course corrections necessary?

Why Conduct Process Evaluation (cont.)? Providing immediate evidence that the system is working *Are we keeping our partners engaged? *Do we need to boost morale?

Why Conduct Process Evaluation (cont.)? Helping with the interpretation of other evaluation results *What context have we provided for further assessment? *What have we learned to help us understand successes and disappointments?

Delivery/Implementation Evaluation

Delivery/Implementation Evaluation

Delivery/Implementation Evaluation Key Questions *Is an effective system in place for communicating a message? *Are activities being carried out as planned? *Is the timetable being followed? *Are the project staff, partners, and volunteers doing their jobs as assigned?

Delivery/Implementation Evaluation (cont.) Problems that delivery/implementation evaluation can uncover *Shortage of resources *Lack of motivation *Poor leadership *Confusion about the implementation plan *Intervening events

Methods of Evaluating Delivery/Implementation Log/tracking systems Systematic checks with implementers, partners, and media

Exposure/Reach Evaluation Key questions *Was the message received as intended? *How many people heard and/or saw the message? *Did the primary and secondary target audiences hear and/or see the message?

Exposure/Reach Evaluation (cont.) Problems that exposure/reach evaluation can uncover *Lack of appropriate assessment of delivery/implementation *Intervening events *Poor planning

Methods of Evaluating Exposure/Reach Monitoring mass media *Monitoring services *Content analysis Materials inventory *Checking stock *Checking distribution points

Methods of Evaluating Exposure/Reach (cont.) Monitoring responses and inquiries *Calls to telephone hotlines *Written requests for materials *Web site hits and inquiries Surveys *Exposure to campaigns *Awareness of campaigns

Limitations of Process Evaluation Data can be misleading Doesn’t tell you how people reacted to a message

Outcome Evaluation

Purpose of Outcome Evaluation Is there evidence that our intervention worked? We need to know for ourselves if the effort was worth it Partners and funders will expect signs of success We need to know how to build on success (or avoid repeating failures)

Grounding Outcome Evaluation in Objectives Outcome evaluation should be guided by objectives that have been refined from the early stages of planning Objectives should be SMART (Specific, Measurable, Achievable, Realistic, and Time-phased)

Examples of SMART Objectives Belief Objective– “By September 2017, increase by 20% the number of middle school students in Barnstable County who believe they are capable of using resistance strategies against drinking at appropriate moments.”

Examples of SMART Objectives Knowledge Objective– “By September 2017, increase by 30% the number of middle school students in Barnstable County who know at least three resistance strategies that can be used when pressured to drink alcohol.”

Examples of SMART Objectives Behavioral Objective– “By September 2017, increase by 10% the number of middle school students in Barnstable County who consistently use resistance strategies when pressured to use alcohol.”

Classic Experiment: Gold Standard for Outcome Evaluation Experimental Group* Pretest  Intervention  Posttest Control Group* Pretest  No Intervention**  Posttest *Assumes random assignment (considered quasi-experimental if random assignment doesn’t exist) **Or at least a modified version of the intervention

Not Meeting the Gold Standard: Random Assignment Not Possible Random assignment might not be possible for logistical reasons Try to match to a control group that is similar to the experimental group

Not Meeting the Gold Standard: Comparison Group Not Possible Using a comparison group might be cost prohibitive or logistically impossible due to access A time series design is still very rigorous, but requires several data collections PRPRPRTRPOPOPO

Not Meeting the Gold Standard: Pretest Is Not Possible Intervention might be underway before there is time to collect baseline data or the team can’t afford a pretest Can do a posttest only control group design, but you’ll never know if groups were equivalent at the beginning Might be possible to use secondary data as a baseline

Not Meeting the Gold Standard: Can’t Afford to Collect Any Data Is there a partner that can collect data for you? With increased access to databases online, look for secondary data sources (might have to wait some time for identifying change) Are there reasonable proxy measures to use?