Effective Outreach: Lessons Learned from LakeSmart Christine Smith Lakes Education Coordinator Maine DEP
Getting In Step Process 1. Define driving forces, goals and objectives 2. Analyze target audience 3. Create tools 4. Package program 5. Distribute program 6. Evaluate outreach 7. Improve and implement
Scientific MethodEffective Outreach 1ProblemDriving Forces 2ObservationsAnalyze Target Audience 3Question/HypothesisGoals and Objectives 4Research MethodsCreate Tools 5Conduct ResearchPackage/Deliver Program 6Analyze ResultsEvaluate and Interpret 7ConclusionImprove and Implement
Step 1. Define driving forces, goals and objectives Driving Force Declining water quality due to urban/suburban landscaping Goal Lake-friendly landuse practices statewide Objectives wait for next steps (don’t forget)
Step 2. Identify and analyze target audience Target audience - lake shore residents McKenzie-Mohr –Barriers and incentives to actions –Identify “Spark plugs” and “Jacobs” in community Lake User survey 2000, Omnibus surveys, focus groups –Concerned, lacking knowledge on cause and effect, looking for easy fixes, retired
Survey Data 2000 Lake User survey Omnibus survey and focus groups ps/2005_report.pdf ps/2005_report.pdf 1999 Omnibus Survey npscamp4year.htm npscamp4year.htm
Step 3. Create the tools Training workshops Evaluation form for site visits Awards for incentive and to increase visibility
Objectives Objectives Hold 5 workshops/year Measure workshop success Track number awards and recognitions/year Long term measurement by redoing watershed surveys
Step 4. Package Program Name Logo Slogan- “Living lightly on the land for the sake of our lake” All develop by surveying audience
Step 4. Package Program Presenters with credentials Agendas for the workshops –Pledges, prompts SWCD to do property evaluations Promotion
Step 5. Distribute program 2 year pilot
Step 6. Evaluate Step 6. Evaluate Process Indicators ’03-’04 : 6 of 10 workshops (well received) but expensive 61% signed up – showed up in ‘04 68 property evaluations 27 awards, 39 recognitions
Summer 2005 Evaluations on 22 lakes Many requests for workshops Lack of response to first workshops Performance Partnership Agreement –Measurable objectives –More evaluation
New Objectives Objective: 15% of properties on project lakes are LakeSmart Objective: 75% of workshop participants take action
Do More Evaluation Process- numbers, familiar Impact- measuring action, water quality Context- who, what, where, why (includes success stories)
Surveys Written survey of target audience Phone survey of ’04 workshop registrants –Tell them you will call –Call no shows Phone interviews of participants, spark plugs, evaluators, collaborators
Impact Evaluation of 2004 Workshops 72% learned something new 37% had a property evaluation in ’04 83% took action (planting, diversions, had evaluation..)
Context Evaluation: Who is getting awards? Where are awards? When are awards? What support is need? Why are some lakes successful and others not?
Context Evaluation Example Analyze location of property evaluations in relation to workshops.
3+ Evaluations- ’03-’05 16 lakes had 3+ evaluations 10 lakes had workshop = 70 evals 6 lakes no workshop = 58 evals Conclusion: workshop not a requirement
Step 7. Improve and Implement We need to focus on fewer lakes Offer support (with social marketing tools) to lake associations for longer Offer shorter trainings Choose lakes with key elements
Elements for a Successful LakeSmart Program 1. A local “Spark Plug” 2. Active Lake Assoc. 3. No other big projects 4. A minimum 2 year commitment 5. Lake Assoc. offers incentives, $, plants, YCC
Elements for a Successful LakeSmart Program 6. Local interest 7. Evaluators- SWCD or other 8. High% year round or summer 9. High% shorefront in L A 10. Sense of community
Lessons Learned EVALUATE- process, impact, context Measurable Objectives Implement Changes
Questions?