Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl.

Similar presentations


Presentation on theme: "Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl."— Presentation transcript:

1 Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl

2 Why Evaluate your Regional? Are we meeting our goals and objectives? Are we meeting our goals and objectives? Are we using the “right approach”? (focusing on correct objectives? correct methods?) Are we using the “right approach”? (focusing on correct objectives? correct methods?) Show successes, strengths of program Show successes, strengths of program Demonstrate concrete results to funding sources Demonstrate concrete results to funding sources Improve your competition Improve your competition A good place to start in program development A good place to start in program development

3 Developing a Program=PIE Implementation Evaluation Planning

4 My Environmental Education Evaluation Resource Assistant: MEERA www.meera.snre.umich.edu

5 Evaluation Process Used MEERA website as a guide: Used MEERA website as a guide: Step 1: Before You Get Started Step 2: Clarify Program Logic Step 3: Set Goals and Indicators Step 4: Choose Design and Tools Step 5: Collect Data Step 6: Analyze Data Step 7: Report Results Step 8: Improve Program

6 Step 1: Before You Get Started What types of resources will I need to invest in the evaluation?What types of resources will I need to invest in the evaluation? How do I find and work with an internal or external evaluator?How do I find and work with an internal or external evaluator? How do I involve program managers, staff and others?How do I involve program managers, staff and others? How do I obtain approval for the evaluation and consentHow do I obtain approval for the evaluation and consent from participants?

7 Step 2: Clarify Program Logic What is a logic model?What is a logic model? Why should I develop a logic model?Why should I develop a logic model? How do I get started?How do I get started?

8 Step 3: Set Goals and Indicators What are your goals for the evaluation? What are your goals for the evaluation? How do I develop evaluation questions? How do I develop evaluation questions? How do I answer my evaluation questions? How do I answer my evaluation questions?

9 NOSB Evaluation Goals Focus: Improving recruitment and retention of teams to the Great Lakes Bowl 1)What motivates teachers to participate in NOSB? 2) What factors are important for continued participation? 3) What factors would prevent teams from returning to NOSB? 4) What do NOSB teachers think are effective ways of advertising NOSB? 5) What can be the role of current NOSB teachers in recruiting new teams?

10 Step 4: Choose Design & Tools What type of data should I collect?What type of data should I collect? Which tool should I use to collect data?Which tool should I use to collect data? When and from whom should I collect data?When and from whom should I collect data? Step 5: Collect Data How can I best manage the data collection process? How do I select evaluation participants? How big should my sample be? How do I prepare my data for analysis?

11 For NOSB evaluation: Qualitative and quantitative data were collected from 2009 Great Lakes Bowl teachers: Focus group at the eventFocus group at the event Follow-up survey completed after the eventFollow-up survey completed after the event 14 teachers: 7 in focus group, 10 in survey, only 1 teacher not included in either 14 teachers: 7 in focus group, 10 in survey, only 1 teacher not included in either focus group or survey

12 Step 6: Analyze Data What type of analysis do I need? How do I analyze quantitative and qualitative data? What software can I use? For NOSB Evaluation: Content analysis for qualitative focus group and survey dataContent analysis for qualitative focus group and survey data Descriptive statistics (counts, frequencies, percentages, mean, median, mode, variability) for quantitative survey dataDescriptive statistics (counts, frequencies, percentages, mean, median, mode, variability) for quantitative survey data

13 Step 7: Report Results How do I develop conclusions and recommendations? How should I report my results and conclusions? How should I organize my report? How do I use graphics to illustrate results?

14 Key results from NOSB evaluation: 100% of teachers surveyed are willing to help promote NOSB, 80% are willing to make a presentation at a professional meeting Need for additional preparation materials was not cited as a factor in retention or recruitment Focus recruitment efforts at teacher meetings and workshops Advertising– electronic and paper mailings are important

15 Key results from NOSB evaluation: Factors motivating participation: Personal interest Student enrichment Low/no cost NOSB supports science curriculum Augments existing club activities Student interest

16 Key results from NOSB evaluation: Educational opportunities are desirable to teachers and students, but not vital for retention. However, once teachers or their students participate in educational opportunities (teacher workshops, internships, scholarships) they greatly value them as a program component. Early-career teachers are more likely to participate in professional development opportunities that offer CEUs.

17 Step 8: Improve Program How can I use my evaluation results to benefit my program? How can I ensure the evaluation is used to improve my program? In what other ways can I get the most from the evaluation?

18 NOSB Program Improvement Plans: Adjust 2009-2010 recruitment efforts to reflect the results of this evaluation:Adjust 2009-2010 recruitment efforts to reflect the results of this evaluation: -Empower teachers to promote NOSB -Re-institute paper mailings, newsletter articles -Explore opportunities to present and/or exhibit at teacher meetings and workshops Plan a 2010 evaluation to provide additional data on efficacy of these adjustments.Plan a 2010 evaluation to provide additional data on efficacy of these adjustments. Share evaluation results with other NOSB regions and NOSB national office.Share evaluation results with other NOSB regions and NOSB national office.

19 More about MEERA: What I liked about MEERAWhat I liked about MEERA Recommendations for use—Recommendations for use— - great for inexperienced evaluators and in-house evaluations - good to have a contact during the process Aspects that were particularly helpful—Aspects that were particularly helpful— - logic model - data analysis - reporting results - program improvement

20 Questions/Discussion Questions about MEERA, my evaluation? Questions about MEERA, my evaluation? Share regional evaluation experiences Share regional evaluation experiences Success stories? What hasn’t worked? Success stories? What hasn’t worked? Ideas for what you would want to evaluate? Ideas for what you would want to evaluate? Share tips, strategies, techniques Share tips, strategies, techniques


Download ppt "Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl."

Similar presentations


Ads by Google