Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Assessment Report Computer Science School of Science and Mathematics Kad Lakshmanan Chair Sandeep R. Mitra Assessment Coordinator.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Best Practices in Assessment, Workshop 2 December 1, 2011.
Matthias Wentzlaff-Eggebert, BZgA, Cologne, Germany This work is part of the Joint Action on Improving Quality in HIV Prevention (Quality Action), which.
Arkansas MSP Grants: Statewide Evaluation Plan Judy Trowell Arkansas Department of Education Charles Stegman Calli Holaway-Johnson University of Arkansas.
TEMPLATE DESIGN © DE-PBS Key Features Evaluation: Matching Philosophy & Measurement Sarah K. Hearn, M.Ed., Delaware Positive.
MED  Problem of the Day: SEND +MORE MONEY.
Introduction to the User’s Guide for Evaluating Learning Outcomes from Citizen Science Tina Phillips, Cornell Lab of Ornithology Marion Ferguson, Cornell.
Title I Needs Assessment and Program Evaluation
Interviewers training, ScoPeO evaluation(Country), (date) Module 1: Present the project and study.
Collecting and Compiling Data G/T Research Program Collecting and Compiling Data G/T Research Program.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Effective dissemination and evaluation
“Fail to plan… plan to fail”
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Alice M. Stafford, BS, CISD, CIT; Gail M. Gongaware, BSN, MA, CCM; Coleen Cox-Ballah, RN, MS-HCM, CCM, GCM INTRODUCTION METHODS DISCUSSIONKey Findings.
Classroom Action Research Overview What is Action Research? What do Teacher Researchers Do? Guidelines and Ideas for Research.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Principles of Assessment
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Continual Improvement Process Oregon Department of Education April, 2012.
Viewpoints for Student Partnerships Carry out a baseline study to research current position. Establish the case for student partnerships and align with.
Becoming a Teacher Ninth Edition Forrest W. Parkay Chapter 13 Becoming a Professional Teacher Parkay ISBN: © 2013, 2010, 2007 Pearson Education,
The Research on Credibility of Knowledge Management System Wang FanLin Department of Accounting Capital University of Economic Business Beijing, China.
If you don’t know where you’re going, any road will take you there.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Developing a Library Marketing Plan, Part 2 Implementing the Plan Mark E. Ibach Marketing & PR Coordinator South Central Library System.
ELA SCHOOL TEAM SESSION Welcome to EEA, 2012! 10/2/2015MSDE1.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Notes for trainers Available online  Link  Club Membership Chairperson’s Guide Just Ask! Welcome Membership Pulse Membership satisfaction Guide Membership.
How to Prepare for an Ohio Technical Assistance Visit.
Environmental Education Research Projects 2012  Junior Solar Sprints Longitudinal Study  Junior Solar Sprints Statewide  Life after H2.
Employee Recognition and Wellness Benchmarking Project Healthy Workplace Champions June 29, 2009.
LEADING FOR SUCCESS YOUR TEAM PRESIDENT PAT DONOHUE Soroptimist International of the Americas July 16-17, 2012.
AdvancED District Accreditation Process © 2010 AdvancED.
Draft TIP for E-rate. What is E-rate? The E-rate provides discounts to assist schools and libraries in the United States to obtain affordable telecommunications.
Planning an Applied Research Project Chapter 3 – Conducting a Literature Review © 2014 by John Wiley & Sons, Inc. All rights reserved.
Gretchen Almeida Staff Development Director
Professional Learning in Action.  What separates a good teacher from a great teacher?
Exploring Evidence.
Using a Logic Model to Plan and Evaluate Your Technology Leadership Development Program Chad Green, Program Analyst Lynn McNally, Technology Resource Supervisor.
Teaching Assistants Facilitating Conceptual Understanding in an Introductory Chemistry Laboratory Course Using the Science Writing Heuristic: Quantitative.
Educational Template Chapter 13 Research Applications Kathy Momtahan Chapter 13 – Research Applications.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Sociological Research Methods. The Research Process Sociologists answer questions about society through empirical research (observation and experiments)
Western Carolina University Office of Assessment A Division of the Office of the Provost.
KTE Community of Practice Case study: Using an educational influential network to disseminate an evidence based tool Team: Christie Brenchley, Mary Egan,
E VALUATING YOUR E - LEARNING COURSE LTU Workshop 11 March 2008.
Science Department Draft of Goals, Objectives and Concerns 2010.
School Accreditation School Improvement Planning.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
1 46th Annual PAFPC Conference May 5, 2015 MARIA GARCIA Schoolwide Program Manager DIVISION OF FEDERAL PROGRAMS Title I Schoolwide Programs.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Office of Service Quality
Developing a Dysphagia Management Strategy What’s next?
Research design and methods. What’s within your research design and method? –What research design will guide your study? –What is the scope/ location.
Financial Literacy Compendium Compiled by Karen Long-Trail, Coordinator of Admissions and Financial Aid TWU T. Boone Pickens Institute of Health Sciences.
HRM 531 EDU Teaching Effectively/hrm531edu.com FOR MORE CLASSES VISIT
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Program Evaluations Jennifer Glenski The Next Step Public Charter School.
Program Evaluation In addition to the school and district data the team routinely collected throughout the year, we also collected some project-wide… Implementation,
Discussion of the 2003 School District Waste Reduction Survey and Findings April 13-14, 2004.
1 A Framework for Focus Group Data Collection Michelle Revels ICF International.
Strategic planning A Tool to Promote Organizational Effectiveness
Universal Design Workshop July 30-31, 2013
United Way of El Paso County
Building Organizational Capacity: An evaluation of
Presentation transcript:

Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl

Why Evaluate your Regional? Are we meeting our goals and objectives? Are we meeting our goals and objectives? Are we using the “right approach”? (focusing on correct objectives? correct methods?) Are we using the “right approach”? (focusing on correct objectives? correct methods?) Show successes, strengths of program Show successes, strengths of program Demonstrate concrete results to funding sources Demonstrate concrete results to funding sources Improve your competition Improve your competition A good place to start in program development A good place to start in program development

Developing a Program=PIE Implementation Evaluation Planning

My Environmental Education Evaluation Resource Assistant: MEERA

Evaluation Process Used MEERA website as a guide: Used MEERA website as a guide: Step 1: Before You Get Started Step 2: Clarify Program Logic Step 3: Set Goals and Indicators Step 4: Choose Design and Tools Step 5: Collect Data Step 6: Analyze Data Step 7: Report Results Step 8: Improve Program

Step 1: Before You Get Started What types of resources will I need to invest in the evaluation?What types of resources will I need to invest in the evaluation? How do I find and work with an internal or external evaluator?How do I find and work with an internal or external evaluator? How do I involve program managers, staff and others?How do I involve program managers, staff and others? How do I obtain approval for the evaluation and consentHow do I obtain approval for the evaluation and consent from participants?

Step 2: Clarify Program Logic What is a logic model?What is a logic model? Why should I develop a logic model?Why should I develop a logic model? How do I get started?How do I get started?

Step 3: Set Goals and Indicators What are your goals for the evaluation? What are your goals for the evaluation? How do I develop evaluation questions? How do I develop evaluation questions? How do I answer my evaluation questions? How do I answer my evaluation questions?

NOSB Evaluation Goals Focus: Improving recruitment and retention of teams to the Great Lakes Bowl 1)What motivates teachers to participate in NOSB? 2) What factors are important for continued participation? 3) What factors would prevent teams from returning to NOSB? 4) What do NOSB teachers think are effective ways of advertising NOSB? 5) What can be the role of current NOSB teachers in recruiting new teams?

Step 4: Choose Design & Tools What type of data should I collect?What type of data should I collect? Which tool should I use to collect data?Which tool should I use to collect data? When and from whom should I collect data?When and from whom should I collect data? Step 5: Collect Data How can I best manage the data collection process? How do I select evaluation participants? How big should my sample be? How do I prepare my data for analysis?

For NOSB evaluation: Qualitative and quantitative data were collected from 2009 Great Lakes Bowl teachers: Focus group at the eventFocus group at the event Follow-up survey completed after the eventFollow-up survey completed after the event 14 teachers: 7 in focus group, 10 in survey, only 1 teacher not included in either 14 teachers: 7 in focus group, 10 in survey, only 1 teacher not included in either focus group or survey

Step 6: Analyze Data What type of analysis do I need? How do I analyze quantitative and qualitative data? What software can I use? For NOSB Evaluation: Content analysis for qualitative focus group and survey dataContent analysis for qualitative focus group and survey data Descriptive statistics (counts, frequencies, percentages, mean, median, mode, variability) for quantitative survey dataDescriptive statistics (counts, frequencies, percentages, mean, median, mode, variability) for quantitative survey data

Step 7: Report Results How do I develop conclusions and recommendations? How should I report my results and conclusions? How should I organize my report? How do I use graphics to illustrate results?

Key results from NOSB evaluation: 100% of teachers surveyed are willing to help promote NOSB, 80% are willing to make a presentation at a professional meeting Need for additional preparation materials was not cited as a factor in retention or recruitment Focus recruitment efforts at teacher meetings and workshops Advertising– electronic and paper mailings are important

Key results from NOSB evaluation: Factors motivating participation: Personal interest Student enrichment Low/no cost NOSB supports science curriculum Augments existing club activities Student interest

Key results from NOSB evaluation: Educational opportunities are desirable to teachers and students, but not vital for retention. However, once teachers or their students participate in educational opportunities (teacher workshops, internships, scholarships) they greatly value them as a program component. Early-career teachers are more likely to participate in professional development opportunities that offer CEUs.

Step 8: Improve Program How can I use my evaluation results to benefit my program? How can I ensure the evaluation is used to improve my program? In what other ways can I get the most from the evaluation?

NOSB Program Improvement Plans: Adjust recruitment efforts to reflect the results of this evaluation:Adjust recruitment efforts to reflect the results of this evaluation: -Empower teachers to promote NOSB -Re-institute paper mailings, newsletter articles -Explore opportunities to present and/or exhibit at teacher meetings and workshops Plan a 2010 evaluation to provide additional data on efficacy of these adjustments.Plan a 2010 evaluation to provide additional data on efficacy of these adjustments. Share evaluation results with other NOSB regions and NOSB national office.Share evaluation results with other NOSB regions and NOSB national office.

More about MEERA: What I liked about MEERAWhat I liked about MEERA Recommendations for use—Recommendations for use— - great for inexperienced evaluators and in-house evaluations - good to have a contact during the process Aspects that were particularly helpful—Aspects that were particularly helpful— - logic model - data analysis - reporting results - program improvement

Questions/Discussion Questions about MEERA, my evaluation? Questions about MEERA, my evaluation? Share regional evaluation experiences Share regional evaluation experiences Success stories? What hasn’t worked? Success stories? What hasn’t worked? Ideas for what you would want to evaluate? Ideas for what you would want to evaluate? Share tips, strategies, techniques Share tips, strategies, techniques