Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Thank you!. At the end of this session, participants will be able to:  Understand the big picture of our new evaluation system  Create evidence-based.
Copyright © 2014 by The University of Kansas Collecting and Analyzing Data.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Chapter 41 Training for Organizations Research Skills.
Formative and Summative Evaluations
Project evaluations a framework for your own project.
Scholarship of Teaching: An Introduction New Fellows Orientation April 17, 2008 SoTL Fellows
Title I Needs Assessment and Program Evaluation
Project Design and Data Collection Methods: A quick and dirty introduction to classroom research Margaret Waterman September 21, 2005 SoTL Fellows
Measuring Learning Outcomes Evaluation
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Action Research: For Both Teacher and Student
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
What is Action Research? Unit 1
CS Spring 5/3/ Presenter : Yubin Li Professor : Dr. Bamshad Mobasher Week 6: Descriptive Research.
Development of Questionnaire By Dr Naveed Sultana.
Introduction to Data Analysis *Training Session*
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
UNLEASH the POWER of the Evaluation Framework, Methods, Tools and Data Analysis.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Principles of Assessment
Evaluation for Grant Writers (and others) Jennifer Sweeney, MSLS, PhD.
Session Materials  Wiki
Business and Management Research
An Overview of the New HCPSS Teacher Evaluation Process School-based Professional Learning Module Spring 2013 This presentation contains copyrighted material.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Evaluating Your Regional Competition Laura Florence, University of Michigan Regional Coordinator, Great Lakes Bowl.
CENTER FOR NONPROFIT EXCELLENCE EVALUATION WORKSHOP SERIES Session IV: Analyzing and Reporting Your Data Presenters: Patty Emord.
Adolescent Literacy – Professional Development
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Debby Deal Tidewater Team STEM Grades 4-5 August 4, 2011 Action/Teacher Research.
NorthSky Nonprofit Network Creating Customer Satisfaction Surveys Presented by Christine A. Ameen, Ed.D. Ameen Consulting & Associates
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
S-005 Collecting data: What methods to use. Common methods Interviews – Face-to-face – Focus group – Telephone – Skype / video conference Questionnaires.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Investigating K-12/University Partnerships: A Case Study Analysis Zulma Y. Méndez, Ph.D. Rodolfo Rincones, Ph.D. College of Education Department of Educational.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
TAH Project Evaluation Data Collection Sun Associates.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: That students do not.
Developing High Quality Student Learning Objectives
Record Keeping and Using Data to Determine Report Card Markings.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
Formative Assessment vs. Summative Assessment Assessment OF Learning (Summative) vs. Assessment FOR Learning (Formative)
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Lecture 22. ` Basic Program Evaluation Contd… Module 8 – How to Collect Data Module 7 – Data Collection Plan Module 8 – How to Collect Data Module 9.
By Lynnece Edmond.  It is essential for researchers to collect data as a way to gather evidence to support their research question.  Collecting data.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
Diocese of Fort Worth Curriculum Development Process Professional Development Evaluation Report EDU: Dr. Ballenger Authors: Pamela Cooper, Charlene.
Documenting Completion of your PDP
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Identifying Assessments
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
You have a ton of data about your 1:1 program, now what? Jeni Corn, Elizabeth Halstead,
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
Data Coaching Services Types of Data 1. 2 o Qualitative Data vs. o Quantitative Data o Where do student and teacher data fall within these categories?
Learning Management System
MSP Summary of First Year Annual Report FY 2004 Projects.
SNRPDP Self Evaluation
Deer Valley USD Work Team November 15, 2017
What It Is and How to Design an Action Research Project
Presentation transcript:

Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried

Goals of the Session Understand how program evaluation process can be helpful to improving your STEM programs Ask evaluation questions that are useful to your STEM programs Identify a variety of useful data sources for your STEM programs

Agenda Introductions The Evaluation Process - Presentation Asking Good Evaluation Questions- Small group discussion Identifying Data – Presentation, Small group discussion Q & A- Whole group discussion

Formative and Summative Evaluation Just like formative and summative assessments … “When the cook tastes the soup, that's formative evaluation. When the guests taste the soup that is summative evaluation.” ~ Bob Stake

Keep it Simple and Focused An evaluation doesn’t have to be big! Match the number of evaluation questions in your plan to your resources. Focus on efficient, effective data collection strategies. A basic chart is a great way to organize your own thinking and to share easily the plan with others. Evaluation Questions Data Sources

Program Evaluation Steps 1-5, Repeat! 1.Identify the critical elements of the STEM program (logic model). 2.Ask important questions about your STEM program. (evaluation questions). 3. Identify what data are available to help answer your questions and determine the additional data needed (MISO surveys, NCERDC data, program level data).. 4.Collect, analyze, and interpret data to answer your questions. 5.What changes to your STEM program should you make based on your results? Repeat steps 1-5!! 6

Program Evaluation Steps 1-5, Repeat! 1.Identify the critical elements of the STEM program (logic model). 2.Ask important questions about your STEM program. (evaluation questions). 3. Identify what data are available to help answer your questions and determine the additional data needed (MISO surveys, NCERDC data, program level data).. 4.Collect, analyze, and interpret data to answer your questions. 5.What changes to your STEM program should you make based on your results? Repeat steps 1-5!! 7

What is a logic model? A logic model is a graphic representation of the relationships among the key elements of a project: inputs, strategies, objectives, long-term goals. Helps to articulate the key elements of the project. Enables evaluation efficiency and effectiveness. Promotes stakeholder buy- in by helping clarify how the project works. Drafting one can be a great way to involve stakeholders in planning.

What is a logic model?

Program Evaluation Steps 1-5, Repeat! 1.Identify the critical elements of the STEM program (logic model). 2.Ask important questions about your STEM program. (evaluation questions). 3. Identify what data are available to help answer your questions and determine the additional data needed.. 4.Collect, analyze, and interpret data to answer your questions. 5.What changes to your STEM program should you make based on your results? Repeat steps 1-5!! 10

Developing evaluation questions The process for identifying the questions to be answered by the evaluation is critical. Evaluation questions provide the direction and foundation for the entire evaluation. EVALUATION QUESTIONS Data Collection Data Analysis Results

Developing evaluation questions Why do we need to ask good questions? To … Determine what is really important and to whom Project leaders, program participants-teachers, students and their parents, etc. Focus data collection efforts What do we need to find out? How can we collect that information? Who is the best person to collect that information?

Developing evaluation questions The main types of evaluation questions are: 1.Questions about STRATEGIES: these questions ask about how well the strategies were implemented. 2.Questions about OBJECTIVES: these questions ask about impacts. Logic models are great guides for developing evaluation questions.

Developing evaluation questions Quick tips for writing good questions: 1.Try to avoid simple “yes or no” questions 2.Consider QUANTITY questions, e.g: “How many” “How much” “How often” 3.Consider QUALITY questions, e.g.: “How well” “How effectively” “In what ways” 4.Be able to be tuned-in to unexpected results.

Developing evaluation questions IMPLEMENTATION questions: -How many hours of sleep am I getting each week? (quantity) -How soundly am I sleeping? (quality) IMPACT questions: -How much weight have I lost? (quantity) -How has my stress level changed? (quality)

Developing evaluation questions Every evaluation question can’t be answered - finding out the answers costs time, money and people. Pick the most important questions that provide the most valuable information to users.

Small Group Activity With a partner(s) at your table, use the sample STEM program logic model to: 1.Brainstorm 2-3 implementation questions about the program’s strategies E.g. Strategy: Teachers will engage in face-to-face and online professional development. Quantity Questions: What percentage of teachers attend the PD regularly? What were the total number of hours teachers attended PD each month? Over the course of the program? Quality Question: How do teachers rate the professional development? 2.Brainstorm 2-3 impact questions to evaluate how well the outcomes are being met. Whole group share-out.

STEM Program Implementation Questions Quantity Questions Quality Questions

STEM Program Impact Questions Quantity Questions Quality Questions

Program Evaluation Steps 1-5, Repeat! 1.Identify the critical elements of the program (logic model). 2.Ask important questions about your program. 3.Identify what data are available to help answer your questions and determine the additional data needed.. 4.Collect, analyze, and interpret data to answer your questions. 5.What changes to your program should you make based on your results? Repeat steps 1-5!!

Data in Evaluations For each evaluation question, what information are you going to gather in order to answer it? 1.Consider a wide variety of data types and sources – both quantitative and qualitative. 2.What data do you already have? 3.What data do you need? 4.How much time, money and/or other resources will it cost to collect the data? 5.Make a calendar of what you’ll need, from who, by when. REMEMBER: Data must be interpreted, not just analyzed. 21

Data in STEM Evaluations MISO Instruments: Student STEM Attitudes Surveys Upper Elementary Middle/High Teacher STEM Attitudes Surveys Elementary Science Technology Engineering Mathematics Evaluation Questions: To what extent did students’ interest in STEM careers increase? Did inquiry-based learning increase student engagement? How did teachers self efficacy for teaching STEM content change?

Data in Evaluations “Qualitative data are measurements that cannot be measured on a numerical scale; they can only be classified into one of a group of categories.” Qualitative data are analyzed for patterns or themes. There are many sources of qualitative data; common in education evaluation: Interviews and focus groups with teachers Interviews and focus group with students Open-ended questions on surveys or questionnaires Open-ended assessments Portfolios of student work or other performance artifacts Open-ended classroom observation notes Journals, logs or other artifacts of project activities

Data in Evaluations “Quantitative data are measurements that are recorded on a naturally occurring numerical scale.” Quantitative data are analyzed using descriptive or inferential statistics. There are many sources of quantitative data; common in education evaluation: Demographics Grade-level information Years of teaching experience Standardized assessment scores Scaled questions on surveys Scale-scored classroom observations Graduation rates Rates of course-taking or course completion Most common and straightforward type of statistics – absolute numbers, percentages, averages, etc.

Data in Evaluations Beware of common data traps! Biting off more than you can chew Not collecting data needed to answer important questions Collecting data that is not really useful Neglecting hard-to-quantify data Not formalizing “informal data” (e.g., anecdotes, unrecorded observations) Not using valuable data after it has been collected 25

Small Group Activity 1.With a partner(s) at your table, select either the evaluation questions you developed or 4-5 questions shared during report out. 2.Brainstorm 1-3 data sources and collection strategies (how & when) you could use to answer each question. Examples: How many teachers attended the PD regularly? Attendance counts collected after each PD session throughout the entire evaluation time period. How do participants rate the PD sessions? Interviews with teachers collected once or twice throughout the entire evaluation period. Feedback forms administered and collected after each evaluation. Observations of each session. Whole group share-out.

STEM Evaluation Data Sources

Q & A Jeni Tricia Alana Thank You