Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Curriculum Planning Planning based on needs building contextual meanings for teachers, children, and parents.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Literacy in the middle years of schooling focusing on Aboriginal Students.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
Curriculum Writing Candace Wallace, Director of Curriculum & Testing June, 2009.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
What should be the basis of
Kyrene Professional Growth Plan
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
Scholarly Project Malignant Hyperthermia Learning Module
2013 State Reports 43 States Reporting Debra Spielmaker, Project Director National Agriculture in the Classroom June 23, 2014.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Instructional System Design
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Models for Evaluating MSP Projects Evaluation of Professional Development Programs MSP Regional Conference Dallas, Texas February 7, 2007 Norman L. Webb.
Professional Growth Plan / Professional Development Guide.
Chapter 6 Leading the Data Teams Process: Standards, Assessment, and Instruction.
Logic Models Handout 1.
Program Evaluation and Logic Models
The Impact of the Maine Learning Technology Initiative on Teachers, Students, and Learning Maine’s Middle School 1-to-1 Laptop Program Dr. David L. Silvernail.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Model Name Transition Project Learning Network Workshop 3 IYF Transition Project.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Professional Development Implementation Team January 28, 2013.
Instruction and Technology December 10, What’s up? Preview of learning gains and Excel “how-to” Final touches in bringing your unit together – Bulletin.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Contextualized Learning Activities
FASA Middle School Principal ’ s Leadership Academy Don Griesheimer Laura Hassler Lang July 22, 2007.
Technology Action Plan By: Kaitlyn Sassone. What is Systemic Change? "Systemic change is a cyclical process in which the impact of change on all parts.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
 “I have to teach the same information skills each year because students do not learn them.”  “I don’t have time to give tests so I do not assess student.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
New Teacher Orientation 2009 Cheryl Dyer Assistant Superintendent Teacher Observation and Evaluation in BRRSD.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
2015 State Reports 46 States Reporting Debra Spielmaker, Project Director National Agriculture in the Classroom June 2016.
MSP Summary of First Year Annual Report FY 2004 Projects.
TEACHERS COLLABORATING TO IMPROVE INSTRUCTION
LOGIC MODEL A visual depiction of what a project does and what changes it is expected to bring about. Learn more: Readings, template, examples:
Resource 1. Involving and engaging the right stakeholders.
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Using Logic Models in Program Planning and Grant Proposals
Short term Medium term Long term
Developing & Refining a Theory of Action
Measuring Project Performance: Tips and Tools to Showcase Your Results
Logic Models and Theory of Change Models: Defining and Telling Apart
2018 OSEP Project Directors’ Conference
Logic Model Template for ATE Projects & Centers
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Common Core State Standards AB 250 and the Professional Learning Modules Phil Lafontaine, Director Professional Learning and Support Division.
Resources Activity Measures Outcomes
Unit 7: Instructional Communication and Technology
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA, Agriculture in the Classroom - Project Director

The Southern Region wants to know “How do I Prove the Value of My Program to Teachers and Industry Partners?”  We’re already keeping track of the numbers of teachers and students reached, but we would like to know the following:  Do teachers continue to use AITC materials after they leave our workshops?  How are teachers using AITC materials in their classrooms?  What concepts are teachers covering with AITC materials received at workshops or ordered online from our websites?

Using program evaluation to answer questions  What is the purpose of a program evaluation?  To gain insights or to determine necessary inputs  To find areas in need of improvement to change practices  To assess program effectiveness  Conduct a cost analysis  Determine sustainability  What’s a program? Different from project?  Who cares about the data?  What difference will in make?

Who are the stakeholders and who are boundary partners?  Stakeholders: People who have a stake of a vested interest in the program, policy or product being evaluated, and also have a stake in the evaluation (Mertens & Wilson, 2012, p. 562).  Boundary Partners: The individuals, groups, or organizations with whom the program works directly (Buskens & Early, 2008, p. 174).

What is the role of stakeholders in program evaluation?  Stakeholders help to build a program theory or how the program should work. 1. Stakeholders identify the elements they believe are necessary to achieve their desired results. 2. They help to build a theory-based model (logic model, log frame, or program theory model) with specific outcomes. 3. Help with outcome mapping:  program’s vision (identifying the target population)  identifying boundary partners  identifying available inputs  prioritizing outcomes  outline how program initiatives will be evaluated

Logic Models  A program theory about how outputs (accomplishments) or interventions work to achieve outcomes (impacts).  Elements of a logic model:  Situation  Audience  Inputs (resources)  Outputs (activities or delivery)  Outcomes (short and long term)  Impact

W. K. Kellogg Foundation (2004)

W. K. Kellogg Foundation (2004)*

W. K. Kellogg Foundation (2004) Evaluating with a Logic Model

So with this knowledge…let’s evaluate a few logic models. W. K. Kellogg Foundation (2004)

Methods - Evaluation Designs  Pre-Post Designs  Experimental (random)  Quasi-experimental designs (selected on criteria)  Ex post facto designs (typically summative evaluations) Strengths and Weaknesses*

Data Collection  Who is the target population  What outcomes will be measured and how will they be measured (knowledge, behaviors, attitudes, skills)?  What types of measures?  Perceptions: Self-reported survey assessments, concept maps  Content Knowledge: Tests  Inventories  Designing an Instrument (see wiki)  Logistical Requirements  Time  Money  Expertise  Access  Data Collection Tools (see wiki)

Analysis  Conducting the Analysis  Stats Primer on Means, SD, statistical significance, and effect sizes (see wiki)  How will you use the results?  What are your performance targets?  Reporting

“How do I Prove the Value of My Program to Teachers and Industry Partners?”  Do teachers continue to use AITC materials after they leave our workshops?  How are teachers using AITC materials in their classrooms?  What concepts are teachers covering with AITC materials received at workshops or ordered online from our websites? Method, Data Collection, Analysis*

Courses for credit can assess all three questions, as a requirement  The Instructional Hours Reflection requires the following information (Ex post facto) 1. Number of classroom instructional hours spent on this lesson 2. Number of students in the classroom 3. strength of the lesson and/or improvement suggestions 4. additional classroom activities conducted and additional classroom resources used and teaching strategies or methods used to deliver this lesson 5. a minimum of three photographs of the activities, student work, or additional electronic files created to support the lesson, e.g. worksheets, PowerPoint, reading guides, etc.

A final Strategy Report: End of a credit course or at the end of a professional development workshop  Outline your strategy for implementing the Agriculture in the Classroom concepts, lesson plans, and activities into your classroom in the future. Your response should include specifics about what lessons, activities, teaching and instructional strategies, and other integration tactics you plan to use in your curriculum during the next year.

Epilogue  Is there a way to develop uniform questions and a uniform way electronically that we can follow up with teachers involved in our programs?