Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.

Slides:



Advertisements
Similar presentations
Using Assessment to Inform Instruction: Small Group Time
Advertisements

Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Team Teaching Section 7: Monitoring Teacher. The Monitoring Teacher model One teacher assumes the responsibility for instructing the entire class. The.
Mywish K. Maredia Michigan State University
Team Teaching Section 2: Traditional Team Teaching.
Exercise Swaps Community Emergency Response Team.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Team Teaching Section 5: Parallel Instruction. The Parallel Instruction model In this setting, the class is divided into two groups and each teacher is.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.
1 Assuring the Quality of your COSF Data. 2 What factors work to improve the quality of your data? What factors work to lessen the quality of your data?
Coaching for School Improvement: A Guide for Coaches and Their Supervisors An Overview and Brief Tour Karen Laba Indistar® Summit September 2, 2010.
Customer Focus Module Preview
Molly Chamberlin, Ph.D. Indiana Youth Institute
Inclusion Parent Meeting Welcome!
How to Develop the Right Research Questions for Program Evaluation
Common Core 3.0 Executive Summary Stakeholder Feedback Seeking Your Input to Improve Child Welfare Training! For audio: call enter access.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
1 Our Approach to Lesson Study María E. Torres Summer, 2004.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Universal Design for Learning in the College Classroom Abstract This Faculty Learning Community (FLC) integrated components of Universal Design for Learning.
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Program Evaluation and Logic Models
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Assessing Program Quality with the Autism Program Environment Rating Scale.
Setting purposeful goals Douglas County Schools July 2011.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
What is HQPD?. Ohio Standards for PD HQPD is a purposeful, structured and continuous process that occurs over time. HQPD is a purposeful, structured and.
1 AVID Tutor Training, Part 2. 2 Your Tutor Trainer [Note to Presenter: Add your introduction and contact information here.]
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
HB + HC Steering Committee Meeting. Meeting Objectives To inform community partners about the NW Health Foundation Grant opportunity To introduce the.
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
MANAGEMENT INFORMATION, INFORMATION MANAGEMENT AND A PERFORMANCE MANAGEMENT STRATEGY YSDF THURSDAY 12, NOVEMBER, 2009.
The Importance of Professional Learning in Systems Reform AdvancED Performance Accreditation Bev Mortimer Concordia Superintendent
Development Team Day 5a October Aim To explore approaches to evaluating the impact of the curriculum on pupil learning.
Continual Service Improvement Methods & Techniques.
How to Involve Families in the Child Outcome Summary (COS) Process Debi Donelan, MSSA Early Support for Infants and Toddlers Katrina Martin, Ph.D. SRI.
Session 2: Developing a Comprehensive M&E Work Plan.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Learning Management System
Logic Models Performance Framework for Evaluating Programs in Extension.
Universal GO 4 IT Training. Welcome and Introductions.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Focus Questions What is assessment?
Incorporating Evaluation into a Clinical Project
Field Experiences and Clinical Practice
Grant Champions Workshop Logic Model Development
New Goal Clarity Coach Training October 27, 2017
All Sure Start Educators
Short term Medium term Long term
How to Talk to Families about the 3 Global Outcomes and the EI Program
State Systemic Improvement Plan Demonstration Site Presented By: Chelsea Saganich Rojas Lead Implementation Coach North Central Early Steps.
General Notes Presentation length - 10 – 15 MINUTES
Assuring the Quality of your COSF Data
The Day My Parents Came to School
How to Talk to Families about the 3 Global Outcomes and the EI Program
Standard for Teachers’ Professional Development July 2016
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
State Systemic Improvement Plan: Demonstration Site Presented By: Chelsea Saganich Rojas Lead Implementation Coach North Central Early Steps.
The Qualitative Service Review Process Overview
The Day My Parents Came to School
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Assuring the Quality of your COSF Data
Presentation transcript:

Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington

Introductions Name, organization, role Short project description Experience with measuring success and impact (evaluation) – for this project or others Lee  2014

Words (wordle?) Design Implement Explore Evaluation Assessment Goals Measure Impact Success Refine Test Scale Learning Reach Data Change Outcomes Feedback Others? Lee  2014 Others: Evidence Scope Analysis Time

What outcomes would indicate to you that your program is achieving success?

First, let’s think about these questions.. What resources go into your program? (ex: funding, staff, partners) What activities does your program develop, support, and implement? What is produced from the activities and the program’s efforts? Lee  2014

What outcomes would indicate to you that your program is achieving success? Development of early childhood STEM lessons / activities? Increased effectiveness of program facilitators? Child learning of particular content or skills? Change in parents’ / caregivers’ behavior? Increased community reach? Lee  2014 What are the changes or performances that result from the program? Be specific.

Outcomes/Goals Inputs Activities Outputs What resources go into your program? (ex: funding, staff, partners) What activities does your program develop, support, and implement? What is produced from the activities and the program’s efforts? What are the changes or performances that result from the program? Program Logic Model

Some Considerations for Measuring Success…

Some Questions to Consider About Your Participants Does your program have repeat vs. one-time interactions over time with participants – children and/or caregivers? Do you have a way to contact and follow up with your program participants? About Your Program Do you have the same facilitators leading lessons / activities throughout the duration of your program? Does your program implement a lesson / activity once, or do you refine it over multiple iterations? Lee  2014 Your answers to these questions make a difference in what (and how) you can measure success!

Questions About Your Participants Repeat vs. one-time interactions over time with children and/or caregivers Can you measure changes in child / caregiver knowledge and skills? Alternatives: Changes in group behavior, facilitation, work products (ex: child artwork ) Lee  2014

Questions About Your Participants Participant contact info Can you collect information about what the child / caregiver does that is related to a program experience? Alternatives: Exit tickets, repeat interactions? Lee  2014

Questions About Your Program Same facilitators leading lessons / activity Can you measure changes in individual’s facilitation skills? Alternatives: Differences between facilitators Lee  2014

Questions About Your Program One-time vs. multiple iterations of lesson / activity What is the rationale for making changes to a lesson / activity? Do they result in changes in participant engagement (as a group)? Alternatives: Design principles Lee  2014

Methods for Measuring Success Methods Descriptive statistics Time sampling Event sampling Field notes Exit tickets Work samples Photographs Video / audio recordings Conversations with caregivers Team reflections Lee  2014 Considerations Time Timing / frequency Staffing Permissions Materials Subjectivity / Objectivity

Refining Outcomes and Identifying Measures of Success

Identifying Measures of Success Review Outcomes / Goals –Can you measure these given the constraints of your program? (Think about your access to participants and program design) –What does it look like when your program goals are met? Brainstorm possible measures –What resources will be needed? (additional staff member on site, develop protocols) –What are some challenges and considerations? Lee  2014

Indicators Methods/Measures Inputs What methods/methods will be used to measure your outcomes/goals? What resources will be needed to implement your methods/measures? What does it look like when your program goals are met? Measuring Success Challenges/Consideratio ns What are some anticipated challenges and considerations for implementing these measures?

Exit Ticket 1.What do you think will be the biggest challenge when implementing an evaluation plan? 2.What is most exciting to you about evaluation? 3.What is the most helpful thing you learned today? Lee  2014 Thank you! Tiffany Lee