Amy Corron, United Way of Greater Houston Roger Durand, University of Houston and Durand Research and Marketing Associates, LLC** Julie Johnson, Communities.

Slides:



Advertisements
Similar presentations
Transition IEP Using Your IEP to Plan for Your Life After High School
Advertisements

Laurie Glader, MD Emily Davidson, MD, MPH Opening Doors for Children with Disabilities and Special Health Care Needs Project Adventure: Lessons Learned.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Destinations What do you aim to achieve through the publication of destination measures? We have made it very clear that we want to put more information.
Writing Transition Focused IEPs Pamela Sacchitella and Jennifer Cacioppo.
United Way of Greater Toledo - Framework for Education Priority community issue: Education – Prepare children to enter and graduate from school.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
Where have we been and where are we going? Content analysis of non-familial intergenerational research Background Shannon E. Jarrott, Ph.D. Virginia Polytechnic.
Substance Abuse Prevention Fulfilling the Promise Linda Dusenbury, Ph.D. Tanglewood Research.
Program Evaluation Principles and Applications PAS 2010.
Goal Attainment Scales as a way to Measure Progress Amy Gaumer Erickson & Monica Ballay December 3, 2012.
207 : Developing Family Leadership: Using Data to Help Develop Practice The Pennsylvania Child Welfare Training Program 207: Developing Family Leadership:
Investigating mentorship influence on youth resilience A Research Proposal By: Sara Carter Clemson University April 18, 2012.
Engaging Youth in Process and Outcomes Evaluations of an Out-of-School-Time Program Amy Corron, United Way of Greater Houston Roger Durand, University.
Measuring the Impact of an Out-of- School Program on Young Children Amy Corron, United Way of Greater Houston *Roger Durand, University of Houston-Clear.
THE SCIENTIFIC METHOD Murtaugh 1A Living Environment.
Amy Corron, United Way of Greater Houston *Roger Durand, University of Houston-Clear Lake Emily Gesing, United Way of Greater Houston Julie Johnson, Communities.
HISD Becoming #GreatAllOver Flipping for Literacy June 2014 HISD Summer Leadership Institute HOUSTON INDEPENDENT SCHOOL DISTRICT Jonathan Trinh Lee High.
YOUR STEM PROFILE.  What they need to know and be able to do  At what level must performance be demonstrated?  How do we know they know what we want.
Chapter 2 Sociologists Doing Research. Research Methods Survey Research Survey – Research method in which people are asked to answer a series of questions.
Evaluating activities intended to improve the quality of Child Outcomes Data August 2016.
Phillip Decker University of Houston-Clear Lake Roger Durand**
The Common Outcomes Process of the United Way of Greater Houston
OTLA Report Writing Training
Step-Up Learning Event Evaluation and learning from the first year
SLP Training Day 3 30th September 2016
Fundamentals of Educational Research
Where the Family Fits Engaging Families Afterschool
How to Collaborate Effectively Appalachian Higher Education Network
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Collaborative Expedition Workshop #71 National Science Foundation
Your Inclusive College Experience is Over; Now What?
Evaluation of a Multi-Site, Out-of-School-Time Program: Contextual, Individual, and Combined Influences on Outcomes Amy Corron, United Way of Greater Houston.
Differential Outcomes of An After-School Program:
Cultural Competency Course
Community schools: a strategy, not a program
Introduction Results Method Conclusions
A Method for and Results from Evaluating a Multi-site After-School Program: Combining a Regression Discontinuity Design with Contextual Effects Modeling.
THE JOURNEY TO BECOMING
The University of Texas System Assessment of
“CareerGuide for Schools”
Program Evaluation Essentials-- Part 2
Afterschool Programs That Follow Evidence-based Practices to Promote Social and Emotional Development Are Effective Roger P. Weissberg, University of Illinois.
Office of Developmental Programs IM4Q Annual Training Quality Management Updates July 28, /18/2018.
Mentoring and Academic achievement
Experimental Design Using the scientific method
Record and Answer the Questions in you ISN so you can review later.
Introduction to Social Anthropology November 2018
Using Early Care and Education Administrative Data
What is a community School district?
Employment Support Sharna Raine
Developing an integrated approach to identifying and assessing Carer health and wellbeing ADASS Yorkshire and The Humber Carers Leads Officers Group, 7.
What It Is and How to Design an Action Research Project
YTP Instruction.
ELC Curriculum Design - Overview
OTLA Report Writing Training
What Does a 21st Century School Administrator Look Like?
A Focus on Strategic vs. Tactical Action for Boards
TLAP Partnership Meeting 7th June 2017
Changing the Game The Logic Model
AUCD Pre-Conference Workshop
Giving Kids What They Need to Succeed
What is SCIENCE? A way to answer questions & solve problems
OTLA Report Writing Training
Outcome Opportunity to meet participants from other countries
Scientific Method Biology I.
School Leadership Evaluation System Orientation SY12-13
Follow up workshop on the Evaluation of AUT STEM Programme
Evaluating Staff Development Initiatives
Scientific Method Biology I.
Presentation transcript:

Amy Corron, United Way of Greater Houston Roger Durand, University of Houston and Durand Research and Marketing Associates, LLC** Julie Johnson, Communities in Schools Kevin Kebede, Alief YMCA Jennifer Key, Alief Independent School District Joseph Le, formerly of Joint City/County Commission Linda Lykos, YMCA of Greater Houston Cheryl McCallum, Children’s Museum of Houston Katherine von Haefen, United Way of Greater Houston ** Presenter

 The Houston’s Kids program  Evaluation methodology  Goals and evaluation results  Findings: Digging deeper  Discussion and conclusions – implications for program managers and evaluators  Helpful resources

 Hurricanes Katrina and Rita – program impetus  Developing the assets (Search Institute) of at-risk children  The collaborating partners: The Alief Independent School district; The Children’s Museum of Houston; Communities in Schools; the Joint City-County Commission on children; the United Way of Greater Houston; the YMCA of Greater Houston; and America’s Promise.  Elements of the program  The goals of Houston’s Kids ( ) and “success standards”

 Process and outcomes assessments  Multiple evaluation designs; multiple measurement tools; multiple observations  Evaluation research subjects– program participants; matched sample of nonparticipating children and youth; parents of participants; program staff; collaborating partners’ staff; employers of participants; program “alums”  Relatively unique feature: merging of school records; program attendance data; surveys; and qualitative evidence.

 The Search Institute developmental assets  Goal 1: Program participants will seek positive social relationships (adults and peers); will be prepared for success in their relationships; and have improved self-image.  Goal 2: Participants will be prepared for success in school  Goal 3: Participants in the employment program will be pared for success in the job market  Evidence showed all goals were achieved

 Program effects are not constant throughout but can be improved with process evaluations and management  Unless randomness (including unreliability) of measures are taken into account, a program’s true impact will likely be mis-identified.  Improved performance comes from identifying those not reached by a program.  (See accompanying tables).

 The out-of-school time program worked well for at-risk children and youth  Effective collaboration was a key  Improving outcomes 1: Variable effects and mid-course corrections  Avoiding flawed findings: Taking into account randomness (including unreliability) of measure is essential.  Improving outcomes 2: Identifying participants who are “resisters” and “backsliders”

 We will be happy to share designs, measurement tools, designs, analysis procedures and…  Contact: ◦ Roger Durand, Ph.D. ◦ 3507 E. Plum Street ◦ Pearland, TX ◦