Technology is everywhere…and that is part of the problem! Outcomes Tools Technology Tuesday, Nov. 15, 2011 Lunch Series ”Thriving Through Technology” Thank.

Slides:



Advertisements
Similar presentations
Basic Logic Model Workshop Dawn Martz, Foellinger Foundation Mike Stone, Evaluation Consultant October 2012.
Advertisements

REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
School Community Councils Tuesday, March 23, 2010.
Outcomes and impact of the Project at the Cape Peninsula University of Technology, and remaining challenges.
A HISTORICAL PERSPECTIVE SCHOOL READINESS:. WHERE DID WE START? 1999 : KSDE began working with Kansas Action for Children to define School Readiness 2000:
Elementary School Counselor
JUVENILE JUSTICE TREATMENT CONTINUUM Joining with Youth and Families in Equality, Respect, and Belief in the Potential to Change.
4-H Administrative Update 2006 Youth Development Institutes March 2006.
Individualized Learning Plans A Study to Identify and Promote Promising Practices.
CHFS ANNUAL MEETING April 14, 2014 Baby Basics John Ladd, MNO Cuyahoga County Office of Early Childhood Invest in Children.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Our Mission Community Outreach for Youth & Family Services, Inc. is dedicated to improving the quality of life for both the youth and adult population.
5 Keys to a Superior PSGC proposal: 1. Create a well written, compelling research backed narrative that describes the problem & serves as justification.
Developing a Logic Model
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
OUTCOME MEASUREMENT TRAINING Logic Models OBJECTIVES FOR TODAY: n Recognize and understand components of a logic model n Learn how to create a logic.
Goal 3, Volunteer Development and Systems to Support Youth: Logic Model and Communications Plan Situation Statement During 2005, over 11,218 adult volunteers.
PLANNING FOR GROWTH AND SUSTAINABILITY Annette Leeland Executive Director California 4-H Foundation.
Joanne Muellenbach, MLS, AHIP The Commonwealth Medical College Scranton, Pennsylvania June 26, 2012.
Sunday 29 March – Thursday 2 April 2009, CAIRO, EGYPT
Molly Chamberlin, Ph.D. Indiana Youth Institute
DR EBTISSAM AL-MADI Consumer Informatics, nursing informatics, public health informatics.
NCALHD Public Health Task Force NC State Health Director’s Conference January 2014 A Blueprint of the Future for Local Public Health Departments in North.
Domus Leadership Training TOPIC - Planning June 2015.
5. How to Amass Evidence (Evaluation) of Change and its Effects? How does assessment drive transformative change in the classroom, at the department level,
Report to Los Angeles County Executive Office And Los Angeles County Health Services Agencies Summary of Key Questions for Stakeholders February 25, 2015.
Investing in Change: Funding Collective Impact
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Performance Measurement and Analysis for Health Organizations
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Evaluating the Effectiveness of the CTRC: Designing Quantitative and Qualitative Measures to Assess in Real Time the Value of the Center Mike Conlon, University.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Michigan Department of Education Office of Education Improvement and Innovation One Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
HECSE Quality Indicators for Leadership Preparation.
LCES Advisory Leadership System… The role of advisory leadership councils in Extension’s programming process.
Organizational Conditions for Effective School Mental Health
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Julie R. Morales Butler Institute for Families University of Denver.
Assessment/Evaluation 1) Determine Desired Outcomes 2) Determine Activities to achieve outcome 3) Determine Indicators to measure whether you are making.
How to Get Where You’re Going (Part 1)
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Creating a Team Vision Training Outcomes: 1.Identified strengths and contributions of each team member 2.List of each team members’ vision for the CTT.
Using a Logic Model to Plan and Evaluate Your Technology Leadership Development Program Chad Green, Program Analyst Lynn McNally, Technology Resource Supervisor.
Presented by: Utah Department of Human Services Utah Department of Workforce Services.
THIS PRESENTATION IS INTENDED AS ONE COMPLETE PRESENTATION. HOWEVER, IT IS DIVIDED INTO 3 PARTS IN ORDER TO FACILITATE EASIER DOWNLOADING AND VIEWING,
Presentation to: Presented by: Date: Developing Shared Goals in Public Health, Coalition Building, and District Partnership Success Chronic Disease University.
MOVING FROM DATA TO ACTION ADDRESSING HOMELESSNESS THROUGH A RBA FRAMEWORK POINT-IN-TIME COUNTS.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
Kathy Corbiere Service Delivery and Performance Commission
Family Homework Night Establishing Routines to Support Parent Involvement Kaitlyn Nykwest Homeless Children’s Education Fund 1.
Measuring Subjective Wellbeing: Integrating regional indicators with national measures Community Indicators Consortium November 2, 2015.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
Planning for Success Advancing district planning practices MASS/MASC Joint Conference November 5, 2014 Carrie Conaway, Associate Commissioner Planning.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
Community Connections Heather Altman, MPH Project Director, Community Connections Carol Woods Retirement Community /
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Program Planning for Evidence-based Health Programs.
SUN Community School Alder Building a Common Understanding and Vision.
Improving the Lives of Mariposa County’s Children and Families System Improvement Plan October 2008 Update.
Strategic Planning for Learning Organizations
Changing the Game The Logic Model
Presentation transcript:

Technology is everywhere…and that is part of the problem! Outcomes Tools Technology Tuesday, Nov. 15, 2011 Lunch Series ”Thriving Through Technology” Thank you Mike Behney, ISRA, Penn State Hbg Jolynn Haney, Deerfield Consulting LLC All material property of ConnectSynergy

Outcomes Tools Technology objectives  Discuss 3 key questions to ask to begin to measuring your program’s success  Apply measurement plan model “Play with” the model’s three critical parts (an activity) Consider the AH-Ha Project LCCF (an example)  Review some “Measuring Success TECH TOOLS” Websites for evidence based benchmarking

“So What” - Do we have a systematic way to assess the extent to which a program has achieved its intended results?

But, how do you measure? Typically we… Focus on what we do Activities Count how many we do Outputs But, these don’t tell us “So What?” Don’t describe success and progress for those we serve

Measuring Success Consortium of South Central PA A partnership of human and health services, government agencies and academia that collaborate, measure, and report program impacts for clients, stakeholders and communities. You are invited to join the CONSORTIUM (see Skip) The AH-HA Project is working with the “Measuring Success CONSORTIUM” …let’s see it!

AH-Ha Seed Fund - Lancaster County Community Foundation’s BIG ISSUES for its programs What positive difference did your project make in the lives of people living in Lancaster County? How did this project develop leadership skills/qualities in the lives of youth/teens in Lancaster County? Does this project have the potential of acting as a catalyst for broader change in the future? If so, how? If not, why not?

AH-Ha Projects LCCF Outcomes & Indicators Participants increase their ability to interact positively with adults 85% (21/25) youth increase the number of weekly, positive interactions they have with their mentors from the beginning to the end of the program. Participants improve their self-esteem 65% (16/25) youth report an increase in self-esteem as indicated by the Rosenberg Self-Esteem Scale. (Based on “Benchmarked” Data?)

RECOMMENED: 3 -step process to develop a MEASUREMENT PLAN… 1. Defines SUCCESS/CHANGE/PROGRESS for those you serve (objectives/outcomes) 2. Establishes ORGANIZED ACTIONS for services you provide (activities with/for clients) 3. Describes HOW YOU’LL MEASURE your success (metrics/indicators of clients’ success)

MEASUREMENT Plan (outcomes or operations plan) List a “big hairy audacious purpose for one program (goal or mission) Success/ProgressActivitiesMeasure Success Objective/Outcome#1 __________________ Objective/Outcome#2 Services to achieve success/progress Align activities to an objective …but each activity does not require its own objective _______________________ Determine indicators of progress for those served Align to each objective…two indicators per objective Then: collect-analyze-display ____________________________

The Camping Metaphor WHY go?WHAT do you do when camping? HOW do you measure SUCCESS?

Program Objectives-Activities-Measures FORM - simplified outcome logic Agency: Program: Strategic Goal/s: Outcomes Success-progress-changes for participants (during this program year) Activities Things you do - services for/with program participants. Indicator s/Measures How you will know you are achieving your outcomes for clients?

place to start from – your beginning point for measuring objective Example 58% actual based on collected data … not “85%” based on former guesstimates. - Counseling Service Montgomery County Best thing going – identify what & how IT is done (3 questions)

Evidence Based Practices The integration of best research evidence with clinical expertise and consumer values or those clinical or administrative interventions or practices for which there is consistent scientific evidence showing that they improve consumer outcomes. Best Practices The best clinical or administrative practice or approach at the moment, given the situation, the consumer’s or family’s needs and desires, the evidence about what works for this situation/need/desire, and the resources available. 15

Websites TOOLS National Registry of Evidence-Based Programs and Practices The Outcomes and Effective Practices Portal Child Trends website Program Development and Evaluation, University of Wisconsin - Extension html html Youth Programs “What Works Clearinghouse” - US Department of Education, Institute of Education Sciences - website Example: working with teens- measure progress using a “Self- Esteem Scale”

Let’s MEASURE… how’d we do today? Scale 1-5 with 5 = “Did Great” & 1 = “Did Poorly” HOW USEFUL were the following…?  Discussion: 3 key questions to ask to begin to measuring your program’s success  Application: measurement plan model “Played with” the model’s three critical parts Considered example: the AH-Ha Project LCCF  Review: “Measuring Success TECH TOOLS” Websites for evidence based programming