Evaluating Grandparent (And Other Relatives) Raising Grandchildren Programs Brookdale Foundation Web Chat, May 2007 Presenter: Kevin Brabazon

Slides:



Advertisements
Similar presentations
Introduction to Performance Measurement for Senior Corps Project STAR Support and Training for Assessing Results Recorded 9/10/2007.
Advertisements

Writing Research Questions
Ministry of Education Perceptual Survey Overview.
Qualitative Social Work Research
Jessica Hernandez California State University of Long Beach School of Social Work May 2012.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Chapter 41 Training for Organizations Research Skills.
Formative and Summative Evaluations
Objectives of Session Seven Complete in-class survey Discuss question formats and ordering Case study: face-to-face interviews v. self-administered questionnaires.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Quality of Life ENACTUS TRAINING Measurement Tools Developed by D Caspersz & D Bejr, 2013.
PROGRAM EVALUATION AÇEV HAMBURG-FEBRUARY ‘AÇEP’: Mother-Child Education Program Beneficiaries:  200,000 mothers and children in 78 provinces of.
UOFYE Assessment Retreat
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
1 Reuben Lindh Family Services Improving Children’s Outcomes Dianne Haulcy, Executive Director, Reuben Lindh Family Services National Conference on Ending.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Molly Chamberlin, Ph.D. Indiana Youth Institute
I want to test a wound treatment or educational program but I have no funding or resources, How do I do it? Implementing & evaluating wound research conducted.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Instructional System Design
Program Evaluation Using qualitative & qualitative methods.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Early Childhood Education The Research Evidence Deborah Lowe Vandell December 11, 2003.
Selective Prevention Working Group: Considerations for the Performance Indicator Table Reporting to the Plenary Session of the VII Meeting of the CICAD.
The Evaluation Plan.
Assessment with Children Chapter 1. Overview of Assessment with Children Multiple Informants – Child, parents, other family, teachers – Necessary for.
A /10 Strengthening Military Families: Current Findings and Critical Directions Anita Chandra, Dr.P.H. Coordinating Council on Juvenile Justice.
Chapter 11 Experimental Designs
“Opening the Doors of Policy-Making: Central Asia and South Caucasus” (UDF- GLO ) Skills Development Training for CSOs Istanbul, June 2-3, 2011 In-depth.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Overview of the Plain Talk Data Collection System Sarabeth Shreffler, MPH, CHES Program Officer, Plain Talk Program Public/Private Ventures.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
The Process of Conducting Research
Draft Outline of Framework for Evaluation of Family Resource Centres Kieran McKeown Version V6 – 7 th September 2011, CDI Conference in Dublin Family Support.
 2008 Johns Hopkins Bloomberg School of Public Health Evaluating Mass Media Anti-Smoking Campaigns Marc Boulay, PhD Center for Communication Programs.
Quantitative and Qualitative Approaches
Research for the Heath Professional. Overview Initial stages Choosing a Research Method Qualitative Research Designs Quantitative Research Designs Stumbling.
SPP/APR - SSIP Stakeholders Meeting # 5. Agenda for Today Stakeholder involvement Review Draft SSIP –Baseline Data / Target setting –Introduction –Data.
Observation and Assessment in Early Childhood Feel free to chat with each other. We will start class at 9:00 PM ET! Seminar Two: Using Standardized Tests.
CHAPTER 16 ASSESSMENT OF THE PROGRAM. Educational Assessment »Assessment and Evaluation is an integral part of any educational program. »This is true.
1 Nemours Health Planning and Evaluation Collaborative Learning Session I: Designing Effective Health Programs.
 Development of a model evaluation instrument based on professional performance standards (Danielson Framework for Teaching)  Develop multiple measures.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Early Intervention Program & Early Family Support Services: Analyzing Program Outcomes with the Omaha System of Documentation Presented to: Minnesota Omaha.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Getting to Outcomes: How to do strategic planning with your CRP Theresa Costello National Resource Center for Child Protective Services May 25, 2007.
CE300-Observation and Assessment in Early Childhood Unit 2 Using Standardized Tests and Authentic Assessments Feel free to chat with each other. We will.
Research Design Mixed methods:  Systematic Review,  Qualitative study, Interviews & focus groups with service users, Interviews & focus groups with healthcare.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
Outcomes Evaluation A good evaluation is …. –Useful to its audience –practical to implement –conducted ethically –technically accurate.
How do you know your product “works”? And what does it mean for a product to “work”?
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Selection and Formulation of Research Problem DR NORIZA MOHD JAMAL DEPT OF MANAGEMENT, FPPSM.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Welcome to the (ENTER YOUR SYSTEM/SCHOOL NAME) Data Dig 1.
Chapter 13 Parents, Families, and the Community Building Partnerships for Student Success.
School of Public Administration & Policy Dr. Kaifeng Yang 研究设计 : 实验研究的基本问题.
Child Outcomes Measurement Tools & Process A story of 3 conversions.
 An illustrated story to help children understand and cope with the problem of alcoholism or other drug addiction in the family.
Designing Effective Evaluation Strategies for Outreach Programs
Planning for Evaluation
Conducting Efficacy Trials
Evaluation Jacqui McDowell.
Monitoring and Evaluation of Postharvest Training Projects
Implementing the National School Safety Framework
Presentation transcript:

Evaluating Grandparent (And Other Relatives) Raising Grandchildren Programs Brookdale Foundation Web Chat, May 2007 Presenter: Kevin Brabazon

Management or Research? Both! From a management perspective, allows improvements in programming or strategic planning From a research perspective, adds to the body of knowledge about what produces changes in participant behavior Informs theory and policy

Phases of Management Control

Goals and Objectives In strategy formulation, goals and objectives are established Goals and objectives lead to strategic planning [programming] Some performance measures [outcomes] can be derived directly from objectives Performance measures are important for control of operations [management view]

Goal: Improve the quality of life for kinship families Objectives: Improve the coping skills of grandparent caregivers Reduce depression in caregivers Improve the educational performance of children Reduce anxiety in children Improve the stability of placements Improve the health status of caregivers and children Improve parenting skills of caregivers Improve intergenerational communication and understanding Example of Goal and Objectives – Grandparent Raising Grandchildren Program

Most Intergenerational Programs Have Two Types of Goals Intergenerational: build communication or understanding between generations; reduce age-related stereotypes; etc. Social/human service: improve literacy of children; prevent drug abuse; reduce depression for isolated elders; etc.

Types of Performance Measures Process Measures: Measures of “outputs” or “inputs” [how many people involved, their demographic characteristics, how many services they receive, etc.] Outcome [Results] Measures: What did the program accomplish? [improved reading skills, better attendance at school, reduced depression, improved intergenerational communication, etc.]

Measurement! Process Measures: Count, count and count! *How many clients enter the program, *How many drop out, *How many of each type of service each client receives, *How long each service intervention lasts, *How many clients each case worker handles, *etc. Great for efficiency, not for effectiveness

More Measurement! Outcome Measures: Only count the results! *Test reading skills before the program begins, and after it has operated for a while *Collect attendance data before the program begins and after it has been in place for a while *Use a depression scale as part of the intake procedure, and test clients again when the program has had time to be effective *Use a survey instrument that codes answers based on a numerical scale Great for effectiveness, not for efficiency

Outcome Measures continued… Pre- and post-tests are helpful, since you create a comparison of where a client is now compared to where a client was Be sensible about tests and build them into the operation of the program if possible A needs assessment or client intake form can be used as a pre-test Regular client surveys [a sample will do] - for example every year - allows a greater amount of data to be collected over time

Then Forget the Measurement! Do one or two case studies that illustrate the changes that take place through the program. This may involve interviewing the client, the client’s case worker, the children’s teachers, or other interested parties. Or observe client behavior [ethnographic study]. Paint a complete picture, and add photos if the client approves. If you have more substantial resources, consider a qualitative evaluation Great for adding meaning to your story

Different Types of Evaluation Outcome Evaluation – did the program accomplish its objectives, i.e., did it work, is it “effective”? Process Evaluation – how did the program accomplish its objectives? Implementation Analysis – was the program implemented the way it was designed? Qualitative study can dig more deeply into the program and what is really happening

Plan your Evaluation from the Beginning Establish performance measures when you plan the program Build data collection into the operation of the program Try to ensure that the data is useful to the operational manager [otherwise it may not be collected]

And the Final Questions…….. What if my program has already started? What is a proxy pre-test? Do I need a comparison [or control] group? Should I even think about random assignment? What is “good enough” for my evaluation? Are social work interns better than public policy students, if I need help?