Download presentation
Presentation is loading. Please wait.
Published byEarl Jefferson Modified over 8 years ago
2
Evaluation Itself is a Value: Combining Effectiveness Research and Epidemiology in a Naturalistic Realist Evaluation Paper presented at American Evaluation Association’s 25 th Annual Conference, Anaheim, California, November 2-5, 2011 Session Title: What Counts in Social Services Evaluation: Values and Valuing in Evaluation Practice Sponsored by the Human Services Evaluation TIG, the Social Work TIG, and the Presidential Strand Mansoor A. F. Kazi, Ph.D., Research Associate Professor, University at Buffalo (The SUNY) Lead Evaluator Chautauqua Tapestry System of Care, Chautauqua County, NY
3
Tapestry of Chautauqua County, New York Mansoor A. F. Kazi, PhD, Director, Program Evaluation Center, School of Social Work, University at Buffalo mkazi@buffalo.edu Based on Kazi, M. A. F. (2003) ‘Realist Evaluation in Practice’, London: Sage Realist Evaluation Partnerships Lancashire Children’s Fund, UK
4
NASW Code of Ethics (Revised 2008) 5.02 Evaluation and Research (a) Social workers should monitor and evaluate policies, the implementation of programs, and practice interventions. (b) Social workers should promote and facilitate evaluation and research to contribute to the development of knowledge. (c) Social workers should critically examine and keep current with emerging knowledge relevant to social work and fully use evaluation and research evidence in their professional practice.
5
Evaluation Itself is a Value It is unethical to provide a Social Service without evaluating its effectiveness, Agencies collect enormous data but not use it for evaluation. This data can be de-identified and utilized in a 100%, naturalistic and unobtrusive evaluation, and at regular intervals in real-time Integrating evaluation into practice Enabling practice decisions to be informed by evidence.
6
Combining Effectiveness Research and Epidemiology in a Naturalistic Realist Evaluation Examining patterns in the data between demographics, intervention and outcomes To investigate what works, for whom and in what contexts. The anonymity of service users is protected, and at the same time there is greater accountability from the agency. The value of evaluation is to help develop more effective social services, and to provide evidence of their effectiveness on demand. This evaluation is done in partnership with stakeholders, e.g. analyzing their own data with them, enhancing the valuing of evaluation in society.
7
What Interventions work & in what circumstances A combination of efficacy research & epidemiology traditions Data already collected typically not used for evaluation Investigate interrelationships between outcomes, client demographics, client circumstances, & services provided Methods such as binary logistic regression can predict the likelihood of effectiveness of an intervention in given circumstances Use findings at regular intervals to better target and develop services
8
Chautauqua County Department of Social Services (with Margaret Coombes, Jon Anderson & Scott Bromberg) A database created by data dumps from the existing MIS systems (e.g. Connections). Child welfare outcomes of 97 children who were discharged from a county Department of Social Services foster care system in 2009 Out of the 97 youth, 62% were males and 71% white, and the largest age group was aged 14 to 17 years Children had a shorter length of stay in foster care if their caseworker was trained in Solution-Focused compared to the children whose caseworkers were not trained. Children who had a caseworker not trained in Solution-Focus stayed in care almost on average of 26.91months Children served by caseworkers who received Solution-Focus training were in foster care approximately 13.96 months. It was found that this difference was statistically significant (ANOVA).
9
Jamestown Senior High School: Comparison Group: Behavior counseling Behavior specialist—management of emotional issues & anxiety Independent samples t test indicates those receiving intervention improved by 5.90 (n= 40) as compared with 2.08 for the rest (n = 1009) Improvement was statistically significant t(41.89) = - 3.068, p =.004
10
Women’s Christian Association Hospital Jamestown Chemical Dependency Program 355 (100%) outpatient data for the period June 2006-2009. The goal of stopping the use of drugs was achieved with 138 (38.9%), partially achieved with 82 (23.1%) and not achieved with 135 (38%). Those that were employed were twice as likely to achieve the goal then those that were not employed Every additional treatment session attended was associated with an increased likelihood of achieving the goal of stopping drug use.
11
Conclusion Evaluators should establish partnerships with social work/human service agencies Naturalistic: No interference with practice Culturally competent user driven services: evidence? Data generated by practice is analyzed with the agency What works and for whom Findings at regular intervals to help develop and better target services Evaluation itself becomes a value—helping to provide effective services to meet the needs of service users Repeatedly investigate changes in outcomes, intervention & the contexts
12
Living Proof. WE MAKE A DIFFERENCE IN PEOPLE’S LIVES.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.