Download presentation
Presentation is loading. Please wait.
Published byHarvey Simon Modified over 8 years ago
1
Creating an evaluation framework to optimise practice returns: Process & progress within a large community sector organisation Australasian Evaluation Society Conference 2011 Karen Bevan & Sally Cowling UnitingCare Children, Young People and Families
2
Welcome to the complexity of UCCYPF Large organisation bringing 4 agencies together: Burnside Unifam Disability Children’s Services Social Justice Unit (SJU) is responsible for our research, policy and advocacy agendas Shares responsibility for program development
3
One Strategic Plan All agencies guided by a single strategic plan and shared purpose: Working together to transform lives Organisational commitment to RBA Determining ‘how much, how well and did we make a difference’ required 2 foundation stones These were developed internally by the SJU
4
MDS and Outcomes Framework Our foundation stones: Development of a UCCYPF Minimum Data Set which has been collected for all service users since July 2010 UCCYPF Outcomes Framework – Tier 1 outcomes have been collected for all service users aged 15+ since July 2010
5
Principles: MDS Data enables population comparisons and determination of relative need Relevant to our work Utilises data already collected (e.g. FRSP Online, HADS and SMART) Reducing data duplication critical to selling the merits of the new system to programs
6
Principles: Outcomes Framework Few measures, collected well Use common tools for data collection Measures should be evidence informed and enable population comparisons where possible Common measures for all service provision programs across the organisation with specific measures developed for each program cluster
7
Outcomes: Measuring the Difference Two tiers of measures: Tier 1 or ‘Headline’ measures Apply to all service delivery programs in the Service Group: Social Support and Self Efficacy (common outcome domains for all programs) Tier 2 or ‘Program Cluster’ measures Under development and specific to the work of clusters of like programs. Measures can be developed for specialist sub-groups
8
Working together to transform lives Headline Measures: All Operational Programs Two Measures: Social Support – HILDA Self efficacy – HILDA Program Cluster Measures: All Operational Programs allocated to a Program Cluster Two Types: One general One relating to specific area of each subgroup Family & Community Engagement Family Dispute Resolution Intensive Family Support Out of Home Care Youth Services Family Relationships (FSP) Disability Services Program Cluster All
9
Informing Evaluation Next challenge was to consider how the Framework could inform evaluation and guide decisions on: Which UCCYPF programs to evaluate, how and why? Which programs are ‘evaluation ready’?
10
Shared understanding of program logic Online ‘Program Development Survey’ completed for 65 Burnside programs to derive a basic ‘program logic’ Documented practitioner understanding of: What we do (activities) and for what ends (outcomes) Theory of change Evidence base Adaptations Data collected and assessment tools
11
Shared understanding of program logic Critical exercise in generating practice-informed understandings of program models High response rate & acknowledgment of: gaps in understanding of program theory, and/or the evidence which has guided choices about activities and ways of working Research team now meeting with programs to ‘close gaps’ by sharing knowledge → creates space for guided reflection
12
Constructing an evaluation platform Survey has informed understanding of assessment data and information collected (and not collected) at program level. Understanding of program objectives is guiding the development of Tier 2 outcome measures Aim to develop outcomes measures for children and accessible outcomes measures for people with disability
13
Constructing an evaluation platform Understanding of theory and outcomes → informs evaluation design and decisions Research resources can be directed to programs which are ‘evaluation ready’ Or used to implement the measures and tools which can support meaningful evaluation at a later stage
14
Next steps Evaluation possibilities emerging from investment in a Client Data Information System CDIS will provide rich data on the type, duration and intensity of services received by clients Investment in quantitative research skills will be required to fully utilise data drawn from MDS, outcomes measures and service profiles Need to define organisational roles & responsibilities for program development
15
Conclusion Important to acknowledge that the implementation of a Minimum Data Set and Outcomes Framework across a large and diverse organisation is an experiment Considered and sophisticated attempt to ‘do’ Results Based Accountability
16
Conclusion Still bedding the process and measures down. In 12 months time it will be important to assess the framework’s contribution to: Measuring whether we have ‘made a difference’ Providing an evidence base to support tenders and grant applications Informing evaluation design Efficient allocation of research resources
17
Questions or comments? Thanks for listening! Contacts: Karen Bevan T: 9407 3221 kbevan@burnside.org.au Sally Cowling T: 9407 3228 scowling@burnside.org.au
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.