Evaluation workshop CLCQ Conference 9th May 2017

Slides:



Advertisements
Similar presentations
Partners in Mind Workshop 17 November 2009
Advertisements

HR Manager – HR Business Partners Role Description
Evaluation at The Prince’s Trust Fire Service Prince's Trust Association meeting 18 th February 2010 Subtitle.
New Models for Sustainability Directed Care environment Australian Multicultural Community Services approach to financial tracking in a Client Directed.
Project Monitoring Evaluation and Assessment
Moving to a Unified Grants Process and a Single Monitoring Framework Jim Gray Acting Head of Community Planning, Corporate Services Dept, Glasgow City.
How to Develop the Right Research Questions for Program Evaluation
ZHRC/HTI Financial Management Training
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Strategic Planning Session David Rudder, Ph.D. Rudder Consultants, LLC. May 17, 2006.
Wgnho Management for Performance Department of Conservation Management for Performance Project.
Commissioning Self Analysis and Planning Exercise activity sheets.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
Justin Weligamage Department of Transport and Main Roads Queensland, Australia Collaboration and Partnership in Managing Skid Resistance for TMR Queensland.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Better Community Engagement Training for Trainers Course Day 1 This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Quality Assuring Deliverers of Education and Training for the Nuclear Sector Jo Tipa Operations Director National Skills Academy for Nuclear.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Provincial M&E Forum 18 August 2011 The Presidency Department of Performance Monitoring and Evaluation Draft National Evaluation Policy Framework.
July 2007 National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Role of Action Planning in The Developmental.
Using results frameworks to shift the focus of evaluation to a strategic level Emerging research on the principles underpinning results frameworks Kate.
Kathy Corbiere Service Delivery and Performance Commission
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Social value reporting: An integrated approach John Maddocks – CIPFA
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Jane Holdsworth 20 January The terminology of outcomes and impact How to evidence outcomes and impact Methods for collecting evidence Sources of.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Supporting community action on AIDS in developing countries ‘Evaluating HIV and AIDS-related advocacy’ Skills Building Session International.
Social return on investments (SROI)
Logic Models How to Integrate Data Collection into your Everyday Work.
Monitoring and Evaluating Rural Advisory Services
How to show your social value – reporting outcomes & impact
Knowledge Transfer Partnership Project Nottingham Trent University and Nottinghamshire County Council Dr Adam Barnard Rachel Clark Catherine Goodall 19/4/16.
Building evaluation in the Department of Immigration and Citizenship
Designing Effective Evaluation Strategies for Outreach Programs
Gathering a credible evidence base
Agency Performance: A New Agenda
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Areas Separate Approaches Parallel Approaches Joint Approaches
Auditing Sustainable Development Goals
Project Management and Monitoring & Evaluation
Right-sized Evaluation
Making Technical Cooperation work for capacity building
Strategic Planning for Learning Organizations
HEALTH IN POLICIES TRAINING
Advocacy and CampaiGning
Ken Matthews Chair, Partnership Working Group 31 July 2013
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Developing & Refining a Theory of Action
Evaluating CLE Community Legal Education Masterclass
Successful Bid Writing:
Claire NAUWELAERS, independent policy expert
Introduction to CPD Quality Assurance
Measure and Manage Your Social Value in North Wales
school self-evaluation and improvement toolkit
Tools, models and due diligence for collaboration
A Focus on Outcomes and Impact
Public Health Intelligence Adviser
WHAT is evaluation and WHY is it important?
Strategy
Introduction to Evaluation
Environment and Development Policy Section
Using Logic Models in Project Proposals
“Methodology SROI & Key Stakeholders”.
Making Technical Cooperation work for capacity building
Presentation transcript:

Evaluation workshop CLCQ Conference 9th May 2017

About us Purpose-led consultancy, founded in 2015 by 3 professionals with substantial experience and expertise in social, environmental and economic impact measurement. We have worked with private sector, not for profit and government organisations of all sizes on program design, impact measurement, capacity building and monitoring & evaluation. “We work with organisations to help measure what matters and convert best intentions into successful outcomes.”

What we’re covering today Background and aims of the project A quick look at ‘measurement’ theory and practice Work and progress to date Next steps

Background to the project Commissioned by the Management Committee of CLCQ, overseen by a Steering Group, and facilitated by The Incus Group The project is important to the viability and sustainability of the Queensland CLC sector as: Funding and other resources are allocated on evidence of need; There is increasing pressure on resources which requires CLC to make informed strategic decisions about service delivery; and Stakeholder communications and relationships at a CLC and sector level can be better informed

Background to the project Aim is to build the capacity of Queensland CLCs to better measure and understand the outcomes of their work in the community within a shared sector outcomes framework. Build an evaluation framework and methodology that can be practically applied to the Queensland CLC context Establish and test a self-evaluation toolkit and resources to assist Queensland CLCs to undertake outcome evaluation on an ongoing basis Contribute to building a sector culture of measuring and reporting on outcomes and impacts. The main purposes of the framework and toolkit are to: Understand and measure client needs and how they are being addressed Capture outcomes for clients and other stakeholders more systematically Effectively capture and communicate what works to funders and the sector Build the measurement and evaluation capacity of CLC’s and the sector

Why measure outcomes? Organisations are operating within an increasingly competitive and sophisticated philanthropic and corporate investment environment. Funders as well as service users are keen to better understand the value of the services they support or rely on Commonwealth and State governments are embarked on a range of new funding and contracting strategies, centred on achieving better value from public investment in services, focused on ‘impacts’ rather than ‘outputs’ …National Disability Insurance Scheme, Payment by Outcomes, Social Impact Bonds Companies are increasingly moving away from pure philanthropy towards strategic community investment models that better align corporate responsibility and community programs with business priorities….. Social Return on Investment, Creating Shared Value

Why measure outcomes? Organisations undertake outcomes measurement for several reasons: To understand, measure and report the value they are creating through their work To use that understanding to improve the quality of the services they deliver to their stakeholders and clients To lobby or advocate for change in public policy To make the strongest case for continuing and increased support from their funders and partners

Outcomes measurement It’s looking at the bigger impact your organisation has on your clients, the broader community and other stakeholders Significant positive and negative changes (outcomes) Short, medium and longer term outcomes Direct and indirect (spill over) outcomes Target beneficiaries and other stakeholders Economic, social and environmental There is no standardised or one-size-fits all framework or approach to guide outcomes measurement or reporting, but there are principles

Principles Strike a balance between numbers and stories Principle 1: Involve your stakeholders Measure what matters from the perspective of your stakeholders – the people or organisations that have experienced change as a result of your activities. Principle 2: Measure and understand the theory of change Apply appropriate methods and resources to understand and then measure the change (positive and negative, intended and unintended) Principle 3: Be accurate and credible by taking account of: What would have happened anyway? (Deadweight) How much did other organisations contribute to the change? (Attribution/Contribution) Have the issues moved somewhere else? (Displacement) Could also include the following as principles: Strike a balance between numbers and stories Be transparent about both the investment and the value created

The theory of change Inputs : the resources invested - cash, volunteer time, in-kind contributions Activities: What the investment enables to happen – e.g. legal advice, casework, community education, client representation, partnership initiatives Outputs: Usually quantitative (e.g. number of clients supported, sessions delivered, cases closed, submissions made) Outcomes/Impacts : The change that occurs as a result of an activity (e.g. improved client personal/financial well-being, more efficient legal services, fairer legal outcomes, change in public policy

Progress to date Steering group established to oversee the project Agreed terms of reference, working arrangements, methodology and implementation plan Trial ‘sites’ agreed Townsville, BRQ and Nundah Data collection and literature review Internal, external, International

Progress to date Consultative workshop for CLC’s Discuss different operational models, Stakeholder mapping Review data collection, current and future External stakeholder consultation 8 interviews including Legal Aid, Appeals Tribunals, Dept of Justice, Community partners, Volunteer Lawyers Draft Theory of Change developed in consultation with staff, community partners and collaborators in justice services, as well as drawing on local and international examples.

Key Evaluation questions Clients To what extent have we made a difference to those we have serviced? To what extent have we serviced those most in need? Community How effectively have we worked with community partners and other services? To what extent have we increased community understanding of legal rights and responsibilities? Justice System and Services To what extent have we contributed to improved efficiencies in the justice system? To what extent has our case-work contributed to effective advocacy? Pro Bono and Volunteer Lawyers To what extent are we increasing knowledge and capacity in the legal sector?

Theory of Change The CLC Theory of Change is also built on a number of assumptions: Client-focused work has both immediate and longer term impacts Clients want results, and also to be empowered Providing information empowers people and communities Rights will be valued and defended Other agencies want to form alliances and advance common agendas Other organisations have the capacity, skills, ability and willingness to collaborate Justice services are expecting measurable outcomes from their engagement with the sector Increased understanding of social justice issues leads to fairer outcomes

Theory of Change

Evaluation Toolkit Components CLASS data Ongoing collection at all CLCs Client / external stakeholder Survey Developed during Project Periodic Data collection Case Studies CLC Specific Projects Internal Evaluations

Monitoring - key points to note Principal function is to advise stakeholders of program performance, and identify of deviations from targets To perform this function well, monitoring must be: Ongoing Referenced to agreed performance criteria Monitoring typically focuses on: Activities and processes (Provision of training, court appearances, information provision) Outputs (# of advices provided, # of CLE trainings) WHAT ARE THE MOST RELEVANT PIECES OF DATA IN CLASS THAT CAN INFORM EVALAUTION OD CLCS

Evaluation – key points to note Principal function is to inform policy and program development and implementation, and enable sound decision making Evaluation forms judgments on the state of affairs, merit and worth of a program State of affairs: What is occurring Merit: The intrinsic quality of something (in absolute or relative terms) Worth: The value of something in a particular context (contributing to the sector and for stakeholders) The word ‘judgment’ can be challenging. In program evaluation: We form judgments about programs, not people Judgments can be positive, negative, or mixed We are using our judgment, not passing judgment We are using our judgment, based on the consolidated sources of information (NOT JUST WHAT WE SEE)

Evaluation – key points to note Evaluation conclusions have both a factual and a normative aspect Factual: Identifying that something is the case Normative: Reaching a view as to merit and/or worth To perform these functions well, evaluation must: Be planned and systematic Form a sophisticated understanding of its subject matter Develop explanations for the state of affairs Which means that evaluation… Often involves more intensive effort than monitoring, in both data collection and analysis Is usually conducted periodically or episodically, not continuously

Next Steps Finalise the draft toolkit: Guidance notes Templates for Client surveys, Partner interviews, ProBono and Volunteer Lawyer surveys Appropriate data collection protocol (Collection methods and frequency) Work with pilot CLCs to test it, providing guidance and support on: Extracting appropriate data from CLSIS/CLASS to inform evaluation questions Stakeholder consultation and engagement (Case Studies, non CLASS reporting etc.) Collating, analysing presenting results Review pilot findings with Steering committee: Present findings and feedback from the pilot Clarify specific issues in relation to toolkit development and future rollout Work with CLCQ to: Produce Training Material and ‘User guide’ based on pilot feedback Agree the roadmap and timetable for rollout WHAT CAN WE REALISTICALLY PULL FROM CLASS REPORTING TO EVIDENCE Survey Monkey – output will be RAW data in excel which INCUS will collate and present as a snapshot of results per for each CLC in pilot