Gathering a credible evidence base

Slides:



Advertisements
Similar presentations
Implementing NICE guidance
Advertisements

Introduction to Impact Assessment
1 Drafting a Standard n Establish the requirements n Agree the process n Draft the Standard n Test the Standard n Implement the Standard.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Practical resource Evidence based – developed through our research study Draws on the framework of impact, practicalities of capturing impact & lessons.
The Value of What We Do Dan Phalen US EPA Region 10.
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Health and Work Development Unit 2011 Implementing NICE public health guidance for the workplace: Implementation and audit action planning toolkit.
COORDINATED AUDITS IN AFROSAI-E Presented By Josephine Mukomba.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Nine steps of a good project planning
Results-Based Management
Performance Measurement and Analysis for Health Organizations
Abu Raihan, MD, MPH Director of Program, Asia IAPB 9th GA, Hyderabad, September 2012 Symposium 6: Challenges in Monitoring.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Measuring your Impact & Outcomes How you know you are making a difference Jill Davies – South Hams CVS.
Developing Indicators
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Logic Models and Theory of Change Models: Defining and Telling Apart
Important Information Have you got a username and password for the school SRF account? If your school has not registered before then you can do this if.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Better Community Engagement Training for Trainers Course Day 1 This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Supporting Development of Organisational Knowledge Management Strategy NHS Librarians Meeting 3 rd June 2010.
Dr Alison Carter – global leader in coaching programme evaluation Dr Penny Tamkin – experienced in organisation based research providing insight into whole.
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Childhood Neglect: Improving Outcomes for Children Presentation P21 Childhood Neglect: Improving Outcomes for Children Presentation Measuring outcomes.
Development of Gender Sensitive M&E: Tools and Strategies.
Lies, Damned Lies & KPIs Part Deux Improving quality: Developing KPIs for Stage 0 & beyond 20 th September 2011 National Improvement Leads NHS Improvement.
Community Score Card as a social accountability Approach Methodology and Applications March 2015.
MODULE 18 – PERFORMANCE MANAGEMENT
Logic Models How to Integrate Data Collection into your Everyday Work.
Workforce Repository & Planning Tool
Evaluating the Quality and Impact of Community Benefit Programs
How to show your social value – reporting outcomes & impact
Project monitoring and evaluation
Part 1 Being professional
Monitoring and Evaluation Frameworks
CAPABILITIES WHAT IS A “CAPABILITY?”
Right-sized Evaluation
Investment Logic Mapping – An Evaluative Tool with Zing
Making Housing Matter 5 July 2016, Skainos Centre.
Chapter 17 Evaluation and Evidence-Based Practice
EPAS Presentation. During one of your field seminars, you will present on your field experiences as they relate to CSWE core competencies and practice.
Digital Technology.
What is performance management?
Gender Equality Ex post evaluation of the ESF ( )
Logic Models and Theory of Change Models: Defining and Telling Apart
Building Knowledge about ESD Indicators
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Gem Complete Health Services
How to conduct Effective Stage-1 Audit
Portfolio, Programme and Project
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluation in Communication Management
Integrating Gender into Rural Development M&E in Projects and Programs
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Data for PRS Monitoring: Institutional and Technical Challenges
Why do we request a PIR? The information provided in the PIR helps inspectors to understand how the service meets the five key questions and the plans.
Impact of quality on day-to-day efforts of PHC
Presentation transcript:

Gathering a credible evidence base Jane Gibbon Newcastle University Business School

Useful and credible evidence within a social enterprise setting Context for appropriate models and tools Using elements of models and tools Gathering evidence Characteristics of useful evidence Presenting evidence

Evidence impact and learning Mission Outcomes Activities Evidence impact and learning

Social Accounting 1. Deciding and managing the scope Focusing on certain aspects Building completeness over time Doing what is possible 2. Agreeing indicators 3. Collecting the quantitative and qualitative data Facts, figures and narrative Stakeholder consultation Getting the questions right Consultation methods Consulting the community Choosing methods Other tools 4. Report on the social, environmental and economic impacts 5. Social Accounting Plan Resources Plan 6. Implementing the Social Accounting Plan Collect data Assessment: Social Accounting Plan that works and consultation and data results

Mapping Impact Inputs Staff & Funding Activities Improve knowledge management Outputs Strengthen policy, research & campaigning work Outcomes Influence government policy Desired Impact Deliver social change

Issues with impact mapping Measuring beyond outputs and understanding outcomes?

Evidence within a specific setting Does the evidence improve decision making? Does the evidence improve outcomes? Does the evidence improve clinical quality or patient safety? Does the evidence improve the care experience? Does the evidence improve efficiency and lower costs? Does the evidence improve our services? Does the evidence empower patients and families to improve their health?

Awareness of expectations regarding accepted metrics

Issues with acceptable evidence?

Soft Indicators and outcomes Interplay between indicators and outcomes Indicators are the means by which to measure whether outcomes have been achieved A soft indicator can be used to refer to the achievements that demonstrate progress towards an outcome A project could explore whether an individual’s motivation has increased during the scheme Subjective judgement but indicators such as improved attendance, improved timekeeping or changes in communication skills could suggest that motivation has increased.

Soft outcomes Outcomes from training, support or guidance interventions which cannot be measured directly or tangibly Could include achievements relating to: Interpersonal skills such as social skills and coping with authority Organisational skills such as personal organisation and the ability to order or prioritise Analytical skills such as the ability to exercise judgement, managing time or problem solving Personal skills such as insight, motivation, confidence, reliability and health awareness

Measuring progress? Two or three good indicators for each outcome make them more measurable Templates over time show progress made can be for each individual (distance travelled) Reasons for ratings identify soft indicators and imply soft outcomes Templates over time can show achievement

Example of a template… Outcomes from Asdfsd Asdfasdf Asdfasd Fasdfasd f

Social Audit Study at HMP Kirklevington Grange

Credible evidence… Includes many forms of evidence Includes both hard and soft indicators and outcomes Is longitudinal in nature Should be regularly reviewed for relevance Is clearly linked to strategic aims Has a structured framework for managing data Is relatively easy to capture