Articulating the Impact of Our Activities LKSL Staff Development Group.

Slides:



Advertisements
Similar presentations
Working Group Feedback EDR Replacement. Where are we now? Inconsistent deployment and practice. Wide variation in the value placed on the process by managers.
Advertisements

Introduction to Impact Assessment
Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Methods Enabling Constituency Voice in Impact Evaluation for Improving Development Andre Proctor, Keystone March 2009.
MADELEINE GABRIEL & JULIE DAS 9 FEBRUARY PURPOSE OF THIS SESSION Introductions Purpose of our input today Our role – we are evaluating the programme,
Evaluation What, How and Why Bother?.
Evaluating Social Prescribing: Towards an Understanding of Social and Economic Impact Presentation to BVSC Conference 9 th July 2014 Chris Dayson Research.
Reflective Practice Leadership Development Tool. Context recognised that a key differentiator between places where people wanted to work and places where.
INVESTORS IN PEOPLE CORPORATE ASSESSMENT DURHAM COUNTY COUNCIL.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Strategic Management Process Lecture 2 COMT 492/592.
Lindsey Martin Meeting the challenges of e-learning: achieving and maintaining an e-ethos in an academic library ALDP April 2007.
Challenge Questions How good is our operational management?
Training and assessing. A background to training and learning 1.
Presented by Beverly Choltco-Devlin Reference and Electronic Resources Consultant Mid-York Library System September 25, 2009 REVVED UP FOR REFERENCE CONFERENCE.
Communicating Effectively Being strategic. Communicating effectively Why communicate? Why a strategy? Key elements in a communications strategy Thinking.
A big picture of the curriculum Adapted with thanks to colleagues at the Council for Curriculum, Examinations and Assessment (CCEA) Working draft: With.
How Halton ICT Business Services climbed the management Ziggurat Mike Horsley Lead Analyst, ICT services, Halton BC.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Sampling March Sampling – What is it? This method generates interest in Herbalife products by offering customers a trial size version of a product.
Building a Successful Event: presented by: Schulich School of Engineering, University of Calgary October 2, 2010 Maier Student Leadership Conference.
SUPI Coordination Day April 2014 Dr Jenni Chambers Senior Policy Manager, RCUK PER
Building Strong Library Associations | Regional Convenings DAY 2 Session 5 How will I get support for my proposal? Within my association.
LOUGHBOROUGHCOLLEGE Business Support Self Assessment
LEAP 3 An Update for SOCCER March 2014
You’ve Got What It Takes: Peer Training and Mentoring for Staff Development You’ve Got What It Takes: Peer Training and Mentoring for Staff Development.
Survey tools, focus groups and video as a means of capturing student experience and expectations of e-learning Dave.
The Impact of Health Coaching
Simon Wills Head of Wessex Drug & Medicines Information Centre Introduction Research is needed to help inform service development and to demonstrate the.
KEY MANAGEMENT ROLES. POLC  There are four key management roles.  Say in your head 5 times: management roles = POLC.  DO NOT FORGET THIS!  Very easy.
John Burke and Clive Alderson Embedding Business & Community Engagement.
ENHANCING PATHWAYS INTO CARE MANCHESTER. KEY RECOMMENDATIONS FROM MANCHESTER MENTAL HEALTH AND SOCIAL CARE TRUST Data collection: – ensure consistency.
The ToolBox Product Management & Product Development Framework Welcome to the Product Management & Product Development “Good Practice” workshop Facilitated.
1 Managers and Transfer Barriers Organisational politics Short term targets Lack of managerial support Lack of time for planning Enablers Authority to.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
Presenters: Pauline Mingram & KG Ouye July 25, 2011 California State Library Public Access Technology Benchmarks Webinar.
Charles Sturt University Climate Survey 2010: Building High Performance Cultures Briefing Session.
SU Counting what matters To measure what counts Karin de Jager University of Cape Town October 2004.
Alumni Programming & Engagement Kimberly Lowe & Craig Little.
Process Quality in ONS Rachel Skentelbery, Rachael Viles & Sarah Green
@Your Library Developing Libraries Services Rheinallt Library Programme Manager.
March E-Learning or E-Teaching? What’s the Difference in Practice? Linda Price and Adrian Kirkwood Programme on Learner Use of Media The Open University.
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
Cultured monitoring and evaluation – getting it right Steven Marwick Evaluation Support Scotland.
MANAGEMENT INFORMATION, INFORMATION MANAGEMENT AND A PERFORMANCE MANAGEMENT STRATEGY YSDF THURSDAY 12, NOVEMBER, 2009.
We’re an Edge library! Applying Edge Results to Elevate Digital and Technology Services Insert Library Logo Here.
Involving Communities in Planning Services David Allen BURA award winner for excellence in community regeneration, 2003 Highly commended in the Scottish.
IT Leading the Way to Institutional Effectiveness Presenter: Kendell Rice, Ph.D. July 11, 2007.
Being The Best We Can A self-evaluation & improvement process for libraries Key results for Victoria’s public library services.
Topic Six: Marketing Information Systems 1.  ALERT: We could spend the entire semester on this topic and not cover it in its entirety therefore every.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Health Behaviour Change in Young People Paul Ballard Deputy Director of Public Health NHS Tayside Honorary Senior Lecturer Dundee University Medical School.
What is Research Design? RD is the general plan of how you will answer your research question(s) The plan should state clearly the following issues: The.
Evaluating Training The Kirkpatrick Model.
BTEC Unit 6 Outcome measures. Objectives To evaluate some outcome measurement forms To identify ways of measuring outcomes effectively To design an outcome.
Reporting in Context Performance management, continual quality improvement and the feedback loop.
TYNE AND WEAR FIRE AND RESCUE SERVICE ‘Creating The Safest Community’ Evaluation in the Fire and Rescue Service Vicki Parnaby.
Identifying Monitoring Questions from your Program Logic.
Top Tips Localism In Action Tip 1: Getting Started Use existing links to build a strong localism partnership across the CA area Be proactive,
Research and Development Dr Julie Hankin Medical Director.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
Vaga Associates Emerging Oxfordshire Sports Partnership Enquiries Outcome of Visioning Event Partnership Aspirations.
Introduction to Workforce Planning
Priorities for the Success AT Strategic Action Plan: SUMMARY
CILIP Performance Framework – Business metrics & KPI
Understanding incident reporting in acute UK trusts
Measuring and communicating the Impact of Heritage
The Impact of the Livelihoods Training Project: Findings from the Evaluation Dr Mark Wilding Reader in Social & Public Policy Associate, SHUSU University.
Group Discussion 1: Structure & Organisation of a research society
Knowledge for Healthcare
Presentation transcript:

Articulating the Impact of Our Activities LKSL Staff Development Group

Possible impacts Improvements in practice which benefit our customers Encourage innovation, service development, sharing best practice Cost effectiveness of organising nationally Support LETBs to understand training needs Targeted to minimise scrap learning Support strategic direction of LKSL Opportunities for all NHS library staff

Who is our audience? LKSL or wider?

Requirements for evaluation Fit for purpose ie enables us to demonstrate impact Suitable for a variety of events and activities f2f, Webex, e-learning, sponsorship Easy to administer Cheap to administer Easy to analyse Speed to insight – easy to understand Generates a reasonable response rate Generates sufficiently robust data – qualitative and quantitative

A little bit of theory (1) Dimensions of evaluation – Process – Outcome Direct effect on participants – Impact Longer term effects

A little bit of theory (2) Kirkpatrick model

Questions to think about Post-event or later follow-up or both? Standard template with variations? Engage with delegates – expectation they will complete evaluation(s) – say why! Attempt metrics (methodologies exist)? Ask delegates for narrative/story? Whose responsibility – Organiser? Group role? Electronic eg Survey Monkey?