DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Planning for Learning and Teaching, Assessment and Moderation
Business Improvement Review Knowledge Understanding Action.
Equalities and Procurement Workshop 1 Identifying Need and Creating the Specification Buying Better Outcomes.
HR Manager – HR Business Partners Role Description
1 Leading Change - Making it Happen!. 2 –“You can make a change and it triggers failure but if you don’t change, failure is inevitable anyway. You are.
Strategy 2022: A Holistic View Tony Hayes International President ISACA © 2012, ISACA. All rights reserved.
Scrutiny Scrutiny is a major tool for evaluating and then effecting change. Reviewing and evaluating what is done and measuring its success is key to better.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Connect and Share: to help us all to meet the future funding and efficiency challenge June 2015 Simon Pinkney measure2improve
MEANS TO AN END: the OECD Approach for Effective Implementation of Public Procurement Systems Getting really strategic Paulo Magina Head of the Public.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Overall Teacher Judgements
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
NSW DEPARTMENT OF EDUCATION AND COMMUNITIES – UNIT/DIRECTORATE NAME SASSPA Conference21 August 2015 Performance and Development NSW.
OFFICE OF THE COMMISSIONER FOR PUBLIC EMPLOYMENT NTPS Capability and Leadership Framework.
How can we evaluate the impact of supported employment and help make a better business case To demonstrate impact we need to measure the social value created.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Improving Integration of Learning and Management Systems Paul Shoesmith Director of Technical Strategy Becta.
Children and Young Peoples’ Participation. Increasingly recognised as a mark of a quality service Belief that this is how ‘transformational change’ can.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
OECD RECOMMENDATION on PUBLIC PROCUREMENT Implementing strategic procurement Matthieu CAHEN Policy Analyst Public Sector Integrity Division Kiev – 27 May.
MnCHOICES Olmstead Planning Committee June 21, 2012 Alex Bartolic Kristi Grunewald 2.
LIFELONG GUIDANCE SYSTEMS: COMMON EUROPEAN REFERENCE TOOLS ELGPN PEER LEARNING ACTIVITY WP2 Prague April 2008 Dr John McCarthy, Director International.
Digital Skills: Upskilling staff and the public William Benson Chief Executive Tunbridge Wells Borough Council.
Kathy Corbiere Service Delivery and Performance Commission
Learning the lessons 2012 and 2014 procurements of audit services.
Introduction Extensive Experience of ex-post evaluation of national support programmes for innovation; less experience at regional level; Paper aims to.
Enabling Collaborative Leadership Pioneer Programme A very brief introduction.
Presenter: Mazinza Ndala Tel:
Senior Management Team Away Day Session 1: HMS Business Update Presentation by Paul Worthington Managing Director Friday 19th and Saturday 20th October.
Clover Rodrigues Cardiff Third Sector Forum 13 December 2013.
MGT 423 Chapter 1: Training in Organizations FEIHAN AHSAN BRAC University Sep 21, 2013.
Mandy Williams, Participation Cymru manager
Panel: Ensuring Success Behind the Meter Solar Ontario 2016 Brian Hewson Senior Manager Strategic Policy.
Support for English, maths and ESOL Module 1 Managing the transition to functional skills.
Department of Internal Affairs Disrupting Government Service Models Tim Occleshaw Government Chief Technology Officer Service and System Transformation.
Sue Shapland, General Manager, Member Services, MS Society WA (Inc) & Louise Durack, Assistant Director, CSPD, Department of Health Keeping the Community.
Hampshire FA Equality Action Plan Overview
Information for Parents Statutory Assessment Arrangements
Knowledge for Healthcare: Driver Diagrams October 2016
Projects, Events and Training
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Deliver On-the-Job Training
Continuous improvement through collaborative development
Information for Parents Statutory Assessment Arrangements
Welcome! Enhancing the Care Team May 25, 2017
New Zealand Disability Strategy
Investment Logic Mapping – An Evaluative Tool with Zing
Building the foundations for innovation
DAVIDSON CONSULTING LIMITED
Improve Business Satisfaction by 10% Through Business Relationship Management Relationship management is the #1 driver of business satisfaction with IT.
CEA Case Study Marianne Farrugia.
CILIP Professional Registration & Portfolio Building
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Learning Link Scotland
Selecting the perfect CRM system
Innovative HR Innovative HR in the UK Civil Service Deborah Crewe, Modernising People Management.
Public engagement strategy
The candidate may as well have written:
ACT Evaluation Capability We’ve talked the talk
Preparing for Universal Credit: lessons from the front-line".
Foster Carer Retention Project Michelle Galbraith Project Manager
Involving children and young people
AICT5 – eProject Project Planning for ICT
Growth and innovation Project support overview.
Getting Knowledge into Action for Healthcare Quality
TRANSFORMING TO A SUSTAINABLE AND EQUITABLE DISABILITY SUPPORT SYSTEM
Enhancing Learning in Practice
Presentation transcript:

DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION IPAA EVENT SERIES DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION THURSDAY 18 MAY 2017

Kathy Kostyrko Director, Public Sector HAYS WELCOME Kathy Kostyrko Director, Public Sector HAYS

SPEAKER Peter Alexander Project, Procurement and Assurance Digital Transformation Agency

Simple, clear, fast government services for everyone

Policy advice Project assurance Strategic oversight Product delivery

User centred design Service Design Interaction Design Who are the users? What about their motivations, triggers, contexts are significant for your service? How can you find them to invite them to participate in user research? You must include users with varying needs (such as needs arising from disability, cultural diversity, literacy and remoteness). Consider all the users in the service including end users, users in government who are delivering the service, and key intermediaries (professional and personal network) What is the real task(s) that people are trying to achieve when they encounter your service. What is the ‘job’ people are trying to get done that your service is a part of? (You need to describe this in words that real end users would use, not using government terminology) How are users currently doing the task your service aims to help them do and key touch points, for example through journey maps. What other relevant government and non-government services are also in use at this time? Where are the pain points in the current experience? What are the user needs? What are the opportunities to remove or reduce the pain points? How might we better meet the user needs? (Demonstrate this through research, testing and validating possible solutions with prototypes) Are you designing the right thing? How have your insights from user research helped you to define your minimum viable product (MVP)? How does the MVP create value for users and government by better meeting user needs? During the Beta stage your understanding of what your users value will have matured through testing design prototypes with them. By the end of the Beta stage you should be able to show: Greater depth and diversity of knowledge on all of the points above from Alpha/Beta as well as How has your service been shaped by user needs? Show how you have made changes in the service and interaction design in response to user research and usability testing. You can evidence this by showing how the design has changed over time and the appropriate research findings that have driven this change How you tested the system in the users’ context with a full range of users (including users with varying needs). You can evidence this with artefacts of research, for example, video clips and outcomes from research analysis Are you prepared for ongoing user research? Show how you plan to continue to test the system with users and the resources for this, for example through an ongoing research plan and budget What have you not solved yet? What the significant design challenges are, for example through key insights, how have you approached them? How do you plan to continue to tackle them? How will you know if your design is working? Make sure that research has fed into the metrics you have developed to know that you continue to meet your user needs By the time you are ready to go live you should: Be able to show greater depth of knowledge for all the points above as well as Show how you are using data from real use to understand which parts of the task users are finding difficult and how you are designing experiments to reduce friction and increase success for users Know how you will measure and monitor your service to ensure it is serving its users well

Measure performance evaluate be transparent Key performance indicators All services must, at a minimum, measure 4 KPIs. User satisfaction - to help continually improve the user experience of your service Digital take-up - to show how many people are using the service and to help encourage users to choose the digital service Completion rate - to show which parts of the service you need to fix Cost per transaction - to make your service more cost efficient There will be other metrics your service needs to measure and monitor to understand how it is performing, such as: error rates time to completion costs, benefits and return on investment content metrics (readability, length). Dashboard The Performance Dashboard collects service data and presents it in a consistent and structured format. This is important so that you can: make quick data-driven decisions about how to improve your service compare data across multiple government services be open and transparent to the public about your service’s performance.

Upskilling the public service Change culture, build skills and improve service delivery Empower agencies to build technology and digital capability A partnership program with other government agencies

Kathy Kostyrko Director, Public Sector HAYS INTRODUCING PANEL Kathy Kostyrko Director, Public Sector HAYS

IPAA EVENT SERIES PANEL DISCUSSION THURSDAY 18 MAY 2017

IPAA EVENT SERIES QUESTIONS AND ANSWERS THURSDAY 18 MAY 2017

Kathy Kostyrko Director, Public Sector HAYS THANK YOU Kathy Kostyrko Director, Public Sector HAYS

Pierre Skorich CEF, Facilitation Group Chair CEF UPDATE Pierre Skorich CEF, Facilitation Group Chair

IPAA EVENT SERIES DELIVERING BETTER SERVICES: USER- CENTRED SERVICE DESIGN AND EVALUATION THURSDAY 18 MAY 2017