Presentation is loading. Please wait.

Presentation is loading. Please wait.

DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION

Similar presentations


Presentation on theme: "DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION"— Presentation transcript:

1 DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION
IPAA EVENT SERIES DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION THURSDAY 18 MAY 2017

2 Kathy Kostyrko Director, Public Sector HAYS
WELCOME Kathy Kostyrko Director, Public Sector HAYS

3 SPEAKER Peter Alexander Project, Procurement and Assurance
Digital Transformation Agency

4 Simple, clear, fast government services for everyone

5 Policy advice Project assurance Strategic oversight Product delivery

6

7 User centred design Service Design Interaction Design
Who are the users? What about their motivations, triggers, contexts are significant for your service? How can you find them to invite them to participate in user research? You must include users with varying needs (such as needs arising from disability, cultural diversity, literacy and remoteness). Consider all the users in the service including end users, users in government who are delivering the service, and key intermediaries (professional and personal network) What is the real task(s) that people are trying to achieve when they encounter your service. What is the ‘job’ people are trying to get done that your service is a part of? (You need to describe this in words that real end users would use, not using government terminology) How are users currently doing the task your service aims to help them do and key touch points, for example through journey maps. What other relevant government and non-government services are also in use at this time? Where are the pain points in the current experience? What are the user needs? What are the opportunities to remove or reduce the pain points? How might we better meet the user needs? (Demonstrate this through research, testing and validating possible solutions with prototypes) Are you designing the right thing? How have your insights from user research helped you to define your minimum viable product (MVP)? How does the MVP create value for users and government by better meeting user needs? During the Beta stage your understanding of what your users value will have matured through testing design prototypes with them. By the end of the Beta stage you should be able to show: Greater depth and diversity of knowledge on all of the points above from Alpha/Beta as well as How has your service been shaped by user needs? Show how you have made changes in the service and interaction design in response to user research and usability testing. You can evidence this by showing how the design has changed over time and the appropriate research findings that have driven this change How you tested the system in the users’ context with a full range of users (including users with varying needs). You can evidence this with artefacts of research, for example, video clips and outcomes from research analysis Are you prepared for ongoing user research? Show how you plan to continue to test the system with users and the resources for this, for example through an ongoing research plan and budget What have you not solved yet? What the significant design challenges are, for example through key insights, how have you approached them? How do you plan to continue to tackle them? How will you know if your design is working? Make sure that research has fed into the metrics you have developed to know that you continue to meet your user needs By the time you are ready to go live you should: Be able to show greater depth of knowledge for all the points above as well as Show how you are using data from real use to understand which parts of the task users are finding difficult and how you are designing experiments to reduce friction and increase success for users Know how you will measure and monitor your service to ensure it is serving its users well

8 Measure performance evaluate be transparent Key performance indicators
All services must, at a minimum, measure 4 KPIs. User satisfaction - to help continually improve the user experience of your service Digital take-up - to show how many people are using the service and to help encourage users to choose the digital service Completion rate - to show which parts of the service you need to fix Cost per transaction - to make your service more cost efficient There will be other metrics your service needs to measure and monitor to understand how it is performing, such as: error rates time to completion costs, benefits and return on investment content metrics (readability, length). Dashboard The Performance Dashboard collects service data and presents it in a consistent and structured format. This is important so that you can: make quick data-driven decisions about how to improve your service compare data across multiple government services be open and transparent to the public about your service’s performance.

9

10 Upskilling the public service
Change culture, build skills and improve service delivery Empower agencies to build technology and digital capability A partnership program with other government agencies

11 Kathy Kostyrko Director, Public Sector HAYS
INTRODUCING PANEL Kathy Kostyrko Director, Public Sector HAYS

12 IPAA EVENT SERIES PANEL DISCUSSION THURSDAY 18 MAY 2017

13 IPAA EVENT SERIES QUESTIONS AND ANSWERS THURSDAY 18 MAY 2017

14 Kathy Kostyrko Director, Public Sector HAYS
THANK YOU Kathy Kostyrko Director, Public Sector HAYS

15 Pierre Skorich CEF, Facilitation Group Chair
CEF UPDATE Pierre Skorich CEF, Facilitation Group Chair

16 IPAA EVENT SERIES DELIVERING BETTER SERVICES: USER- CENTRED SERVICE DESIGN AND EVALUATION THURSDAY 18 MAY 2017


Download ppt "DELIVERING BETTER SERVICES: USER-CENTRED SERVICE DESIGN AND EVALUATION"

Similar presentations


Ads by Google