Download presentation
Presentation is loading. Please wait.
Published byCristal Tanton Modified over 9 years ago
1
Assessing & Improving Caller Experience Greg Simsar, VP Speech Services Eduardo Olvera, VUI Architect
2
Why We Are Here To empower you with the knowledge, confidence & inspiration to improve your company’s caller experience
3
How We’re Going to Empower You Sharing what we know Learning by doing Getting you fired up!
4
Agenda Caller experience & why it matters Benchmarking, best practices & standards Methods & tools for assessing & improving caller experience “Hands-on” exercises “Selling” caller experience Takeaways
5
Caller Experience Defined Caller experience – everything the caller experiences ‘over the phone’ when contacting a company for any reason Focus for this seminar will be (but not limited to) the caller experience with call center automation – call routing (ACD), self-service (IVR), and computer telephony integration (CTI)
6
Why Caller Experience Matters
7
What caller’s think about caller experience What caller’s think about speech recognition Why it matters Good caller experience is good business (have your cake and eat it too)
8
What Caller’s Think
9
Which method for doing business with a company is described by the phrase? Speech systems are perceived to deliver a higher level of benefits than keypad entry systems. = statistically significant difference What Caller’s Think
10
If speech were to replace keypad entry, how would that affect your overall experience? 70% feel their overall experience would be improved if speech technology were used instead of touch-tone menus. What Caller’s Think
11
How influential is customer service on your perception of the company? 88% “very influential” “influential” 88% reported that the quality of customer service is “very influential” or “influential” on their perception of a company. Very influential 62% Influential 26% Somewhat influential 10% Not very influential 1% Why it Matters
12
76% of customers say they have because of poor customer service poor customer service stopped doing business with a company Why it Matters
13
Satisfaction is split Satisfied – 49% Dissatisfied – 39% Completely satisfied Somewhat satisfied Neither satisfied nor unsatisfied Somewhat unsatisfied Completely unsatisfied In general, how satisfied are you with your interactions with customer service? Why it Matters
14
86% 86% choose the phone for contacting companies. The phone remains the dominant method of contacting customer service. Methods used for contacting customer service most often Why it Matters
15
Caller ‘choice’ and automation usage are not mutually exclusive. Customers prefer automated interactions if their hold time for agents exceeds 2 minutes. “60 percent of customers favor an automated option for many types of simple interactions; the rest said they didn’t mind being presented with an automated option as long as they could connect with a live agent if they wanted one.” [McKinsey] Have some cake and eat it too!
16
Agenda Caller experience & why it matters Benchmarking, best practices & standards Methods & tools for assessing & improving caller experience “Hands-on” exercises “Selling” caller experience Takeaways
17
Benchmarking, Best Practices & Standards Available caller experience benchmarking Caller experience best practices Caller experience standards The gethuman™ standard None of these substitute for assessing and improving your company’s caller experience
18
Caller Experience Benchmarking No ‘industry’ benchmark Commercial benchmarks exist and are useful: –Sterling Audits, –BenchmarkPortal, –EIG Customer experience vs. Voice User Interface (VUI) design benchmarking
19
Caller Experience Best Practices Agreed upon customer experience ‘best practices’ –Widely accepted by customer interaction design community –Basic, common sense Best practices vs. ‘conventional wisdom’ Customer experience (interaction) best practices vs. VUI design best practices
20
Caller Experience Standards Best practices vs. standards No true customer experience ‘standards’ exist No true VUI design standards exist The gethuman™ standard
21
What is gethuman™? “The gethuman project is a consumer movement to improve the quality of phone support in the US.” [gethuman.com] Originally “The IVR Cheat Sheet” started by Paul English Hosts a website gethuman.com “One million consumers” [gethuman.com] The gethuman 500 database - lists companies, grades them, and tells you how to reach a human
22
What is gethuman?
23
What is the gethuman standard? “A specification for how customer service phone systems and support should work.” [gethuman.com] Version 1.0 published 18 October 2006 Not really a “standard” – not approved by any formal standards body or industry organization “Contributing” organizations: Microsoft, IBM, Avaya, Genesys, Nortel, Syntellect… 84.4% of the 500 received an F, and fewer than 2% of the 500 received an A score [and they don’t have any automation]
24
Should you care? After all… –“Its not really a standard” –“Just how big is this movement anyway” –“I’ve got plenty of company” The real answer is yes - if you care about customer experience –The “standard” is just “common sense” when it comes to customer experience –Agrees with “everyone’s” best practices for customer interaction design And don’t forget good caller experience is good business (eating cake)
25
The gethuman standard 1.The caller must always be able to dial 0 or to say "operator" to queue for a human. 2.An accurate estimated wait-time, based on call traffic statistics at the time of the call, should always be given when the caller arrives in the queue. A revised update should be provided periodically during hold time. 3.Callers should never be asked to repeat any information (name, full account number, description of issue, etc.) provided to a human or an automated system during a call. 4.When a human is not available, callers should be offered the option to be called back. If 24 hour service is not available, the caller should be able to leave a message, including a request for a call back the following business day. 5.Speech applications should provide touch-tone (DTMF) fall- back 6.Callers should not be forced to listen to long/verbose prompts.
26
The gethuman standard 7.Callers should be able to interrupt prompts (via dial-through for DTMF applications and/or via barge-in for speech applications) whenever doing so will enable the user to complete his task more efficiently. 8.Do not disconnect for user errors, including when there are no perceived key presses (as the caller might be on a rotary phone); instead queue for a human operator and/or offer the choice for call-back. 9.Default language should be based on consumer demographics for each organization. Primary language should be assumed with the option for the caller to change language. (i.e. English should generally be assumed for the US, with a specified key for Spanish.) 10.All operators/representatives of the organization should be able to communicate clearly with the caller (i.e. accents should not hinder communication; representatives should have excellent diction and enunciation.)
27
Pop quiz Name that gethuman standard How’s this…
28
The gethuman “gold” standard 1.While holding, allow callers to disable hold music; remember their selection for future calls. 2.If ads or promotions are played, allow users to disable them. 3.Allow callers, where appropriate, to identify themselves via caller ID and a securely defined PIN, instead of being required to enter long account numbers each call. 4.Default to preferred language based on caller ID. 5.Support and publicize individual toll-free numbers for individual languages. 6.Allow callers to access audio transcriptions of their calls via the organization's website. 7.Call back the caller at the time that he/she specified.
29
Exercise: How do you measure up? You: rate your company Group: compare ratings and see how you measure up
30
Agenda Caller experience & why it matters Benchmarking, best practices & standards Methods & tools for assessing & improving caller experience “Hands-on” exercises “Selling” caller experience Takeaways
31
Methods & Tools for Assessing & Improving Caller Experience Assessment & Improvement lifecycle “Be the caller” Call Monitoring & Recordings Evaluative Usability Studies Surveys Statistics & Reports Speech vs. Touchtone
32
Assess, Improve & Do It Some More Production Assess Improve Implement Assessment & Improvement Lifecycle
33
“Be The Caller” Critical shift from focusing on ‘business’ requirements to ‘caller’ requirements The simplest way to do this is just pick up the phone and call your company – “be the caller” Good first step: –Its “free” –Shifts your perspective –Some immediate improvements may surface –Can get you fired up!
34
“Be The Caller” Self-Survey –What do you think of the experience? –Did you accomplish your task? –Was it easy to do? –If not, why not? –What did you like? –What didn’t you like? –What would you to improve the experience? –What impression of your company was created by the experience?
35
Call Monitoring & Recordings Easy to do if… –IVR ports are configured on your call recording platform –IVR ports are configured on your ACD And if not…its critical to enable one or both capabilities Allows you to hear ‘both sides’ of the conversation If you monitor quality of your CSRs why wouldn’t you want to monitor the quality of your IVRs?
36
Call Monitoring & Recordings Excellent next step: –In the Pareto (80/20) sweet spot – low cost with very high reward –Completes your perspective shift –Will definitely surface areas for improvement –Will definitely get you fired-up! As with CSRs consider monitoring on a periodic basis
37
Call Monitoring & Recordings How to go about it –Typically one day of monitoring will do –100 calls is a good “rule of thumb” sample size – adjust for application complexity and number of user profiles –If you can monitor you can record too – tools of the trade revealed –Take notes - highlighting trouble spots and observations (organize and tabulate later) –Organize and tabulate calls based on notes & recordings - goal is a prioritized list of issues and opportunities for improvement
38
Evaluative Usability Studies Getting inside the caller’s head –Interview callers on their usage patterns and characteristics –Observe callers using the system –Interview callers on their experience with attempted tasks –Survey callers on their overall experience using the system
39
Evaluative Usability Studies Goes beyond observation to investigation and understanding the caller and what they are thinking Excellent next step to call monitoring: –There is an engagement cost –Generally good value – especially if caller experience issues were identified by call monitoring –Will not only identify areas for improvement but yield insight into the best way to fix them
40
Evaluative Usability Studies What’s involved –Define key user profiles, demographics & tasks –Develop participant ‘screening script’ –Recruit and schedule usability test participants – 10 to 12 per user profile –Develop usability interview scripts –Conduct usability sessions –Compile findings and recommendations Options: –Remote ‘over the phone’ or –Lab with video for ‘non-verbal’ feedback
41
Evaluative Usability Studies In addition to usability sessions: –Management & staff interviews –Review of available statistics & reports –IVR call monitoring (or recordings) –CSR service observation & interviews (optional) How to go about it – involve an experienced professional
42
Caller Surveys Captures direct caller feedback Complement to other assessment methods & tools –Cost, value & benefit based on type of caller survey –Best way to measure customer satisfaction –Broader but shallower feedback vs. usability study Caller Survey Types –Brief post-call survey –Broad outbound surveys
43
Brief Post-Call Surveys Brief caller survey performed by CSRs about IVR/automation experience Complement to other assessment methods & tools: –Low engagement cost if any – indirect costs need consideration –Good value Measures general caller satisfaction with experience Identifies trouble spots and areas for improvement
44
Brief Post-Call Surveys
45
Broad Out-bound Surveys More in-depth caller survey conducted by independent party Broadly conducted - random customer sample Complement to other assessment methods & tools: –Engagement cost can be significant –If you can afford it – do it –Best measure of general satisfaction with caller experience –Limited trouble spot & improvement insight
46
Broad Out-bound Surveys
47
Costs & Benefits Benefit Cost Make a call Call Monitoring Brief Post-call Survey Broad Outbound Survey Usability Studies
48
Statistics & Reports Useful report types –Task completion –Hot spots Use reports to identify potential trouble spots and areas for improvement…but not to assess caller experience and define improvements Won’t shift perspective or fire you up Complement but not a substitute for the other methods we’ve covered here
49
Statistics & Reports “High-end” caller behavior mapping tools –Clearly map aggregate caller behavior –Map caller behavior across multiple interactions and multiple channels (i.e. phone, web, etc.) –Focus on caller behavior vs. caller experience –Useful – but relatively high cost –Complement but not a substitute for the other methods we’ve covered here
50
Recognition Analysis & Tuning Required step in optimizing speech recognition performance Focus on tuning grammars and recognition parameters vs. caller experience and usability Can identify caller experience trouble spots and potential areas for improvement Complement but not a substitute for the other methods we’ve covered here
51
Agenda Caller experience & why it matters Benchmarking, best practices & standards Methods & tools for assessing & improving caller experience “Hands-on” exercises “Selling” caller experience Takeaways
52
“Hands-on” Exercises The ‘lucky’ company Exercise 1: Pick up the phone and dial Exercise 2: “Listening in” Exercise 3: “Live” usability session
53
Exercise 1: Pick up the Phone Group: make calls into customer call center – attempt 2-3 self-service tasks & transfer to CSR You: take ‘self-survey’ Group: compare and discuss self-survey feedback
54
Exercise 2: “Listening In” Group: listen to call recordings of actual customer calls You: take notes and identify trouble spots and areas for improvement Greg & Eduardo: share tabulation spreadsheet for full call sample Group: compare and discuss identified trouble spots and areas for improvement
55
Exercise 3: “Live” Usability Session Group: observe “live” usability session Small teams: identify and define caller experience improvements based on usability session Group: compare and discuss caller experience improvements
56
Agenda Caller experience & why it matters Benchmarking, best practices & standards Methods & tools for assessing & improving caller experience “Hands-on” exercises “Selling” caller experience Takeaways
57
“Selling” Caller Experience “Getting it” is not enough, you have to “sell it” Share what you’ve learned here: –Why caller experience matters –Good caller experience is good experience (eating cake) Make calls & listen to calls and share what you find - ‘shift’ focus of call center management & staff
58
Takeaways Caller experience matters What’s good for the caller is good for business (eating cake) Take the first step: make calls & listen to calls…the rest will follow Once you’ve started – don’t stop Don’t just get it, sell it!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.