Download presentation
Presentation is loading. Please wait.
Published byAllen Thompson Modified over 9 years ago
1
1 Measuring the Future MIT and Stanford Benchmark the Help Desk Measuring the Future MIT and Stanford Benchmark the Help Desk Enterprise 2006 Educause Enterprise Technology May 25th 2006 © Copyright William Clebsch, Greg Anderson, Chris Lundin, Oliver Thomas, 2006. This work is the intellectual property of the authors. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.
2
2 Deep Benchmarking Project began summer 2002 Project began summer 2002 Article published in EDUCAUSE Quarterly in Jan/Feb 2004 Article published in EDUCAUSE Quarterly in Jan/Feb 2004 Details, charts, metrics, article available at Details, charts, metrics, article available at http://web.mit.edu/ist/about/benchmarking
3
3 We had to go deep, then be strategic Establish partnership, scope General understanding Processes, definitions Site visits Raw data Metrics Corrections Sharing, teaching Another tool for management, cultural change Intense, deep, iterative project work
4
4 Define scope Set goals Form team Develop plan Teach team Launch Prepare Research literature Visit sites Map processes Define data Collect data Define metrics Calculate metrics Study & Define Draft a dashboard Populate with available data Test and revise Make easily repeatable Use in operations Operationalize Use dashboard Publicize Replace existing reports Explore next area for bench- marking Leverage Interpret at high-level Identify areas to explore Identify potential improvements Test initial changes Interpret & Test Continuously test and revise Deep Benchmarking: 5 Iterative Phases
5
5 Metrics lead to action on many fronts Organizational Increased budget; automated support Measure staff Stanford re-org: consolidation of all client-facing work Put HD staff on solid funding, not one-time $Managerial Sharing metrics with staff, teams Refining metrics – measure the right things Specific responses: e.g., create “swat” team for Oracle rollout Everyone tracks his/her timeCultural Understanding data; recognizing similarities, push for action Collaboration across schools Apply metrics to all projects Value people with technical and behavioral skills
6
6 We linked goals to specific metrics and then created a dashboard Invest Appropriately % of budget Clients served/FTE Be Cost Effective Cost per case by topic Total costs by topic Cases by media, including self-help Be Responsive Elapsed time per case (days) Call abandonment Hold time Time to answer % of cases resolved on 1 st contact Support Customer Needs with High Quality Service Annual customer survey Spot-surveys on selected transactions Develop Effective, Mature Processes # of contacts vs. # of days to resolve Origin of Help Desk cases Develop High Performing, Competent Teams Employee satisfctn. survey Individual perf. metrics Team performance metrics Training $$ / FTE % Help Desk certification Case volume compared to staff skills mix Support Rollout of New Systems Case volume by topic 3 months before and after launch Minutes per case
8
8
9
9 Planning a deep benchmarking project Choose a good partner Choose a good partner Invest in site visits Invest in site visits Plan for many iterations Plan for many iterations Have a dedicated project manager Have a dedicated project manager Metrics must be actionable and tell a story Metrics must be actionable and tell a story Allow time for others to internalize Allow time for others to internalize Ensure management wants to use metrics in a meaningful way Ensure management wants to use metrics in a meaningful way
10
10 Cultural Change: Management by Fact Use facts to dispel the myth of the anecdote Use facts to dispel the myth of the anecdote http://web.mit.edu/ist/about/benchmarking http://web.mit.edu/ist/about/benchmarking http://web.mit.edu/ist/about/benchmarking Questions? Questions?
11
11 Improving Help Desk Services Through Measurement
12
12 Improving Help Desk Services through Measurement Measurement and metrics are not the goal; service improvement is Measurement and metrics are not the goal; service improvement is Where we started Where we started What we experienced: What we experienced: –Individual Performance –Team Performance What we learned and how we’re moving forward What we learned and how we’re moving forward
13
13 Measuring Individuals Few are measured as closely as the Help Desk staff: dealing with staff concerns Few are measured as closely as the Help Desk staff: dealing with staff concerns The “Easy-To-Measure” The “Easy-To-Measure” –Hours in Ready Mode –Tickets Resolved –Calls-into-Tickets Ratio –Direct customer feedback The “Hard-To-Measure” The “Hard-To-Measure” –Phone service quality –Process improvement suggestions –Getting to root cause and systemic fixes –Sharing knowledge in the group –Contribution to project work
14
14 Daily Team Performance
15
15 Simplistic Measures Can Trigger Non-Productive Work (and not improve service) Focus on “time spent” can cause staff to inflate time Focus on “time spent” can cause staff to inflate time Focus on “tickets handled” can cause “ticket stealing” Focus on “tickets handled” can cause “ticket stealing” Focus on Individual can damage Team Focus on Individual can damage Team –Aim for “Fair Share” –Reward Fair Share contributions with extra assignments
16
16 Measuring Teams The “Easy-To-Measure” The “Easy-To-Measure” –Calls handled/Abandon Rate –First Contact Resolution –Time to Resolution –“Fair Share” Contributions The “Hard-to-Measure” The “Hard-to-Measure” –Overall strength and quality of team –Knowledge development within the team –Participation in service deployments –Project and Knowledge Development work Teams: Compete or Support? Teams: Compete or Support?
17
17 Improving our Phone Handling Increased hours in “Ready Mode” Gave feedback to staff on their own “Ready Time” Encouraged web submissions to the community
18
18 Encouraging Web Submissions
19
19 De-emphasis Phone Calls
20
20 Managing IT Services By The Numbers Senior Management Support Critical Senior Management Support Critical –Sharing “Daily Morning Report” –Sharing Weekly “5-Day Unresolved” –Top 5/Bottom 5 Group »Sometimes group shame can go too far Accountability Embedded in Director/Manager/Staff Goals Accountability Embedded in Director/Manager/Staff Goals –HelpSU handling included in goals.
21
21 Help Desk “Morning Report”
22
22 Unresolved Help Requests The value of leadership and focus
23
23 Assuring Campus-Wide Service, Preserving Brand Reputation A single University-wide request tracking system ( HelpSU ) has its pros and cons: A single University-wide request tracking system ( HelpSU ) has its pros and cons: –Great economy of scale in our BMC/Remedy investment. –Single customer interface raises user satisfaction –Behind the scenes, 250+ groups and 1,200+ consultants makes quality control a challenge. –Weekly work with all service providers in needed
24
24 Med School Help Requests Helping our Partners Focus
25
25 Tools Keep On Serving: Self-Service Password Resets
26
26 Our Current Improvement Targets Improved phone service quality by auditing calls Improved phone service quality by auditing calls Folding our ordering process (for all IT services) into an expanded Service Desk Folding our ordering process (for all IT services) into an expanded Service Desk Direct satisfaction surveys to a sampling of our customers each week Direct satisfaction surveys to a sampling of our customers each week Remote control support through WebEx Remote control support through WebEx Investigating a knowledge base for both support staff and customers. Investigating a knowledge base for both support staff and customers.
27
27 Absorbing the Benchmarking Project Into the Organization and Culture Products, Methodology, and Theme
28
28 Absorbing the Products Dashboard Dashboard –Valuable as upward / outward reporting tools –Continue to try to make dashboards work for operational managers –Snapshot surveys provide critical component to dashboard
29
29 Absorbing the Products - Dashboard
30
30 Absorbing the Products - Dashboard
31
31 Absorbing the Products - Dashboard
32
32 Absorbing the Methodology Applied methodology to Network Security team with great effect Applied methodology to Network Security team with great effect Results: Results: –Significant process improvement –Workload under control –Even stopped doing some things entirely –Getting customer feedback became positive experience!
33
33 Security Dashboard 2004
34
34 Security Dashboard 2006
35
35 Absorbing the Methodology Okay, maybe that was a little too fast.
36
36 Absorbing the Methodology
37
37 Absorbing the Theme Using metrics to strengthen relationships Using metrics to strengthen relationships Measuring success Measuring success Getting staff to buy into it Getting staff to buy into it “In God we trust; all others bring data.” - W. Edwards Deming
38
38 Absorbing the Theme Getting staff (and managers) to buy into it Getting staff (and managers) to buy into it –Must be real, direct value to them: »More resources? »Less effort? »Better relationships? –Give them control! »Give them the tools to prove you wrong! »Challenge them to turn lagging indicators into leading ones –Get a Rob Smyser –Use metrics to support your story, never instead of it
39
39 Cultural Change: Management by Fact Use facts to dispel the myth of the anecdote Use facts to dispel the myth of the anecdote http://web.mit.edu/ist/about/benchmarking http://web.mit.edu/ist/about/benchmarking http://web.mit.edu/ist/about/benchmarking Questions? Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.