Download presentation
Presentation is loading. Please wait.
Published byGretchen Grandison Modified over 10 years ago
1
HDI San Diego Chapter May 2012 Desktop Support Metrics: A Case Study Mike Russell V.P. Communications, SDHDI
2
About the Author Over 26 years in I.T. in San Diego Experienced in Programming, Operations, Infrastructure and Support 12 years in Service Delivery Management Specialized in Desktop Support Management and Device Deployment Member of the San Diego HDI chapter for 5 years Member of HDI Desktop Support Forums Member of HDI Desktop Support Advisory Board
3
The Challenges of Developing Desktop Support Metrics Analysis and Development of Desktop Support standards are not as mature as the Service Desk standards –Different responsibilities require different processes –Different processes require different metrics which are difficult to capture –History of using subjective not objective measures Desktop Support Staff have multiple inputs for service –Mobile, dynamic and dispersed workforce –Have a closer relationship with customer base –Receive escalations and triage requests from different departments in I.T.
4
Common Perceptions of Desktop Support Perceptions of Desktop Support can be varied –Customers feel the Desktop Support team is an extension or replacement of the Service Desk –I.T. Partners feel that the Desktop Support team can be engaged at will for immediate assistance and may be an inexhaustible resource –Project Managers feel that the Desktop Support team is a replacement for project labor and may be an inexhaustible resource –Executive Management does not fully understand the scope of work performed by Desktop Support –Desktop Support analysts can feel misunderstood, under appreciated and over utilized
5
The Problem I needed a way to accurately measure, analyze and market the services delivered by Desktop Support. –Demonstrate Staff Productivity –Measure Staff Effectiveness –Measure Performance Quality –Track Customer Satisfaction –Illustrate effects of bad changes –Identify Opportunities for Service Improvement –Demonstrate Improved performance –Improve staff satisfaction –Market the value of Desktop Support to I.T. and Executive Management
6
Must Have Measurements Staff Effectiveness Staff Productivity –The ability to track tickets by: Incidents Service Requests Problems Changes Quality Customer Satisfaction
7
Staff Effectiveness The Effective use of Time by Staff Actual time staff has available to work on issues –Does not include meetings, breaks, non-ticketed project time, sick days, PTO, etc. –May require process changes in time and attendance Actual time spent on individual tickets –Does not mean time received to time resolved –May be in several chunks of time –Will require manual best judgment of staff –May require modification to your ticketing system Actual Ticket Time/Available Time = Effectiveness
8
Staff Effectiveness Example: Bobby works a standard 40 hour week (37.5 hours w/breaks) Bobby attends 4 hours of meetings (37.5 – 4 = 34.5) Bobby is sick one day (34.5 – 8 = 26.5) Bobby documents 22.3 hours spent on ticket resolution. Bobbys Effectiveness is 22.3/26.5 = 84%
9
Staff Effectiveness Expect to see initial rates between 30% to 200% –Low numbers indicate that staff may not be estimating times correctly or not reporting all issues –High numbers may indicate duplicate tickets or lack of understanding of what is being tracked This can take 2 – 3 months to settle in with staff –Review monthly with staff to find cause of out of range effectiveness –Do not assume staff are purposely misleading with stats –Tip: do NOT show total time spent on tickets to staff at first (or possible at all) Industry Standard: 80% Effectiveness Rating (48 Minutes per Hour)
10
Staff Productivity You should already be tracking the number of tickets being closed per day or month –Decide which metrics are related to productivity and customer satisfaction (ex: initial response time, days to resolve, etc.) –In your ticketing system, automatically classify tickets as incidents or requests –Have the analyst resolving the ticket verify the CLOSING classification (do NOT default to opening classification!) –The closing analyst should document if the ticket is related to a change or problem, and if so, which one –Try using the time captured in the effectiveness metric to calculate tickets closed per working hour
11
Quality Use a monthly quality survey to track SLA adherence and other factors critical to the delivery of superior support Customer Contact Time (SLA < 2 business hours) Resolution Time (SLA < 8 business hours) Work log information complete and correct –Document all customer contacts including name –Should be clear enough that the CUSTOMER can understand it. Appointments Made and Kept –If appointments are made, are they kept Asset Information Correct Closing CTI Appropriate for the issue
12
Quality Contains Objective and Subjective measurements Measurement standards should be clear and documented Should not be performed by one individual Sampling size needs to remain consistent Because some subjective judgments must be made, the staff members must have the ability to review and challenge the results As a manager, you have the right and responsibility to make changes to the results in order to remain fair
13
Customer Satisfaction Send an automated survey to customers for each ticket Expect a 5% - 15% rate of return Very low or Very high returns are a red flag, especially on an individual basis. Design reports so that Customer Satisfaction can be trended to other metrics (ticket volumes, time to respond, problems, projects, etc.) Customer Satisfaction transcends all levels of management, and can be the most important factor in the perception, success and survival of the desktop support team.
14
Quick Review Perceptions: –Unlimited Resources, Unknown scope of services, Always Immediately available (not doing anything else), misunderstood, under appreciated, over utilized Metrics driving the solution: –Staff Productivity –Staff Effectiveness –Quality –Customer Satisfaction Put this all together, what does it look like?
15
The Solution The Balanced Team Scorecard –Productivity metrics for Incident, Service Request Problems and Changes (SLAs/OLAs) –Average Team Effectiveness –Average Quality Scores –Average Customer Satisfaction –Trending report for last 12 months for Key Performance Indicators and SLAs Subtext describing significant factors in changes Distributed Monthly to I.T. Management and Customers
16
What does the end result look like?
17
The Results It Works! –Received Praise from Executive Levels on the metrics reported –Adopted as standard for metrics reporting for the I.T. operations teams –Received praise from the staff as they felt recognized and valued in the organization –Captures data that can be used for further research, ex: cost of bad changes, most costly services, etc. –Recognized as a best practice by HDI, presented at the 2012 National Conference in Orlando
18
Thank you! Questions? Mike Russell, I.T. Service Delivery Management mrussel2@cox.net
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.