Download presentation
Presentation is loading. Please wait.
Published byHenry Carter Modified over 9 years ago
1
1 FY02 ASA Presentation Operate Emergency Communication Center Presented by: G. Borden G. Elliott, G. Harris, L. Martinez, M. Sheelor Office of Research Services National Institutes of Health 18 November 2002
2
2 Table of Contents Main Presentation ASA Template ……………………………….……………………………….4 Customer Perspective……………………….……………………………….5 Customer Segmentation …………………….……………………………………6 Customer Satisfaction……………………….…………………………………….7 Unique Customer Measures………………….…………………………………..8 Internal Business Process Perspective…………………………………….9 Service Group Block Diagram…………………………………………………..10 Conclusions from Discrete Services Deployment Flowcharts……………….11 Process Measures………………………………………………………………..12 Learning and Growth Perspective………………………………………….13 Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data……..14 Analysis of Readiness Conclusions…………………………………………….15 Unique Learning and Growth Measures………………………………………..16 Financial Perspective………………………………………………………..17 Unit Cost……………………………………………………………………………18 Asset Utilization……………………………………………………………………19 Unique Financial Measures..……………………………………………………..20 Conclusions and Recommendations……………………………………….21 Conclusions from FY02 ASA..……………………………………………………22 Recommendations…………………………………………………………………23
3
3 Table of Contents Appendices Page 2 of your ASA Template Customer segments graphs Customer satisfaction graphs Block diagram Process maps Process measures graphs Learning and Growth graphs Analysis of Readiness Information
4
4 ASA TEMPLATE
5
5
6
6 What are we about? We handle over 4,800 telephone calls per month Responsible for over 32 line on the telephone system Monitor 17 CCTV’s w/over 120 images of the NIH campus Monitor 24 alarm devices and the fire alarm system Make over 50,000 National Crime Information Center (FBI – NCIC) inquiries Criminal and employment Background checks Analyze stressful situations – 911 calls The only Federal 911 center in the DC area. Three shifts 3 personnel midnights, 3 personnel mid-shift, and 5 personnel including supervisor on the day shift TTY service for the hearing challenged
7
7 If we are not handling Emergencies, What? Types of Non-Emergency Calls Received Employee building access Parking complaints Ticket complaints Directions to NIH Events/Employment information Complaints on parking meters Calls for officers, administrative staff, ECC personnel Lost & Found Disable d vehicles Citizens locked out of their vehicles
8
8 Customer Perspective
9
9 OUR CUSTOMERS ECC’s prime customer is the Police Branch of which we are an integral part NOTE: We concentrated on Police/Fire as customers with the idea that the NIH community is the ultimate customer
10
10 CCTV Monitoring: DS2 From May – October 02 – Two incidents CCTV’s are providing crime deterrence Increased security & CCTV’s has decreased crime
11
11
12
12 FY02 ORS Customer Scorecard Data for the Annual Self Assessments Service Group 13: Operate Emergency Communication Center 16 October 2002 Summary Prepared by the Office of Quality Management (OQM)
13
13 Survey Distribution Number of Surveys Distributed Emergency Communication Center 50 Number of Surveys Returned Emergency Communication Center17 Response Rate34%
14
14 Survey Respondents FY02 Respondents by IC
15
15 Radar Chart FY02 Product/Service Satisfaction Ratings Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.
16
16 Radar Chart Interpretations Product/Service Comparison of ORS & ECC Product/Service Satisfaction ratings ECC’s overall score is slightly below ORS index (ORS 8.27 vs. ECC 6.53), with ECC’s reliability the lowest of four categories (reliability, cost, timeliness, and quality) Lack of formal training may be an issue in providing correct information or response Ratings may reflect continuing difficulties in communication between Fire Department and ECC ECC cost score is slightly below ORS (7.42 vs. ECC 7.33) however cost in ECC is the highest rated category Respondents think ECC is a pretty good value
17
17 Radar Chart FY02 Customer Service Satisfaction Ratings Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results. Service Group Index = 6.81 6.29 7.06 7.31 5.82 7.59 1.00 4.00 7.00 10.00 Availability Responsiveness ConvenienceCompetence Handling of Problems Data based on 17 respondents
18
18 Comparison of ORS & ECC Customer Service Satisfaction ratings Compared to ORS customer service satisfaction ratings, ECC is slightly below (ORS 8.55 vs. ECC 6.81) with ECC’s competence the lowest rating of 5 categories (reliability, handling of problems, responsiveness, convenience, competence). The percentage of mistakes is miniscule, but the consequences are major The only ways to reduce the number of mistakes further are (a) to provide additional training for employees and (b) to provide more staff – rushed call takers make mistakes
19
19 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings Note: The Importance rating scale ranges from 1 - 10 where “1” represents Unimportant and “10” represents Important. The Satisfaction rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding.
20
20 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings: A Closer Look Note: A smaller portion of the chart is shown so that the individual data points can be labeled.
21
21 Scatter Diagram Interpretation The Police & Fire Branches believe availability and convenience are very important. Cost does not appear to be a major factor for satisfaction. It is interesting to note that better handling of problems would contribute greatly to improved customer satisfaction. Again, we need to invest in more training and additional staff.
22
22 These are some of the survey’s comments of what was done particularly well? One or two dispatchers are competent to handle duties required. The rest need improvement. Checking on building alarms in and out of service. Keeping track of officers is usually done very well. Good service. All phases of dispatch and communication services are performed exceptionally well. The new supervisor of ECC has improved the quality of ECC services and relationships with the customers.
23
23 Returned Survey Comments of what needs to be improved? Need to place lead dispatches on all shifts. All of them work day work. Timing. ECC needs to follow proper command structure, in other words, dispatchers need to stop placing themselves in the role of supervisor, which they are not. How key dispatch calls. Better dispatchers, people that understand police/fire/rescue operations. Dispatchers that speak clear English, dispatchers that are competent. Quality and competency of ECC staff, more staff, better scheduling so "NO" police officers have to fill in. Repeaters for al 3 channels for better communications on and off campus. The whole system. Speed/accuracy of NCIC for officers for some dispatchers. Nothing to complain about at this time. Better equipment.
24
24 Other Comments All dispatchers need official training before being assigned to dispatch center. Formal training should be conducted by full time trainers, off campus if need be. Keep training. Day crew is very reliability, and responsive but nights and weekends are very shady and fly by night. Feel can not rely on them. ECC definitely needs to improve quality and performance of dispatchers since NIH police and fire respond to life threatening situations! Better dispatchers, better training, better quality assurance, and review. Sent your staff to a ECC that works with real Emergency every day to get them real training. The quality of dispatching changes between operations/shifts and by work load. I believe they all try very hard doing their best.
25
25 To Summarize: Increased Customer Satisfaction depends on Quality – customer satisfaction training Reliability – work toward “Industry Standards” Timeliness – constant scenario drills Competence – better hiring practices Handling of problems – training issues More staff – tired and rushed dispatchers make mistakes
26
26 Recommendations Improve pay & benefits to attract high quality personnel Have a panel of ECC/Fire personnel interview prospective recruits the interviewing to improve selection process Structure probationary period – 12-week training to screen out less-successful recruits Dispatcher Certification Formal “IN-SERVICE’ program Training in competence areas (i.e., customer service, proper dispatching procedures)
27
27 Internal Business Process Perspective
28
28
29
29 Relationship between Service Group and Discrete Services NIH ECC is the only Federal ECC to establish an enhanced 911 Emergency Communication Center in the DC area ECC is the focal point between the Police, Fire, and NIH community ECC answers emergency and non-emergency calls, analyze stressful situations, monitor approx. 130 CCTV’s, monitor Door Alarm Devise, and Fire Alarms. ECC utilizes the Federal Bureau of Investigation’s National Crime Information Center (NCIC) for investigation purposes
30
30
31
31 Our Service Group completed 3 deployment flowcharts for 3 discrete services We have learned from the deployment flowcharts Overall view of total process from requester to completed actions Shows sequence of events Each chart shows a strong symbiotic relationship between ECC, Police & fire (I.e., maintain radio contact for personal safety Shows area of improved training needs (i.e., obtain relevant information from requester) Shows no outstanding signals Conclusions from Discrete Services Deployment Flowcharts
32
32 Process Measures Process measures for each discrete service DS1: Operate ECC Total calls received in ECC 2002 Average time to dispatch a call Actual average monthly overtime DS2: Monitor CCTV equipment Relatively new DS measurement – established a baseline for future measures (2.8 hours 24/7) DS3: Respond to NCIC requests Total number of requests per year Relatively new DS measurement - established baseline for future measures
33
33
34
34
35
35
36
36
37
37 Average Time To Dispatch A Call in Minutes Control Chart Within limits: No major signals
38
38 911 Control Chart
39
39 Total Calls Received In The Emergency Communications Center June 2001 – June 2002 Non-Emergency 84,693, BlP 11,868, Emergency 8,535 Total 84,693 76% Non-Emergency 14% Blue Light Phone 10% Emergency
40
40 Process Measure Findings Beginning month 40-60 hours mandated OT because of high security – out side influence Emergency incoming calls (approx. 11% of total calls) are within the control process Average time to dispatch a call is between 4- 5 minutes Training issues with the update version of CAD system
41
41 FY02 Learning and Growth (L&G) Data for the Annual Self Assessments Service Group 13: Operate Emergency Communications Center 26 September 2002 Summary Prepared by the Office of Quality Management
42
42 Learning and Growth Data Table About 2 days sick leave per employee About 1 award for every 2 employees 14% employee turnover 1 ER case out of 7 employees
43
43 Interpreting ECC’s Data Group Turnover Rate ECC is slightly below average Average Hours of Sick leave Use ECC is below average Average Number of Awards Received ECC is slightly below average Average Number of EEC Complaints None Average Number of ER Cases One case in 2001
44
44 Summary of ECC’ Learning and Growth Data Fourteen percent employee turnover About 2 days of sick leave used per employee About one award for every 2 employees No ECC complaints for the year 2002 Fourteen percent Employee Resolution cases (actually only one case in 2001) Average number of ADR Cases - None
45
45 Group Turnover Rate Fourteen percent Out of 32 SG’s ECC ranks slightly below the 50 percentile with 50% being the mean Average Hours of Sick Leave Used With 35 hours as an average, ECC is below average with 18% This is an indication that Sick Leave is not being abused Average Number of Awards Received ECC is slightly below average. Improvement is needed Note: NIH in general needs to improve awards program Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data
46
46 Conclusions Continued Average Number of EEC Complaints None – This indicates an adherence to the rules and policies concerning EEC practices Average Number of ER Cases One case in 2001 which gives 14% As compared with other SG’s, ECC was slightly above the the average of.12 (ECC.14) Note: ECC is in the process of formulating a Union, thus, another means to address ER issues. Average Number of ADR Cases None – ECC mgt & employees are working to resolve issues
47
47 Financial Perspective
48
48 Financial Findings: Overtime Trend !!! Until ECC reach “Full” staffing, OT costs will remain a part of the “normal” operating procedure. ECC is utilizing approximate 20% OT This equates to 2.82 FTE’s Avg. Salary $38,314 $38,314 X 11 (number of FTE’s) = $421,450 (straight time) Avg. OT hours $27.63 X 904.50 hours (total number of OT used) = $107881 Straight Sal $421450 & OT Sal 107881 = $529331 (total personnel cost) $107881/38314 = 2.82 2/82 X $38,314 = $107881 Hiring 3 additional personnel could be a break-even point
49
49 Asset Utilization Measures Activity difficult to measure in terms of standard outputs. How do you measure standby? Quess-timate 10% non-productive Through observations Asset Utilization 90% Max input 11X1840 = 20240 Non-productive input 2025 (10%) Asset utilization 18216/20240 Asset utilization 90% Note: 10% can be used for additional training needs
50
50 The right mix of skills & abilities Interpersonal skill, communication abilities (I.e., verbal, written), decision making abilities, technical know-how, analytical skill, ability to multi-task, physical In the next three years ECC is expected to expand its digital CCTV to cover off-campus There is a need for operational training (i.e. Weapons of Mass Destruction, Handling Bomb Threats, etc) The right tools needed to carry out the mission are technological updates (i.e., MAAARS-View – shows location of incoming 911, DIAPHONE – records the into ing call, ANDOVER – UPDATES FOR BUILDING ACDCESS, CCTV) At this point, budget concerns come into play Analysis of Readiness Conclusions: What is Needed
51
51 Readiness Continued What are the anticipated implications of not obtaining the right mix? Poor service Inefficiencies Liability issues Possible loss of life and property
52
52 Conclusions and Recommendations
53
53 Conclusions from FY02 ASA ECC process is working but with a squeaky wheel – ECC needs grease because of the following: Need for increased pay & benefits to attract qualified personnel Demand for service is increasing Need for additional personnel Need for formal training of personnel Need for Technological updates A substantial safety risk factor exists because of insufficient trained personnel Insufficient staffing equals to insufficient service As a integral part of the Public Safety Branch, without ECC, security and safety are at risk
54
54 Increase pay & benefits to attract qualified personnel 10% retention pay Update Job description Invest in Computer Aid Dispatch and other ECC technical enhancements and upgrades Hire additional personnel to: meet increasing demands reduce overtime and risk meet Congressional directive (FY2000) from USATREX survey suggestion to increase staffing levels to 16 FTE’s. Off set “abnormal” use of OT for normal operations Train ECC personnel to meet ECC Industry Standards and certifications (i.e., Maryland State) Liaison with the Fire Department for better understanding of their needs Do a website to inform the NIH community of our services Recommendations
55
55 We're Here to Serve You NIH 9-1-1 what is Your Emergency? 9 - The number of days we need in our workweek 1 - The number of times we have to get it right. 1 - NIH Emergency Communication Center THE ONE to call
56
56 Appendices
57
57 Appendices Include the following: Page 2 of ASA Template Customer segments graphs Customer satisfaction graphs Block diagram Process maps Process measure graphs Learning and Growth graphs Analysis of Readiness Information
58
58
59
59 Survey Respondents FY02 Respondents by IC
60
60 Radar Chart FY02 Product/Service Satisfaction Ratings Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.
61
61 Radar Chart FY02 Customer Service Satisfaction Ratings Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results. Service Group Index = 6.81 6.29 7.06 7.31 5.82 7.59 1.00 4.00 7.00 10.00 Availability Responsiveness ConvenienceCompetence Handling of Problems Data based on 17 respondents
62
62 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings Note: The Importance rating scale ranges from 1 - 10 where “1” represents Unimportant and “10” represents Important. The Satisfaction rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding.
63
63 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings: A Closer Look Note: A smaller portion of the chart is shown so that the individual data points can be labeled.
64
64
65
65
66
66
67
67
68
68
69
69
70
70 Average Time To Dispatch A Call in Minutes Control Chart Within limits: No major signals
71
71 911 Control Chart
72
72 The right mix of skills & abilities Interpersonal skill, communication abilities (I.e., verbal, written), decision making abilities, technical know-how, analytical skill, ability to multi-task, physical In the next three years ECC is expected to expand its digital CCTV to cover off-campus There is a need for operational training (i.e. Mass Weapons of Destruction, Handling Bomb Threats, etc) The right tools needed to carry out the mission are technological updates (i.e., MAAARS-View – shows location of incoming 911, DIAPHONE – records the into ing call, ANDOVER – UPDATES FOR BUILDING ACDCESS, CCTV) At this point, budget concerns come into play Analysis of Readiness Conclusions: What is Needed
73
73 Readiness Continued What are the anticipated implications of not obtaining the right mix? Poor service Inefficiencies Liability issues Possible loss of life and property
74
74 Methodology ASA Teams determined best methodology to assess customer satisfaction FY02 methodology reviewed by OQM Customer segments to be assessed Customization of ORS Customer Scorecard instrument Description of item to be assessed (e.g., Service Group, Discrete Service, specific product/service) Method of survey distribution (e.g., email, hard copy) Accompanying Memos/email messages Timeline for distribution and return Number of surveys to be distributed Upon gaining approval, ASA Teams distributed surveys to customers
75
75 Methodology (cont.) Completed surveys were returned to OQM or to ASA Consultant (SAIC) Preserve customers’ anonymity Ensure the integrity of the results Survey data were entered into a database and analyzed Results typically summarized at Service Group level If sufficient number of completed surveys were returned, may be able to generate analyses for specific products/services
76
76 FY02 Learning and Growth (L&G) Data for the Annual Self Assessments Service Group 13: Operate Emergency Communications Center 26 September 2002 Summary Prepared by the Office of Quality Management
77
77 Methodology All data represent occurrences from Oct 2001 - June 2002 Data analyzed covered period between October 1 st and end of June to provide time to analyze and present the data ORS Human Resources (HR) provided data on: Turnover Sick leave Awards HR data stored in NIH databases by Standard Administrative Codes (SACs) Developed cross-reference of ORS Service Groups to SACs Almost all SACs assigned to Service Groups Some Service Groups have identical SACs In this case, two Service Groups will receive same set of data
78
78 Methodology (cont.) Also obtained data from: Equal Employment Opportunity (EEO) Number of EEO complaints Employee Relations (ER) Number of ER cases Alternative Dispute Resolution (ADR) ADR cases
79
79 Interpreting Your Data FY02 is the first time L&G data were collected and analyzed Compare your Service Group relative to the other ORS Service Groups What are all the L&G indicators telling you? In the future your group should compare itself to its own Service Group data over time Interpret data in terms of other ASA data Customer satisfaction ratings Process measures Financial measures Does the L&G data, when compared to data in other perspectives, show potential relationship (could L&G be contributing to customer satisfaction results)? From reviewing your Service Group’s L&G data, what could be done to improve Quality of Work Life (QOWL)?
80
80 Service Group Turnover Rate Calculated as the number of separations for a Service Group / Population of Service Group Separations defined as: Retirements (separation codes 3010, 3020, 3022) Resignations (separation codes 3120, 3170) Removals (separation codes 3300) Terminations (separation codes 3520, 3550, 3570) Promotions to new organization (separation codes 7020) Reassignments (separation code 7210) Note that transfers/promotions within ORS Divisions/Offices are not captured by the NIH database
81
81 Service Group Turnover Rate (cont.) Calculation of Service Group population was needed since number of employees changes over time Population for Service Group was estimated based on average of employee count at three snapshots in time (Nov 2001, Feb 2002, June 2002)
82
82 Average Hours of Sick Leave Used Calculated as the total number of sick leave hours used for a Service Group / Population of Service Group
83
83 Average Number of Awards Received Calculated as the total number of awards received / Population of Service Group Includes both monetary and non-monetary awards Cash awards QSIs Time-off Honorary Customer Service
84
84 Average Number of EEO Complaints Calculated the total number of EEO complaints for a Service Group / Population of Service Group
85
85 Average Number of ER Cases Calculated the total number of ER cases for a Service Group / Population of Service Group Case is defined as any contact with ER Office where an action occurs (e.g., Letter is prepared)
86
86 Average Number of ADR Cases Calculated the number of ADR cases for a Service Group / Population of Service Group Case is initiated when person contacts ADR
87
87 Learning and Growth Data Table About 2 days sick leave per employee About 1 award for every 2 employees 14% employee turnover 1 ER case out of 7 employees
88
88 Service Group Turnover Rate (Oct 2001 - June 2002) Service Group Number Turnover Rate
89
89 Average Hours of Sick Leave Used (Oct 2001 - June 2002) Service Group Number Average Hours
90
90 Average Number of Awards Received (Oct 2001 - June 2002) Service Group Number Average number
91
91 Average Number of EEO Complaints (Oct 2001 - June 2002) Service Group Number Average Number
92
92 Average Number of ER Cases (Oct 2001 - June 2002) Service Group Number Average Number
93
93 Average Number of ADR Cases (Oct 2001 - June 2002) Service Group Number Average Number
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.