Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Douglas Carter, Janet.

Similar presentations


Presentation on theme: "1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Douglas Carter, Janet."— Presentation transcript:

1 1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Douglas Carter, Janet Thomson, Victor Voegtli ORS National Institutes of Health Date: February 23, 2005

2 2 Table of Contents PM Template………………………….………………………………. Customer Perspective……………………….………………………………. Internal Business Process Perspective……………………………………… Learning and Growth Perspective…………………………………………… Financial Perspective………………………………………………………… Conclusions and Recommendations………………………………………….. Customer Satisfaction Survey Results……………………….

3 3 Table of Contents Survey Background…………………………………………………………………….3 Satisfaction Ratings on Specific Service Aspects ………………………………….7 Importance Ratings on Specific Service Aspects………………………………....18 Comments……………………………………………………………………………..29 Summary…………………………………………………………………..…………..34 Recommendations……………………………………………………………………39

4 4

5 5

6 6 Customer Perspective

7 7 Customer Perspective (cont.)

8 8

9 9 C2 Enhance Communications with customers Measures C2a: : Number of visits to DRS Portal  Not able to discriminate between AU or DRS employee but will after upgrade. C2b: Length of time on Portal  Eliminated this measure since no important data was retrieved C2c: Tasks performed via Portal

10 10 C2c: Tasks performed via Portal based on frequency 1. Material Disposals 2. User Changes 3. Monthly Memo Printing 4. Waste Pickup Requests 5. NIH 88-1 submission 6. User Registrations 7. Lab Changes C2 Enhance Communications with customers

11 11 C2 Enhance Communications with customers C2d: Tasks performed via Portal

12 12 C2 Enhance Communications with customers Initiative and Measures for FY’05 Increase auditing capabilities of Portal usage Improve usability of Portal function Increase transactions of infrequent tasks such as 88-1 form submission

13 13 Customer Perspective (cont.)

14 14 C3: Percentage of people training on- line

15 15 C3: Percentage of people training on-line Goal: increase on-line training FY’04 Initiative: on-line refresher training for AUs Data show decrease in on-line training Cause: elimination of on-line training module for nurses FY’06 Initiative: new on-line training module for nurses

16 16 Relationship Among Performance Objectives Enhancing communication with our customers would Maintain compliance with regulations Increase customer satisfaction

17 17 Internal Business Process Perspective

18 18 Internal Business Process Perspective

19 19 IB1a: Number of Security Violations

20 20 IB1b: Number of Non-security Violations

21 21 IB1a and b: Number of security and non-security violations

22 22 IB2: Improve effectiveness of radioactive waste pick-up scheduling

23 23 Internal Business Process Perspective (cont.)

24 24 0.9 2.4 00.511.522.5 Percentage On-line Scheduling of Radioactive Waste Pickups FY'04 FY'03 Improve Effectiveness of Radioactive Waste Pick-up Scheduling

25 25 IB2: Percentage radioactive waste pickups scheduled on-line Baseline.9% Target 5% Achieved 2.7%

26 26 Internal Business Process Perspective (cont.)

27 27 IB3: Ensure timely return of dosimeters

28 28 IB3: Ensure timely return of dosimeters

29 29 IB3: Ensure timely return of dosimeters

30 30 Internal Business Process Perspective (cont.) The Focus Group (FG) average absentee rate is within 1-sigma of the target rate when comparing FG absent dosimeters to FG dosimeters issued. FG absentee rate compares favorably to other medical/research institutions with dosimetry programs of similar size and type. A primary concern is that the FG is comprised of only 11 of the 70 badge groups at NIH, yet they account for 44% of the missing dosimeters. None of the corrective actions implemented to date have made a substantial impact on alleviating the problem.

31 31 Internal Business Process Perspective Actions taken: Reorganized badge groups by size and location to make them more manageable. Offered to buy and install badge boards to aid with distribution and collection of dosimeters. Distributed informational handouts detailing the importance of timely collection of dosimeters and the importance of individual roles within the program to Authorized Users and Dosimeter Custodians. Implemented a program of hand delivery and pick-up of dosimeters for all badge groups residing on the main campus.

32 32 Internal Business Process Perspective Actions pending: Develop and implement an on-line training program for Dosimeter Custodians. Actions to be considered: Levy a per dosimeter charge against the parent institutes to offset the missing dosimeter fees imposed upon us by our contractor (consumes ~ 5% of our annual dosimetry budget). Consider revoking individual user privileges for program participants who persistently fail to comply with program requirements

33 33 Internal Business Process Perspective (cont.)

34 34 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals Increased awareness intended to reduce the number of ASPs involving radioactive materials or radiation producing equipment that have not been reviewed by DRS. Baseline study of FY’03 ASP program found that 90% of ASPs involving radiation were reviewed by DRS To make this initiative effective it would rely heavily on cooperation from DRS, ACUC coordinator, DOHS reps, and also PI.

35 35 IB4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals Steps taken to increase awareness: Added information on DRS website as well as the Office of Animal Care and Use (OACU) websites. Performed audits to each institutes ASP file and compared it to DRS file Surveyed each ACUC coordinator to better understand their role in the ASP review process Created a pre-screening checklist for ACUC coordinator to help determine if DRS review is needed.

36 36 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals Steps taken to increase awareness (cont.): Created a list of “buzzwords” to help DOHS reps become more familiar with terminology used in ASPs involving radiation. Developing a database to track ASPs Annual reviews of existing and new ASPs

37 37 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals

38 38 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals On the whole, the level of awareness has been increased by 3%. A higher level of awareness is hoped to be achieved when the ASP database comes online. The ASPs will be tracked and reviews will be conducted on an annual basis. The annual review is also hoped to enhance communication between the PI and DRS and become another mechanism to heighten awareness.

39 39 Internal Business Process Perspective (cont.)

40 40 IB5: Ensure HP’s have critical data in a timely manner

41 41 IB5: Ensure HP’s have critical data in a timely manner

42 42 IB5: Ensure HP’s have critical data in a timely manner

43 43 Internal Business Process Perspective (cont.) The Delinquent Analysis rate falls easily within 1-sigma, and is just slightly above the current target rate of 5%. The current target rate should be attainable now that the process has been established and the mindset of involved personnel is such that meeting specific timing goals is given appropriate priority.

44 44 Internal Business Process Perspective (cont.) After the current target rate is achieved consistently, our long range goal is to lower the target rate incrementally until it falls below 1%.

45 45 Learning and Growth Perspective

46 46 Learning and Growth Perspective

47 47 LG1: Determine and Maintain Effective Staffing Level

48 48 LG1: Determine and Maintain Effective Staffing Levels Reduced FTEs by 2 Saved approximately $180,000 3 employees now elsewhere at NIH Reasons: career transitions and/or promotions Conducted workshops to enhance teamwork Recruited 2 employees Developed questions for QuickHire

49 49 0 0.5 1 1.5 2 2.5 3 Numb er of Departing Employees Turnover Rate in Division of Radiation Safety FY'03 FY'04 LG1: Maintain Effective Staffing Levels

50 50 Learning and Growth Perspective (cont.)

51 51 LG2: Number of Awards and Dollars per Award Unable to collect meaningful data No centralized tracking system Difficult to determine value of different types of awards Discontinue this objective and measure

52 52 Learning and Growth Perspective (cont.)

53 53 LG3a: Number of training hours per HP Data collected incomplete Seminars, workshops, etc., not funded by DRS not tracked Implemented new tracking mechanism to capture total hours of training for each HP

54 54 Financial Perspective

55 55 Financial Perspective (cont.)

56 56 F1: Minimize cost a defined service level for radiation safety

57 57 Financial Perspective 36% increase in unit cost Cause: incorporation of cost of acquisition and distribution of radionuclides, formerly under Fee For Service DRS now 100% Membership Service

58 58 Process Maps

59 59 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals

60 60 Conclusions

61 61 Conclusions from PMP Our customers are highly satisfied with the services we provide Upgrade tracking system for portal usage Develop on-line training module for nurses by FY’06 Try to decrease number of security violations by developing on-line training module on security Successful in reducing non-security violations, perhaps due to AU refresher training

62 62 Conclusions (cont.) Decrease missing dosimeters by training Dosimetry Custodians Benchmark dosimetry return issues Create database to track Animal Study Proposals Expedite sample preparation to reduce turnaround time for analysis

63 63 Conclusions (cont.) Develop mechanism for tracking all training hours for HPs Continue to re-evaluate necessary staffing level and adjust as necessary Continue to look for cost-cutting opportunities

64 64 Division of Radiation Safety (DRS) Dosimetry Survey Joe Wolski Office of Quality Management, Office of Research Services and Janice Rouiller, Ph.D and Laura Stouffer SAIC 06 January 2005

65 65 Survey Background

66 66 Survey Background Purpose

67 67 Survey Background Methodology

68 68 Survey Background Distribution Number of surveys distributed Number of respondents 11 Response Rate %

69 69 Satisfaction Ratings on Specific Service Aspects

70 70 FY04 Satisfaction Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 11

71 71 FY04 Satisfaction Ratings: Available Services Frequency of Response N = 11 Mean = 9.30 Median = 10 100% UnsatisfactoryOutstanding 0%

72 72 FY04 Satisfaction Ratings: Quality Frequency of Response N = 11 Mean = 9.22 Median = 9 UnsatisfactoryOutstanding 100%0%

73 73 FY04 Satisfaction Ratings: Timeliness Frequency of Response N = 11 Mean = 9.10 Median = 10 UnsatisfactoryOutstanding 100%0%

74 74 FY04 Satisfaction Ratings: Reliability Frequency of Response N = 11 Mean = 9.20 Median = 10 UnsatisfactoryOutstanding 100%0%

75 75 FY04 Satisfaction Ratings: Staff Availability Frequency of Response N = 11 Mean = 8.71 Median = 9 UnsatisfactoryOutstanding 86%14%0%

76 76 FY04 Satisfaction Ratings: Responsiveness Frequency of Response N = 11 Mean = 9.00 Median = 9 UnsatisfactoryOutstanding 89%11%0%

77 77 FY04 Satisfaction Ratings: Convenience Frequency of Response N = 11 Mean = 9.18 Median = 10 UnsatisfactoryOutstanding 91%9%0%

78 78 FY04 Satisfaction Ratings: Competence Frequency of Response N = 11 Mean = 9.20 Median = 10 UnsatisfactoryOutstanding 90%10%0%

79 79 FY04 Satisfaction Ratings: Handling of Problems Frequency of Response N = 11 Mean = 8.38 Median = 9 UnsatisfactoryOutstanding 75%25%0%

80 80 Importance Ratings on Specific Service Aspects

81 81 FY04 Importance Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 10

82 82 FY04 Importance Ratings: Available Services Frequency of Response N = 10 Mean = 8.89 Median = 9 UnsatisfactoryOutstanding 0% 11% 89%

83 83 FY04 Importance Ratings: Quality Frequency of Response N = 10 Mean = 8.88 Median = 9 0% 12% 88% UnsatisfactoryOutstanding

84 84 FY04 Importance Ratings: Timeliness Frequency of Response N = 10 Mean = 9.00 Median = 9 0% 11% 89% UnsatisfactoryOutstanding

85 85 FY04 Importance Ratings: Reliability Frequency of Response N = 10 Mean = 9.00 Median = 9 0% 11% 89% UnsatisfactoryOutstanding

86 86 FY04 Importance Ratings: Staff Availability Frequency of Response N = 10 Mean = 8.57 Median = 9 0% 29% 71% UnsatisfactoryOutstanding

87 87 FY04 Importance Ratings: Responsiveness Frequency of Response N = 10 Mean = 8.89 Median = 9 0% 22% 78% UnsatisfactoryOutstanding

88 88 FY04 Importance Ratings: Convenience Frequency of Response N = 10 Mean = 8.90 Median = 10 0% 20% 80% UnsatisfactoryOutstanding

89 89 FY04 Importance Ratings: Competence Frequency of Response N = 10 Mean = 8.89 Median = 9 0% 22% 78% UnsatisfactoryOutstanding

90 90 FY04 Importance Ratings: Handling of Problems Frequency of Response N = 10 Mean = 8.57 Median = 9 0% 29%71% UnsatisfactoryOutstanding

91 91 Comments

92 92 Survey Comments Total of 6 respondents provided at least one comment 55% of respondents Total of 8 comments were made on 3 general questions: What was done particularly well? What needs to be added or improved? Other comments Realize comments are qualitative data Comments provide a different type of information from your customers regarding their satisfaction Comments are NOT representative of the perceptions of all your customers Review them but don’t over react to an individual comment Comments are a great source for ideas on how to improve

93 93 Survey Comments What was done particularly well? (N = 5) As long as things work, I am happy. I get a report every month. Somebody delivers it to me. Availability of Radiation Safety officers. Doing excellent job! Thank you. Have only had to handle the exchange of the monitoring badge. The items are packaged well. When items are missing, they have been promptly replaced.

94 94 Survey Comments What needs to be added or improved? (N = 2) Returning badges through the NIH internal mail is somewhat risky. The loss of a badge is such a headache, perhaps a more secure return system could be developed. It is all fine and well to get reports on my exposure, but my main concern is how much exposure I'm getting and what that means. The reports I get are not very informative. They use symbols that are not included in any key, so they are basically meaningless to me. Since I work with a high energy Gamma emitter, I would like to really know I'm safe. I don't really get that from the reports.

95 95 Survey Comments Other Comments (N = 1) My RSO, John Jacohus goes out of his way to help me any problems/issues. Very competent, responsive and reliable.

96 96 Summary

97 97 Summary Respondent Characteristics __% of recipients responded to the survey. Satisfaction Ratings on Specific Service Aspects Respondents were asked to rate their satisfaction with the following aspects of Dosimetry services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Satisfaction mean ratings range from a high of 9.30 on Available Services to a low of 8.38 on Handling of Problems. Notice that the lowest mean rating (8.38) is still well above the midpoint of a 10-point scale. In general, respondent perceptions are quite positive.

98 98 Summary (cont.) Satisfaction Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating outstanding performance. For each service aspect, at least 75% of respondents perceived that aspect to be outstanding. For 4 service aspects (Available Services, Quality, Timeliness, and Reliability), all respondents indicated that the service was outstanding. None of the respondents find service to be unsatisfactory (responses of 1, 2, or 3) in any of the service aspects.

99 99 Summary (cont.) Importance Ratings on Specific Service Aspects Respondents were asked to rate the importance of the following aspects of Dosimetry services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Importance mean ratings range from a high of 9.00 on Timeliness and Reliability to a low of 8.57 on Staff Availability and Handling of Problems. Notice that the lowest mean rating (8.57) is still well above the midpoint of a 10-point scale. In general, respondents find all service aspects to be very important. Note: In future surveys, scale anchors for importance ratings should be changed to (1) Unimportant to (10) Very Important

100 100 Summary (cont.) Importance Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating highest importance. For each service aspect, at least 71% of respondents perceived that aspect to be of the highest importance. None of the respondents find any of the service aspects to be unimportant (responses of 1, 2, or 3).

101 101 Recommendations

102 102 Interpret ORS Customer Scorecard data in terms of other PM data gathered Does the customer satisfaction data, when compared to data in other perspectives, show potential relationships? Review comments in presentation for specific issues that can be tackled Take the time to read through all comments If appropriate, generate potential actions based on what you have learned from the data Can you make changes to address issues raised? How might you implement those actions? Communicate the survey results (and intended actions) to important stakeholder groups ORS Senior Management Radiation Safety staff Survey respondent pool Conduct follow-up survey (within next 2 years) to check on improvements Recommendations

103 103 Division of Radiation Safety (DRS) Analytical Laboratory Survey Joe Wolski Office of Quality Management, Office of Research Services and Janice Rouiller, Ph.D and Laura Stouffer SAIC 6 January 2005

104 104 Table of Contents Survey Background…………………………………………………………………….3 Satisfaction Ratings on Specific Service Aspects ………………………………….7 Importance Ratings on Specific Service Aspects………………………………....18 Comments……………………………………………………………………………..29 Summary…………………………………………………………………..…………..34 Recommendations……………………………………………………………………39

105 105 Survey Background

106 106 Survey Background Purpose

107 107 Survey Background Methodology

108 108 Survey Background Distribution Number of surveys distributed Number of respondents 8 Response Rate %

109 109 Satisfaction Ratings on Specific Service Aspects

110 110 FY04 Satisfaction Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 8

111 111 FY04 Satisfaction Ratings: Available Services Frequency of Response N = 8 Mean = 8.75 Median = 9 100% UnsatisfactoryOutstanding 0%

112 112 FY04 Satisfaction Ratings: Quality Frequency of Response N = 8 Mean = 8.63 Median = 9 UnsatisfactoryOutstanding 100%0%

113 113 FY04 Satisfaction Ratings: Timeliness Frequency of Response N = 8 Mean = 8.13 Median = 8 UnsatisfactoryOutstanding 50% 0%

114 114 FY04 Satisfaction Ratings: Reliability Frequency of Response N = 8 Mean = 8.75 Median = 9 UnsatisfactoryOutstanding 88%12%0%

115 115 FY04 Satisfaction Ratings: Staff Availability Frequency of Response N = 8 Mean = 8.25 Median = 8 UnsatisfactoryOutstanding 88%12%0%

116 116 FY04 Satisfaction Ratings: Responsiveness Frequency of Response N = 8 Mean = 8.50 Median = 9 UnsatisfactoryOutstanding 75%25%0%

117 117 FY04 Satisfaction Ratings: Convenience Frequency of Response N = 8 Mean = 8.75 Median = 9 UnsatisfactoryOutstanding 88%12%0%

118 118 FY04 Satisfaction Ratings: Competence Frequency of Response N = 8 Mean = 8.75 Median = 9 UnsatisfactoryOutstanding 100%0%

119 119 FY04 Satisfaction Ratings: Handling of Problems Frequency of Response N = 8 Mean = 8.43 Median = 8 UnsatisfactoryOutstanding 86%14%0%

120 120 Importance Ratings on Specific Service Aspects

121 121 FY04 Importance Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 8

122 122 FY04 Importance Ratings: Available Services Frequency of Response N = 8 Mean = 9.00 Median = 9 UnsatisfactoryOutstanding 0% 12% 88%

123 123 FY04 Importance Ratings: Quality Frequency of Response N = 8 Mean = 8.75 Median = 9 0% 12% 88% UnsatisfactoryOutstanding

124 124 FY04 Importance Ratings: Timeliness Frequency of Response N = 8 Mean = 8.88 Median = 9 0% 12% 88% UnsatisfactoryOutstanding

125 125 FY04 Importance Ratings: Reliability Frequency of Response N = 8 Mean = 8.88 Median = 9 0% 100% UnsatisfactoryOutstanding

126 126 FY04 Importance Ratings: Staff Availability Frequency of Response N = 8 Mean = 8.63 Median = 9 0% 25% 75% UnsatisfactoryOutstanding

127 127 FY04 Importance Ratings: Responsiveness Frequency of Response N = 8 Mean = 8.63 Median = 9 0% 25% 75% UnsatisfactoryOutstanding

128 128 FY04 Importance Ratings: Convenience Frequency of Response N = 8 Mean = 7.75 Median = 8 0% 50% UnsatisfactoryOutstanding

129 129 FY04 Importance Ratings: Competence Frequency of Response N = 8 Mean = 9.00 Median = 10 0% 12% 88% UnsatisfactoryOutstanding

130 130 FY04 Importance Ratings: Handling of Problems Frequency of Response N = 8 Mean = 8.38 Median = 9 0% 25%75% UnsatisfactoryOutstanding

131 131 Comments

132 132 Survey Comments Total of 7 respondents provided at least one comment 88% of respondents Total of 18 comments were made on 3 general questions: What was done particularly well? What needs to be added or improved? Other comments Realize comments are qualitative data Comments provide a different type of information from your customers regarding their satisfaction Comments are NOT representative of the perceptions of all your customers Review them but don’t over react to an individual comment Comments are a great source for ideas on how to improve

133 133 Survey Comments What was done particularly well? (N = 6) Never any confusion on the results. Following SOP for counting requests, providing consistent results. Lab manager is widely available and willing to talk about issue HP's would have that require their (TSB's) services. I have always received excellent service and quick response to questions and problems. Everything. Doug really takes pride in running the lab well. Vince is always a smiling face in the lab.

134 134 Survey Comments What needs to be added or improved? (N = 6) Turn-around time and transition from one lab worker to the next could be improved. Direct communication with dosimetry custodians regarding missing dosimetry. ( I do understand that efforts are underway to improve this.) Closer tracking of situations where missing dosimetry requires a close estimation. Explaining why users are receiving their annual exposure report and what it means. Ability to perform whole body scanning on someone with highly contaminated hands. Contractor prep of samples could be more timely on occasion. Nothing. I would recommend taking the Analytical Lab services back from the contractor and just do the function in-house. We have the staff already.

135 135 Survey Comments Other Comments (N = 6) Overall, pretty good job! Timeliness of HP notification has improved greatly, takes pressure off HPs. Scale for importance not relevant in survey. Better form for evaluation needs to be developed. Overall analytical lab service has improved greatly since the hiring of a lab manager. Great work! Keep it up. Get those SOPs done!

136 136 Summary

137 137 Summary Respondent Characteristics __% of recipients responded to the survey. Satisfaction Ratings on Specific Service Aspects Respondents were asked to rate their satisfaction with the following aspects of Analytical Laboratory services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Satisfaction mean ratings range from a high of 8.75 on Available Services, Reliability, Convenience, and Competence to a low of 8.13 on Timeliness. Notice that the lowest mean rating (8.13) is still well above the midpoint of a 10-point scale. In general, respondent perceptions are quite positive.

138 138 Summary (cont.) Satisfaction Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating outstanding performance. For each service aspect, at least 50% of respondents perceived that aspect to be outstanding. For 3 service aspects (Available Services, Quality, and Competence), all respondents rated the service as outstanding. None of the respondents find service to be unsatisfactory (responses of 1, 2, or 3) in any of the service aspects.

139 139 Summary (cont.) Importance Ratings on Specific Service Aspects Respondents were asked to rate the importance of the following aspects of Analytical Laboratory services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Importance mean ratings range from a high of 9.00 on Available Services and Competence to a low of 7.75 on Convenience. Notice that the lowest mean rating (7.75) is still well above the midpoint of a 10-point scale. In general, respondents find all services to be quite important. Note: In future surveys, scale anchors for importance ratings should be changed to (1) Unimportant to (10) Very Important

140 140 Summary (cont.) Importance Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating highest importance. For each service aspect, at least 50% of respondents perceived that aspect to be of the highest importance. For Reliability, all respondents indicated this service aspect to be of the highest importance. None of the respondents find any service aspect to be unimportant (responses of 1, 2, or 3).

141 141 Recommendations

142 142 Interpret ORS Customer Scorecard data in terms of other PM data gathered Does the customer satisfaction data, when compared to data in other perspectives, show potential relationships? Review comments in presentation for specific issues that can be tackled Take the time to read through all comments If appropriate, generate potential actions based on what you have learned from the data Can you make changes to address issues raised? How might you implement those actions? Communicate the survey results (and intended actions) to important stakeholder groups ORS Senior Management Radiation Safety staff Survey respondent pool Conduct follow-up survey (within next 2 years) to check on improvements Recommendations


Download ppt "1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Douglas Carter, Janet."

Similar presentations


Ads by Google