1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Douglas Carter, Janet.

Slides:



Advertisements
Similar presentations
Project Management Concepts
Advertisements

St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
Campus-wide Presentation May 14, PACE Results.
Project leaders will keep track of team progress using an A3 Report.
Presentation for the Management Study of the Code Enforcement Process City of Little Rock, Arkansas August 3, 2006.
Logistics Customer Satisfaction Survey Results FY 2007 Logistics FY 2007 Customer Satisfaction.
1 FY 2004 Performance Management Plan Presentation Conduct Fire Prevention Services Team Members: J. P. McCabe, P.E. Fire Marshal, DFM, ORS Samuel A. Denny,
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
Measures Definition Workshop
1 Performance Management Presentation Access Control Team Members: Major Billy Alford, Team Leader Bill Brosius, Alex Salah, Cassandra Harris ORS National.
Assessing Your Organizational Span of Control State Classification Office, September 2003.
1 Performance Management Presentation Team Members: Ken Ryland and Marianne Bachmann, Hugh Malek, Karla Terney, Alice Hardy, Joseph Kristofik, Dan Shoen.
University of Tennessee at Chattanooga IT Master Plan Helpdesk: (423)
Measuring for Success Module Nine Instructions:
Core Performance Measures FY 2005
2010 Annual Employee Survey Results
© 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
ORF 1 Performance Management Presentation Team Members: Division of Policy and Program Assessment Farhad Memarzadeh, Sheri Bernstein, Reza Jafari, Robert.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Module 3 Develop the Plan Planning for Emergencies – For Small Business –
Dear User, This presentation has been designed for you by the Hearts and Minds Support Team. It provides a template for presenting the results of the SAFE.
Basics of OHSAS Occupational Health & Safety Management System
Student Engagement Survey Results and Analysis June 2011.
‘Developing the appraisal process in the wider context of the Sport and Fitness sector of Higher Education’. Welcome & Introductions.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
ORF 1 Performance Management Presentation Team Members: Melissa Richardson, Veronica Crawford-Robinson, Jim Phelan, Stewart Hill 9 January 2005 Lease,
ORF 1 Performance Management Presentation Improve Environmental Quality Team Members: Kenny Floyd, Jim Carscadden, Bill Ketner, Charlyn Lee, Valerie Nottingham,
IAEA International Atomic Energy Agency Reviewing Management System and the Interface with Nuclear Security (IRRS Modules 4 and 12) BASIC IRRS TRAINING.
2010 Results. Today’s Agenda Results Summary 2010 CQS Strengths and Opportunities CQS Benchmarks Demographics Next Steps.
Prepared by Opinion Dynamics Corporation May 2004.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Why do we … Life cycle processes … simplified !!!.
Assessing the Value of Training  Training can result in improved profitability for XYC while lowering staffing costs.  Training can result in a higher.
Presented by Linda Martin
Staff Survey Executive Team Presentation (Annex B) Prepared by: GfK NOP September, Agenda item: 17 Paper no: CM/03/12/14B.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
CERTIFICATION In the Electronics Recycling Industry © 2007 IAER Web Site - -
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
Noel-Levitz Student Satisfaction Survey of Classroom and Online Students Conducted Spring 2008.
January 18, 2012 Administrative Council Presentation.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
1 FY04 ORS Performance Management Manage Information Technology Team Members: Charlie Jones, Ben Ashtiani, Denny Bermudez, La'Tanya Burton, Ron Edwards,
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Kelly Austin, George Redmond.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
1 FY02 ASA Presentation Provide Administrative Support (2) Presented by: Genia Bohrer Office of Research Services National Institutes of Health 18 November.
FSA - The Financial Supervision Authority Nele Piir, Marge Laan, Kadri Toks.
Programme Performance Criteria. Regulatory Authority Objectives To identify criteria against which the status of each element of the regulatory programme.
1 FY02/FY03/FY04 Evaluation Summary Security and Emergency Response Program Division of Fire Rescue Services “NIH Fire Department”
Prepared by Opinion Dynamics Corporation May 2004.
ORF 1 Performance Management Presentation List your Service Group Team Members: List Leader and Members ORS National Institutes of Health Date.
1 Performance Management Presentation Support Foreign Staff Exchange Program Team Members: Team Leader: Candelario Zapata Team Members: Brian Daly, Stephanie.
ISO MONITORING AND MEASUREMENT. ISO Environmental Management Systems2 Lesson Learning Goals At the end of this lesson you should be.
University of Minnesota Internal\External Sales “The Internal Sales Review Process” An Overview of What Happens During the Review.
Chapter 6: THE EIGHT STEP PROCESS FOCUS: This chapter provides a description of the application of customer-driven project management.
1 FY03 PMP Presentation Package Security and Emergency Response Program Division of Fire Rescue Services “NIH Fire Department”
1 Performance Management Report Service Group: Provide Police Services Team Leader: Major Ophus Robertson Team Members: Lieutenant Jensen Lieutenant Cox.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
1 City of Shelby Wastewater Treatment Division Becomes State’s Second Public Agency to Implement a Certified Environmental Management System CERTIFICATION.
1 Results from the Performance Management Survey Prepared by Janice Rouiller and Laura Stouffer, SAIC and Joe Wolski, OQM 9 May 2005.
Prepared for GAMES Spring 2014 Attendees By Mary Nicholas, MHA President / CEO HQAA, Inc.
2009 Annual Employee Survey U.S. Department of Housing and Urban Development December 29,2009 (updated January 8, 2010)
Why marketing matters to schools. What is Marketing? PR / media management Branding Event management Print and design management , web and telephone.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Welcome. Contents: 1.Organization’s Policies & Procedure 2.Internal Controls 3.Manager’s Financial Role 4.Procurement Process 5.Monthly Financial Report.
Component D: Activity D.3: Surveys Department EU Twinning Project.
Telehealth Survey Update.
Presentation transcript:

1 Performance Management Presentation Maintain Safe Working Environment Radiation Safety Team Leader: Nancy Newman Team Members: Douglas Carter, Janet Thomson, Victor Voegtli ORS National Institutes of Health Date: February 23, 2005

2 Table of Contents PM Template………………………….………………………………. Customer Perspective……………………….………………………………. Internal Business Process Perspective……………………………………… Learning and Growth Perspective…………………………………………… Financial Perspective………………………………………………………… Conclusions and Recommendations………………………………………….. Customer Satisfaction Survey Results……………………….

3 Table of Contents Survey Background…………………………………………………………………….3 Satisfaction Ratings on Specific Service Aspects ………………………………….7 Importance Ratings on Specific Service Aspects……………………………… Comments……………………………………………………………………………..29 Summary…………………………………………………………………..…………..34 Recommendations……………………………………………………………………39

4

5

6 Customer Perspective

7 Customer Perspective (cont.)

8

9 C2 Enhance Communications with customers Measures C2a: : Number of visits to DRS Portal  Not able to discriminate between AU or DRS employee but will after upgrade. C2b: Length of time on Portal  Eliminated this measure since no important data was retrieved C2c: Tasks performed via Portal

10 C2c: Tasks performed via Portal based on frequency 1. Material Disposals 2. User Changes 3. Monthly Memo Printing 4. Waste Pickup Requests 5. NIH 88-1 submission 6. User Registrations 7. Lab Changes C2 Enhance Communications with customers

11 C2 Enhance Communications with customers C2d: Tasks performed via Portal

12 C2 Enhance Communications with customers Initiative and Measures for FY’05 Increase auditing capabilities of Portal usage Improve usability of Portal function Increase transactions of infrequent tasks such as 88-1 form submission

13 Customer Perspective (cont.)

14 C3: Percentage of people training on- line

15 C3: Percentage of people training on-line Goal: increase on-line training FY’04 Initiative: on-line refresher training for AUs Data show decrease in on-line training Cause: elimination of on-line training module for nurses FY’06 Initiative: new on-line training module for nurses

16 Relationship Among Performance Objectives Enhancing communication with our customers would Maintain compliance with regulations Increase customer satisfaction

17 Internal Business Process Perspective

18 Internal Business Process Perspective

19 IB1a: Number of Security Violations

20 IB1b: Number of Non-security Violations

21 IB1a and b: Number of security and non-security violations

22 IB2: Improve effectiveness of radioactive waste pick-up scheduling

23 Internal Business Process Perspective (cont.)

Percentage On-line Scheduling of Radioactive Waste Pickups FY'04 FY'03 Improve Effectiveness of Radioactive Waste Pick-up Scheduling

25 IB2: Percentage radioactive waste pickups scheduled on-line Baseline.9% Target 5% Achieved 2.7%

26 Internal Business Process Perspective (cont.)

27 IB3: Ensure timely return of dosimeters

28 IB3: Ensure timely return of dosimeters

29 IB3: Ensure timely return of dosimeters

30 Internal Business Process Perspective (cont.) The Focus Group (FG) average absentee rate is within 1-sigma of the target rate when comparing FG absent dosimeters to FG dosimeters issued. FG absentee rate compares favorably to other medical/research institutions with dosimetry programs of similar size and type. A primary concern is that the FG is comprised of only 11 of the 70 badge groups at NIH, yet they account for 44% of the missing dosimeters. None of the corrective actions implemented to date have made a substantial impact on alleviating the problem.

31 Internal Business Process Perspective Actions taken: Reorganized badge groups by size and location to make them more manageable. Offered to buy and install badge boards to aid with distribution and collection of dosimeters. Distributed informational handouts detailing the importance of timely collection of dosimeters and the importance of individual roles within the program to Authorized Users and Dosimeter Custodians. Implemented a program of hand delivery and pick-up of dosimeters for all badge groups residing on the main campus.

32 Internal Business Process Perspective Actions pending: Develop and implement an on-line training program for Dosimeter Custodians. Actions to be considered: Levy a per dosimeter charge against the parent institutes to offset the missing dosimeter fees imposed upon us by our contractor (consumes ~ 5% of our annual dosimetry budget). Consider revoking individual user privileges for program participants who persistently fail to comply with program requirements

33 Internal Business Process Perspective (cont.)

34 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals Increased awareness intended to reduce the number of ASPs involving radioactive materials or radiation producing equipment that have not been reviewed by DRS. Baseline study of FY’03 ASP program found that 90% of ASPs involving radiation were reviewed by DRS To make this initiative effective it would rely heavily on cooperation from DRS, ACUC coordinator, DOHS reps, and also PI.

35 IB4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals Steps taken to increase awareness: Added information on DRS website as well as the Office of Animal Care and Use (OACU) websites. Performed audits to each institutes ASP file and compared it to DRS file Surveyed each ACUC coordinator to better understand their role in the ASP review process Created a pre-screening checklist for ACUC coordinator to help determine if DRS review is needed.

36 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals Steps taken to increase awareness (cont.): Created a list of “buzzwords” to help DOHS reps become more familiar with terminology used in ASPs involving radiation. Developing a database to track ASPs Annual reviews of existing and new ASPs

37 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals

38 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals On the whole, the level of awareness has been increased by 3%. A higher level of awareness is hoped to be achieved when the ASP database comes online. The ASPs will be tracked and reviews will be conducted on an annual basis. The annual review is also hoped to enhance communication between the PI and DRS and become another mechanism to heighten awareness.

39 Internal Business Process Perspective (cont.)

40 IB5: Ensure HP’s have critical data in a timely manner

41 IB5: Ensure HP’s have critical data in a timely manner

42 IB5: Ensure HP’s have critical data in a timely manner

43 Internal Business Process Perspective (cont.) The Delinquent Analysis rate falls easily within 1-sigma, and is just slightly above the current target rate of 5%. The current target rate should be attainable now that the process has been established and the mindset of involved personnel is such that meeting specific timing goals is given appropriate priority.

44 Internal Business Process Perspective (cont.) After the current target rate is achieved consistently, our long range goal is to lower the target rate incrementally until it falls below 1%.

45 Learning and Growth Perspective

46 Learning and Growth Perspective

47 LG1: Determine and Maintain Effective Staffing Level

48 LG1: Determine and Maintain Effective Staffing Levels Reduced FTEs by 2 Saved approximately $180,000 3 employees now elsewhere at NIH Reasons: career transitions and/or promotions Conducted workshops to enhance teamwork Recruited 2 employees Developed questions for QuickHire

Numb er of Departing Employees Turnover Rate in Division of Radiation Safety FY'03 FY'04 LG1: Maintain Effective Staffing Levels

50 Learning and Growth Perspective (cont.)

51 LG2: Number of Awards and Dollars per Award Unable to collect meaningful data No centralized tracking system Difficult to determine value of different types of awards Discontinue this objective and measure

52 Learning and Growth Perspective (cont.)

53 LG3a: Number of training hours per HP Data collected incomplete Seminars, workshops, etc., not funded by DRS not tracked Implemented new tracking mechanism to capture total hours of training for each HP

54 Financial Perspective

55 Financial Perspective (cont.)

56 F1: Minimize cost a defined service level for radiation safety

57 Financial Perspective 36% increase in unit cost Cause: incorporation of cost of acquisition and distribution of radionuclides, formerly under Fee For Service DRS now 100% Membership Service

58 Process Maps

59 I.B4: Increase awareness of requirement for DRS review of Animal Study Program (ASP) proposals

60 Conclusions

61 Conclusions from PMP Our customers are highly satisfied with the services we provide Upgrade tracking system for portal usage Develop on-line training module for nurses by FY’06 Try to decrease number of security violations by developing on-line training module on security Successful in reducing non-security violations, perhaps due to AU refresher training

62 Conclusions (cont.) Decrease missing dosimeters by training Dosimetry Custodians Benchmark dosimetry return issues Create database to track Animal Study Proposals Expedite sample preparation to reduce turnaround time for analysis

63 Conclusions (cont.) Develop mechanism for tracking all training hours for HPs Continue to re-evaluate necessary staffing level and adjust as necessary Continue to look for cost-cutting opportunities

64 Division of Radiation Safety (DRS) Dosimetry Survey Joe Wolski Office of Quality Management, Office of Research Services and Janice Rouiller, Ph.D and Laura Stouffer SAIC 06 January 2005

65 Survey Background

66 Survey Background Purpose

67 Survey Background Methodology

68 Survey Background Distribution Number of surveys distributed Number of respondents 11 Response Rate %

69 Satisfaction Ratings on Specific Service Aspects

70 FY04 Satisfaction Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 11

71 FY04 Satisfaction Ratings: Available Services Frequency of Response N = 11 Mean = 9.30 Median = % UnsatisfactoryOutstanding 0%

72 FY04 Satisfaction Ratings: Quality Frequency of Response N = 11 Mean = 9.22 Median = 9 UnsatisfactoryOutstanding 100%0%

73 FY04 Satisfaction Ratings: Timeliness Frequency of Response N = 11 Mean = 9.10 Median = 10 UnsatisfactoryOutstanding 100%0%

74 FY04 Satisfaction Ratings: Reliability Frequency of Response N = 11 Mean = 9.20 Median = 10 UnsatisfactoryOutstanding 100%0%

75 FY04 Satisfaction Ratings: Staff Availability Frequency of Response N = 11 Mean = 8.71 Median = 9 UnsatisfactoryOutstanding 86%14%0%

76 FY04 Satisfaction Ratings: Responsiveness Frequency of Response N = 11 Mean = 9.00 Median = 9 UnsatisfactoryOutstanding 89%11%0%

77 FY04 Satisfaction Ratings: Convenience Frequency of Response N = 11 Mean = 9.18 Median = 10 UnsatisfactoryOutstanding 91%9%0%

78 FY04 Satisfaction Ratings: Competence Frequency of Response N = 11 Mean = 9.20 Median = 10 UnsatisfactoryOutstanding 90%10%0%

79 FY04 Satisfaction Ratings: Handling of Problems Frequency of Response N = 11 Mean = 8.38 Median = 9 UnsatisfactoryOutstanding 75%25%0%

80 Importance Ratings on Specific Service Aspects

81 FY04 Importance Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 10

82 FY04 Importance Ratings: Available Services Frequency of Response N = 10 Mean = 8.89 Median = 9 UnsatisfactoryOutstanding 0% 11% 89%

83 FY04 Importance Ratings: Quality Frequency of Response N = 10 Mean = 8.88 Median = 9 0% 12% 88% UnsatisfactoryOutstanding

84 FY04 Importance Ratings: Timeliness Frequency of Response N = 10 Mean = 9.00 Median = 9 0% 11% 89% UnsatisfactoryOutstanding

85 FY04 Importance Ratings: Reliability Frequency of Response N = 10 Mean = 9.00 Median = 9 0% 11% 89% UnsatisfactoryOutstanding

86 FY04 Importance Ratings: Staff Availability Frequency of Response N = 10 Mean = 8.57 Median = 9 0% 29% 71% UnsatisfactoryOutstanding

87 FY04 Importance Ratings: Responsiveness Frequency of Response N = 10 Mean = 8.89 Median = 9 0% 22% 78% UnsatisfactoryOutstanding

88 FY04 Importance Ratings: Convenience Frequency of Response N = 10 Mean = 8.90 Median = 10 0% 20% 80% UnsatisfactoryOutstanding

89 FY04 Importance Ratings: Competence Frequency of Response N = 10 Mean = 8.89 Median = 9 0% 22% 78% UnsatisfactoryOutstanding

90 FY04 Importance Ratings: Handling of Problems Frequency of Response N = 10 Mean = 8.57 Median = 9 0% 29%71% UnsatisfactoryOutstanding

91 Comments

92 Survey Comments Total of 6 respondents provided at least one comment 55% of respondents Total of 8 comments were made on 3 general questions: What was done particularly well? What needs to be added or improved? Other comments Realize comments are qualitative data Comments provide a different type of information from your customers regarding their satisfaction Comments are NOT representative of the perceptions of all your customers Review them but don’t over react to an individual comment Comments are a great source for ideas on how to improve

93 Survey Comments What was done particularly well? (N = 5) As long as things work, I am happy. I get a report every month. Somebody delivers it to me. Availability of Radiation Safety officers. Doing excellent job! Thank you. Have only had to handle the exchange of the monitoring badge. The items are packaged well. When items are missing, they have been promptly replaced.

94 Survey Comments What needs to be added or improved? (N = 2) Returning badges through the NIH internal mail is somewhat risky. The loss of a badge is such a headache, perhaps a more secure return system could be developed. It is all fine and well to get reports on my exposure, but my main concern is how much exposure I'm getting and what that means. The reports I get are not very informative. They use symbols that are not included in any key, so they are basically meaningless to me. Since I work with a high energy Gamma emitter, I would like to really know I'm safe. I don't really get that from the reports.

95 Survey Comments Other Comments (N = 1) My RSO, John Jacohus goes out of his way to help me any problems/issues. Very competent, responsive and reliable.

96 Summary

97 Summary Respondent Characteristics __% of recipients responded to the survey. Satisfaction Ratings on Specific Service Aspects Respondents were asked to rate their satisfaction with the following aspects of Dosimetry services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Satisfaction mean ratings range from a high of 9.30 on Available Services to a low of 8.38 on Handling of Problems. Notice that the lowest mean rating (8.38) is still well above the midpoint of a 10-point scale. In general, respondent perceptions are quite positive.

98 Summary (cont.) Satisfaction Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating outstanding performance. For each service aspect, at least 75% of respondents perceived that aspect to be outstanding. For 4 service aspects (Available Services, Quality, Timeliness, and Reliability), all respondents indicated that the service was outstanding. None of the respondents find service to be unsatisfactory (responses of 1, 2, or 3) in any of the service aspects.

99 Summary (cont.) Importance Ratings on Specific Service Aspects Respondents were asked to rate the importance of the following aspects of Dosimetry services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Importance mean ratings range from a high of 9.00 on Timeliness and Reliability to a low of 8.57 on Staff Availability and Handling of Problems. Notice that the lowest mean rating (8.57) is still well above the midpoint of a 10-point scale. In general, respondents find all service aspects to be very important. Note: In future surveys, scale anchors for importance ratings should be changed to (1) Unimportant to (10) Very Important

100 Summary (cont.) Importance Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating highest importance. For each service aspect, at least 71% of respondents perceived that aspect to be of the highest importance. None of the respondents find any of the service aspects to be unimportant (responses of 1, 2, or 3).

101 Recommendations

102 Interpret ORS Customer Scorecard data in terms of other PM data gathered Does the customer satisfaction data, when compared to data in other perspectives, show potential relationships? Review comments in presentation for specific issues that can be tackled Take the time to read through all comments If appropriate, generate potential actions based on what you have learned from the data Can you make changes to address issues raised? How might you implement those actions? Communicate the survey results (and intended actions) to important stakeholder groups ORS Senior Management Radiation Safety staff Survey respondent pool Conduct follow-up survey (within next 2 years) to check on improvements Recommendations

103 Division of Radiation Safety (DRS) Analytical Laboratory Survey Joe Wolski Office of Quality Management, Office of Research Services and Janice Rouiller, Ph.D and Laura Stouffer SAIC 6 January 2005

104 Table of Contents Survey Background…………………………………………………………………….3 Satisfaction Ratings on Specific Service Aspects ………………………………….7 Importance Ratings on Specific Service Aspects……………………………… Comments……………………………………………………………………………..29 Summary…………………………………………………………………..…………..34 Recommendations……………………………………………………………………39

105 Survey Background

106 Survey Background Purpose

107 Survey Background Methodology

108 Survey Background Distribution Number of surveys distributed Number of respondents 8 Response Rate %

109 Satisfaction Ratings on Specific Service Aspects

110 FY04 Satisfaction Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 8

111 FY04 Satisfaction Ratings: Available Services Frequency of Response N = 8 Mean = 8.75 Median = 9 100% UnsatisfactoryOutstanding 0%

112 FY04 Satisfaction Ratings: Quality Frequency of Response N = 8 Mean = 8.63 Median = 9 UnsatisfactoryOutstanding 100%0%

113 FY04 Satisfaction Ratings: Timeliness Frequency of Response N = 8 Mean = 8.13 Median = 8 UnsatisfactoryOutstanding 50% 0%

114 FY04 Satisfaction Ratings: Reliability Frequency of Response N = 8 Mean = 8.75 Median = 9 UnsatisfactoryOutstanding 88%12%0%

115 FY04 Satisfaction Ratings: Staff Availability Frequency of Response N = 8 Mean = 8.25 Median = 8 UnsatisfactoryOutstanding 88%12%0%

116 FY04 Satisfaction Ratings: Responsiveness Frequency of Response N = 8 Mean = 8.50 Median = 9 UnsatisfactoryOutstanding 75%25%0%

117 FY04 Satisfaction Ratings: Convenience Frequency of Response N = 8 Mean = 8.75 Median = 9 UnsatisfactoryOutstanding 88%12%0%

118 FY04 Satisfaction Ratings: Competence Frequency of Response N = 8 Mean = 8.75 Median = 9 UnsatisfactoryOutstanding 100%0%

119 FY04 Satisfaction Ratings: Handling of Problems Frequency of Response N = 8 Mean = 8.43 Median = 8 UnsatisfactoryOutstanding 86%14%0%

120 Importance Ratings on Specific Service Aspects

121 FY04 Importance Ratings on Specific Service Aspects Mean Response UnsatisfactoryOutstanding N = 8

122 FY04 Importance Ratings: Available Services Frequency of Response N = 8 Mean = 9.00 Median = 9 UnsatisfactoryOutstanding 0% 12% 88%

123 FY04 Importance Ratings: Quality Frequency of Response N = 8 Mean = 8.75 Median = 9 0% 12% 88% UnsatisfactoryOutstanding

124 FY04 Importance Ratings: Timeliness Frequency of Response N = 8 Mean = 8.88 Median = 9 0% 12% 88% UnsatisfactoryOutstanding

125 FY04 Importance Ratings: Reliability Frequency of Response N = 8 Mean = 8.88 Median = 9 0% 100% UnsatisfactoryOutstanding

126 FY04 Importance Ratings: Staff Availability Frequency of Response N = 8 Mean = 8.63 Median = 9 0% 25% 75% UnsatisfactoryOutstanding

127 FY04 Importance Ratings: Responsiveness Frequency of Response N = 8 Mean = 8.63 Median = 9 0% 25% 75% UnsatisfactoryOutstanding

128 FY04 Importance Ratings: Convenience Frequency of Response N = 8 Mean = 7.75 Median = 8 0% 50% UnsatisfactoryOutstanding

129 FY04 Importance Ratings: Competence Frequency of Response N = 8 Mean = 9.00 Median = 10 0% 12% 88% UnsatisfactoryOutstanding

130 FY04 Importance Ratings: Handling of Problems Frequency of Response N = 8 Mean = 8.38 Median = 9 0% 25%75% UnsatisfactoryOutstanding

131 Comments

132 Survey Comments Total of 7 respondents provided at least one comment 88% of respondents Total of 18 comments were made on 3 general questions: What was done particularly well? What needs to be added or improved? Other comments Realize comments are qualitative data Comments provide a different type of information from your customers regarding their satisfaction Comments are NOT representative of the perceptions of all your customers Review them but don’t over react to an individual comment Comments are a great source for ideas on how to improve

133 Survey Comments What was done particularly well? (N = 6) Never any confusion on the results. Following SOP for counting requests, providing consistent results. Lab manager is widely available and willing to talk about issue HP's would have that require their (TSB's) services. I have always received excellent service and quick response to questions and problems. Everything. Doug really takes pride in running the lab well. Vince is always a smiling face in the lab.

134 Survey Comments What needs to be added or improved? (N = 6) Turn-around time and transition from one lab worker to the next could be improved. Direct communication with dosimetry custodians regarding missing dosimetry. ( I do understand that efforts are underway to improve this.) Closer tracking of situations where missing dosimetry requires a close estimation. Explaining why users are receiving their annual exposure report and what it means. Ability to perform whole body scanning on someone with highly contaminated hands. Contractor prep of samples could be more timely on occasion. Nothing. I would recommend taking the Analytical Lab services back from the contractor and just do the function in-house. We have the staff already.

135 Survey Comments Other Comments (N = 6) Overall, pretty good job! Timeliness of HP notification has improved greatly, takes pressure off HPs. Scale for importance not relevant in survey. Better form for evaluation needs to be developed. Overall analytical lab service has improved greatly since the hiring of a lab manager. Great work! Keep it up. Get those SOPs done!

136 Summary

137 Summary Respondent Characteristics __% of recipients responded to the survey. Satisfaction Ratings on Specific Service Aspects Respondents were asked to rate their satisfaction with the following aspects of Analytical Laboratory services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Satisfaction mean ratings range from a high of 8.75 on Available Services, Reliability, Convenience, and Competence to a low of 8.13 on Timeliness. Notice that the lowest mean rating (8.13) is still well above the midpoint of a 10-point scale. In general, respondent perceptions are quite positive.

138 Summary (cont.) Satisfaction Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating outstanding performance. For each service aspect, at least 50% of respondents perceived that aspect to be outstanding. For 3 service aspects (Available Services, Quality, and Competence), all respondents rated the service as outstanding. None of the respondents find service to be unsatisfactory (responses of 1, 2, or 3) in any of the service aspects.

139 Summary (cont.) Importance Ratings on Specific Service Aspects Respondents were asked to rate the importance of the following aspects of Analytical Laboratory services Available Services Quality Timeliness Reliability Staff Availability Responsiveness Convenience Competence Handling of Problems The scale ranged from (1) Unsatisfactory to (10) Outstanding. Importance mean ratings range from a high of 9.00 on Available Services and Competence to a low of 7.75 on Convenience. Notice that the lowest mean rating (7.75) is still well above the midpoint of a 10-point scale. In general, respondents find all services to be quite important. Note: In future surveys, scale anchors for importance ratings should be changed to (1) Unimportant to (10) Very Important

140 Summary (cont.) Importance Ratings on Specific Service Aspects (cont.) Response frequencies for each service aspect were computed and responses of 8, 9, and 10 grouped as indicating highest importance. For each service aspect, at least 50% of respondents perceived that aspect to be of the highest importance. For Reliability, all respondents indicated this service aspect to be of the highest importance. None of the respondents find any service aspect to be unimportant (responses of 1, 2, or 3).

141 Recommendations

142 Interpret ORS Customer Scorecard data in terms of other PM data gathered Does the customer satisfaction data, when compared to data in other perspectives, show potential relationships? Review comments in presentation for specific issues that can be tackled Take the time to read through all comments If appropriate, generate potential actions based on what you have learned from the data Can you make changes to address issues raised? How might you implement those actions? Communicate the survey results (and intended actions) to important stakeholder groups ORS Senior Management Radiation Safety staff Survey respondent pool Conduct follow-up survey (within next 2 years) to check on improvements Recommendations