1 FY02 ASA Presentation Ensure Integrity of NIH Facilities Presented by: Mehryar Ebrahimi Office of Research Services National Institutes of Health 18.

Slides:



Advertisements
Similar presentations
Project Management Concepts
Advertisements

St. Louis Public Schools Human Resources Support for District Improvement Initiatives (Note: The bullets beneath each initiative indicate actions taken.
System Development Life Cycle (SDLC)
Build a Better Mousetrap: Quality Assurance and Call Specialist Performance Evaluation.
Performance Management Guide for Supervisors. Objectives  Understand necessity of reviews;  To define a rating standard across the Foundation for an.
Module 11 Session 11.2 Visual 1 Module 11 Executing and Controlling the Work Session 11. 2: Managing Contracts.
DO NOT PLACE ANY TEXT OR GRAPHICS ABOVE THE GUIDELINE SHOWN DO NOT PLACE ANY TEXT OR GRAPHICS BELOW THE GUIDELINE SHOWN TO EDIT GRAPHICS IN THE MASTER.
1 FY 2004 Performance Management Plan Presentation Conduct Fire Prevention Services Team Members: J. P. McCabe, P.E. Fire Marshal, DFM, ORS Samuel A. Denny,
Safety and Health Programs
The University of Texas at Arlington Office of Research and Office of Accounting and Business Services Brown Bag Training Session Three: The 1,2,3’s of.
Pertemuan Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
Stress: employee’s training Contents What is the issue? What is the issue in our organisation? Why should we deal with it? What are.
Safety and Health Programs
Energy Audit- a small introduction A presentation by Pune Power Development Pvt. Ltd.
1 Performance Management Presentation Team Members: Ken Ryland and Marianne Bachmann, Hugh Malek, Karla Terney, Alice Hardy, Joseph Kristofik, Dan Shoen.
ORF 1 Performance Management Presentation Team Members: Division of Policy and Program Assessment Farhad Memarzadeh, Sheri Bernstein, Reza Jafari, Robert.
9 Closing the Project Teaching Strategies
Copyright 2005 Welcome to The Great Lakes TL 9000 SIG TL 9000 Requirements Release 3.0 to Release 4.0 Differences Bob Clancy Vice President, BIZPHYX,
Software Quality Assurance Activities
Chapter 4 The Project. 2 Learning Objectives Third phase starts after a contract is drawn up and ends when the project objective is accomplished; final.
FY02 ASA Presentation Provide Scientific Equipment Services Presented by: Johnny Robbins (Team Leader) Team Members: Annalie Burke John Baron John Olguin.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Provide Facility Management and Maintenance Services 1 FY02 ASA Presentation Provide Facility Management and Maintenance Services Presented by: Kelvin.
1 FY02 ASA Presentation Provide Animal Product Delivery Presented by: Scott Green Animal Resource Contracts Administrator ORS/DIRS/VRP Office of Research.
1 FY02 ASA Presentation Fire Prevention Section, EMB, DPS Presented by: J.P. McCabe, P.E. Samuel A. Denny, P.E. Robert C. Beller, P.E. Office of Research.
Noel-Levitz Student Satisfaction Survey of Classroom and Online Students Conducted Spring 2008.
1 FY02 ASA Presentation Environmental Protection Branch Presented by: Don Wilson Division of Safety National Institutes of Health 18 November 2002.
1 FY02 ASA Presentation Operate Emergency Communication Center Presented by: G. Borden G. Elliott, G. Harris, L. Martinez, M. Sheelor Office of Research.
1 FY02 ASA Presentation Basic Life Support for Animal Research Animals Presented by: Eileen Morgan Chief, Facility Management Branch Charmaine Foltz Chief,
1 FY02 ASA Presentation “Provide Space Planning to Support IC Programs” Presented by: Gerald W. Hines (team leader) Cyrena Simons Joan Swaney Camelia Smith.
1 FY’02 ASA Presentation Occupational Medical Service, DS “Medical Care in a Timely Fashion” Presented by: James Schmitt Team Members: Alpha Bailey James.
Data Analysis and Graphing Office of Quality Management Office of Research Services National Institutes of Health July 2002.
1 FY02 ASA Presentation Provide Mail Services Presented by: James V. Spears Team Members: Angela Milton Tracy Niksich Office of Research Services National.
Chapter 4 Product Costing for Management Decisions: Activity-Based Costing and Activity-Based Management.
Fee For Service Program Alabama Medicaid Program Changes.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
1 CONDUCT FACILITY ASSESSMENTS Customer Value Proposition Leverage our expertise and background in NIH scientific infrastructure needs by providing state.
1 FY04 ORS Performance Management Manage Information Technology Team Members: Charlie Jones, Ben Ashtiani, Denny Bermudez, La'Tanya Burton, Ron Edwards,
Continuous Improvement Story Cover Page The cover page is the title page Rapid Process Improvement Story Template O Graphics are optional This CI Story.
Adult Education and Literacy Budget Development and Cost Allocation.
1 FY02 ASA Presentation Provide NIH Events Management Services Presented by: Ken Ryland Marianne Bachmann Office of Research Services National Institutes.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
HUMAN RESOURCES Teach Cobb! It’s Better at the Top! Evaluating Central Office Employees Evaluator Training for Clerical Evaluations July 2008.
1 FY02 ASA Presentation Provide Administrative Support (2) Presented by: Genia Bohrer Office of Research Services National Institutes of Health 18 November.
Seven Quality Tools The Seven Tools –Histograms, Pareto Charts, Cause and Effect Diagrams, Run Charts, Scatter Diagrams, Flow Charts, Control Charts.
1 FY02/FY03/FY04 Evaluation Summary Security and Emergency Response Program Division of Fire Rescue Services “NIH Fire Department”
ORF 1 Performance Management Presentation List your Service Group Team Members: List Leader and Members ORS National Institutes of Health Date.
1 FY03 PMP Presentation Package Security and Emergency Response Program Division of Fire Rescue Services “NIH Fire Department”
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
State of Georgia Release Management Training
1 Overview of the NF 1680 Evaluation of Performance Process Overview/Training Charts April 7, 2008.
UNIVERSITY OF DAR ES SALAAM t Selection and Employment of Consultants Negotiations with Consultants; Monitoring Performance of Consultants; Resolving Disputes.
1 FY02 ASA Presentation Provide Emergency Response Services Presented by: Assistant Chief Jonathan Mattingly Team Members: Chief Gary Hess Master Firefighter.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
1 1 Effective Administration of Commercial Contracts Breakout Session # Session D06 Name: Holly Walker, CPCM Corporate Learning Solutions and Contract.
FY02 ASA Presentation Provide Workplace Safety Presented by: Deborah E. Wilson, DrPH, SM, CBSP Occupational Safety & Health Branch Division of Safety,
Management, Supervision, and Decision Making Chapter 2.
Welcome. Contents: 1.Organization’s Policies & Procedure 2.Internal Controls 3.Manager’s Financial Role 4.Procurement Process 5.Monthly Financial Report.
Author name here for Edited books chapter 9 Facility Maintenance 9 chapter.
Fort Stanwix National Monument Energy Audit Contract
Sheryl Loesch Swakopmund, Namibia October 17, 2017
Reconstruction site Investigation, Planning, Scheduling, Estimating and Design Eng. Fahmi Tarazi.
CONDUCT FACILITY ASSESSMENTS
FY02 ASA Presentation Operate Emergency Communication Center
Quality Management Systems – Requirements
OHS Staff Introduction Training
Stevenson 5 Capacity Planning.
Hands-On: FSA Assessments For Foreign Schools
Civil Contractors Federation ‘2014 Earth Awards’ Submission Template CATEGORIES 1 and 2 ONLY Company Name (NOTE: if an Alliance then the name of the.
Internal Control Internal control is the process designed and affected by owners, management, and other personnel. It is implemented to address business.
Presentation transcript:

1 FY02 ASA Presentation Ensure Integrity of NIH Facilities Presented by: Mehryar Ebrahimi Office of Research Services National Institutes of Health 18 November 2002

2 Table of Contents Main Presentation ASA Template ……………………………….……………………………….4 Customer Perspective……………………….……………………………….6 Customer Segmentation …………………….……………………………………8 Customer Satisfaction……………………….…………………………………….10 Internal Business Process Perspective…………………………………….12 Service Group Block Diagram…………………………………………………….13 Conclusions from Discrete Services Deployment Flowcharts…………………14 Process Measures………………………………………………………………….15 Learning and Growth Perspective………………………………………… Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data……….23 Analysis of Readiness Conclusions………………………………………………24 Financial Perspective……………………………………………………… Unit Cost……………………………………………………………………………..26 Asset Utilization……………………………………………………………………..28 Conclusions and Recommendations……………………………………………..34 Conclusions from FY02 ASA..…………………………………………………….35 Recommendations………………………………………………………………….37

3 Table of Contents Appendices Page 2 of your ASA Template Customer survey results and graphs Process maps Learning and Growth graphs

4

5 Discrete Services We propose some slight changes in wording of the Discrete Services as follows: Old DS1: Ensure campus infrastructure integrity New DS1: Ensure Integrity of Campus Utility Infrastructure Old DS2: Ensure building and building system integrity New DS2: Ensure Integrity of Buildings Utility Infrastructure

6 Customer Perspective

7 WHO IS THE CUSTOMER? A key point to be made is that the team concluded that our primary customer is PWB for both of these discrete services. Of course PWB’s primary customers are the IC’s who are the ultimate recipient of the utility services.

8 Customer Segmentation DS1 Ensure Integrity of Campus Utility Infrastructure Our customers represent the majority of supervisors and managers and employees working within Central Utilities Section of PWB Total Population= 64 Total Customers= 15

9 Customer Segmentation DS2 Ensure Integrity of Campus Building Utility Infrastructure Our customers represent a portion of supervisors, managers, and other employees within Building Maintenance Sections of PWB Total Population= 222 Total Customers= 12

10 We sent out 27 customer surveys to our customer target population, however we did not receive a very high response rate. At this time, due to low response, we cannot make any objective conclusions regarding our survey on DS-2 dealing with the Building Utilities Integrity. We have looked at the comments and the Scatter Diagram for Discrete Service 1: Ensure the Integrity of Campus Utility Infrastructure, and we will be focusing on those areas that rated high on “level of importance” and low on “satisfaction”. We plan to monitor customer satisfaction and participation rates throughout the year during FY03. Customer Satisfaction

11 Scatter Diagram For DS-2 Insure Campus Integrity FY02 Customer Importance and Satisfaction Ratings: A Closer Look Note: A smaller portion of the chart is shown so that the individual data points can be labeled.

12 Internal Business Process Perspective

13 Service Group Block Diagram Ensure Integrity of Campus Utilities Infrastructure and Ensure Integrity of Campus Building Infrastructure consumes approximately 25% of the project officer’s time in DCAB. Our primary interface points are with “Perform Facilities Maintenance and Operations” and “Provide Facilities Management Services ”

14 Our Service Group completed 2 deployment flowcharts for 2 discrete services Based on the deployment flow charts that we created and analyzed, we believe that we need to increase the involvement of our customers (PWB) and ensure that we receive, and utilize their comments during Planning, Design, Construction and Closeout. Conclusions from Discrete Services Deployment Flowcharts

15 Process Measures Process measures for each discrete service: DS1: Percent of Projects impacting Campus Utilities vs. the number submitted to PWB for review at different stages. DS1: Percent of Projects submitted for review to PWB vs. the number of reviews returned to DCAB and their timeliness DS2: Percent of Projects impacting Building Utilities vs. the number submitted to PWB for review at different stages. DS2: Percent of Projects submitted for review to PWB vs. the the number of reviews returned to DCAB and their timeliness DS-2: Building 10 utility infrastructure team (DCAB Team 7) has established process measures that includes tracking of activities performed by this team. It is important to note that this team takes the lead in coordinating projects with PWB and incorporating their comments specially during pre- design and design activities.

16 Process Map for Ensuring Building Utility Integrity

17 Process Map for Ensuring Building Utility Integrity Cont … Note there is little to no involvement from PWB after completion of the design

18 Process Measures  Other than Building 10 utility infrastructure tracking which is done by Team 7 of DCAB, we did not implement any of these proposed unique process measures for FY02 however, we will be implementing the above measures for FY03 in cooperation with PWB.

19 Process Measures Building 10 Utilities Infrastructure Group (DCAB Team 7)

20 Process Measures Building 10 Utilities Infrastructure Group (DCAB Team 7)

21 Learning and Growth Perspective

22 Summary of L & G Data for Service Group 34 Services provided by this service group is a part of a greater function performed by DCAB. Therefore, the L&G for the DCAB is presented here. 3% employee turnover Over 1 award per employee About 4 days of sick leave per employee 0 EEO complaints, 2 ER and 5 ADR cases out of 111 employees

23 Based on the feedback from the Learning and growth data provided, we found that our turnover rate, sick leave rate, and EEO/ER/ADR rates were below average when compared to other discrete services. We were a little above average on the number of awards received by our employees. Since this is the first year we have examined this data in this way, we will have better comparative data at the end of FY03. In general, data indicates DCAB is good place to work. Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data

24 We feel that our staff is fully qualified to perform the work related to our discrete services now and into the foreseeable future. We do, however, need to monitor the early planning process and forecasts for new construction and alterations. A sudden increase in work load in these areas will have an immediate impact on our ability to spend sufficient time investigating condition of the existing utilities infrastructure and their future planning. Analysis of Readiness Conclusions

25 Financial Perspective

26 Unit cost for DS1: Ensure Integrity of Campus Utility Infrastructure Per square foot cost of providing this discrete service was calculated using the budget included in the rent model divided by the total gross square feet of the campus buildings. Unit Cost Measures Fiscal Year Total Campus Gross SF Total BudgetCost of DS-1/GSF 20019,405,937 $713, ,405,937 $ 743,

27 Unit Cost Measures Cont … Unit cost for DS2: Ensure Integrity of Building Utilities Infrastructure Per square foot cost of providing this discrete service was calculated using the budget included in the rent model divided by the total gross square feet of campus buildings. Fiscal Year Total Campus Gross SF Total BudgetCost of DS-1/GSF 20019,405,937 $ 3,124, ,405,937 $ 3,200,000.34

28 There are two separate ORS accounts for Campus and Building Integrity services as follows: Campus Utility Infrastructure (HQF10017) Building Utility Infrastructure (HQF10000) Asset utilization was evaluated using the time cards from DCAB and comparing it with the allotted budget for each Discrete Service. Asset Utilization Measures

29 Budget increase of 4% in 02 vs. 01 DCAB charges increased by 40% in 02 vs. 01

30 Budget increase of 2.5% in 02 vs. 01 DCAB charges increased by 7% in 02 vs. 01

31

32 Over 30% of PO’s are not charging to the ORS account on Building Integrity although they provide the service.

33 Conclusion for Asset Utilization Based on the study of the actual time charged against the ORS account for Ensure Integrity of Campus and Building Utilities, it is clear that there is a need for additional training regarding use of the ORS account for these discrete services as there are many PO’s who do not charge any time to these accounts and provide the service. Considering deficient utility infrastructure of Building 10, we anticipated higher than average PO’s time to be charged against the ORS account. Evaluating the time charged by DCAB Teams 1, 6, & 7 revealed that approx 20% of the total time on the ORS building infrastructure account was charged for building 10 whereas this building constitutes 30% of the campus total square feet. Paul Hawver who leads the Building 10 infrastructure group, indicated there are services provided by consultants that have not been captured but are being considered for inclusion in FY03 and will be charged against the Building Integrity account.

34 Conclusions and Recommendations

35 Conclusions from FY02 ASA 1. The following are observations from of the Radar Charts and the comments from surveys (PWB) for the “Ensure Campus Integrity” discrete service: a.Quality, Cost, Competence, Reliability, and Availability were rated above average with Quality and Competence being the highest ranked. b.Timeliness, Responsiveness, Handling of Problems and Convenience were rated below average with Timeliness being the lowest ranked.

36 Conclusions from FY02 ASA 2. The following are conclusions from the comments provided in the surveys: a.What was done particularly well: i.Communication ii.Construction of Tunnel (Reliability, Safety, maintenance) b.What needs to be improved: i.Timeliness ii.Training iii.Coordination iv.Closeout v.More involvement during Inspection, Commissioning and Acceptance vi.Involvement in all levels of discussions affecting the utilities vii.PWB is the customer viii.Beneficial occupancy dates should be agreed to by customers (PWB)

37 Recommendations 1. Identify PWB as a customer in DCAB’s ISO 9000 procedure manual. This would require PWB’s involvement in all aspects of the project specifically sign off requirement at 35% Design and Construction closeout. (Action by DCAB)

38 Recommendations Cont … 2. Establish a tracking system to monitor projects that impact the campus or building utilities infrastructure and include the following as a minimum (Action by DCAB and PWB management): a.For each project identify if there is impact on the utilities, both campus and building utilities, during planning (Existing system - PIN). b.Provide automatic Notification to the PWB central point of contact of the utility impacts and the design / construction schedule. c.PWB and/or DCAB monitoring system needs to be established to assure projects that are identified for utility impact are submitted for PWB reviews at planning, design, construction and closeout. Also PWB’s reviews are done in a timely manner and returned to DCAB.

39 Recommendations Cont … 3.DCAB Project Officers need to better track their time spent on the utilities infrastructure for both campus and building utility categories in their time cards. Additional training is needed. (Action by DCAB management)

40 Recommendations Cont … 4.Building 10 complex utility infrastructure team (DCAB Team 7) led by Paul Hawver is currently providing review and coordination services to DCAB project officers on all projects that have utility impact in Building 10. Continue with the service and assure all associated costs are charged against the ORS Building Utilities Infrastructure account. Recommend adding same type of service for all other buildings on the campus either by PWB or DCAB. (Decision by DES management as to which branch should provide this service as this will require dedicated manpower). 5.The two Discrete Services studied under this service group category, should become a part of the larger DCAB service group. (Decision by DCAB management).

41 Appendices

42 Appendices  Page 2 of ASA Template  Customer segments graphs  Customer satisfaction graphs  Block diagram  Process maps  Process measure graphs  Learning and Growth graphs  Analysis of Readiness Information  Unit cost graphs  Asset utilization graphs

43 Page 2 of Revised Template

44 Customer Survey Results DS1: Ensure Integrity of Campus Utility Infrastructure

45 Radar Chart FY02 Product/Service Satisfaction Ratings Note: The rating scale ranges from where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

46 Radar Chart FY02 Customer Service Satisfaction Ratings Note: The rating scale ranges from where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

47 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings Note: The Importance rating scale ranges from where “1” represents Unimportant and “10” represents Important. The Satisfaction rating scale ranges from where “1” represents Unsatisfactory and “10” represents Outstanding.

48 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings: A Closer Look Note: A smaller portion of the chart is shown so that the individual data points can be labeled.

49 Reviewing Comments Realize comments are qualitative data and are not meant to be counted and tallied Comments provide a different type of information from your customers regarding their satisfaction Comments are NOT representative of the perceptions of all your customers Review them but don’t over react to an individual comment Comments are a great source for ideas on how to improve

50 What was done particularly well? Nothing comes to mind. Nothing was done particularly well the quality of the material was not the best they do not make the repairs to the equipment in a timely manner. Communication. Can say very little. Can't think of anything. The Construction of the Utility Tunnel Expansion Project improves the reliability of the steam/condensate pump return, chilled water and domestic water distribution systems. It also provides our operating personnel safe working conditions, adequate space and accessibility to perform maintenance and repairs of the systems.

51 What needs to be improved? Operator and maintenance training, fixing deficiencies, follow up on warranty work, customer concurrence on change orders/deletions/addition to contract. They need to fix the things that are leaking and finish the punch list that they started on but stopped it seams that they are just waiting to see if we will do the work ourselves. Attention to detail, alternative solutions, fiscal responsibility, timeliness, bring project to closure. Working relationship between DCAB and PWB. PWB is also DCAB's customer (internal). All of the above. Training, O & M Manuals, dust control, monitoring contractor activities, close out of contract, warranty items, final acceptance. Improvements are needed in the areas of coordination, inspection, commissioning and acceptance of the project involving new construction or repair of utility distribution systems. Personnel from Central Utilities Section (CUS) should be involved and included in all level of discussion affecting the Utilities.

52 Other Comments DCAB needs to get costumer concurrence on changes to contract. Beneficial occupancy date should be agreed to by costumer. I THINK YOU NEED TO GET MORE INPUT FROM THE PEOPLE IN CHARGE OF THE PLANT TO SEE WHAT THEY NEED IN HERE NOT JUST WHAT THE DESIGNERS WANT TO PUT IN HERE THE MEN THAT RUN THE PLANT KNOWS WHAT THEY NEED TO DO THE JOB Please give me a call if this survey is for real then would give comments. Jim Powers Asst Chief Cup Central Utilities Section should have the signatory authority in the commissioning and final acceptance on all the projects relating to Utility Distribution Systems.

53 Customer Survey Results DS2: Ensure Integrity of Campus Building Utility Infrastructure

54 Radar Chart FY02 Product/Service Satisfaction Ratings Note: The rating scale ranges from where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

55 Radar Chart FY02 Customer Service Satisfaction Ratings Note: The rating scale ranges from where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

56 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings Note: The Importance rating scale ranges from where “1” represents Unimportant and “10” represents Important. The Satisfaction rating scale ranges from where “1” represents Unsatisfactory and “10” represents Outstanding.

57 Scatter Diagram FY02 Customer Importance and Satisfaction Ratings: A Closer Look Note: A smaller portion of the chart is shown so that the individual data points can be labeled.

58 What needs to be improved? Differential pressure of around 10 to 12 psig should be maintained all the time. There are still many bldg. Groups that don't have the tertiary pumping system installed are dependent on the differential pressure being maintained. The fee for service associated with the planning needs to be reduced and the time it takes to complete projects needs to be shortened.

59 Other Comments I don't know that the high pressure air has been included in this project but there are some existing problems with this utility. Most buildings do not receive the 100 psig air from the plant as they did in the past. A minimum of 100 psig air is needed for lab and cage wash equipment. Bldg 37 is having a compressor installed just for this equipment. Bldg 36 researchers have also inquired about the lower pressure being supplied to the bldg.

60 Process Map for DS1 (page 1 of 2)

61 Process Map for DS1 (page 2 of 2)

62 Process Map for DS2: (Page 1 of 2)

63 Process Map for DS2 (Page 2 of 2)

64 FY02 Learning and Growth (L&G) Data for the Annual Self Assessments Service Group 34: Ensure Integrity of NIH Facilities 10 October 2002 Summary Prepared by the Office of Quality Management

65 Methodology All data represent occurrences from Oct June 2002 Data analyzed covered period between October 1 st and end of June to provide time to analyze and present the data ORS Human Resources (HR) provided data on: Turnover Sick leave Awards HR data stored in NIH databases by Standard Administrative Codes (SACs) Developed cross-reference of ORS Service Groups to SACs Almost all SACs assigned to Service Groups Some Service Groups have identical SACs In this case, two Service Groups will receive same set of data

66 Methodology (cont.) Also obtained data from: Equal Employment Opportunity (EEO) Number of EEO complaints Employee Relations (ER) Number of ER cases Alternative Dispute Resolution (ADR) ADR cases

67 Interpreting Your Data FY02 is the first time L&G data were collected and analyzed Compare your Service Group relative to the other ORS Service Groups What are all the L&G indicators telling you? In the future your group should compare itself to its own Service Group data over time Interpret data in terms of other ASA data Customer satisfaction ratings Process measures Financial measures Does the L&G data, when compared to data in other perspectives, show potential relationship (could L&G be contributing to customer satisfaction results)? From reviewing your Service Group’s L&G data, what could be done to improve Quality of Work Life (QOWL)?

68 Service Group Turnover Rate Calculated as the number of separations for a Service Group / Population of Service Group Separations defined as: Retirements (separation codes 3010, 3020, 3022) Resignations (separation codes 3120, 3170) Removals (separation codes 3300) Terminations (separation codes 3520, 3550, 3570) Promotions to new organization (separation codes 7020) Reassignments (separation code 7210) Note that transfers/promotions within ORS Divisions/Offices are not captured by the NIH database

69 Service Group Turnover Rate (cont.) Calculation of Service Group population was needed since number of employees changes over time Population for Service Group was estimated based on average of employee count at three snapshots in time (Nov 2001, Feb 2002, June 2002)

70 Service Group Turnover Rate (Oct June 2002) Service Group Number Turnover Rate

71 Average Hours of Sick Leave Used Calculated as the total number of sick leave hours used for a Service Group / Population of Service Group

72 Average Hours of Sick Leave Used (Oct June 2002) Service Group Number Average Hours

73 Average Number of Awards Received Calculated as the total number of awards received / Population of Service Group Includes both monetary and non-monetary awards Cash awards QSIs Time-off Honorary Customer Service

74 Average Number of Awards Received (Oct June 2002) Service Group Number Average number

75 Average Number of EEO Complaints Calculated the total number of EEO complaints for a Service Group / Population of Service Group

76 Average Number of EEO Complaints (Oct June 2002) Service Group Number Average Number

77 Average Number of ER Cases Calculated the total number of ER cases for a Service Group / Population of Service Group Case is defined as any contact with ER Office where an action occurs (e.g., Letter is prepared)

78 Average Number of ER Cases (Oct June 2002) Service Group Number Average Number

79 Average Number of ADR Cases Calculated the number of ADR cases for a Service Group / Population of Service Group Case is initiated when person contacts ADR

80 Average Number of ADR Cases (Oct June 2002) Service Group Number Average Number

81 Learning and Growth Data Table 3% Employee Turnover About 4 days of sick leave per employee Over 1 award per employee 0 EEO complaints, 2 ER and 5 ADR cases out of 111 employees

82 Summary of Service Group 34 Learning and Growth Data 3% employee turnover Over 1 award per employee About 4 days of sick leave per employee 0 EEO complaints, 2 ER and 5 ADR cases out of 111 employees