Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Management

Similar presentations


Presentation on theme: "Performance Management"— Presentation transcript:

1 Performance Management
Conference (ORS/ORF) January 21, 2004 ORS Annual Self Assessment

2 Welcome Customers ORS Advisory Committee members ORS/ORF PM teams
ORS/ORF personnel NIH sponsors/colleagues Visitors

3 Mr. Steve Ficca Associate Director for Research Services

4 Purpose of Performance Management (PM)
Institutionalize a process to continually improve the performance of the services delivered to the NIH. Demonstrate the degree to which the ORS/ORF service providers are adding value Identify new ways to improve service performance even further President’s Management Agenda – Budget and Performance Integration, and Competitive Sourcing

5 Context: Some FY03-04 Business Realities
Competitive sourcing (A-76) New organization (ORFDO) Reorganization within both ORS and ORF Security Hiring freeze

6 Number of Improvements by Type
OQM developed systematic method to gather and quantify improvements during FY03 24 quantifiable improvements achieved to date: Provide Print and Digital Media Services reduced their space by several 1,000 sq ft for estimated cost avoidance of $200k Provide Library Services has maintained the unit cost of an “information unit” while the number of units rose by 35% Conduct Collaborative Research had a 16% increase in the number of professional presentations delivered Provide Basic Animal Life support had a 113% increase in animal holding capacity through building renovations and purchasing new cages

7 Number of Improvements by Type
Provide Animal Research Services had an 18% increase in ICU utilization through improved record keeping Provide Administrative Services reduced FOIA cycle time by 35 days through enhancing staff competencies

8 Number of Published Service Group Success Stories

9 Number of Published Service Group Success Stories

10 Mr. Antonio R. Rodriguez Assistant Director for Quality Management Office of Research Services

11 ORS Service Hierarchy Service Hierarchy Discrete services (DS)
Service Groups (SG) Program Areas (PA) Program Area X Y11 Y12 Y13 SG Y1 Y21 Y22 SG Y2 Program Area Y DS SG X1 X21 X22 SG X2 DS X23 DS X11 X12 X13 X14

12 Today’s Agencies Are Expected to Be:
Why do we need to do this? Today’s Agencies Are Expected to Be: Competitive (sourcing) Accountable Customer-friendly Fiscally responsible External world is highly unstable so planning systems must deal with uncertainty 1. Strategy is a Hypothesis 2. Strategy is a Dynamic Process 3. Strategy is Everyone’s Job Organizations are systems that must sense, experiment, learn, and adapt Strategy is an articulation of the direction we want to take an organization *Adapted from “Planning Is Dead, Long Live Planning, “ Jos. Fuller, Across the Board, March 1998

13 On-going implementation/ improvement
PM Overall Process Business Function Day-to-day (PM Framework) On-going findings & recommendations Business Plans (Budget/ABM) On-going implementation/ improvement

14

15

16 Performance Management Roles and Responsibilities
ORS Executive Board (ORSEB) ORS Management Council (ORSMC) PM Team Leader PM Team Members Office of Quality Management (OQM) OQM Sponsored Consultants ORS Annual Self Assessment

17 What Do Our Customers Really Want?
(Outcome or “End -State”) What are we really trying to accomplish as a Service Group? What outcomes are we offering customers with our Service Group /Discrete Service offerings? ORS Annual Self Assessment

18 Internal Business Process
Generic Value Chain Model EXAMPLE Innovation Cycle “Innovation” Develop Services Identify Customers/Create Service Offering Operations Cycle “Operational Excellence” Communicate the Services Deliver the Products / Services Service Cycle “Service Quality” Service the Customer Customers Needs Identified Customers Needs Satisfied ORS Annual Self Assessment 22

19 What is Your Value Chain?
How do we get new work? Where does it come from? What processes do we need to perform really well? How do we complete the work? How do we deliver it to our clients/ customers? How can we improve our processes to meet attributes identified in the VP? ORS Annual Self Assessment

20

21 Pathology: Case Completion Time
Note: Average turnaround time has significantly improved in FY02

22 What Improvements Can Be Made in Our Internal Processes?
What do we need to do better to make our clients/ customers happy? Can we be more efficient or more effective at what we do? ORS Annual Self Assessment

23 Typical Learning & Growth Objectives
Knowledge & Technology Assets Climate for Action Skills & Competencies Strategic skills Training levels Core competencies Strategic technologies Strategic databases Experience capture Best practices Patents, copyrights Leadership Accountability / empowerment Alignment Results Oriented Teaming ORS Annual Self Assessment

24 What do our employees need to help us achieve our goals?
What Skills Need to be Addressed? What will the Knowledge and skills needs be over the next 5 years? Do we need to train/recruit/contract out? ORS Annual Self Assessment

25 Unit Cost

26 What is in it for the NIH and ORS?
Systematic improvement of performance Information for Decision making Evidence of performance levels Justification for resource needs Greater personnel involvement (decision-making, ownership, pursuing improvement)

27 Performance Management and A-76
Better Performance Data  Better PWS Better PWS, better: Competition (e.g., less room for confusion) Better QASP and overall management of the contract

28 Performance Management and A-76 (cont.)
Every function, before facing an A competition, should: Identify the boundaries of its “span of contribution / value chain,” and manage it, cross-functionally Continually find and implement better ways to do the work Evaluate every position before filling it, and determine whether the work should be contracted out Must be as efficient and effective as possible, so future MEOs do not have to be drastically different to win

29 Data / Results

30 Percent of Service Groups with completed PMPs (as of Quarter 1 FY04)

31 Percent of Teams with PMPs Reviewed by Program Management (as of Quarter 1 FY04)
ORS Total Number of Teams 33 PMP’s Reviewed by Management 24 73% ORF Total Number of Teams 13 PMP’s Reviewed Management 9 69% Overall Total Number of Teams 46 PMP’s Reviewed by Management 33 72% Decide which measures you will present data for in your main presentation as they relate to the objectives you have selected. Indicate through the title with the proper numbering and corresponding name of the measure (from your PMP) which data you are graphing. For instance, here is an example from the OQM template of measure C1a: Percent of Teams with PMPs reviewed by Management. You can see if you look back at the prior slide that the graph title corresponds to what is listed on the PMP for the first measure for the first customer objective (C1a). ORS Annual Self Assessment

32 Percent of service groups with ongoing data collection (as of Quarter 1 FY04)
ORS Total Number of Teams 33 Teams with ongoing data collection 26 79% ORF Total Number of Teams 13 Teams with ongoing data collection 6 46% Overall Total Number of Teams 46 Teams with ongoing data collection 32 70%

33 Relationship Between Total Hours Spent with Service Groups and PMP Implementation Progress (as of Quarter 1 FY04) Decide which measures you will present data for in your main presentation as they relate to the objectives you have selected. Indicate through the title with the proper numbering and corresponding name of the measure (from your PMP) which data you are graphing. If you have any questions refer to the examples in the customer perspective section. Note 1: Total hours include both hours spent by OQM and Consultants for all effort codes. Team progress is defined as percent completion of 14 key elements of the 2003 PM process Note 2: Total Hours do not include hours spent on OQM’s own PMP (Provide quality, performance, and organizational improvement services ORS Annual Self Assessment

34 Number of OQM data analyses services provided (as of Quarter 1 FY04)
33 Total Types of Data Analyses Provided: Assisted DFP in developing benchmark study Analyzed cycle time data for Department of Veterinary Resources (DVR) nutrition program Analyzed cycle time data for DVR pharmacy program Analyzed Process data for DVR histopathologies With Division of Facilities Planning (DFP), developed large scale project scheduling system to track project status

35 Types of Data Analyses Done (Cont.)
Analyzed processing time data for DVR Comprehensive full panels Analyzed processing time data for DVR routine and rush PCR’s Analyzed data on number of design & submittal reviews for Department of the Fire Marshall (DFM) new permit process Analyzed five years of emergency response (time) data for Fire Department Documented existing process to maintain JCAHO accreditations at the Clinical Center and maintain AALAC accreditations in NIH animal facilities.  Developed and administered customer surveys for 22 ORS/ORF Service Providers

36 Number of OQM data analysis service provided (as of Quarter 1 FY04) con’t.
Customer Satisfaction Surveys Developed and Administered in FY03: Perform master and facilities planning Maintain roads, parking areas, and landscaping Manage and administer worksite enrichment programs – vendor survey Manage and administer worksite enrichment programs – climate assessment Provide security guard services Manage solid waste streams Maintain safe working environment – radiation safety Maintain safe working environment – biological safety cabinets Maintain safe working environment – occupational medical services Maintain safe working environment – pest management Procure and deliver animal product – animal procurement Procure and deliver animal product – animal transportation Provide animal research services – phenotyping Provide library services – translations Transition Slide for your Conclusions. ORS Annual Self Assessment

37 What Should Happen Next?
Conclusion and What Should Happen Next? Transition Slide for your Conclusions. ORS Annual Self Assessment

38 Conclusion and Next Steps
Significant progress achieved in FY03 in spite of “additional” challenges (resource intensive) We need to raise the performance bar: Every service group should contribute at least one verifiable “bottom-line” improvement such as evidence of: Higher customer satisfaction Savings Cost avoidance Shortened cycle times Improved QOWL ORS Program Areas should have their own PMP and scorecards (cross-functional/service groups) ORF should continue to refine higher-level PMPs and scorecards ORS and ORF should report organization-wide performance results on an annual basis

39

40

41 Mr. Leonard Taylor Acting Director of Office of Research Facilities, Development and Operations

42 Break Concurrent Sessions Start at 9:15 ORS in rooms E1 and E2 ORF in room D


Download ppt "Performance Management"

Similar presentations


Ads by Google