Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Metrics 1 Performance Metrics – What Are They And How Do I Write Good Ones? Chris Hamm Operations Director General Services Administration.

Similar presentations


Presentation on theme: "Performance Metrics 1 Performance Metrics – What Are They And How Do I Write Good Ones? Chris Hamm Operations Director General Services Administration."— Presentation transcript:

1 Performance Metrics 1 Performance Metrics – What Are They And How Do I Write Good Ones? Chris Hamm Operations Director General Services Administration GSA Training Conference and Expo 2010

2 Performance Metrics 2 This session is sponsored by the Federal Acquisition Institute The primary organization providing knowledge and support to the federal civilian acquisition workforce. For more information about FAI, please visit our website at www.fai.gov

3 Performance Metrics 3 Topics Introduction to Performance Metrics Step by Step Process for Creating Metrics Post Award Management Process Pitfalls

4 Performance Metrics 4  “If you can’t measure it, you can’t manage it”  Peter Drucker

5 Performance Metrics 5 Stickk.com & The Biggest Loser  Stickk  Dean Karlan - Economics professor at Yale University  Ian Ayres - a Law professor at Yale University  Jordan Goldberg - a student from Yale School of Management  The mere act of writing down your goals increases your chances of success. Tying a financial incentive (e.g., make a bet or Commitment Contracts) increases your odds even more!  Proof: Years of rigorous academic research using two of the hardest goals to achieve: Losing weight and quitting smoking.  The Biggest Loser  Incredible results are possible with regular measurement, quality assurance surveillance

6 Performance Metrics 6 What are Performance Metrics?  They are the short hand for telling where we are. Number on your speedometer Grades your children bring home Performance results on your service contract Customer satisfaction level Uptime Usage  A means of measuring that indicates how well our performance objectives are being met  An integral part of your requirements development and performance management strategy Also called SLAs, KPIs

7 Performance Metrics 7 Why are we talking about Metrics? FY08 – 50% of all eligible services acquisitions should be PBA FAR Subpart 37.6  (b) Performance-based contracts for services shall include—  (1) A Performance Work Statement (PWS);  (2) Measurable performance standards (i.e., in terms of quality, timeliness, quantity, etc.) and the method of assessing contractor performance against performance standards; and  (3) Performance incentives where appropriate. When used, the performance incentives shall correspond to the performance standards set forth in the contract.

8 Performance Metrics 8 Cost Reimbursable to Fixed Price Continuum ResearchDevelopment Government assumes more cost risk Contractor assumes more cost risk Production / Sustainment As you move to the right, the more detailed and meaningful the performance measures become.... Higher risk, less-defined requirements CPFFCPFF / CPAFCPIF/CPAF R&DDevelopment FPAF/FPIF/ FFP FPIF /FFP Enhancements / Production Operations & Maintenance Lower Risk, well-defined requirements

9 Performance Metrics 9 Performance Metrics are.... A communications tool - The value of an agreement is not just in the final product; the very process of establishing metrics helps to open up communications A conflict-prevention tool - An agreement helps to avoid or alleviate disputes by providing a shared understanding of needs and priorities. And if conflicts do occur, they tend to be resolved more readily and with less gnashing of teeth A living document - This is one of its most important benefits. The agreement isn't a dead-end document consigned to the Forget Forever file. On a predetermined frequency, the parties get to review the metric to assess service adequacy and negotiate adjustments An objective basis for gauging service effectiveness - A metric ensures that all parties use the same criteria to evaluate service quality

10 Performance Metrics 10 http://www.acquisition.gov/comp/seven_steps/index.html

11 Performance Metrics 11  Create an Integrated Project Team! The IPT should have your technical expert, stakeholders, financial person, and acquisition. Be sure to include contracting so that you start off on the right strategy and discuss the requirement.  Invite the technical staff that currently have access to any performance data. There is data somewhere. Find it. Get copies of it. Ask the current contractor for it. Establish the Team

12 Performance Metrics 12  Interview all internal and external parties to determine the needs, desired outcome, potential solutions, obstacles. Get BUY IN!  Take notes during the meeting, send them out to everybody. Documentation takes time, but eliminates complaints and issues that may arise. You cannot over-communicate...  Finalize definitions on the terms you will be using for measuring performance (e.g., Service Level Agreement [SLA], Key Performance Indicator [KPI]) Describe the Problem that Needs Solving

13 Performance Metrics 13 Conduct Market Research  Research the public sector to see what has worked for your counterparts in other agencies Locate sources for performance measures. Type what you are seeking into any search engine and you can find many examples. There are dozens of companies that track performance and create standards  Meet with vendors one-on-one to learn about your industry. Ask them what metrics they use

14 Performance Metrics 14 Due Diligence / Industry Days / RFIs  What if you don’t know what you want to measure?  Industry Day: This is our current environment and requirement, we give a presentation to interested vendors.  Due Diligence: The Government releases a draft description of the requirement and opens up the doors to interested parties, helping each side clarify the requirement.  Request for Information: Can include draft requirements, but is geared most to soliciting ideas and options from the vendors.  Industry Day is good, Due Diligence and RFI’s are better, because you can ask for specific information on industry standard measures, metrics, and approaches. Both will increase the quality of your RFP and the vendor proposals.

15 Performance Metrics 15 How to Include Metrics in the Solicitation Draft Incentive/ Award Fee Plan and SLA Format included in Section J Option 1: Specify Metrics & have all vendors bid the same Option 2: Let vendors chose Metrics & targets Review requirements with the client before release and see if tasks are sufficiently defined to create notional metrics to include as recommendations Remember that the Government’s strongest negotiation position is pre-award. Fee Plans are unilateral before the period begins, bi-lateral during the period (if the CO allows any changes at all) Option # 2 is harder to evaluate but better post award: Better measures, more leverage after award.

16 Performance Metrics 16 PBA Matrix Template

17 Performance Metrics 17 Quality Assurance Surveillance Plan Evaluation Factor weightings for Section M of RFP Contract Line Item structure for RFP PBA Matrix ObjectiveStandardsAQLInspection Incentive Performance Work Statement or SOO Map the Matrix to the Solicitation

18 Performance Metrics 18 Decide how to Measure & Manage Performance People get really wrapped up in performance measures and SLAs, but this can be made simple if you want it to be. Think about the outcomes rather than the processes or org charts Use the systems that you have in place Only measure things that are meaningful, not things because you can Bad Example: Cost Savings over IGCE

19 Performance Metrics 19 Contract Types for Measuring Good Contract Types: “Incentive” types (CPIF / FPIP) Firm Fixed Price (FFP) and Fixed Unit Price (FUP) Award Fee (CPAF / FPAP) Less Good Contract Types: Time & Materials and Labor Hour can have metrics, but cannot have metrics with financial incentives Regardless of contract type, the Government must still perform quality assurance. The level of effort on surveillance is higher on the good contract types

20 Performance Metrics 20 Source Selection Process Include the vendor’s proposed metrics in the technical evaluation Review the feasibility of the measure, whether it is appropriate, meaningful, and end-to-end Is the measure entirely within the control of the vendor? Most vendor proposed metrics will require some changes after award

21 Performance Metrics 21 Manage Performance (immediately after post award)  Ensure Contract Award includes any contractor proposed metric and/or changes from Discussions and Negotiations  Set up separate meeting from Kickoff Meeting with key stakeholders to review metrics  - Ensure that all stakeholders agree to the metrics  - Set the “weight” of each metric  - Make sure the metric is entirely within the control of the contractor (no mixed responsibility)  - Add further definition of the metric and all exclusions

22 Performance Metrics 22 Manage Performance (immediately after post award)  Develop the methodology of how to measure performance  Periods (monthly, quarterly, bi-annually)  Set Levels (Stretch, primary, secondary, tertiary)  Set Measurement Scale with Incentive / Disincentive –For Example:  Stretch:100% + rollover  Primary:100% of available fee  Secondary:90%  Tertiary:75%  Failure:0 % Fee Not all levels are required. Some only have one target. It depends on the requirements.

23 Performance Metrics 23 Manage Performance - Measurement Scale  Award Fee Plan Scale:  Outstanding (95-100)  Excellent (90-95)  Good (80-90)  Average (70-80)  Poor (0-70) SLA Measurement Scale:  Outstanding 99.99+  Excellent 99.9 – 99.99  Good 99 – 99.9  Average 98 – 99  Poor >98

24 Performance Metrics 24 Manage Performance - Examples of Measurements  Percentages  See Chart  Hours  Indicate normal business hours for response!  Days  Always clarify business vs. calendar days!  Yes / No Measure Downtime in one year 99%3.7 days 99.90%8.8 hours 99.99%53 minutes 99.999%5.3 minutes Most projects do not need Five 9s and could not afford it if they wanted it. The cost is prohibitive! NEW EXCEPTION - CLOUD availability

25 Performance Metrics 25 Manage Performance – a few months post award Hold a regular review of the metrics to ensure that the levels are set correctly Set up a quality assurance function to check the contractor’s self reporting metrics Insist on access to the raw data from the system tools used to generate the metrics reports Review all trends and focus on failures Root cause analysis can identify process issues

26 Performance Metrics 26 Manage Performance – Build Dashboard Summary Scorecard CLIN 0001Contractor # 2 Dinning Facility Supplier A&AS Supplier Cost Customer Satisfaction Performance Small Business Goals

27 Performance Metrics 27 Manage Performance – a few months post award Set a structure for gradual improvement of metrics over time For particularly difficult metrics to negotiate, consider benchmarking them for a period without making them SLAs to obtain buy-in and set the appropriate target Gradually increase the overall weight of metrics in the fee plan overtime, with the goal of eliminating subjectivity as much as possible

28 Performance Metrics 28 Manage Performance - Exceptions Scheduled Downtime Government Intervention in Contractor Response Other Contractor Intervention Acts of God / Force Majure Make sure someone from the Government is aware of and approves all scheduled downtime or deviations from SLAs. No fox in the hen house! Another benefit of outsourcing your infrastructure to the “cloud”... No more scheduled downtime!

29 Performance Metrics 29 Manage Performance – Rollover Good metrics are designed to challenge the vendor to deliver good performance. When the vendor does not achieve the performance goal, the Government has the option of “Rollover” *, where the financial incentive is not lost, but moved to the next period. If mitigating circumstances exist (e.g., Government redirected performance), Rollover to the next period may be appropriate. If the vendor simply failed to meet the goal, Rollover is not appropriate. * Interim Rule Section 814 – Reporting on Rollover

30 Performance Metrics 30 Manage Performance – Pitfalls  Metrics should align with the client and project objective AND motivate good contractor behavior  Do not design a metric that, when missed, motivates the contractor to give up on that measure for the rest of the period –Example: Doing an “average” over a six month period. If the contractor fails in the 1 st month, there is no incentive to maintain service levels or improve over time  Just because something can be measured doesn’t mean it should be a metric  Example: It is easy to measure how fast someone picks up a phone, but a better measure is how long it takes to actually solve the problem

31 Performance Metrics 31 Manage Performance – Pitfalls  Having too many measures  Example: Too many metrics, some of which were redundant and worth less than 1% of the pool  Example: Aligning metrics to Government staff. At least one metric (and sometimes more) for each of 25 tasks to appease stakeholders  Targets  Just like performance reviews, if you set a target, the contractor will plan and staff to meet that target. If the target is an average, make sure that they plan to make the average, not to always beat the target

32 Performance Metrics 32 Manage Performance – Pitfalls  Funding  If the metric has an incentive, that amount must be fully funded to prevent any anti-deficiency situation –This is what makes really good incentives difficult level of oversight  Oversight  COR and Contractor may initially balk at the level of effort required to stand up a metric management process

33 Performance Metrics 33 Manage Performance – Pitfalls  Over reliance on Industry Standards  Benchmarks are great, but that does not mean they are right for your client’s infrastructure. Just because Gartner says five 9s is normal for a call center doesn’t mean that your requirement needs that  Customer Satisfaction  This one is difficult. It will take time to reach agreement on what scale to use (1-5, Y/N, etc), what survey to use, how to count the no-response pool.  At the end of the day, customer satisfaction may be the most important measure but the hardest to get right

34 Performance Metrics 34 Learn more about the Federal Acquisition Institute and the resources it offers at www.fai.gov

35 Performance Metrics 35 Questions


Download ppt "Performance Metrics 1 Performance Metrics – What Are They And How Do I Write Good Ones? Chris Hamm Operations Director General Services Administration."

Similar presentations


Ads by Google