Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Government Scoring Plans and Rating Systems: How Agencies Score Proposals Breakout Session # A03 Name Marge Rumbaugh, CPCM, Fellow Date Monday, July.

Similar presentations


Presentation on theme: "1 Government Scoring Plans and Rating Systems: How Agencies Score Proposals Breakout Session # A03 Name Marge Rumbaugh, CPCM, Fellow Date Monday, July."— Presentation transcript:

1

2 1 Government Scoring Plans and Rating Systems: How Agencies Score Proposals Breakout Session # A03 Name Marge Rumbaugh, CPCM, Fellow Date Monday, July 30, 2012 Time 11:15 am-12:30 pm.

3 2 Overview of Topics Presented Review Proposal Evaluation Principals Describe Evaluation Criteria and Weighting Explain Proposal Scoring Plan Methods Illustrate How Agencies Weight and Score Proposals Provide Lessons Learned from Case Studies Summarize Important Points About Weighting and Scoring Proposals

4 3 Proposal Evaluation Principals: The Best Value Continuum Lowest Price Technically Acceptable –Pass/Fail evaluation: acceptable or not –Lowest price wins Trade-off process (Best Value) – Evaluates “shades of gray” –Scoring system needs to convey more complex information.

5 4 Evaluation Criteria: What Gets Evaluated? Price or Cost Quality –Typically Technical/Management Past Performance: Over the Simplified Acquisition Threshold $150,000 Small disadvantaged business concern participation: Acquisitions expected to exceed $650,000 or $1,500,000 for construction.

6 5 Evaluation Criteria: State Relative Order of Importance Significantly more important than cost or price, Approximately equal to cost or price, or Significantly less important than cost or price. FAR 15.304

7 6 Evaluation Criteria Weighting Example Source Selection Decision Non-CostCost 55%45% Approximately equal to cost or price Significantly more important than cost or price 80% 20% Significantly less important than cost or price 20% 80%

8 7 Evaluation Criteria Structure Example Basis of Award Cost Non–Cost Technical Management Past Performance

9 8 Non-Cost Evaluation Criteria Organization 1.0 Area: Management Broad categories that define non-cost criteria at the highest levels 1.1 Factor: Personnel 1.1.1. Sub-Factor: Personnel Qualifications 1.1.1. Sub-factor: Personnel Retention Standard Minimum level of compliance for specific factors and sub-factors Divide into more detailed parts specific parts of the overarching area

10 9 Proposal Scoring Method: What is it? A scoring method or rating system uses a scale of words, colors, numbers or other indicators to denote the degree to which proposals meet the standards for the non- cost evaluation factors. Some commonly used scoring methods or rating systems are –Adjectival, -- Color coding, –Numerical, --Combination

11 10 Adjectival Scoring Method (1) AdjectiveDefinition Outstanding A proposal that satisfies all of the agency’s requirements, with extensive detail indicating a feasible approach & a thorough understanding of the problems. The proposal has numerous significant strengths that are not offset by weaknesses. The proposal has an overall low degree of risk. Good A proposal that satisfies all of the agency’s requirements, with adequate detail of a feasible approach & an understanding of the problems. The proposal has some significant strengths or numerous minor strengths that are not offset by weaknesses. The proposal has an overall low to moderate degree of risk.

12 11 Adjectival Scoring Method (2) AdjectiveDefinition Acceptable A proposal that satisfies all of the agency’s requirements, with minimal detail indicating a feasible approach and a minimal understanding of the problems. The proposal has an overall moderate to high degree of risk. Marginal A proposal that satisfies all of the agency’s requirements, with minimal detail indicating a feasible approach and a minimal understanding of the problem. The proposal has an overall high degree of risk.

13 12 Adjectival Scoring Method (3) Unacceptable A proposal that contains at least one major error, omission, or deficiency that indicates a lack of understanding of the problems. The approach cannot be expected to meet requirements or involves a very high risk. None of these conditions can be corrected without a major rewrite or proposal revision.

14 13 ColorStrengthsWeaknesses Blue The proposal has numerous strengths Weaknesses are insignificant & have no apparent impact on the program. Green Some strengths exist & the strengths clearly offset weaknesses. A few weaknesses exist; they are correctable with minimal government oversight or direction. Color Scoring of Strengths & Weaknesses

15 14 ColorStrengthsWeaknesses Few strengths exist & the strengths do not offset the weaknesses. Substantial weaknesses exist & they are correctable with some govt. oversight & direction. Red There are no beneficial strengths. Numerous weaknesses exist that are so significant that a proposal rewrite is not feasible within a suitable timeframe. Color Scoring of Strengths & Weaknesses

16 15 Color & Adjective Combination Method ColorRatingDescription BlueExceptional Offeror’s proposal has an exceptional understanding of acquisition’s goals RedUnacceptable Offeror’s proposal has an unacceptable under- Standing of acquisition’s goals. YellowMarginal Offeror’s proposal has a fair understanding of acquisition’s goals. GreenAcceptable Offeror’s proposal has an acceptable under- Standing of acquisition’s goals. TealVery Good Offeror’s proposal has a very good under- standing of acquisition’s goals.

17 16 Numerical Scoring Method ScoreDescription 0 The factor is not addressed, or is addressed in a way that is totally deficient and without merit 1 There are deficiencies or weaknesses that can be corrected only by significant changes to the proposal. The factor is addressed so minimally or vaguely that there are widespread information gaps. The technical evaluation team has serious concerns about the offeror’s ability to perform the required work.

18 17 Numerical Scoring Method ScoreDescription 2 Information is incomplete, unclear, or indicates an inadequate approach to, or understanding of, the factor. There is a question about the offeror’s ability to perform satisfactorily. 3 Meets the specifications and requirements. The offeror could perform & meet the government’s minimum requirements.

19 18 Numerical Scoring Method 4 The proposal has some superior features. Information provided is generally clear, and demonstrates an acceptable ability to accomplish the technical requirements, with the possibility of more than adequate performance. 5 The response to the factor is superior in most features.

20 19 Numerical & Adjectival Method 8 - 10 0 -.9 1 – 2.9 3 - 5.9 6 -7.9 Excellent Marginal Accept- able Very Good RangeRatingDescription Offeror’s proposal has an exceptional understanding of the acquisition’s goals Offeror’s proposal has very little understanding of the acquisition’ goals Offeror’s proposal has a very good understanding of acquisition’s goals Un- acceptable Offeror’s proposal demonstrates a good understanding of acquisition’s goals Offeror’s proposal demonstrates a poor understanding of acquisition’s goals

21 20 Scoring With Evaluation Standards Standards are the baseline against which the agency evaluates the offeror’s proposals to determine their acceptability and value A standard should specify a target performance level for the proposal. Standards require evaluators to evaluate proposals against uniform objectives rather than against other proposals.

22 21 Scoring With Evaluation Standards Quantitative Standard –Miles per hour –Dollars per pound –Lines of code –Years of experience –Computer processing speed Qualitative Standard –Provides an acceptable technical or management solution/ approach, which meet requirements of the … –Provides adequate and appropriate personnel skills –Provides an acceptable level of experience/ knowledge

23 22 Scoring With Evaluation Standards: Illustration + + + Significantly above standards/expectations + + Clearly above standards/expectations + Slightly above standards/expectations MEETS THE STANDARD -Slightly below standards/expectations - -Clearly below standards/expectations - - -Significantly below standards/expectations

24 23 Scoring: Risk Assessment Rating Rating High Low Medium- Low Medium- High Description Offeror’s proposed approach is likely to cause significant schedule disruption, cost increase or performance degradation. It will require significant contractor emphasis & government monitoring to overcome difficulties. Offeror’s proposed approach is likely to cause minimal schedule disruption, cost increase or performance degradation. It will require a low level of contractor emphasis and govt. Monitoring to overcome difficulties. Offeror’s proposed approach is likely to cause minimal to moderate schedule disruption, cost increase or performance degradation. It will require a low to medium level of contractor emphasis & govt. monitoring to overcome difficulties. Offeror’s proposed approach is likely to cause a moderate to significant, schedule disruption, cost increase or performance degradation. It will require a medium to high level of contractor emphasis & govt. monitoring to overcome difficulties.

25 24 Standards, Scoring, and Risk Rating + + +Significantly above standards + +Clearly above standards +Slightly above standards MEETS STANDARD -Slightly below standards - -Clearly below standards - - -Significantly below standards Rating High Low Medium Marginal Determine Score based on Strengths & Weaknesses Determine Risk Rating Identify Strengths & Weaknesses Exceptional Very Good Acceptable Unacceptable

26 25 Scoring With Weighted Evaluation Factors Non-Cost Past Perf Technical Item Factor Sub Factor Factor Item Sub Factor Factor 50 % 20 % Mgmt 30 % 50% 75 % 25 % 75% 25% 50% 25% 50 % 60% 40% Factor

27 26 Scoring Proposals: Individual Evaluations Evaluators individually assess each proposal in accordance with the RFP’s stated evaluation factors Some agencies use standards to help with scoring. Evaluators must support adjectival, color, or numerical ratings with narrative statements that explain or justify the given score.

28 27 Scoring Proposals: Consensus After individual evaluations are complete, evaluators meet as a group/team to determine a consensus score. The consensus score is an agreement of the strengths, weaknesses, risks, and determine a score for a specific evaluation factor.

29 28 Scoring Proposals: Consensus No simple process exists to help the evaluators reach a consensus rating. Evaluators must –Assess the collective impact of evaluation subfactors on each factor, –Assess all of the evaluation factors as they relate to each other under the weighting methodology identified in the solicitation. –Document dissenting opinions

30 29 Case Study: Using Different Adjectives The RFP’s adjectival rating scale: –Exceptional –Very good –Satisfactory –Marginal –Unsatisfactory. Actual Rating System used 3 adjectives with a corresponding point system: –Exceptional (3) –Average (2) –Marginal (1) Trajen, Inc., U.S. Government Accountability Office, B-296334; B-296334.2; B-296334.3; B-296334.4 (July 29, 2005).

31 30 Case Study: Scoring Methodology Protest The RFP stated the agency intended to make award on a “best value” basis. Technical factors given greater consideration than price. If the technical proposals were determined to be essentially equal, price would govern the agency’s source selection decision. Gap Solutions, Inc. B-310564 January 4, 2008

32 31 Case Study: Scoring Methodology Protest The agency’s proposal scoring scheme was flawed in that it essentially “negated” the technical distinctions among the proposals. The effect was to artificially narrow the range of possible total scores making the numeric scores all appear technically equal. The award decision was based on low price rather than on technical considerations that should have received more consideration under the RFP’s terms.

33 32 Summary Proposal Scoring is way to summarize evaluation results to assist the Source Selection Authority’s decision making. The particular scoring method an agency uses is less important than the consistency with which the evaluators apply the selected method to all competing proposals.

34 33 Questions? Thank you for attending this session!


Download ppt "1 Government Scoring Plans and Rating Systems: How Agencies Score Proposals Breakout Session # A03 Name Marge Rumbaugh, CPCM, Fellow Date Monday, July."

Similar presentations


Ads by Google