1 Government Scoring Plans and Rating Systems: How Agencies Score Proposals Breakout Session # A03 Name Marge Rumbaugh, CPCM, Fellow Date Monday, July.

Slides:



Advertisements
Similar presentations
Acquisition Process Step 1 - Requirements Definition
Advertisements

Page Federal Contracting AbilityOne November 8, 2012 Midwest SBLO Meeting.
1 Follow Up Items  What are Unbalanced Bids?  What are Best Value Contracts?  Analysis of Contract Approval Limits.
U.S. Pretrial Services and Probation Office Northern District of Ohio.
Source Selection and Contract Award
Writing Proposals for Oak Ridge National Laboratory Women-Owned Small Business Day Sonny Rogers Contract Services Group Manager Oak Ridge, TN August 24,
March 9,  HISTORY ◦ NASA HQ & JSC Lean 6 Sigma Teams  Recommended various ways to streamline process  JSC STREAMLINED TEAM CHARTER ◦ Document.
. as of March 22, 2012 UNCLASSIFIED
1 BROOKHAVEN SCIENCE ASSOCIATES National Synchrotron Light Source II Procurement Methodology David Dale NLSL II Procurement Manager.
Review of EIA Quality A formal step in the EIA process Purpose is to establish if the information in the EIA report is sufficient for decision –making.
Western Chain Saw Case Analysis. Steps in Analyzing Case Analyze and Record the Current Situation –Threats, Opportunities –Strengths, Weaknesses Identify.
APAT, October 29, Acronym Legend 2 SEB - Source Evaluation Board SLPT - Streamlined Procurement Team (2 Methods)  PPT - Price and Past Performance.
Performance Appraisal System Update
Project Management Session 7
US Army Corps of Engineers BUILDING STRONG ® What Happens to Your Proposal After it is Submitted? Phyllis Buerstatte & Jerome Conway Contracting Officers.
1 Major changes Get ready! Changes coming to Review Meetings Considering Potential FY2010 funding and beyond: New 1-9 Scoring System Scoring of Individual.
Purpose of the Standards
I n t e g r i t y - S e r v i c e - E x c e l l e n c e PPNM’s/PNM’s Date: 29 Dec 2006.
Best Procurement Practices and Helpful Information August 2011.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
YOU ARE THE EVALUATOR - WHAT EVALUATING PROPOSALS CAN TEACH YOU ABOUT WRITING BETTER ONES Roger Campbell 6 February 2014.
Source Selection. What is Source Selection? Source Selection is the process of conducting competitive negotiations. Source Selection allows the Government.
GWAC Ordering Procedures Overview
© Copyright 2003 RBAP. All Rights Reserved. Microfinance Staff Performance Evaluation.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Multiple Award Contracts Training Presented by Jennifer Salts State of Utah - Division of Purchasing 1.
Research & Technology Implementation TxDOT RTI OFFICE.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force 1 Overview of EUL Solicitation & Selection Process Ms. Lee A. Conesa.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force 9/27/20071 Overview of EUL Solicitation & Selection Process 12 Feb.
Assessment and Testing
Insert Project Title Presentation of SSEB Findings to the Source Selection Authority {Insert Date} Presented by: Insert Name & Title Insert Name, Contracting.
B1B AIRCRAFT MAINTENANCE HANGAR DAHA99-01-R-4001 Debriefing July 16, 2001.
Changes is NIH Review Process and Grant Application Forms Shirley M. Moore Professor of Nursing and Associate Dean for Research Frances Payne Bolton School.
{Project Name} Pre-Award Debriefing to {Insert Offeror Name} {Insert Date} Presented by: {Name}, Technical Team Lead {Name}, Contracting Officer Presented.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force Solicitation and Selection Process.
Source Selection Process & Successful Proposal Tips
A research proposal is a document written for the purpose of obtaining funding for a research project.
Source Selection Overview Source Selection Overview June
0 0 0 Making Better Best Value Tradeoff Decisions Breakout Session # WC12-F10 Marge Rumbaugh, CPCM, Fellow and Janie Maddox, CPCM, Fellow Tuesday, July.
Evaluation. What is important??? Cost Quality Delivery Supplier Expertise Financial Stability Coverage Product Offerings Do you intend to negotiate?
Small Business and Subcontracting. Subcontracting for Small Business 6 steps to successful subcontracting 6. Report Contractor performance 1. Consider.
1. 2 Cost & Price Analysis Breakout Session # 312 Beverly Arviso, CPA, Fellow, CPCM, CFCM, Arviso, Inc. Melanie Burgess, CPA, CFCM, Burgess Consulting,
Best Practices for a FAR 15 Procurement PART 1 – DEVELOPING THE SOLICITATION.
Compliance with CCNA F.S..  Advertisement  Longlist  Shortlist  Request for Proposal  Scope of Services Meeting  Technical Proposal Review.
DoD Source Selection Procedures Source Selection Support Center of Excellence July 12, 2016.
Contracting Officer Podcast Slides
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluating Small Business Participation
NATA Foundation Student Grants Process
“An Opportunity to Communicate”
Consent to Subcontract
Stakeholder Consultation
Contracting Officer Podcast Slides
FAR Part 2 - Definitions of Words and Terms
CON 280: Source Selection and the Administration of Service Contracts
CON 280: Source Selection and the Administration of Service Contracts
Small Business and Subcontracting.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Request for Proposal - Best Value
Prepared for California State University, San Bernardino
Request for Proposal - Best Value
Prepared for California State University, San Bernardino
PERFORMANCE AND TALENT MANAGEMENT
Source Selection Procedures
Source Selection Training
A Evaluation Factors D Pass/Fail 85% Weight S GRADES A- 67% B 93%
U.S. Army Contracting Command
Omnibus IV Contracting Strategy Michael D’Alessandro
A VOYAGE THROUGH PERFORMANCE MANAGEMENT TOOLS
Presentation transcript:

1 Government Scoring Plans and Rating Systems: How Agencies Score Proposals Breakout Session # A03 Name Marge Rumbaugh, CPCM, Fellow Date Monday, July 30, 2012 Time 11:15 am-12:30 pm.

2 Overview of Topics Presented Review Proposal Evaluation Principals Describe Evaluation Criteria and Weighting Explain Proposal Scoring Plan Methods Illustrate How Agencies Weight and Score Proposals Provide Lessons Learned from Case Studies Summarize Important Points About Weighting and Scoring Proposals

3 Proposal Evaluation Principals: The Best Value Continuum Lowest Price Technically Acceptable –Pass/Fail evaluation: acceptable or not –Lowest price wins Trade-off process (Best Value) – Evaluates “shades of gray” –Scoring system needs to convey more complex information.

4 Evaluation Criteria: What Gets Evaluated? Price or Cost Quality –Typically Technical/Management Past Performance: Over the Simplified Acquisition Threshold $150,000 Small disadvantaged business concern participation: Acquisitions expected to exceed $650,000 or $1,500,000 for construction.

5 Evaluation Criteria: State Relative Order of Importance Significantly more important than cost or price, Approximately equal to cost or price, or Significantly less important than cost or price. FAR

6 Evaluation Criteria Weighting Example Source Selection Decision Non-CostCost 55%45% Approximately equal to cost or price Significantly more important than cost or price 80% 20% Significantly less important than cost or price 20% 80%

7 Evaluation Criteria Structure Example Basis of Award Cost Non–Cost Technical Management Past Performance

8 Non-Cost Evaluation Criteria Organization 1.0 Area: Management Broad categories that define non-cost criteria at the highest levels 1.1 Factor: Personnel Sub-Factor: Personnel Qualifications Sub-factor: Personnel Retention Standard Minimum level of compliance for specific factors and sub-factors Divide into more detailed parts specific parts of the overarching area

9 Proposal Scoring Method: What is it? A scoring method or rating system uses a scale of words, colors, numbers or other indicators to denote the degree to which proposals meet the standards for the non- cost evaluation factors. Some commonly used scoring methods or rating systems are –Adjectival, -- Color coding, –Numerical, --Combination

10 Adjectival Scoring Method (1) AdjectiveDefinition Outstanding A proposal that satisfies all of the agency’s requirements, with extensive detail indicating a feasible approach & a thorough understanding of the problems. The proposal has numerous significant strengths that are not offset by weaknesses. The proposal has an overall low degree of risk. Good A proposal that satisfies all of the agency’s requirements, with adequate detail of a feasible approach & an understanding of the problems. The proposal has some significant strengths or numerous minor strengths that are not offset by weaknesses. The proposal has an overall low to moderate degree of risk.

11 Adjectival Scoring Method (2) AdjectiveDefinition Acceptable A proposal that satisfies all of the agency’s requirements, with minimal detail indicating a feasible approach and a minimal understanding of the problems. The proposal has an overall moderate to high degree of risk. Marginal A proposal that satisfies all of the agency’s requirements, with minimal detail indicating a feasible approach and a minimal understanding of the problem. The proposal has an overall high degree of risk.

12 Adjectival Scoring Method (3) Unacceptable A proposal that contains at least one major error, omission, or deficiency that indicates a lack of understanding of the problems. The approach cannot be expected to meet requirements or involves a very high risk. None of these conditions can be corrected without a major rewrite or proposal revision.

13 ColorStrengthsWeaknesses Blue The proposal has numerous strengths Weaknesses are insignificant & have no apparent impact on the program. Green Some strengths exist & the strengths clearly offset weaknesses. A few weaknesses exist; they are correctable with minimal government oversight or direction. Color Scoring of Strengths & Weaknesses

14 ColorStrengthsWeaknesses Few strengths exist & the strengths do not offset the weaknesses. Substantial weaknesses exist & they are correctable with some govt. oversight & direction. Red There are no beneficial strengths. Numerous weaknesses exist that are so significant that a proposal rewrite is not feasible within a suitable timeframe. Color Scoring of Strengths & Weaknesses

15 Color & Adjective Combination Method ColorRatingDescription BlueExceptional Offeror’s proposal has an exceptional understanding of acquisition’s goals RedUnacceptable Offeror’s proposal has an unacceptable under- Standing of acquisition’s goals. YellowMarginal Offeror’s proposal has a fair understanding of acquisition’s goals. GreenAcceptable Offeror’s proposal has an acceptable under- Standing of acquisition’s goals. TealVery Good Offeror’s proposal has a very good under- standing of acquisition’s goals.

16 Numerical Scoring Method ScoreDescription 0 The factor is not addressed, or is addressed in a way that is totally deficient and without merit 1 There are deficiencies or weaknesses that can be corrected only by significant changes to the proposal. The factor is addressed so minimally or vaguely that there are widespread information gaps. The technical evaluation team has serious concerns about the offeror’s ability to perform the required work.

17 Numerical Scoring Method ScoreDescription 2 Information is incomplete, unclear, or indicates an inadequate approach to, or understanding of, the factor. There is a question about the offeror’s ability to perform satisfactorily. 3 Meets the specifications and requirements. The offeror could perform & meet the government’s minimum requirements.

18 Numerical Scoring Method 4 The proposal has some superior features. Information provided is generally clear, and demonstrates an acceptable ability to accomplish the technical requirements, with the possibility of more than adequate performance. 5 The response to the factor is superior in most features.

19 Numerical & Adjectival Method – Excellent Marginal Accept- able Very Good RangeRatingDescription Offeror’s proposal has an exceptional understanding of the acquisition’s goals Offeror’s proposal has very little understanding of the acquisition’ goals Offeror’s proposal has a very good understanding of acquisition’s goals Un- acceptable Offeror’s proposal demonstrates a good understanding of acquisition’s goals Offeror’s proposal demonstrates a poor understanding of acquisition’s goals

20 Scoring With Evaluation Standards Standards are the baseline against which the agency evaluates the offeror’s proposals to determine their acceptability and value A standard should specify a target performance level for the proposal. Standards require evaluators to evaluate proposals against uniform objectives rather than against other proposals.

21 Scoring With Evaluation Standards Quantitative Standard –Miles per hour –Dollars per pound –Lines of code –Years of experience –Computer processing speed Qualitative Standard –Provides an acceptable technical or management solution/ approach, which meet requirements of the … –Provides adequate and appropriate personnel skills –Provides an acceptable level of experience/ knowledge

22 Scoring With Evaluation Standards: Illustration Significantly above standards/expectations + + Clearly above standards/expectations + Slightly above standards/expectations MEETS THE STANDARD -Slightly below standards/expectations - -Clearly below standards/expectations - - -Significantly below standards/expectations

23 Scoring: Risk Assessment Rating Rating High Low Medium- Low Medium- High Description Offeror’s proposed approach is likely to cause significant schedule disruption, cost increase or performance degradation. It will require significant contractor emphasis & government monitoring to overcome difficulties. Offeror’s proposed approach is likely to cause minimal schedule disruption, cost increase or performance degradation. It will require a low level of contractor emphasis and govt. Monitoring to overcome difficulties. Offeror’s proposed approach is likely to cause minimal to moderate schedule disruption, cost increase or performance degradation. It will require a low to medium level of contractor emphasis & govt. monitoring to overcome difficulties. Offeror’s proposed approach is likely to cause a moderate to significant, schedule disruption, cost increase or performance degradation. It will require a medium to high level of contractor emphasis & govt. monitoring to overcome difficulties.

24 Standards, Scoring, and Risk Rating + + +Significantly above standards + +Clearly above standards +Slightly above standards MEETS STANDARD -Slightly below standards - -Clearly below standards - - -Significantly below standards Rating High Low Medium Marginal Determine Score based on Strengths & Weaknesses Determine Risk Rating Identify Strengths & Weaknesses Exceptional Very Good Acceptable Unacceptable

25 Scoring With Weighted Evaluation Factors Non-Cost Past Perf Technical Item Factor Sub Factor Factor Item Sub Factor Factor 50 % 20 % Mgmt 30 % 50% 75 % 25 % 75% 25% 50% 25% 50 % 60% 40% Factor

26 Scoring Proposals: Individual Evaluations Evaluators individually assess each proposal in accordance with the RFP’s stated evaluation factors Some agencies use standards to help with scoring. Evaluators must support adjectival, color, or numerical ratings with narrative statements that explain or justify the given score.

27 Scoring Proposals: Consensus After individual evaluations are complete, evaluators meet as a group/team to determine a consensus score. The consensus score is an agreement of the strengths, weaknesses, risks, and determine a score for a specific evaluation factor.

28 Scoring Proposals: Consensus No simple process exists to help the evaluators reach a consensus rating. Evaluators must –Assess the collective impact of evaluation subfactors on each factor, –Assess all of the evaluation factors as they relate to each other under the weighting methodology identified in the solicitation. –Document dissenting opinions

29 Case Study: Using Different Adjectives The RFP’s adjectival rating scale: –Exceptional –Very good –Satisfactory –Marginal –Unsatisfactory. Actual Rating System used 3 adjectives with a corresponding point system: –Exceptional (3) –Average (2) –Marginal (1) Trajen, Inc., U.S. Government Accountability Office, B ; B ; B ; B (July 29, 2005).

30 Case Study: Scoring Methodology Protest The RFP stated the agency intended to make award on a “best value” basis. Technical factors given greater consideration than price. If the technical proposals were determined to be essentially equal, price would govern the agency’s source selection decision. Gap Solutions, Inc. B January 4, 2008

31 Case Study: Scoring Methodology Protest The agency’s proposal scoring scheme was flawed in that it essentially “negated” the technical distinctions among the proposals. The effect was to artificially narrow the range of possible total scores making the numeric scores all appear technically equal. The award decision was based on low price rather than on technical considerations that should have received more consideration under the RFP’s terms.

32 Summary Proposal Scoring is way to summarize evaluation results to assist the Source Selection Authority’s decision making. The particular scoring method an agency uses is less important than the consistency with which the evaluators apply the selected method to all competing proposals.

33 Questions? Thank you for attending this session!