PROPOSED CHANGES TO RATING METHODS

Slides:



Advertisements
Similar presentations
RFP AND CONTRACT PHASE Jeff Wassenaar, P.E., HQ Project Development Benjamin Acimovic, P.E., Region 1.
Advertisements

Systems Analysis and Design in a Changing World
Chapter 8: Evaluating Alternatives for Requirements, Environment, and Implementation.
Monthly Conference Call With Superintendents and Charter School Administrators.
No 1 REVIEW OF ACADEMIC STRUCTURE PROPOSED GENERAL STAFF STRUCTURE 3 June 2008.
ZHRC/HTI Financial Management Training
January 20, Pro Rata Overview Department of Finance Fiscal Systems and Consulting Unit.
Public Works Contracting Marsha Reilly Office of Program Research House of Representatives recommended.
Catherine C. Dunn, P.E., N.PE, P.PE Deputy Director Port Development Prt of New Orleans Catherine C. Dunn, P.E., D.NE, D.PE Director,
Project Manager Training. Agenda Revised Consultant Selection Process Invoicing Overtime, Travel and Vehicle Reimbursement Policies Performance Evaluations.
Catherine C. Dunn, P.E., N.PE, P.PE Deputy Director Port Development Prt of New Orleans Catherine C. Dunn, P.E., D.NE, D.PE Director,
1 DVBE Incentive Workshop To meet or exceed the 3% DVBE goal in State contracting.
Consultant Selection Missouri Local Programs How to Evaluate Consultant Proposals, Complete the Rating Sheets & Make a Selection.
GENERAL FUNDS DISBURSAL April 2008 Revised June 2008 Proposal from Asha DC.
Independence Plan Update February 26, © 2009 Harvard Pilgrim Health Care2 Key Points  Independence Plan introduced in 2005 –Tiered copayment product.
1 Office of Procurement Presents Request for Proposal (RFP) Process.
Advisory Committees for Educator Preparation Programs
Work shop on Procurement Key-performance indicators with selected implementing entities Public procurement and property administration agency August 2016.
Compliance with CCNA F.S..  Advertisement  Longlist  Shortlist  Request for Proposal  Scope of Services Meeting  Technical Proposal Review.
Project Execution Methodology
Project Management PTM721S
Understanding the RUC Survey Instrument
Systems Analysis and Design in a Changing World, Fifth Edition
Introduction to Teacher Evaluation
IAEWS Benchmark Study September 2011
RFP 12-58, Data Warehouse Pre-Proposal Conference
The British Accreditation Council: ensuring standards
The Port of New Orleans - As-Needed Consultant Selection Process for Professional Services June 9, 2016.
ROAD ACCIDENT FUND COMPULSORY BRIEFING SESSION RAF/2015/00007
TMG Total Rewards Steering Committee
CONTRACT AWARD TO ALTA PLANNING AND DESIGN FOR CONSULTANT SERVICES TO CONDUCT SAFETY OUTREACH AND UPDATE THE SUGGESTED ROUTES TO SCHOOL MAPS FOR THE SAFER.
NATA Foundation Student Grants Process
Center for Excellence in Applied Computational Science and Engineering
Light Rail Transit Project
Feedback/Performance Review and Compensation Process
CON 280: Source Selection and the Administration of Service Contracts
Case Road Sewer Joe Mouawad, P.E. October 27, 2016.
TRAVEL SERVICES Request for Proposal
EMPLOYMENT EQUITY ACT, No 55, 1998 (EEA)
Meeting Venue Date Public Interest Oversight Board Maria Helena Pettersson PIOB Board Member IESBA CAG Meeting New York – March 6, 2017.
Release of PARCC Student Results
Systems Analysis – ITEC 3155 Evaluating Alternatives for Requirements, Environment, and Implementation.
Request for Proposal - Best Value
2 Selecting a Healthcare Information System.
Introduction to Teacher Evaluation
Assessment and Accountability Update
Faculty Performance Reviews at MSU
Approved Evaluator Training Provider Application Process
City of Phoenix Request for Qualifications and Scoring Process Update
ROAD ACCIDENT FUND COMPULSORY BRIEFING SESSION RAF/2015/00007
Request for Proposal - Best Value
Indian Policies and Procedures (IPPs) OASIS December 7, 2017
11/30/2018 Approved Evaluator Training Provider on the Colorado State Model Evaluation System Application Process November 2016.
ELO Data Points Time: 1 Materials: none Facilitator notes:
Group Member Evaluation Form
Fasset Briefing Session
Provider Peer Grouping: Project Overview
Briefing on Development of Performance Frameworks
Wednesday, December 1st Today’s Facilitators: Kim Glow & Cindy Dollman
2018 UNC System employee engagement survey
OCPS CCNA SELECTION COMMITTEE TRAINING
Policies, Procedures, and Best Practices IEEE AESS PANELS 2019 Prepared by the Technical Operations Committee Contents: Applying for Panel Approval Forming.
DOTD Form September 4, 2014.
Agenda • Introductions • Project Objectives • Project Steps
[Group Name].
Student Growth Measures
President Search “the most important work that this Board Will Ever Do” Our Next Steps: Search Firm and interim President Report to the Clark College.
DEVELOPMENT IMPACT FEES AB 1600 UPDATE
New Special Education Teacher Webinar Series
Advisory Committees for Educator Preparation Programs
Presentation transcript:

PROPOSED CHANGES TO RATING METHODS Compatibility, Workload, Past Performance, and Category Weighting February 11, 2016

Purpose Provide consultant community and DOTD practitioners with an update on the progress of the DOTD/ACEC initiative Obtain feedback concerning the specific recommendations for implementation

Background In June 2015, DOTD Chief Engineer and ACEC BOD signed document approving recommendations for implementation Approved 35 recommendations for implementation from 57 in 6 categories (see next chart) Several (7) were pending, not ready Sub committee A (procurement) continued to work Many additional meetings (Jun –Dec 2015) Presented to steering committee on 28 Jan 2016 and approved to move forward with this briefing

DOTD/ACEC Initiative Status   Total Approved Pending Tabled Not Approved A. Procurement 22 6 7 2 B. Contracting 5 3 1 C. Contract Administration D. Quality Assurance/Control E. Claims F. Contract Provisions 18 57 35 10 4 8

DOTD-ACEC/L Round Table Group A: Procurement Committee of DOTD and Consultants Focus on consultant selection process Actively working for over 3 years Committee members: DOTD ACEC/L Ed Wedge Simone Ardoin Bob Boagni Charles Nickel Ray Mumphrey Bob Basinger Debbie Guest Masood Rasoulian Gordon Nelson Paul Vaught Lawrence Hamm Buddy Porta Heather Huval Connie Porter Nick Ferlito Jesse Rauser Suzanne McCain Hunter Lancaster

Goals Evaluate the existing consultant selection process and look for opportunities for improvement by: Ensuring that the process appropriately balances the desire to provide as many firms as possible with the opportunity to perform the work with the need to ensure that the most technically experienced firms are selected Increasing transparency by simplifying the process and making it less subjective and more consistent

Key Issues Compatibility and Workload scoring methodologies are unnecessarily complex Determine alternate approach to evaluate those categories Create and define new input variables Project Magnitude Firm Size Designation Better address how past performance scores for new firms are assigned

Key Issues Rating categories can be divided into “experience” and “non-experience” ratings “Experience” Categories: Staff Experience, Firm Experience, Past Performance “Non-Experience” Categories Compatibility, Workload, Location Greater variation between consultants in the “non- experience” categories leads to their over-weighting in the selection process

PROJECT MAGNITUDE

Project Magnitude Project Magnitude is proposed to be determined by: 1. Complexity 2. Contract Time or Schedule (Current Solicitation) 3. Amount of the Contract (Projected for all Phases/Stages) 4. Route Classification.

Project Magnitude Criteria Value 1 2 3 Criteria Complexity Simple Medium Complex Contract Time Typical Compressed Critical Contract Amount ≤ $250,000 $250,000 to $2,500,000 ≥ $2,500,000 Route Classification Non-NHS Local Non-NHS State NHS

Project Magnitude Project Magnitude Designation Combined Criteria Values Micro 4 to 5 Small 5 to 7 Medium 7 to 9 Large 9 to 11 Mega 11 to 12

FIRM SIZE

Number of Transportation Personnel Firm Size Firm Size designation is determined using the table below: Firm Size Designation Number of Transportation Personnel Micro ≤ 13 Small 10 to 22 Medium 20 to 33 Large 30 to 55 Mega ≥ 50

COMPATIBILITY RATING

Compatibility Rating Once the project magnitude and firm size have been determined, the proposed compatibility rating* can be obtained by the following table. *When a team is considered, the compatibility rating will be based on the prime consultant only.

Project Magnitude Designation Compatibility Rating   Project Magnitude Designation Micro Small Medium Large Mega Firm Size Designation 5.0 3.0 1.0 4.0 2.0

WORKLOAD RATING

Workload Rating Once the firm size and remaining DOTD work amount have been determined, the proposed workload rating can be obtained by the following table.

Project Magnitude Designation Workload Rating   Project Magnitude Designation 5.0 5.0 → 1.0 1.0 Remaining DOTD Work Firm Size Designation Micro ≤ $20,000 $20,000 to $1,625,000 ≥ $1,625,000 Small ≤ $110,000 $110,000 to $2,750,000 ≥ $2,750,000 Medium ≤ $210,000 $210,000 to $4,125,000 ≥ $4,125,000 Large ≤ $320,000 $320,000 to $6,875,000 ≥ $6,875,000 Mega ≤ $530,000 $530,000 to $12,500,000 ≥ $12,500,000

PAST PERFORMANCE RATING

Past Performance Existing: Firms who have not received a rating for a work category will be assigned a rating equal to the lowest of the following: The average rating of the firms submitting The statewide average rating for that category as of the date the advertisement was posted.

Past Performance Proposed: Firms who have not received a rating for a work category will be assigned a rating equal to the lowest of the following: The average rating of the firms submitting The statewide average rating for that category as of the date the advertisement was posted The satisfactory rating of 3.

CATEGORY WEIGHTING

Category Weighting The score is based on six categories: Firm Experience Staff Experience Past Performance Compatibility Workload Location

Category Weighting Three of the six categories can be thought of as “experience” based and are generally associated with technical ability to perform the work. Firm Experience Staff Experience Past Performance

Category Weighting The other three categories can be thought of as “non-experience” based and are generally associated with the distribution of opportunity for work. Compatibility Workload Location

Category Weighting The “experience” ratings are currently weighted as follows: Firm Experience 3 Staff Experience 4 Past Performance 6 Total: 13

Category Weighting The “non-experience” ratings are currently weighted as follows: Compatibility 3 Workload 5 Location 4 Total: 12

Category Weighting Although the “experience” scores are higher, the lower “non-experience” scores dominate the ranking, simply due to their larger variations.

Category Weighting This is a known consequence of the current system. In some cases, such as complex projects, where technical experience based selection is of critical importance, current practice is to neutralize some of the “non-experience” scores to help alleviate this issue.

Category Weighting Based on a sample set of contracts, the “experience” scores need to be weighted more to fully compensate for the larger variations typically observed in the “non-experience” scores.

Category Weighting A Sensitivity Analysis was performed using a sample set of contracts, which were scored using the proposed category weightings. This resulted in: 50% of the contract selections were governed by the “experience” factors. 75% of the time, non-experience factors were not of sufficient influence to move a firm ranking by more than one-third (when experience only scores were compared with the overall final scores).

Category Weighting The category weightings are as follows: Proposed Proposed Existing Typical Specialty Firm Experience 3 3 4 Staff Experience 4 4 5 Past Performance 6 6 7 Compatibility 3 2 1 Workload 5 3 2 Location 4 2 1 Total 25 20 20

Category Weighting It has been noted that the “experience” scores may still need to be weighted even more in order to achieve our goal. However, since this is based on a relatively small sample of contracts, it was felt that the proposed weightings should be used first and after one year, if it has been determined that additional weighting is required, then at that time they can be adjusted further.

One Year Measurables Percentage of number 1 ranked firm controlled by “Experience” categories (at least 50%) Firms final rankings not influenced by more than one third by the non-experience score (at least 75%)

Action Plan Receive feedback from meeting today by email to Consultant Contracts, heather.huval@la.gov by February 19, 2016 Group A evaluate feedback and make final recommendations Submit to Chief Engineer and Steering Committee for approval Goal is to approve process by 2016 Louisiana Transportation Conference and implementation by July 1, 2016

Consultant Contract Services Contact Information Masood Rasoulian 225-379-1433 masood.rasoulian@la.gov Contracts Administration Kathy Ward 225-379-1893 kathy.ward@la.gov Contracts Manager Wanda Crawford 225-379-1406 wanda.crawford@la.gov Agreements/Invoicing Manager Heather Huval 225-379-1733 heather.huval@la.gov Advertisement Specialist Larry Hamm 225-379-1457 lawrence.hamm@la.gov Engineer 5 -DCL

QUESTIONS ? Latest information to be posted on the Consultant Contracts Services website under Latest News