Download presentation
Presentation is loading. Please wait.
Published byBethany Tucker Modified over 9 years ago
1
ASTM E2936 Standard Guide for Contractor Self Assessment for U. S
ASTM E2936 Standard Guide for Contractor Self Assessment for U.S. Government Property Management Systems Brandon Kriner, CPPM CF Harris Corporation
2
Background FAR requires contractors to self-assess, but no specific guidance is provided in the clause There is much confusion and debate on Contractor Self Assessment (CSA) What constitutes an “acceptable” CSA? What should be the role of CSA in a Government audit or PMSA? The guide is intended to compile best practices from both a contractor and Government point of view- not a step-by-step manual- every environment is different (Hence a standard “guide” and not a “practice”)
3
Task Group Brandon Kriner, Harris Corporation (Lead)
Alex Barenblitt, BSL Company Cinda Brockman, A2B Tracking Eric Fassett, Northrop Grumman Corporation Bill Franklin, Noblis, Inc. Doug Goetz, GP Consultants Mike Showers, NASA Rick Shultz, JHU Applied Physics Laboratory Tamra Zahn, The Boeing Company
4
Scope The standard is intentionally narrow in focus- specifically addresses the requirements of FAR Many of the concepts could be applied to other types of asset management self-assessments, audits, surveillances, reviews, etc. The FAR Government Property focus is intended to directly address the widespread challenges associated with CSA implementation and acceptance since 2007
5
Significance and Use The guide provides a foundation for minimum effective internal assessment of a contractor’s Government property management system Your CSA may go farther/deeper- probably already aligned with the standard! The guide is a menu, not a recipe A contractor may use all or parts of the guide in accordance with its procedures and operating environment Self assessment should be used to identify and correct potential deficiencies or areas of risk independent of a Government audit Also provides insight into continuous improvement opportunities
6
Significance and Use Self assessments are not a replacement for PMSAs or other external audits- they are distinct requirements A contractor is required to self-assess in order to proactively address issues The Government has a fiduciary responsibility to ensure that its property is adequately managed by contractors The results of CSAs should be made available to Government auditors for potential inclusion in their audits or reports Self disclosure of potential issues (and evidence of corrective action) may preempt CARs (C. Williams Letter 9/13) Self disclosure of areas of strength may minimize the time spent on a PMSA In a September 5, 2013 letter to AIA VP Acquisition Policy William Greenwalt, Charlie Williams, DCMA Director, stated “We recognize that if contractors self-identify non-compliances and take timely and appropriate action to correct the non-compliances, DCMA CARs are normally not needed. In our view, that should be the ultimate goal.”
7
Independence and Objectivity
To the extent possible, a CSA should provide a level of objectivity similar to that of a PMSA Individuals performing CSA should be independent when possible The contractor’s procedures should address independence People should not be assessing their own work Example: calibration lab performs calibration; Property Management assessing % of on-schedule calibration of assets
8
Procedures Contractors should clearly describe their CSA in their procedures The following concepts should be addressed: Methodology used (examples): Government PMSA criteria ASTM E2452 EMPM ILPs/Customary Commercial Practices Balanced Scorecard/CMMI In general, the closer you are to the Government’s methodology, the greater the chance of your CSA results being incorporated
9
Procedures The processes/outcomes to be reviewed
These may be the “10 outcomes/15 functions” or other processes/contractual requirements The organizational scope of the CSA Business units, sites, other sub-divisions to which the assessment applies
10
Procedures A “defect” should be defined
The differences between minor, major and critical defects should be defined in the context of the business environment Corrective action requirements for defects should be established
11
Defects- E2936 Definitions
Defect, n – a condition in which a functional segment, a sample item or sample item element of a property control system contains one or more deficiencies (E 2315). Definitions are loosely aligned with DFARS Contractor Business System Deficiencies (aka the BSR)
12
Defects- E2936 Definitions
Minor defect – a defect that is administrative in nature, non-systemic and would have no material outcome for the control of Government property. Major defect – a significant, but not systemic defect that may affect the control of government property, possibly increasing the risk to the Government. Critical defect – a significant and systemic defect that would have a material effect on contract performance or cause concern for the reliability of the information provided by the property management system. Materiality- magnitude of an omission or misstatement of accounting data
13
Defects- Examples Minor defect – Characters transposed in a model number; asset location one room off (no material impact on contract performance or management information) Major defect – A $500,000 test set has been improperly stored, leading to damage (significant; not systemic) Critical defect – The contractor automatically disposes of all Government property as scrap at its discretion without gaining Government approval or reporting to the Government (significant and systemic)
14
Risk Assessment Contractors should perform a risk assessment in planning a CSA The frequency of a CSA or parts of a CSA should be based upon the risk assessment Direct resources to areas that present risk; don’t overdo low-risk areas
15
Process Tests Contractors should establish process tests that provide sufficient evidence to credibly evaluate the effectiveness and risk level of the property system Process tests may be quantitative (metrics, statistical sampling) or qualitative Support documentation is critical The standard suggests processes to test
16
Population Selection Sample data are observed in order to estimate attributes of the entire population Transaction-based population Driven by actions over a set period of time, e.g. all receipts of GP over the past year Attribute-based population Based on common characteristics, e.g. condition of storage areas
17
Population Selection Processes may have more than one population
Acquisition of GFP vs. CAP Sensitive property ABC or Low Risk Property (ASTM E2811)
18
Population Selection Some populations may be used to test multiple processes Records and Utilization Acquisition and Receiving
19
Sampling Statistical Sampling: use of random statistical tests to estimate the characteristics of a complete population Judgment Sampling: non-random, non-probability; auditor selects items based on knowledge and experience Purposive Sampling: selecting specific items based on knowledge of a situation
20
Sampling Contractors must identify the statistical sampling plan to be used DCMA 90%,95%,97% Double Sampling Plans ASTM E2234 AQL 6.5 Single or Double Sampling Plans Others The statistical sampling plan used should be determined by risk and other factors
21
Sampling Statistical sampling populations should be randomly generated using automated tools Excel’s sampling tool under the Data Analysis add-in The standard contains several statistical sampling tables in the appendix
22
Evaluation of Samples Defects should be analyzed from both a quantitative and qualitative perspective Quantitative- based on the acceptance rates in the statistical sampling program Qualitative- the relative significance and materiality of the defects Even quantitative “failures” may be immaterial! $2.48 cents out of a population of $500,000
23
Significance Defined in GAGAS as the relative importance of a matter within the context in which it is being considered Magnitude in relation to overall system Nature and effect of the defect Relevance and impact on overall contract performance GAGAS = Generally Accepted Government Auditing Standards
24
Significance Business Systems Rule Definition:
DFARS : “Significant deficiency” means a shortcoming in the system that materially affects the ability of officials of the Department of Defense to rely upon information produced by the system that is needed for management purposes. This is a high bar that should not be hastily applied. GAGAS = Generally Accepted Government Auditing Standards
25
Corrective Actions and Plans
Contractors should take corrective action to resolve issues and mitigate risk as they become known during the CSA. The cost and burden of corrective actions should be commensurate with the significance and risk presented to the Government
26
Corrective Actions and Plans
Significant risks (critical or major defects) should be addressed through a formal written CAP Identify steps taken to identify and analyze the root cause, mitigate the risk, the resources required, and the specific timeline for implementation Minor defects should be corrected at the lowest effective responsible level of contractor personnel CAP = Corrective Action Plan
27
Get the Standard Join ASTM: http://www.astm.org
$75 for individual membership- provides access to all 27 active standards under E53 Committee on Asset Management Your organization may already have an organizational membership
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.