1 Anthony Apted/ James Arnold 26 September 2007 Has the Common Criteria Delivered?

Slides:



Advertisements
Similar presentations
Trusted Computing in Government Networks May 16, 2007 Richard C. (Dick) Schaeffer, Jr. Information Assurance Director National Security Agency.
Advertisements

Module 1 Evaluation Overview © Crown Copyright (2000)
National Information Assurance Partnership Paul Mansfield January 2013
University of Tulsa - Center for Information Security Common Criteria Dawn Schulte Leigh Anne Winters.
Computer Science CSC 474Dr. Peng Ning1 CSC 474 Information Systems Security Topic 5.2: Evaluation of Secure Information Systems.
TCSEC: The Orange Book. TCSEC Trusted Computer System Evaluation Criteria.
Software Quality Assurance Plan
The 7 Year Itch - Time To Commit Or Time To Move On? Shaun Lee Security Evaluations Manager, Global Product Security.
Chapter 16: Standardization and Security Criteria: Security Evaluation of Computer Products Guide to Computer Network Security.
Common Criteria Richard Newman. What is the Common Criteria Cooperative effort among Canada, France, Germany, the Netherlands, UK, USA (NSA, NIST) Defines.
IT Security Evaluation By Sandeep Joshi
The Common Criteria Cs5493(7493). CC: Background The need for independently evaluated IT security products and systems led to the TCSEC Rainbow series.
German Research Center for Artificial Intelligence Protection Profile for Central Requirements for Online Voting German Research Center for Artificial.
International Energy Agency Energy Conservation in Buildings and Community Systems Programme - ECBCS IEA Energy Conservation in Buildings & Community Systems.
1 The Role of the Revised IEEE Standard Dictionary of Measures of the Software Aspects of Dependability in Software Acquisition Dr. Norman F. Schneidewind.
October 3, Partnerships for VoIP Security VoIP Protection Profiles David Smith Co-Chair, DoD VoIP Information Assurance Working Group NSA Information.
OECD Expert Group for International Collaboration on Microdata Access Mariarosa Lunati, OECD Statistics Directorate Luxembourg, 28 March 2012.
Common Criteria National Information Assurance Partnership Evaluation of Mobile Technology Janine Pedersen 1.
1 Evaluating Systems CSSE 490 Computer Security Mark Ardis, Rose-Hulman Institute May 6, 2004.
COEN 351: E-Commerce Security Public Key Infrastructure Assessment and Accreditation.
Stephen S. Yau CSE , Fall Evaluating Systems for Functionality and Assurance.
1 Terrie Diaz/ James Arnold 27 September 2007 Threats, Policies, and Assumptions in the Common Criteria What is the target of evaluation anyhow?
National Information Assurance Partnership NIAP 2000 Building More Secure Systems for the New Millenium sm.
Melanie Volkamer (Research Manager) University of Passau, Innstraße 43, Passau, Germany, Tel: / Webpage:
SERVICES TRADE RESTRICTIVENESS INDEX PROFESSIONAL SERVICES ARCHITECTURE Russell V. Keune Architect, USA.
Information Security Group DSD & E-Security DSD and E-Security Tim Burmeister Information Security Policy Defence Signals Directorate
Information Security Management – Management System Requirements, Code of Practice for Controls, and Risk Management supervision Assistant Professor Dr.
Assurance Continuity: What and How? Nithya Rachamadugu September 25, 2007.
COST 356 EST - Towards the definition of a measurable environmentally sustainable transport CONTACTS Dr Robert Joumard, chairman, INRETS, tel
A Security Business Case for the Common Criteria Marty Ferris Ferris & Associates, Inc
Updates on Korean Scheme IT Security Certification Center, National Intelligence Service The 8 th ICCC in Rome, Italy.
A Global Approach for Ex-Products – IECEx UNECE WP.6 Geneva June 2006 Proposal for a new activity: “International legal requirements for explosion.
OECD Review of Russian Statistics Peer Review Mission to Russia April 2012 Tim Davis Head, Global Relations, Statistics Directorate.
GLP & Quality Assurance
Conformity Assessment and Accreditation Mike Peet Chief Executive Officer South African National Accreditation System.
Common Criteria Recognition Arrangement 8 th ICCC Rome, 25 th September 2007 Report by the MC Chairman Gen. Luigi Palagiano.
Important acronyms AO = authorizing official ISO = information system owner CA = certification agent.
OECD Short-Term Economic Statistics Working PartyJune Establishing guidelines for creating long time series for short-term economic statistics.
Proposed US Graduate Study Program in New Zealand Introduction The focus of New Zealand education development is to ensure school students achieve as.
© 2011 Underwriters Laboratories Inc. All rights reserved. This document may not be reproduced or distributed without authorization. ASSET Safety Management.
Security Standards and Threat Evaluation. Main Topic of Discussion  Methodologies  Standards  Frameworks  Measuring threats –Threat evaluation –Certification.
You say to-mah-to, I say to-mae-to: why isn’t there a single solution to Information Security Assurance? Apostol Vassilev atsec information security &
The Value of Common Criteria Evaluations Stuart Katzke, Ph.D. Senior Research Scientist National Institute of Standards & Technology 100 Bureau Drive;
Programme Performance Criteria. Regulatory Authority Objectives To identify criteria against which the status of each element of the regulatory programme.
ECE Prof. John A. Copeland fax Office: GCATT Bldg.
TM8104 IT Security EvaluationAutumn CC – Common Criteria (for IT Security Evaluation) The CC permits comparability between the results of independent.
Security consulting What about the ITSEC?. security consulting What about the ITSEC? Where it came from Where it is going How it relates to CC and other.
1 Using Common Criteria Protection Profiles. 2 o A statement of user need –What the user wants to accomplish –A primary audience: mission/business owner.
SAM-101 Standards and Evaluation. SAM-102 On security evaluations Users of secure systems need assurance that products they use are secure Users can:
High Assurance Products in IT Security Rayford B. Vaughn, Mississippi State University Presented by: Nithin Premachandran.
Stakeholder Relations at Large-Scale Infrastructures The CERN Model Rolf Heuer 7 th Canadian Science Policy Conference, Ottawa, 26 November 2015.
CSCE 727 Awareness and Training Secure System Development and Monitoring.
9 th International Common Criteria Conference Report to IEEE P2600 WG Brian Smithson Ricoh Americas Corporation 10/24/2008.
Important acronyms AO = authorizing official ISO = information system owner CA = certification agent.
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components.
Recent reforms in decentralization frameworks in OECD countries: financial, institutional and territorial aspects Joaquim OLIVEIRA MARTINS Head, OECD Regional.
IEC System of Conformity Assessment Schemes for Electrotechnical Equipment and Components.
European Innovation Scoreboard European Commission Enterprise and Industry DG EPG DGs meeting, May 2008.
Global Golf Equipment Market to 2019 The report focuses on global major leading industry players with information such as company profiles, product picture.
Best Sustainable Development Practices for Food Security UV-B radiation: A Specific Regulator of Plant Growth and Food Quality in a Changing Climate The.
TCSEC: The Orange Book.
Partnerships for VoIP Security VoIP Protection Profiles
Session 2: Institutional arrangements for energy statistics
Evolution of the IECEx Scheme – Ex TAG Training Workshop Shanghai
James Arnold/ Jean Petty 27 September 2007
9th International Common Criteria Conference Report to IEEE P2600 WG
A Global Approach for Ex-Products
A Framework for the Governance of Infrastructure - Getting Infrastructure Right - Jungmin Park, OECD Budgeting & Public Expenditures Division 2019 Annual.
IT SECURITY EVALUATION ACCORDING TO HARMONIZED AND APPROVED CRITERIA
Presentation transcript:

1 Anthony Apted/ James Arnold 26 September 2007 Has the Common Criteria Delivered?

2 Synopsis  Back to Basics – Why Evaluate?  Evaluation Scheme Goals –US Scheme Evolution – a Case Study –A Sample of other Schemes  Evaluation Criteria Goals –Historical Perspective and Previous Approaches –Common Criteria and Mutual Recognition  CC Metrics –Does activity equate to achievement?  New Approaches  Conclusions and Recommendations

3 Why Evaluate?  Establish appropriate level of assurance (“trust”) in COTS products –The product provides claimed/required capabilities (security functionality) –The product is not vulnerable to attack (relative to level of assurance)  Why Commercial Evaluation? –Governments unable to adequately resource evaluation capability

4 US Scheme Evolution  Trusted Product Evaluation Program (TPEP, ) –Established to perform evaluations against Trusted Computer System Evaluation Criteria (TCSEC, aka “Orange Book”) –Operated by NSA, using government resources –Over 100 products evaluated –TCSEC, TNI, TDI Operating systems Network components Database systems Subsystems

5 US Scheme Evolution  Trusted Technology Assessment Program (TTAP, ) –First foray into commercial evaluation process –14 products evaluated –TCSEC, CC, TDI Firewalls Database systems Operating system Hardware switch

6 US Scheme Evolution  Common Criteria Evaluation & Validation Scheme (CCEVS, ) –Objectives To meet the needs of government and industry for cost- effective evaluation of IT products To encourage the formation of commercial security testing laboratories and the development of a private sector security testing industry To ensure that security evaluations of IT products are performed to consistent standards To improve the availability of evaluated IT products.

7 CCEVS  CCEVS (2001-current) –CC & numerous validated protection profiles 217 products currently evaluated in CCEVS with 29 maintenance updates and 10 retired/archived evaluations 43 PPs currently validated and 8 retired/archived PPs 119 products and PPs in evaluation TTAP CCEVS CC Evaluations in the U.S.

8 Other Schemes – A Sample  UK IT Security Evaluation and Certification Scheme –Established 1989 –Objectives: Evaluate and certify trustworthiness of security features in IT products and systems Provide framework for international mutual recognition of such certificates –Commercial evaluation scheme since inception –ITSEC, CC –Over 100 products evaluated

9 Other Schemes – A Sample  Australasian Information Security Evaluation Program (AISEP) –Established 1994 –Objectives: Ensure the ready availability of a comprehensive list of independently assured IT products that meets the needs of Australian and New Zealand government agencies in securing their official resources –Commercial evaluation scheme –ITSEC, CC

10 Historical Criteria  TCSEC –provide users with a yardstick with which to assess the degree of trust that can be placed in computer systems for the secure processing of classified or other sensitive information –provide guidance to manufacturers as to what to build into their new, widely-available trusted commercial products in order to satisfy trust requirements for sensitive applications –provide a basis for specifying security requirements in acquisition specifications.  ITSEC (Harmonised criteria of France, Germany, Netherlands, UK)  Canadian Criteria  Federal Criteria

11 Common Criteria  CC Objectives –Permit comparability between results of evaluations –Establish a level of confidence that the product meets the claimed requirements –Provide a useful guide for development, evaluation and/or procurement –Address confidentiality, integrity, and availability  As presented in Part 1 of the CCv2.1 and CCv2.3 the overall objectives remain essentially unchanged. Ref: CCv2.X Part 1 Scope, CCv3.1 Part 1 Introduction

12 International Common Criteria Community  Common Criteria Recognition Arrangement –Objectives to ensure that evaluations of Information Technology (IT) products and protection profiles are performed to high and consistent standards and are seen to contribute significantly to confidence in the security of those products and profiles; to improve the availability of evaluated, security-enhanced IT products and protection profiles; to eliminate the burden of duplicating evaluations of IT products and protection profiles; to continuously improve the efficiency and cost-effectiveness of the evaluation and certification/validation process for IT products and protection profiles.

13 International Common Criteria Community  Certificate Authorizing Members –Australia and New Zealand –Canada –France –Germany –Japan –Republic of Korea –Netherlands –Norway –Spain –United Kingdom –United States  Certificate Consuming Members –Austria –Czech Republic –Denmark –Finland –Greece –Hungary –India –Israel –Italy –Sweden –Turkey Common Criteria Recognition Arrangement Participants

14 Analysis of Scheme Objectives  Primary objective of evaluation schemes essentially the same –Ensure availability of evaluated IT products  National schemes may have specific national objectives –Support national government agencies –Support national industry  How well does the CC meet these objectives? –Availability of evaluated IT products increased by Mutual Recognition between schemes –CC and CEM support consistent and comparable evaluations

15 CC Metrics  Raw numbers indicate CC is a success –Increasing numbers of evaluated products and PPs –Growth in number of CCRA participants –Growth in numbers of EAL4 evaluations –Evaluations outstripping validation resources in some schemes CC Evaluations in the World

16 Is activity the same as achievement?  However, usual complaints are still made: –Evaluations take too long –Evaluations cost too much –Evaluation results are not meaningful because evaluated configuration is unrealistic  Are these CC issues or scheme issues? –Do all CC assurance requirements deliver value? –Do scheme requirements impose unnecessary overhead? –Have product developers and consumers been properly educated?

17 CC V3.1  Appears to have “missed the boat”  No significant changes to Part 2 (so previous problems remain)  Part 3 changes have not improved value of many requirements

18 CC v4.0  “Issues for CC v4.0” is being presented at 8 th ICCC –Not clear (at time of writing) if this is a report on current developments, or a proposal for future development

19 Other New Approaches  COTS Development Assurance (CDA) –CC Issues By many measures, CC is a big success, but... –Product consumers and vendors see significant issues Product consumers want more relevant results –Not seen as predicting real-world assurance in operation –Difficult specialist vocabulary Doesn’t reflect current commercial development practice –Focus on design-level issues and waterfall development model –Focus on security features, not assurance technology –Tied to static product definition –Goals Assess assurance in operation “Confidence that an IT product will operate as intended, throughout its reasonably anticipated life cycle, even in the presence of adversarial activity” Provide meaningful assurance information to certifiers, accreditors, ISSEs, and system architects/designers Evaluate real products as they are delivered and used in the marketplace Evaluate in a predictable and cost-effective manner Enable qualitative product assurance comparisons Provide a framework equivalent to current Common Criteria activities

20 Conclusions and Recommendations  The CC and CEM are tools to assist evaluation schemes achieve their objectives  If those objectives are not being adequately met, is it because: –The objectives are not clearly defined and articulated? –The tools are not being used correctly? –The tools are not the correct tools to use?

21 Contacts Anthony Apted SAIC Accredited Testing & Evaluation Labs Senior Evaluator James Arnold SAIC Accredited Testing & Evaluation Labs AVP/Technical Director