Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal.

Slides:



Advertisements
Similar presentations
Tales from the Lab: Experiences and Methodology Demand Technology User Group December 5, 2005 Ellen Friedman SRM Associates, Ltd.
Advertisements

Advanced Oracle DB tuning Performance can be defined in very different ways (OLTP versus DSS) Specific goals and targets must be set => clear recognition.
Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion Abu Bakr Siddiq.
1 SOFTWARE TESTING Przygotował: Marcin Lubawski. 2 Testing Process AnalyseDesignMaintainBuildTestInstal Software testing strategies Verification Validation.
1 Module 1 The Windows NT 4.0 Environment. 2  Overview The Microsoft Operating System Family Windows NT Architecture Overview Workgroups and Domains.
DESIGNING A PUBLIC KEY INFRASTRUCTURE
Dr. Sarbari Gupta Electrosoft Services Tel: (703) Security Characteristics of Cryptographic.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 30 Slide 1 Security Engineering.
Database Connectivity Rose-Hulman Institute of Technology Curt Clifton.
Data Management I DBMS Relational Systems. Overview u Introduction u DBMS –components –types u Relational Model –characteristics –implementation u Physical.
Concepts of Database Management Seventh Edition
1 RAKSHA: A FLEXIBLE ARCHITECTURE FOR SOFTWARE SECURITY Computer Systems Laboratory Stanford University Hari Kannan, Michael Dalton, Christos Kozyrakis.
DISTRIBUTED COMPUTING
Module 2: Planning to Install SQL Server. Overview Hardware Installation Considerations SQL Server 2000 Editions Software Installation Considerations.
VMware vCenter Server Module 4.
Security Architecture Dr. Gabriel. Security Database security: –degree to which data is fully protected from tampering or unauthorized acts –Full understanding.
Capacity Planning in SharePoint Capacity Planning Process of evaluating a technology … Deciding … Hardware … Variety of Ways Different Services.
SharePoint Portal Server 2003 JAMES WEIMHOLT WEIDER HAO JUAN TURCIOS BILL HUERTA BRANDON BROWN JAMES WEIMHOLT INTRODUCTION OVERVIEW IMPLEMENTATION CASE.
Database Security and Auditing: Protecting Data Integrity and Accessibility Chapter 3 Administration of Users.
MCITP Administrator: Microsoft SQL Server 2005 Database Server Infrastructure Design Study Guide (70-443) Chapter 1: Designing the Hardware and Software.
Computer System Lifecycle Chapter 1. Introduction Computer System users, administrators, and designers are all interested in performance evaluation. Whether.
Hands-On Microsoft Windows Server 2008 Chapter 1 Introduction to Windows Server 2008.
Software Faults and Fault Injection Models --Raviteja Varanasi.
Ekrem Kocaguneli 11/29/2010. Introduction CLISSPE and its background Application to be Modeled Steps of the Model Assessment of Performance Interpretation.
1 GFI LANguard N.S.S VS NeWT Security Scanner Presented by:Li,Guorui.
Chapter-4 Windows 2000 Professional Win2K Professional provides a very usable interface and was designed for use in the desktop PC. Microsoft server system.
Requirements Engineering

Database Security and Auditing: Protecting Data Integrity and Accessibility Chapter 3 Administration of Users.
Ling Guo Feb 15, 2010 Database(RDBMS) Software Review Oracle RDBMS (Oracle Cooperation) 4()6 Oracle 10g Express version DB2 (IBM) IBM DB2 Express-C SQL.
Data Administration & Database Administration
Security Baseline. Definition A preliminary assessment of a newly implemented system Serves as a starting point to measure changes in configurations and.
 Prototype for Course on Web Security ETEC 550.  Huge topic covering both system/network architecture and programming techniques.  Identified lack.
University of Coimbra, DEI-CISUC
IMPLEMENTING F-SECURE POLICY MANAGER. Page 2 Agenda Main topics Pre-deployment phase Is the implementation possible? Implementation scenarios and examples.
Naaliel Mendes, João Durães, Henrique Madeira CISUC, Department of Informatics Engineering University of Coimbra {naaliel, jduraes,
Environment for Information Security n Distributed computing n Decentralization of IS function n Outsourcing.
Module 7: Fundamentals of Administering Windows Server 2008.
 CS 5380 Software Engineering Chapter 8 Testing.
A+ Guide to Managing and Maintaining Your PC Fifth Edition Chapter 13 Understanding and Installing Windows 2000 and Windows NT.
1 Wenguang WangRichard B. Bunt Department of Computer Science University of Saskatchewan November 14, 2000 Simulating DB2 Buffer Pool Management.
© 2001 by Carnegie Mellon University SS5 -1 OCTAVE SM Process 5 Background on Vulnerability Evaluations Software Engineering Institute Carnegie Mellon.
Oracle 10g Database Administrator: Implementation and Administration Chapter 2 Tools and Architecture.
Simulating a $2M Commercial Server on a $2K PC Alaa R. Alameldeen, Milo M.K. Martin, Carl J. Mauer, Kevin E. Moore, Min Xu, Daniel J. Sorin, Mark D. Hill.
Module 1: Installing Microsoft Windows XP Professional.
Slide 1 Breaking databases for fun and publications: availability benchmarks Aaron Brown UC Berkeley ROC Group HPTS 2001.
Data Security Assessment and Prevention AD660 – Databases, Security, and Web Technologies Marcus Goncalves Spring 2013.
Chapter 1 Introduction to Databases. 1-2 Chapter Outline   Common uses of database systems   Meaning of basic terms   Database Applications  
Online Music Store. MSE Project Presentation III
Database Role Activity. DB Role and Privileges Worksheet.
IST 318 Database Administration Lecture 1 What Is a DBA?
OARN Database UPDATE – SEPTEMBER We’re Live – and Testing  The site is up and running in Google’s data centers:  The site has been secured: 
Lesson 19-E-Commerce Security Needs. Overview Understand e-commerce services. Understand the importance of availability. Implement client-side security.
Fault Tolerance Benchmarking. 2 Owerview What is Benchmarking? What is Dependability? What is Dependability Benchmarking? What is the relation between.
MSE Portfolio Presentation 1 Doug Smith November 13, 2008
CS525: Big Data Analytics MapReduce Computing Paradigm & Apache Hadoop Open Source Fall 2013 Elke A. Rundensteiner 1.
Database Security Cmpe 226 Fall 2015 By Akanksha Jain Jerry Mengyuan Zheng.
Slide 1 Security Engineering. Slide 2 Objectives l To introduce issues that must be considered in the specification and design of secure software l To.
Improving the Reliability of Commodity Operating Systems Michael M. Swift, Brian N. Bershad, Henry M. Levy Presented by Ya-Yun Lo EECS 582 – W161.
Whole Test Suite Generation. Abstract Not all bugs lead to program crashes, and not always is there a formal specification to check the correctness of.
Retele de senzori Curs 1 - 1st edition UNIVERSITATEA „ TRANSILVANIA ” DIN BRAŞOV FACULTATEA DE INGINERIE ELECTRICĂ ŞI ŞTIINŢA CALCULATOARELOR.
Non Functional Testing. Contents Introduction – Security Testing Why Security Test ? Security Testing Basic Concepts Security requirements - Top 5 Non-Functional.
 Abstract  Introduction  Literature Survey  Conclusion on Literature Survey  Threat model and system architecture  Proposed Work  Attack Scenarios.
Abhinav Kamra, Vishal Misra CS Department Columbia University
Security Engineering.
Installation and database instance essentials
Big Data - in Performance Engineering
Operating Systems Bina Ramamurthy CSE421 11/27/2018 B.Ramamurthy.
PLANNING A SECURE BASELINE INSTALLATION
Performance And Scalability In Oracle9i And SQL Server 2000
Presentation transcript:

Dependability benchmarking for transactional and web systems Henrique Madeira University of Coimbra, DEI-CISUC Coimbra, Portugal

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 2 Ingredients of a recipe to “bake” a dependability benchmark Measures Workload Faultload Procedure and rules (how to cook the thing) Dependability benchmark specification Document based only or Document + programs, tools,

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 3 Benchmark properties Representativeness Portability Repeatability Scalability Non-intrusiveness Easy to use Easy to understand

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 4 Benchmark properties Representativeness Portability Repeatability Scalability Non-intrusiveness Easy to use Easy to understand A benchmark is always an abstraction of the real world! It’s an imperfect and incomplete view of the world. Usefulness  improve things Agreement In practice…

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 5 The very nature of a benchmark Compare components, systems, architectures, configurations, etc. Highly specific: applicable/valid for a very well defined domain. Contribute to improve computer systems because you can compare alternative solutions. A real benchmark represents an agreement.

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 6 Three examples of dependability benchmarks for transactional systems 1. DBench-OLTP [DSN VLDB 2003] u Dependability benchmark for OLTP systems (database centric) u Provided as a document structured in clauses (like TPC benchmarks) 2. Web-DB [SAFECOMP 2004] u Dependability benchmark for web servers u Provided as a set of ready-to-run programs and document-based rules 3. Security benchmark (first step) [DSN 2005] u Security benchmark for database management systems u Set of tests to database security mechanisms

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 7 The DBench-OLTP Dependability Benchmark (SUB) System Under Benchmarking (BMS) Benchmark Management System DBMS OS BM RTE FLE Workload Faultload Control Data + results Benchmark Target Workload and setup adopted from the TPC-C performance benchmark

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 8 Phase 1 Phase 2 Time Slot N Slot 1 Slot 2 Slot 3 Phase 1: Baseline performance measures (TPC-C measures) Phase 2: Performance measures in the presence of the faultload Dependability measures Injection time Recovery time Testing Slot (Start) Testing Slot (End) Steady state time Keep time Detection time Recovery start Fault activation Recovery end Steady state condition Data Integrity Testing Measurement Interval Benchmarking Procedure

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 9 Measures Baseline performance measures u tpmC – number of transactions executed per minute u $/tpmC – price per transaction Performance measures in the presence of the faultload u Tf – number of transactions executed per minute (with faults) u $/Tf – price per transaction (with faults) Dependability measures u AvtS – availability from the server point-of-view u AvtC – availability from the clients point-of-view u Ne – number of data integrity errors

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 10 Faultload Operator faults u Emulate database administrator mistakes Software faults u Emulate software bugs in the operating system High-level Hardware failures u Emulates hardware component failures

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 11 Examples of systems benchmarked SystemHardwareOperating SystemDBMSDBMS Config. A  Processor: Intel Pentium III 800 MHz  Memory: 256MB  Hard Disks: Four 20GB/7200 rpm  Network: Fast Ethernet Win2k Prof. SP 3Oracle 8i (8.1.7)Conf. 1 BWin2k Prof. SP 3Oracle 9i (9.0.2)Conf. 1 CWinXp Prof. SP 1Oracle 8i (8.1.7)Conf. 1 DWinXp Prof. SP 1Oracle 9i (9.0.2)Conf. 1 EWin2k Prof. SP 3Oracle 8i (8.1.7)Conf. 2 FWin2k Prof. SP 3Oracle 9i (9.0.2)Conf. 2 GSuSE Linux 7.3Oracle 8i (8.1.7)Conf. 1 HSuSE Linux 7.3Oracle 9i (9.0.2)Conf. 1 IRedHat Linux 7.3PostgreSQL 7.3- J  Processor: Intel Pentium IV 2 GHz  Memory: 512MB  Hard Disks: Four 20GB / 7200 rpm  Network: Fast Ethernet Win2k Prof. SP 3Oracle 8i (8.1.7)Conf. 1 KWin2k Prof. SP 3Oracle 9i (9.0.2)Conf. 1

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 12 DBench-OLTP benchmarking results Performance

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 13 DBench-OLTP benchmarking results Performance Availability

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 14 DBench-OLTP benchmarking results Performance Availability Price

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 15 Using DBench-OLTP to obtain more specific results Availability variation during the benchmark run Corresponds to about 32 hours of functioning in which the system have been subject of 97 faults. Each fault is injected in a 20 minutes injection slot. System is rebooted between slots

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 16 DBench-OLTP benchmarking effort Type of fault# of days TPC-C benchmark implementation10 (with reuse of code) DBench ‑ OLTP benchmark implementation 10 (first implementation) Benchmarking process execution3 (average per system)

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 17 The WEB-DB Dependability Benchmark (SUB) System Under Benchmarking (BMS) Benchmark Management System Web- Server OS SPECWeb Client Benchmark Target Fault injector Bench. Coordinator Availability tester Workload and setup adopted from the SPECWeb99 performance benchmark Workload Faultload Control Data + results

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 18 WEB-DB measures  Performance degradation measures: u SPECf : Main SPEC measure in the presence of the faultload u THRf : Throughput in the presence of the faultload (ops/s) u RTMf : Response time in the presence of the faultload (ms)  Dependability related measures u Availability: Percentage of time the server provides the expected service u Autonomy: Percentage of times the server recovered without human intervention (estimator of the self-healing abilities of the server) u Accuracy: Percentage of correct results yielded by the server

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 19  Network & hardware faults u Connection loss (server sockets are closed) u Network interface failures (disable + enable the interface)  Operator faults u Unscheduled system reboot u Abrupt server termination  Software faults u Emulation of common programming errors u Injected in the operating system (not in the web-server) WEB-DB faultloads

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 20 BL 123 res Benchmark procedure: 2 steps  Step 1: u Determine baseline performance (SUB + benchmark tools running workload without faults) u Tune workload for a SPEC conformance of 100%  Step 2: u 3 runs u Each run comprises all faults specified in the faultload u Bechmark results: the average of the 3 runs WEB-DB procedure Time Specweb Ramp Up + Ramp Down times. faults O.S. / Net / W.S. workload Web Srv. (BT) idle workload

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 21 Examples of systems benchmarked  Benchmark and compare the dependability of two common web-servers: u Apache web-server u Abyss web-server  When running on: u Win u Win. XP u Win. 2003

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 22 AvailabilityAccuracyAutonomy Apache Abyss Dependability results

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 23 SPECfTHRfRTMf Apache Abyss Baseline performance - Apache: 31, 26, 30 - Abyss: 28, 25, 24 Performance in the presence of faults Performance degradation (%) - Apache: 55.4, 30.7, Abyss: 63.2, 45.2, 46.3

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 24 Security benchmark for database management systems … Client Application … … Client Application Client Application Web Browser Web Browser Application Server Web Server Client Application DBMS Network Key Layer

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 25 Security Attacks vs System Vulnerabilities Security attacks: u Intentional attempts to access or destroy data System vulnerabilities: u Hidden flaws in the system implementation u Features of the security mechanisms available u Configuration of the security mechanisms

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 26 Approach for the evaluation of security in DBMS Characterization of DBMS security mechanisms Our approach: 1) Identification of data criticality levels 2) Definition of database security classes 3) Identification of security requirements for each class 4) Definition of security tests for two scenarios: – Compare different DBMS – Help DBA assessing security in real installations

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 27 Database Security Classes DB Security ClassData Criticality LevelRequired Security Mechanisms Class 0None Class 1Level 1 - User authentication (internal or external) Class 2Level 2 - User authentication - User privileges (system and object privileges) Class 3Level 3 - User authentication - User privileges - Encryption in the data communication Class 4Level 4 - User authentication - User privileges - Encryption in the data communication - Encryption in the data storage Class 5Level 5 - User authentication - User privileges - Encryption in the data communication - Encryption in the data storage - Auditing

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 28 Requirements for DBMS Security Mechanisms Requirements Req. Weight (%) Req. Reference The system must provide internal user authentication by using usernames and passwords The system must guarantee that, besides the DBA users, no other users can read/write to/from the table/file where the usernames and passwords are stored 61.2 The password must be encrypted during the communication between the client and the server during the authentication 61.3 The passwords must be encrypted in the table/file where they are stored 41.4 Internal user authentication (username/ password):

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 29 Measures and Scenarios Measures provided: u Security Class (SCL) u Security Requirements Fulfillment (SRF) Potential scenarios: u Compare different DBMS products u Help DBA assessing security in real installations

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 30 Comparing DBMS Security Set of tests to verify if the DBMS fulfill the security requirements Development of a database scenario: u Database model (tables) u Data criticality levels for each table u Database accounts corresponding to the several DB user profiles u System and object privileges for each account Network Oracle? DB2? PostgreSQL? System under development

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 31 Database scenario Database model: Level 1 Level 4 Level 3 Level 5 Level 2

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 32 Example: Comparative Analysis of Two DBMS Oracle 9i vs PostgreSQL 7.3 Security Mechanism # Req.Oracle 9iPostgreSQL Internal user authenticationALLOK External user authenticationALLOK User privileges 3.1OKNot OK 3.2OK 3.3OK 3.4OK 3.5OKNot OK Encryption in the data communication 4.1OKNot OK 4.2Depends on the methodNot OK Encryption in the data storage 5.1OKNot OK 5.2Not OK 5.3Not OK Auditing6.1OKNot OK

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 33 Results Summary Oracle 9i (encryption RC4, AES, and DES ) Oracle 9i (encryption 3DES) PostgreSQL 7.3 Security Class Class 5 Class 1 SRF metric 96%92%66% Oracle 9i does not fulfill all encryption requirements u 400% < performance degradation < 2700% PostgreSQL 7.3: u Some manual configuration is required to achieve Class 1 u High SRF for a Class 1 DBMS u Fulfills some Class 2 requirements

Henrique Madeira Workshop on Dependability Benchmarking, November 8, Chicago, IL USA 34 Conclusions Comparing (components, systems, architectures, and configurations) is essential to improve computer systems  Benchmarks needed! Comparisons could be missleading  Benchmarks must be carefully validated! Two ways of having real benchmarks: u Industry agreement u User community (tacit agreement)