Figures – Chapter 12.

Slides:



Advertisements
Similar presentations
Critical Systems Specification
Advertisements

Chapter 12 – Dependability and Security Specification
Chapter 12 – Dependability and Security Specification
Chapter 14 – Security Engineering
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 20 Slide 1 Critical systems development 2.
Figures – Chapter 11. Figure 11.1 Principal dependability properties.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 24 Slide 1 Critical Systems Validation 2.
Lecture 1: Overview modified from slides of Lawrie Brown.
SWE Introduction to Software Engineering
Critical Systems Specification
Modified from Sommerville’s originalsSoftware Engineering, 7th edition. Chapter 9 Slide 1 Critical Systems Specification.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 30 Slide 1 Security Engineering.
SWE Introduction to Software Engineering
Lecture 11 Reliability and Security in IT infrastructure.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 9 Slide 1 Critical Systems Specification.
©Ian Sommerville 2004Software Engineering, 7th edition. Insulin Pump Slide 1 An automated insulin pump.
©Ian Sommerville 2004Software Engineering, 7th edition. Insulin Pump Slide 1 The portable insulin pump Developing a dependability specification for the.
The embedded control software for a personal insulin pump
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 30 Slide 1 Security Engineering.
©Ian Sommerville 2006Critical Systems Slide 1 Critical Systems Engineering l Processes and techniques for developing critical systems.
CIS 376 Bruce R. Maxim UM-Dearborn
 Lionel Briand Safety Analysis.  Lionel Briand Safety-critical Software Systems whose failure can threaten human life or cause serious.
Software Dependability CIS 376 Bruce R. Maxim UM-Dearborn.
Component-Based Software Engineering and Software Assurance SSW-540, Unit 12.
Chapter 1- “Diversity” “In higher education they value diversity of everything except thought.” George Will.
Computer Based Information Systems Control UAA – ACCT 316 – Fall 2003 Accounting Information Systems Dr. Fred Barbee.
Security Baseline. Definition A preliminary assessment of a newly implemented system Serves as a starting point to measure changes in configurations and.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 30 Slide 1 Security Engineering 1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 9 Slide 1 Critical Systems Specification 2.
Chapter 12 – Dependability and Security Specification
PATCH MANAGEMENT: Issues and Practical Solutions Presented by: ISSA Vancouver Chapter March 4, 2004.
1 Chapter 3 Critical Systems. 2 Objectives To explain what is meant by a critical system where system failure can have severe human or economic consequence.
Information Systems Security Operational Control for Information Security.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 3 Slide 1 Critical Systems 1.
Chapter 8 – Software Testing Part 2 1Chapter 8 Software testing.
Lesson 7-Managing Risk. Overview Defining risk. Identifying the risk to an organization. Measuring risk.
Chapter 1 Overview The NIST Computer Security Handbook defines the term Computer Security as:
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 20 Slide 1 Critical systems development 3.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 9 Slide 1 Critical Systems Specification 1.
Lecture slides prepared for “Computer Security: Principles and Practice”, 3/e, by William Stallings and Lawrie Brown, Chapter 1 “Overview”. © 2016 Pearson.
IT Risks and Controls Revised on Content Internal Control  What is internal control?  Objectives of internal controls  Types of internal controls.
Figures – Chapter 15. Figure 15.1 Model checking.
CS, AUHenrik Bærbak Christensen1 Critical Systems Sommerville 7th Ed Chapter 3.
Slide 1 Security Engineering. Slide 2 Objectives l To introduce issues that must be considered in the specification and design of secure software l To.
©Ian Sommerville 2000Dependability Slide 1 Chapter 16 Dependability.
Dependability requirements engineering, York EngD programme, 2009Slide 1 Dependability requirements engineering.
Lecturer: Eng. Mohamed Adam Isak PH.D Researcher in CS M.Sc. and B.Sc. of Information Technology Engineering, Lecturer in University of Somalia and Mogadishu.
Database Security Threats. Database An essential corporate resource Data is a valuable resource Must be strictly controlled, managed and secured May have.
CS457 Introduction to Information Security Systems
CASE STUDIES * System Engineering, 9th Edition Sommerville.
Chapter 12 – Safety Engineering
Hardware & Software Reliability
Software Qualities II.
Chapter 13 – Security Engineering
Chapter 13 – Security Engineering
Security Engineering.
Critical Systems Specification
Chapter 12 – Safety Engineering
Database Security &Threats
Chapter 13 – Security Engineering
Chapter 11 – Security and Dependability
Presentation transcript:

Figures – Chapter 12

Figure 12.1 Risk-driven specification

Figure 12.2 The risk triangle

Figure 12.3 Risk classification for the insulin pump Identified hazard Hazard probability Accident severity Estimated risk Acceptability 1.Insulin overdose computation Medium High Intolerable 2. Insulin underdose computation Low Acceptable 3. Failure of hardware monitoring system ALARP 4. Power failure 5. Machine incorrectly fitted 6. Machine breaks in patient 7. Machine causes infection 8. Electrical interference 9. Allergic reaction

Figure 12.4 An example of a fault tree

Figure 12.5 Examples of safety requirements SR1: The system shall not deliver a single dose of insulin that is greater than a specified maximum dose for a system user. SR2: The system shall not deliver a daily cumulative dose of insulin that is greater than a specified maximum daily dose for a system user. SR3: The system shall include a hardware diagnostic facility that shall be executed at least four times per hour. SR4: The system shall include an exception handler for all of the exceptions that are identified in Table 3. SR5: The audible alarm shall be sounded when any hardware or software anomaly is discovered and a diagnostic message, as defined in Table 4, shall be displayed. SR6: In the event of an alarm, insulin delivery shall be suspended until the user has reset the system and cleared the alarm.

Figure 12.6 Types of system failure Failure type Description Loss of service The system is unavailable and cannot deliver its services to users. You may separate this into loss of critical services and loss of non-critical services, where the consequences of a failure in non-critical services are less than the consequences of critical service failure. Incorrect service delivery The system does not deliver a service correctly to users. Again, this may be specified in terms of minor and major errors or errors in the delivery of critical and non-critical services. System/data corruption The failure of the system causes damage to the system itself or its data. This will usually but not necessarily be in conjunction with other types of failures.

Figure 12.7 Availability specification Explanation 0.9 The system is available for 90% of the time. This means that, in a 24-hour period (1,440 minutes), the system will be unavailable for 144 minutes. 0.99 In a 24-hour period, the system is unavailable for 14.4 minutes. 0.999 The system is unavailable for 84 seconds in a 24-hour period. 0.9999 The system is unavailable for 8.4 seconds in a 24-hour period. Roughly, one minute per week.</TB>

Figure 12.8 Examples of functional reliability requirements RR1: A pre-defined range for all operator inputs shall be defined and the system shall check that all operator inputs fall within this pre-defined range. (Checking) RR2: Copies of the patient database shall be maintained on two separate servers that are not housed in the same building. (Recovery, redundancy) RR3: N-version programming shall be used to implement the braking control system. (Redundancy) RR4: The system must be implemented in a safe subset of Ada and checked using static analysis. (Process)

Figure 12.9 The preliminary risk assessment process for security requirements

Figure 12.10 Asset analysis in a preliminary risk assessment report for the MHC-PMS Value Exposure The information system High. Required to support all clinical consultations. Potentially safety-critical. High. Financial loss as clinics may have to be canceled. Costs of restoring system. Possible patient harm if treatment cannot be prescribed. The patient database An individual patient record Normally low although may be high for specific high-profile patients. Low direct losses but possible loss of reputation.

Figure 12.11 Threat and control analysis in a preliminary risk assessment report Probability Control Feasibility Unauthorized user gains access as system manager and makes system unavailable Low Only allow system management from specific locations that are physically secure. Low cost of implementation but care must be taken with key distribution and to ensure that keys are available in the event of an emergency. Unauthorized user gains access as system user and accesses confidential information High Require all users to authenticate themselves using a biometric mechanism. Log all changes to patient information to track system usage. Technically feasible but high-cost solution. Possible user resistance. Simple and transparent to implement and also supports recovery.

Figure 12.12 Formal specification in a plan-based software process