Open Platform for EvolutioNary Certification Of Safety-critical Systems Large-scale integrating project (IP) Nuanced Term-Matching to Assist in Compositional.

Slides:



Advertisements
Similar presentations
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Advertisements

Chapter 4 Quality Assurance in Context
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 24 Slide 1 Critical Systems Validation 2.
Presented by: Thabet Kacem Spring Outline Contributions Introduction Proposed Approach Related Work Reconception of ADLs XTEAM Tool Chain Discussion.
e-Framework Components and Responsibilities.
Software Testing and Quality Assurance
Industrial Avionics Working Group 19/04/07 Modular Certification Developing Safety Case Modules.
The Architecture Design Process
Industrial Avionics Working Group 13/09/06 Incremental Certification Phil Williams – General Dynamics (UK) Ltd Representing the Industrial Avionics Working.
1/31 CS 426 Senior Projects Chapter 1: What is UML? Chapter 2: What is UP? [Arlow and Neustadt, 2005] January 22, 2009.
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems.
Industrial Avionics Working Group 18/04/07 Modular Certification Safety Case Contracts.
1 CS 426 Senior Projects Chapter 1: What is UML? Chapter 2: What is UP? [Arlow and Neustadt, 2002] January 26, 2006.
Course Instructor: Aisha Azeem
©Ian Sommerville 2006Critical Systems Slide 1 Critical Systems Engineering l Processes and techniques for developing critical systems.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 11 Slide 1 Architectural Design.
The design process z Software engineering and the design process for interactive systems z Standards and guidelines as design rules z Usability engineering.
Qualitative Studies: Case Studies. Introduction l In this presentation we will examine the use of case studies in testing research hypotheses: l Validity;
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
What is UML? What is UP? [Arlow and Neustadt, 2005] January 23, 2014
Expert System Presentation On…. Software Certification for Industry - Verification and Validation Issues in Expert Systems By Anca I. Vermesan Presented.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
WHAT IS SYSTEM SAFETY? The field of safety analysis in which systems are evaluated using a number of different techniques to improve safety. There are.
Software Models (Cont.) 9/22/2015ICS 413 – Software Engineering1 -Component-based software engineering -Formal Development Model.
Alignment of ATL and QVT © 2006 ATLAS Nantes Alignment of ATL and QVT Ivan Kurtev ATLAS group, INRIA & University of Nantes, France
HCI in Software Process Material from Authors of Human Computer Interaction Alan Dix, et al.
1 Introduction to Software Engineering Lecture 1.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 20 Slide 1 Critical systems development 3.
Lach1MAPLD 2005/241 Accessible Formal Verification for Safety-Critical FPGA Design John Lach, Scott Bingham, Carl Elks, Travis Lenhart Charles L. Brown.
Develop a Safety Assurance approach for Complex Systems (Problem Definition) Supervisors: Tim Kelly, Rob Alexander Chris Leong HISE Group Giving a Presentation.
FDT Foil no 1 On Methodology from Domain to System Descriptions by Rolv Bræk NTNU Workshop on Philosophy and Applicablitiy of Formal Languages Geneve 15.
Distribution and components. 2 What is the problem? Enterprise computing is Large scale & complex: It supports large scale and complex organisations Spanning.
Software Safety Case Why, what and how… Jon Arvid Børretzen.
Chapter 6 – Architectural Design Lecture 1 1Chapter 6 Architectural design.
Improving Dependability in Service Oriented Architectures using Ontologies and Fault Injection Binka Gwynne Jie Xu School of Computing University of Leeds.
CGI3: Current Status Common Goals and Infrastructures CGI3: Infrastructure Jonas Mellin.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
Software Engineering Jon Walker. What is Software Engineering? Why do we call it Software Engineering? Why not just call it programming or software development?
Mixed Criticality Systems: Beyond Transient Faults Abhilash Thekkilakattil, Alan Burns, Radu Dobrin and Sasikumar Punnekkat.
Lecture 13.  Failure mode: when team understands requirements but is unable to meet them.  To ensure that you are building the right system Continually.
NCAF_May03.ppt Slide - 1 CSE International Ltd Data Integrity: The use of data by safety-related systems Alastair Faulkner CEng CSE International Ltd Tel:
CPSC 871 John D. McGregor Module 8 Session 1 Testing.
ARTEMIS JU Grant Agreement number WP4 Instantiation WP4 Status 25 September, 2013.
What’s Ahead for Embedded Software? (Wed) Gilsoo Kim
Industrial Avionics Working Group 18/04/07 The Relationship Between the Design and Safety Domains in IAWG Modular Certification Part 2: Completeness of.
1 Software Testing and Quality Assurance Lecture 17 - Test Analysis & Design Models (Chapter 4, A Practical Guide to Testing Object-Oriented Software)
From Use Cases to Implementation 1. Structural and Behavioral Aspects of Collaborations  Two aspects of Collaborations Structural – specifies the static.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
From Use Cases to Implementation 1. Mapping Requirements Directly to Design and Code  For many, if not most, of our requirements it is relatively easy.
CPSC 372 John D. McGregor Module 8 Session 1 Testing.
 System Requirement Specification and System Planning.
Development of Assessments Laura Mason Consultant.
An Integrated Model-Based Approach to System Safety and Aircraft System Architecture Development Eric Villhauer – Systems Engineer Brian Jenkins – System.
John D. McGregor Session 9 Testing Vocabulary
John D. McGregor Session 9 Testing Vocabulary
HCI in the software process
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
Critical Systems Validation
Model-Driven Analysis Frameworks for Embedded Systems
John D. McGregor Session 9 Testing Vocabulary
Chapter 5 Designing the Architecture Shari L. Pfleeger Joanne M. Atlee
Standards.
Chapter 13 Quality Management
HCI in the software process
Baisc Of Software Testing
HCI in the software process
Automated Analysis and Code Generation for Domain-Specific Models
From Use Cases to Implementation
Presentation transcript:

Open Platform for EvolutioNary Certification Of Safety-critical Systems Large-scale integrating project (IP) Nuanced Term-Matching to Assist in Compositional Safety Assurance ASSURE 19/05/13 Katrina Attwood and Philippa Conmy (presenting)

Introduction Introduction to the problem Aims of the work Related work Nuanced term matching – linguistic theory Example Conclusions

The Problem… Safety Critical Systems are becoming increasingly complex – OTS components – Multiple suppliers – Cross-domain Need more agile processes for certification – Reduce amount of re-analysis required – Reduce time to develop systems – Reduce cost How can we do this in an acceptably safe way?

OPENCOSS project Aims to produce methods and tools to support compositional certification By means of Common Certification Language used to model different standards Orthogonal mappings between the different standards/domains to support cross-domain understanding This paper looks at how to compare terms/certification artefacts produced to different standards or developed independently

Compositional Certification Numerous safety standards which guide the development process for software intensive systems Frequently based on concepts of “integrity level” Suggest analysis techniques and processes dependent on the integrity – Attributes such as depth and rigour – Can apply to different types of component – architectural decomposition – E.g. software testing with different levels of coverage on different types of “software unit”

Compositional Certification Principles of this work…. Validation – Does component do the task we need it to? – Only possible in a system context to fully assess this Verification – Does the component meet a specification? – Possible to do this independently Picking an integrity level… – Only possible to judge whether the evidence is compelling enough in a system context Functional composition – Do the contexts match? – Do the integrated components behave as intended?

Principles

Compositional Arguments Using compositional arguments can capture details about evidence/specification sufficiency Context and assumptions about the evidence are provided and can be compared Difficulties – Claims are made in natural language – Terms may not match – Similar concepts may just be expressed in a different way

Related Research and Practice Integrated Modular Avionics – Heterogenous computer networks on aircraft – Common design principles and software interfaces – Designed to maximise ability to maintain and incrementally certify – IAWG defence project in the UK developed principles of safety arguments to integrate data, limited details on how to capture required dependencies – DO-297 Avionics guidance requires capture of safety assumptions Incremental development approach to match these

Related Research and Practice IEC – Automotive standard – “Safety Element out of Context” – Develop to a set of assumed requirements – Validate these during actual system development Various contract languages for capturing properties – E.g. non functional properties such as timing – Failure propagations These tend to tackle small bits of the problem Focus on specifying rather than justifying or qualifying the supporting evidence

Structural Linguistics

Ideal - synonymy

Mismatches Homonymy – Same term may be used, but for different concepts – E.g. safety – freedom from accidents in some cases in others acknowledges it is not absolute – Function – software or more conceptual? Partial synonymy – One party uses term to capture some aspect of the concept – Hearers interpretation uses a term that also captures some aspect, but not necessarily the same – Neither has complete coverage of the concept – E.g. fault and failure

Mismatches No match – Signifier for a concept in one language that has no signifier in another – Then attempt to use a super-concept – E.g. Schadenfreude? Mishap – Mil Std 882

Super-concept

Use of a Thesaurus Exact match – the relationship between a term from a standard and the vocabulary’s term for the core concept is a synonymy. Partial or nuanced match – some aspect of the core concept in the vocabulary is covered by a standard-specific term, but the relationship is not a synonymy. No match – a standard-specific term cannot be matched exactly to the vocabulary term for the core concept, but a match via a more abstract superconcept might be possible.

Checking across GSN arguments Horizontal checks – Pairwise comparisons of nouns, adjectives and verbs in claims and contexts – Compare the similarity of subjects in claims Similar level of abstraction or super-concept? – Identify potential mismatches of detail – Similarity of claims “fault free” the same as “absence of faults”? Vertical checks – Expectations of the system argument Shortcomings and qualifiers on the evidence Not a “yes/no” answer of compatibility – Rather to inform the argument developer

Example Horizontal comparisons – Application software, with software HAZOP – Supporting infrastructure argument Re-usable in multiple scenarios – We are considering whether particular failure mode management can be guaranteed Vertical comparisons – Typical system level data which needs to be considered

Modular GSN

Application Software

Backplane

Example matching… GTiming, GCommsWCETCommunications –matched noun 5ms/6ms – different values but within tolerances Missing information - is communications {type} used by {Component A} the correct one? ConHWInfo, ConOSHWInfo Potential exact match - processor Missing information – backplane information. Is this vital information? Does it weaken the argument or our assurance?

System Level Goal

Example matching… SysSIL, CompAHAZOP, PlatTimingAnl Superconcept – suppose {SIL Y} in {Std Z} requires that a Failure Modes and Effects Analysis. Software HAZOP is (arguably) a specific type of FMEA, which cannot be exactly matched, but can be via principles of concept Similarly, Timing Analysis or schedulability analysis, or general performance characteristics? Other types of matching to consider Evidence characteristics – are there keywords to consider or to flag potential issues? Different levels of abstraction (of evidence and of components analyses were applied to)

Conclusions Compositional Certification – Requires matching and composition of assurance data gathered from different sources and domains – Using safety arguments we can capture Claims being made about a component Trustworthiness and so on in the evidence data Difficult to match – Natural language expresses same thing in different ways – Principles of translation from linguistics can be used to understand and guide a matching process Future work – Development of a vocabulary using standards and general principles – Further automation where possible