Presentation is loading. Please wait.

Presentation is loading. Please wait.

Open Platform for EvolutioNary Certification Of Safety-critical Systems Large-scale integrating project (IP) Nuanced Term-Matching to Assist in Compositional.

Similar presentations


Presentation on theme: "Open Platform for EvolutioNary Certification Of Safety-critical Systems Large-scale integrating project (IP) Nuanced Term-Matching to Assist in Compositional."— Presentation transcript:

1 Open Platform for EvolutioNary Certification Of Safety-critical Systems Large-scale integrating project (IP) Nuanced Term-Matching to Assist in Compositional Safety Assurance ASSURE 19/05/13 Katrina Attwood and Philippa Conmy (presenting)

2 Introduction Introduction to the problem Aims of the work Related work Nuanced term matching – linguistic theory Example Conclusions

3 The Problem… Safety Critical Systems are becoming increasingly complex – OTS components – Multiple suppliers – Cross-domain Need more agile processes for certification – Reduce amount of re-analysis required – Reduce time to develop systems – Reduce cost How can we do this in an acceptably safe way?

4 OPENCOSS project Aims to produce methods and tools to support compositional certification By means of Common Certification Language used to model different standards Orthogonal mappings between the different standards/domains to support cross-domain understanding This paper looks at how to compare terms/certification artefacts produced to different standards or developed independently

5 Compositional Certification Numerous safety standards which guide the development process for software intensive systems Frequently based on concepts of “integrity level” Suggest analysis techniques and processes dependent on the integrity – Attributes such as depth and rigour – Can apply to different types of component – architectural decomposition – E.g. software testing with different levels of coverage on different types of “software unit”

6 Compositional Certification Principles of this work…. Validation – Does component do the task we need it to? – Only possible in a system context to fully assess this Verification – Does the component meet a specification? – Possible to do this independently Picking an integrity level… – Only possible to judge whether the evidence is compelling enough in a system context Functional composition – Do the contexts match? – Do the integrated components behave as intended?

7 Principles

8 Compositional Arguments Using compositional arguments can capture details about evidence/specification sufficiency Context and assumptions about the evidence are provided and can be compared Difficulties – Claims are made in natural language – Terms may not match – Similar concepts may just be expressed in a different way

9 Related Research and Practice Integrated Modular Avionics – Heterogenous computer networks on aircraft – Common design principles and software interfaces – Designed to maximise ability to maintain and incrementally certify – IAWG defence project in the UK developed principles of safety arguments to integrate data, limited details on how to capture required dependencies – DO-297 Avionics guidance requires capture of safety assumptions Incremental development approach to match these

10 Related Research and Practice IEC 26262 – Automotive standard – “Safety Element out of Context” – Develop to a set of assumed requirements – Validate these during actual system development Various contract languages for capturing properties – E.g. non functional properties such as timing – Failure propagations These tend to tackle small bits of the problem Focus on specifying rather than justifying or qualifying the supporting evidence

11 Structural Linguistics

12 Ideal - synonymy

13 Mismatches Homonymy – Same term may be used, but for different concepts – E.g. safety – freedom from accidents in some cases in others acknowledges it is not absolute – Function – software or more conceptual? Partial synonymy – One party uses term to capture some aspect of the concept – Hearers interpretation uses a term that also captures some aspect, but not necessarily the same – Neither has complete coverage of the concept – E.g. fault and failure

14 Mismatches No match – Signifier for a concept in one language that has no signifier in another – Then attempt to use a super-concept – E.g. Schadenfreude? Mishap – Mil Std 882

15 Super-concept

16 Use of a Thesaurus Exact match – the relationship between a term from a standard and the vocabulary’s term for the core concept is a synonymy. Partial or nuanced match – some aspect of the core concept in the vocabulary is covered by a standard-specific term, but the relationship is not a synonymy. No match – a standard-specific term cannot be matched exactly to the vocabulary term for the core concept, but a match via a more abstract superconcept might be possible.

17 Checking across GSN arguments Horizontal checks – Pairwise comparisons of nouns, adjectives and verbs in claims and contexts – Compare the similarity of subjects in claims Similar level of abstraction or super-concept? – Identify potential mismatches of detail – Similarity of claims “fault free” the same as “absence of faults”? Vertical checks – Expectations of the system argument Shortcomings and qualifiers on the evidence Not a “yes/no” answer of compatibility – Rather to inform the argument developer

18 Example Horizontal comparisons – Application software, with software HAZOP – Supporting infrastructure argument Re-usable in multiple scenarios – We are considering whether particular failure mode management can be guaranteed Vertical comparisons – Typical system level data which needs to be considered

19 Modular GSN

20 Application Software

21 Backplane

22 Example matching… GTiming, GCommsWCETCommunications –matched noun 5ms/6ms – different values but within tolerances Missing information - is communications {type} used by {Component A} the correct one? ConHWInfo, ConOSHWInfo Potential exact match - processor Missing information – backplane information. Is this vital information? Does it weaken the argument or our assurance?

23 System Level Goal

24 Example matching… SysSIL, CompAHAZOP, PlatTimingAnl Superconcept – suppose {SIL Y} in {Std Z} requires that a Failure Modes and Effects Analysis. Software HAZOP is (arguably) a specific type of FMEA, which cannot be exactly matched, but can be via principles of concept Similarly, Timing Analysis or schedulability analysis, or general performance characteristics? Other types of matching to consider Evidence characteristics – are there keywords to consider or to flag potential issues? Different levels of abstraction (of evidence and of components analyses were applied to)

25 Conclusions Compositional Certification – Requires matching and composition of assurance data gathered from different sources and domains – Using safety arguments we can capture Claims being made about a component Trustworthiness and so on in the evidence data Difficult to match – Natural language expresses same thing in different ways – Principles of translation from linguistics can be used to understand and guide a matching process Future work – Development of a vocabulary using standards and general principles – Further automation where possible


Download ppt "Open Platform for EvolutioNary Certification Of Safety-critical Systems Large-scale integrating project (IP) Nuanced Term-Matching to Assist in Compositional."

Similar presentations


Ads by Google