Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Security of Embedded Systems 10.2.2010: Logics and Proof Methods, Wrap-Up Prof. Dr. Holger Schlingloff Institut für Informatik und Fraunhofer.

Similar presentations


Presentation on theme: "Information Security of Embedded Systems 10.2.2010: Logics and Proof Methods, Wrap-Up Prof. Dr. Holger Schlingloff Institut für Informatik und Fraunhofer."— Presentation transcript:

1 Information Security of Embedded Systems 10.2.2010: Logics and Proof Methods, Wrap-Up Prof. Dr. Holger Schlingloff Institut für Informatik und Fraunhofer FIRST

2 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20102 Structure 1. Introductory example 2. Embedded systems engineering 1.definitions and terms 2.design principles 3. Foundations of security 1.threats, attacks, measures 2.construction of safe systems 4. Design of secure systems 1.design challenges 2.safety modelling and assessment 3.cryptographic algorithms 5. Communication of embedded systems 1.remote access 2.sensor networks 6. Algorithms and measures 1.digital signatures 2.key management 3.authentication 4.authorization 7. Formal methods for security 1.protocol verification 2.logics and proof methods 8. Wrap-Up

3 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20103 Integrated Security Design Systematic consideration of security goals  Requirements, specifications, models, code, test suites Model-based design  system model refined to implementation model  assertions as logical formulas  automated code generation  particularly well-suited for embedded systems Modelling formalisms  UML  Matlab/Simulink/Stateflow  automata

4 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20104 Example: Security Policies Lit.: M. McDougall, R. Alur, C. Gunter: A Model-Based Approach to Integrating Security Policies for Embedded Devices http://seclab.uiuc.edu/cgunter/dist/McDougallAG04.pdf  Security policies  Programmable payment cards  Policy automata, defeasible logic  Model checking and code generation

5 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20105 Security Policies Goal: restrict permission to certain transactions  e.g. money withdrawal only if account positive Stateful: result of a request may depend on previous decisions  e.g. the amount withdrawn during the current day Various stakeholders may impose different policies  e.g. employer: withdrawal only on business trips  e.g. parent: withdrawal only up to certain amount Non-monotonic: new policy may override previous policy  e.g. withdrawal always possible for “good customers” Problem: conflicting policies? Problem: correct implementation of policy?

6 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20106 Programmable Payment Cards Authorization of transactions according to a policy ROM, EEPROM on card, can be programmed (Java) After being issued card allows to add policies but not to remove them Policies provide boolean result, transaction is allowed iff approved by each of the policies

7 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20107 Defeasible Logic non-monotonic logic  new axioms not necessarily increase theory  efficient proof/disproval method -> Strict rule  always (necessarily) valid (cf. modal logic!)  (penguin -> ¬fly) “penguins don’t fly” => Defeasible rule  usually valid, but can be preempted by other information  (bird => fly) “birds can fly unless we have some reason to think otherwise” ~> Defeater rule  block the tentative conclusions of defeasible rules  (injured ~> ¬fly) will block the above rule ⊢yes, ⊬ ¬ yes  approval; ⊬ yes  disapproval; ⊢yes, ⊢ ¬ yes  conflict

8 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20108 Policy Automata Given T: transactions (e.g. ), D: votes (rules of defeasible logic) policy automaton A=(M, X, q 0, R, δ)  M: modes, X: variables, Q: states; q 0 : initial state  R: rules; R: Q x T  D  δ: transitions; δ: Q x T x {yes, no}  Q Policy model = set of policy automata  automata proceed simultaneously  depending on the approval outcome the “yes” or “no” transition is traversed  in case of conflict an error state is assumed

9 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 20109 Example: a Payment Card Policy P 3 : Allow up to 3 purchases per day P E : Guarantee payment to emergency services twice P cc : A cash card: spend no more than $500 total P N : No alcohol can be purchased P t : Prevent purchases of prescription drugs which conflict with the anti-depressant Tofranil

10 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201010 Example: a Payment Card Policy (1) P 3 : Allow up to 3 purchases per day

11 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201011 Example: a Payment Card Policy (2) P E : Guarantee payment to emergency services twice

12 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201012 Example: a Payment Card Policy (3) P cc : A cash card: spend no more than $500 total

13 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201013 Example: a Payment Card Policy (4) P N : No alcohol can be purchased

14 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201014 Example: a Payment Card Policy (5) P t : Prevent purchases of prescription drugs which conflict with the anti-depressant Tofranil

15 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201015 Complete Model

16 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201016 Analysis of Policy Models The following properties can be verified  reachability  conflict-freeness  redundancy of a policy Code generation is possible  translation of automata into Java card applets  adding of applets to a pre-configured card  defeasible logic engine in runtime environment Tool “Polaris”  graphical editor, analysis engine, code generator

17 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201017 Wrap-Up: What we have learned… Systematics of security  throughout the system’s functionality  throughout the system’s design  throughout the system’s operation Assess security goals, take measures  measures must be adapted to possibilities  measures may change over time Security for embedded needs special care  processing, energy, design challenges  mass-market, price, non-revocability, … Social processes and consequences

18 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201018 Further Topics Attacks on different levels  HW tampering, modifications  side channels Security testing  verification on all levels impossible  test case selection intrinsically hard Future: Intentious systems  autonomous agents  self-organizing systems Many things to research!

19 17.2.2010Embedded Security © Prof. Dr. H. Schlingloff 201019 Organizational Matters mündliche Prüfungen => Frau Heene  Prüfungen bitte noch in 2010! Bescheinigungen etc. => Mit ausgefülltem Formular direkt bei mir weitere Kurse  SS 2010: Grundlagen der Programmierung  WS 2010/11: Modellbasierte Entwicklung Schöne Semesterferien !!!


Download ppt "Information Security of Embedded Systems 10.2.2010: Logics and Proof Methods, Wrap-Up Prof. Dr. Holger Schlingloff Institut für Informatik und Fraunhofer."

Similar presentations


Ads by Google