Secure Information Flow for Reactive Programming Paradigm Zhengqin Luo 2016-7-71SAFA workshop 2009.

Slides:



Advertisements
Similar presentations
Security of Multithreaded Programs by Compilation Tamara Rezk INDES Project, INRIA Sophia Antipolis Mediterranee Joint work with Gilles Barthe, Alejandro.
Advertisements

ALGORITHMS - PART 2 CONDITIONAL BRANCH CONTROL STRUCTURE
Compilation 2011 Static Analysis Johnni Winther Michael I. Schwartzbach Aarhus University.
Lecture 8: Three-Level Architectures CS 344R: Robotics Benjamin Kuipers.
© S. Ramesh / Kavi Arya / Krithi Ramamritham IT-606 Embedded Systems (Software) S. Ramesh Kavi Arya Krithi Ramamritham KReSIT/ IIT Bombay.
Silberschatz, Galvin and Gagne ©2009 Operating System Concepts – 8 th Edition, Chapter 6: Process Synchronization.
Process Synchronization. Module 6: Process Synchronization Background The Critical-Section Problem Peterson’s Solution Synchronization Hardware Semaphores.
27 th Oct 2003 Checking Secure Interactions of Smart Card Applets: extended version P. Bieber, J. Cazin, P. Girard, J. –L. Lanet, V. Wiels, and G. Zanon.
Ashish Kundu CS590F Purdue 02/12/07 Language-Based Information Flow Security Andrei Sabelfield, Andrew C. Myers Presentation: Ashish Kundu
Language-based Security: Information Flow Control 18739A: Foundations of Security and Privacy Anupam Datta Fall 2009.
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Denotational Semantics Syntax-directed approach, generalization of attribute grammars: –Define context-free abstract syntax –Specify syntactic categories.
Software modeling for embedded systems: static and dynamic behavior.
1 Enforcing Confidentiality in Low-level Programs Andrew Myers Cornell University.
Concurrency CS 510: Programming Languages David Walker.
Robust Declassification Steve Zdancewic Andrew Myers Cornell University.
CS 711 Fall 2002 Programming Languages Seminar Andrew Myers 2. Noninterference 4 Sept 2002.
Pseudocode.
Pseudocode.
Automatic Implementation of provable cryptography for confidentiality and integrity Presented by Tamara Rezk – INDES project - INRIA Joint work with: Cédric.
Timed UML State Machines Ognyana Hristova Tutor: Priv.-Doz. Dr. Thomas Noll June, 2007.
15-740/ Oct. 17, 2012 Stefan Muller.  Problem: Software is buggy!  More specific problem: Want to make sure software doesn’t have bad property.
JIT in webkit. What’s JIT See time_compilation for more info. time_compilation.
Containment and Integrity for Mobile Code Security policies as types Andrew Myers Fred Schneider Department of Computer Science Cornell University.
JAVA SERVER PAGES. 2 SERVLETS The purpose of a servlet is to create a Web page in response to a client request Servlets are written in Java, with a little.
Proof Carrying Code Zhiwei Lin. Outline Proof-Carrying Code The Design and Implementation of a Certifying Compiler A Proof – Carrying Code Architecture.
Lecture 2 Foundations and Definitions Processes/Threads.
Pseudocode. Simple Program Design, Fourth Edition Chapter 2 2 Objectives In this chapter you will be able to: Introduce common words, keywords, and meaningful.
Pseudocode Simple Program Design Third Edition A Step-by-Step Approach 2.
Looping and Counting Lecture 3 Hartmut Kaiser
© S. Ramesh / Kavi Arya / Krithi Ramamritham 1 IT-606 Embedded Systems (Software) S. Ramesh Kavi Arya Krithi Ramamritham KReSIT/ IIT Bombay.
Using a simple Rendez-Vous mechanism in Java
Theory of Programming Languages Introduction. What is a Programming Language? John von Neumann (1940’s) –Stored program concept –CPU actions determined.
Thread basics. A computer process Every time a program is executed a process is created It is managed via a data structure that keeps all things memory.
A Lattice Model of Secure Information Flow By Dorothy E. Denning Presented by Drayton Benner March 22, 2000.
/ PSWLAB Thread Modular Model Checking by Cormac Flanagan and Shaz Qadeer (published in Spin’03) Hong,Shin Thread Modular Model.
Types and Programming Languages Lecture 3 Simon Gay Department of Computing Science University of Glasgow 2006/07.
Prof. Necula CS 164 Lecture 171 Operational Semantics of Cool ICOM 4029 Lecture 10.
Language-Based Information- Flow Security (Sabelfeld and Myers) “Practical methods for controlling information flow have eluded researchers for some time.”
CS 162 Discussion Section Week 2. Who am I? Prashanth Mohan Office Hours: 11-12pm Tu W at.
Language-Based Security: Overview of Types Deepak Garg Foundations of Security and Privacy October 27, 2009.
Tutorial 2: Homework 1 and Project 1
An Operational Approach to Relaxed Memory Models
CSE341: Programming Languages Lecture 11 Type Inference
Static Detection of Cross-Site Scripting Vulnerabilities
G.Anuradha Reference: William Stallings
Memory Consistency Models
Memory Consistency Models
Paper Reading Group:. Language-Based Information-Flow Security. A
Threads, SMP, and Microkernels
Program Design Introduction to Computer Programming By:
Information Security CS 526
CSE341: Programming Languages Lecture 11 Type Inference
CSE341: Programming Languages Lecture 11 Type Inference
Background and Motivation
Grades.
Basics of Recursion Programming with Recursion
Information Security CS 526
Introduction to Data Structure
ECE 352 Digital System Fundamentals
Non-preemptive Semantics for Data-race-free Programs
CSE 153 Design of Operating Systems Winter 19
CSE341: Programming Languages Lecture 11 Type Inference
Chapter 6: Synchronization Tools
Threads and Multithreading
Foundations and Definitions
Programming with Shared Memory Specifying parallelism
Information Security CS 526
CSE341: Programming Languages Lecture 11 Type Inference
CSE341: Programming Languages Lecture 11 Type Inference
Presentation transcript:

Secure Information Flow for Reactive Programming Paradigm Zhengqin Luo SAFA workshop 2009

What is this talk about?  Secure information flow problem  Programs can access confidential information  Some of programs’ behaviors are publicly observation  Reactive Programming Paradigm  Deterministic concurrency, cooperative scheduling  Synchronized threads, signals, suspensions, preemptions  Controlling information flow for reactive programs  How information can be deliberately leaked?  How to prevent insecure flows?  Dynamic and static enforcement SAFA workshop 2009

Secure information flow problem  Programs interact with confidential information  Non-interference property  Inputs/outputs are classified by secret and public  Differs in secret inputs SHOULD NOT result in different public output  A simplified lattice model: L (public), H (secret) P P Credit card No., price Other cookies Auction.com Steal-your-informaion.com P P Secret Public Secret Public SAFA workshop 2009

Typical insecure programs  High-order imperative ML notion  u L is public; v H is confidential  Direct flow u L :=!v H  Indirect flow If !v H then u L :=0 else u L :=1  Indirect flow with thread creation If !v H then (thread (u L :=0)) else (thread (u L :=1))  Caution! Carefully check your programs! SAFA workshop 2009

Reactive Programming Paradigm - Informal  Based on high-order imperative ML  An reactive machine contains  Environment ξ : a set of emitted signals  Thread pool :  Reactive construct  (emit s)  (when s do N) - suspension  (watch s do N) - preemption  Cooperative scheduling (deterministic)  For example: a round-robin one  Computations are divided by instants SAFA workshop 2009

Reactive Programming Paradigm - Informal * * Suspension: t1’ is waiting for some signal s, which is not in ξ1, (when s do M) A cooperative round-robin scheduler All threads are suspended Instant transition: perform preemption t=(watch s do M) => t=(), if s is in ξ2 Instant 1 * Instant 2 Reset signal environment SAFA workshop 2009

Insecure examples (1/2) – suspension  T1: If !h then (emit s) else (); when s do (l:=1)  T2: l:=2 ; emit s  h= true in the initial memory T1 || T2, Ø → when s do (l:=1) || l:=2 ; emit s, {s} → when s do () || l:=2 ; emit s, {s}(l:=1) → () || l:=2 ; emit s, {s} → () || emit s, {s}(l:=2) → () || (), {s} SAFA workshop 2009

Insecure examples (1/2) – suspension  T1: If !h then (emit s) else (); when s do (l:=1)  T2: l:=2 ; emit s  h= false in the initial memory T1 || T2, Ø → when s do (l:=1) || l:=2 ; emit s, Ø → when s do (l:=1) || emit s, Ø (l:=2) → when s do (l:=1) || (), {s} → when s do () || (), {s} (l:=1) → () || (), {s} SAFA workshop 2009

Insecure examples (1/2) – suspension  T1: If !h then (emit s) else (); when s do (l:=1)  T2: l:=2 ; emit s  T1 || T2 is not secure by non-interference  Key observation  Signals carry information  Testing signals gets information  Suspension may change the order of observable actions SAFA workshop 2009

Insecure programs (2/2) – preemption  T1: (watch s do (when t do (l:=1)))  T2: l:=2; if !h then (emit s) else (); pause; emit t  h=false in the initial memory T1 || T2, Ø → T1 || T2, Ø → T1 || if !h then (emit s) else (); pause; emit t, Ø (l:=2) → (watch s do (when t do (l:=1))) || pause; emit t, Ø → (watch s do (when t do (l:=1))) || () ; emit t, Ø → (watch s do (when t do (l:=1))) || (), {t} → (watch s do (when t do ())) || (), {t} (l:=1) → () || (), {t} SAFA workshop 2009

Insecure programs (2/2) – preemption  T1: (watch s do (when t do (l:=1)))  T2: l:=2; if !h then (emit s) else (); pause; emit t  h=true in the initial memory T1 || T2, Ø → T1 || T2, Ø → T1 || if !h then (emit s) else (); pause; emit t, Ø (l:=2) → (watch s do (when t do (l:=1))) || pause; emit t, {s} → () || () ; emit t, Ø → () || (), {t} SAFA workshop 2009

Insecure programs (2/2) – preemption  T1: (watch s do (when t do (l:=1)))  T2: l:=2; if !h then (emit s) else (); pause; emit  T1 || T2 is not secure by non-interference  Key observation  Preemption may change whether an observable actions is executed or not SAFA workshop 2009

Our solution  Consider signal environment as part of the inputs/outputs (memory)  s= true s is emitted  Classify signals with security level  s L, s H, t L, t H …  Then some of the programs should be rejected  if !x H then (emit s L ) else (emit t L )  when s H do x L :=1;  watch s H do (…(x L :=1)...)  IS THAT ALL? SAFA workshop 2009

Our solutions – more subtle case  Recall the insecure suspension example  T1: If !h then (emit s) else (); when s do (l:=1)  T2: l:=2 ; emit s  An slightly modified version  T1: If !h then (emit s) else (); when s do (); l:=1  T2: l:=2 ; emit s  Observation: suspension construct will not only possible to reorder its body, but also all the computation afterward.  We should also reject  (when s H do …);…;x L :=1; SAFA workshop 2009

Secure Information flow as a safety property  Non-interference, bisimulation  Comparing two execution of programs, a non-stand property  Intuitive notion of secure information flow  “one should not put in a public location a value elaborated using confidential information”,[DD77]  A monitoring semantics  Keep track of the level of information gained along the computation  Run-time error: trying to assign to public location when the level recorded is more confidential  Secure information as a safety property [Bou08]  No such error occurs! SAFA workshop 2009

A monitoring semantics  We extend the standard semantics with dynamic checks for insecure information flow  To keep track of information, each thread are extend by two variables (independent for each thread) t=(pc,cur,M) where  pc,cur ∈ {L,H}  pc stands for the level of information that will affect the functional and imperative behavior of the thread  cur stands for the level of information that will affect the reactive behavior of the thread SAFA workshop 2009

How pc is manipulated - example  The same as in [Bou08]  Example:  (L,L, (if !x H then y H := 1 else ()); z L :=1)  → (L,L, (if !x H then y H := 1 else ()); (z L :=1) L )  → (H,L, (if tt then y H := 1 else ()); (z L :=1) L )  → (H,L, (y H := 1); (z L :=1) L )  → (H,L, (); (z L :=1) L )  → (L,L, z L :=1)  Similar cases for other constructs Security check happens here, if it is (y L :=1), the check will fail pc is forgotten pc is transmitted pc is increased SAFA workshop 2009

How cur is manipulated  Testing a signal by when (the case by watch is similar)  (pc,cur,when s l do N) → (pc,pc\/cur\/l, when do N)  Why pc is added to cur?  Examples:  (L, L, if !x H then (when s L do N) else (when t L do N))  → (H, L, if tt then (when s L do N) else (when t L do N))  → (H, L, (when s L do N) )  → (H, H, (when do N) )  But cur is never forgotten!  Compare (when s do N) and (when s do ());N  They have essentially the same effect to reorder instructions in case of suspension SAFA workshop 2009

Is the monitoring secure enough?  Consider this example:  If u H then when s H do () else (); !x L () || x L := λ_.(); emit s L  where x L = λ_.(y L :=1), and y L :=0 initially  It is safe by the monitoring semantics  If u H then when s H do () else (); !x L () || x L := λ_.(); emit s L  Why is that?  Monitoring only learns information in one branch of conditional constructs. ①②③④⑤ ①②③④⑤ SAFA workshop 2009

A type directed translation  An special programming construct (P M) for the target language  In the standard semantics  M->M’ => (P M) → (P M’)  (P V) → V  In the monitoring semantics  (pc,cur,(P V)) → (pc,pc\/cur, V) SAFA workshop 2009

A type directed translation  How does the translation work?  Suppose we have (if !h then M else N)  and M will suspend while N will not  The translation is  (if !h then M else (P N))  The subtle example again  Remember (pc,cur,(P V)) → (pc,pc\/cur, V)  If u H then when s H do () else (P ()); !x L () || x L := λ_.(); emit s L ①②③④⑤ ①②③ SAFA workshop 2009

A particular reactive programming  High-order imperative language with reactive construct  An reactive machine is M =[μ,ξ,t,T]  μ is the store, ξ is the signal environment  t=(i,M) is the current running thread  T is the thread pool  M,N ::= V | (if M then N else N’) | (MN) | (M;N) | (ref M) | (!M) | (M:=N) | (thread M) | (sig) | (emit M) | | (when M do N) | (watch M do N)  V ::= λxM | s l | u l | tt | ff | () SAFA workshop 2009

Security enforced by dynamic monitoring  Definition: Progress-insensitive security [SR09]  A symmetric relation R on reactive machines  M 1 =[μ 1,ξ 1,t 1,T 1 ] R M 2 =[μ 2,ξ 2,t 2,T 2 ] , iff  1) μ 1 = L μ 2, ξ 1 = L ξ 2  2) if M 1 → M’ 1, either there exists M 2 → * M’ 2,such that M’ 1 R M’ 2  or for all M’ 2 =[μ’ 2,ξ’ 2,t’ 2,T’ 2 ] such that M 2 → * M’ 2, μ’ 2 = L μ’ 2, ξ’ 1 = L ξ’ 2  Definition: Safe programs  Does not run into security error in monitoring semantics  Our results (informal)  Theorem: For every programs M that its translation M’ is safe, then for every μ 1 = L μ 2, ξ 1 = L ξ 2, we have [μ 1,ξ 1,(0,M’), Ø] R [μ 2,ξ 2, (0,M’), Ø] SAFA workshop 2009

A type and effect system for safety  We also designed a type and effect system for a sound analysis of safety property  A fairly standard one extend the one for high-order language  Our results  Lemma: If a program is typable then its translation is safe  Lemma: The semantics of typable program is almost identical to its translation in the monitoring semantics  Theorem: Typable programs itself satisfy progress- insensitive security SAFA workshop 2009

Conclusion  Controlling information flow in reactive programs  Dynamically checking information – A monitoring semantics  Does not imply any security property  A simple type-directed translation  If translated program is safe, then it satisfy progress- insensitive security  A type safety result  Type-checked program => no need for monitoring SAFA workshop 2009

 Thank you!  Q & A SAFA workshop 2009