Safety in Discretionary Access Control for Logic-based Publish-subscribe Systems Kazuhiro Minami, Nikita Borisov, and Carl A. Gunter University of Illinois.

Slides:



Advertisements
Similar presentations
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Advertisements

Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
Secure Context-sensitive Authorization Kazuhiro Minami and David Kotz Dartmouth College.
Polynomial-time reductions We have seen several reductions:
Complexity class NP Is the class of languages that can be verified by a polynomial-time algorithm. L = { x in {0,1}* | there exists a certificate y with.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
SAT and Model Checking. Bounded Model Checking (BMC) A.I. Planning problems: can we reach a desired state in k steps? Verification of safety properties:
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Methods of Proof Chapter 7, second half.
Analysis of Algorithms CS 477/677
Knoweldge Representation & Reasoning
Chapter 11 Limitations of Algorithm Power Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 22 Instructor: Paul Beame.
Proof by Deduction. Deductions and Formal Proofs A deduction is a sequence of logic statements, each of which is known or assumed to be true A formal.
ASP vs. Prolog like programming ASP is adequate for: –NP-complete problems –situation where the whole program is relevant for the problem at hands èIf.
Themes of Presentations Rule-based systems/expert systems (Catie) Software Engineering (Khansiri) Fuzzy Logic (Mark) Configuration Systems (Sudhan) *
Inference is a process of building a proof of a sentence, or put it differently inference is an implementation of the entailment relation between sentences.
Logics for Data and Knowledge Representation Propositional Logic: Reasoning Originally by Alessandro Agostini and Fausto Giunchiglia Modified by Fausto.
Boolean Satisfiability and SAT Solvers
Understanding PML Paulo Pinheiro da Silva. PML PML is a provenance language (a language used to encode provenance knowledge) that has been proudly derived.
The Complexity of Optimization Problems. Summary -Complexity of algorithms and problems -Complexity classes: P and NP -Reducibility -Karp reducibility.
Space Complexity. Reminder: P, NP classes P NP is the class of problems for which: –Guessing phase: A polynomial time algorithm generates a plausible.
Knowledge Representation Use of logic. Artificial agents need Knowledge and reasoning power Can combine GK with current percepts Build up KB incrementally.
Pattern-directed inference systems
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
CSCI 2670 Introduction to Theory of Computing November 29, 2005.
CSCI 2670 Introduction to Theory of Computing December 1, 2004.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
1 Lower Bounds Lower bound: an estimate on a minimum amount of work needed to solve a given problem Examples: b number of comparisons needed to find the.
Illinois Security Lab Privacy Sensitive Location Information Systems in Smart Buildings Jodie P. Boyer, Kaijun Tan, Carl A. Gunter Midwest Security Workshop,
Confidentiality-preserving Proof Theories for Distributed Proof Systems Kazuhiro Minami National Institute of Informatics FAIS 2011.
Scalability in a Secure Distributed Proof System Kazuhiro Minami and David Kotz May 9, 2006 Institute for Security Technology Studies Dartmouth College.
Lightweight Consistency Enforcement Schemes for Distributed Proofs with Hidden Subtrees Adam J. Lee, Kazuhiro Minami, and Marianne Winslett University.
CSCI 3160 Design and Analysis of Algorithms Tutorial 10 Chengyu Lin.
Simultaneously Learning and Filtering Juan F. Mancilla-Caceres CS498EA - Fall 2011 Some slides from Connecting Learning and Logic, Eyal Amir 2006.
Single-bit Re-encryption with Applications to Distributed Proof Systems Nikita Borisov and Kazuhiro Minami University of Illinois at Urbana-Champaign.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
LDK R Logics for Data and Knowledge Representation Propositional Logic: Reasoning First version by Alessandro Agostini and Fausto Giunchiglia Second version.
NP-Complete Problems. Running Time v.s. Input Size Concern with problems whose complexity may be described by exponential functions. Tractable problems.
CS6133 Software Specification and Verification
Design and Analysis of Algorithms - Chapter 101 Our old list of problems b Sorting b Searching b Shortest paths in a graph b Minimum spanning tree b Primality.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Reasoning about the Behavior of Semantic Web Services with Concurrent Transaction Logic Presented By Dumitru Roman, Michael Kifer University of Innsbruk,
Solving the Logic Satisfiability problem Solving the Logic Satisfiability problem Jesus De Loera.
NPC.
9/18/2000copyright Brian Williams1 Propositional Logic Brian C. Williams J/6.834J October 10, 2001.
1 An infrastructure for context-awareness based on first order logic 송지수 ISI LAB.
Daniel Kroening and Ofer Strichman 1 Decision Procedures An Algorithmic Point of View Basic Concepts and Background.
Space Complexity. Reminder: P, NP classes P is the class of problems that can be solved with algorithms that runs in polynomial time NP is the class of.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Pancakes, Puzzles, and Polynomials: Cracking the Cracker Barrel Game Christopher Frost Michael Peck.
Complexity ©D.Moshkovitz 1 Our First NP-Complete Problem The Cook-Levin theorem A B C.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
EA C461 Artificial Intelligence
8.1 Determine whether the following statements are correct or not
P & NP.
Dr. Rachel Ben-Eliyahu – Zohary
Elementary Metamathematics
Binary Decision Diagrams
Logics for Data and Knowledge Representation
Resolution Proofs for Combinational Equivalence
NP-Completeness Yin Tat Lee
4-9问题的形式化描述.
Methods of Proof Chapter 7, second half.
Trevor Brown DC 2338, Office hour M3-4pm
Logical Agents Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

Safety in Discretionary Access Control for Logic-based Publish-subscribe Systems Kazuhiro Minami, Nikita Borisov, and Carl A. Gunter University of Illinois at Urbana-Champaign

Aggregation in Publish-subscribe (pub-sub) system Pub-sub system Location-tracker application Intelligent building Management system Sensors Location sensors Motion sensors Door sensors Publish high-level events derived from raw sensor data Eliminate duplicate tasks from multiple subscribers Location event Aggregation

Deriving high-level events based on logic Represent events as logical statements Maintains event derivation rules in Datalog Derive high-level events in a bottom-up way PublisherSubscriber occupied(L) ← location(P, L) Knowledge base Inference engine Location(bob, room10) occupied(room10) Publish-subscribe system

Events in pervasive environments contain users’ private information Concern with location privacy Combination of low-level sensor data could reveal types of user activities (i.e., high-level events) – E.g., power usage in a household

Protection with discretionary access control (DAC) policies is a good start A pub-sub system defines discretionary access control policies dacl: E → 2 P where: – E is a set of events that a pub-sub system could maintain – P is a set of subscriber principals Event e is protected with an access control list dacl(e) – E.g., dacl(location(alice, L)) = {bob, dave}

I However, a malicious subscriber could learn confidential events through inferences Pub-sub system PS[E, I, dacl] Subscriber (Tom) OR dacl(location(P,L)) = ϕ dacl(occupied(L)) = {Tom} Knows PS’s derivation rules I and DACL policies dacl

But, an adversary could learn confidential events through inferences Pub-sub system PS[E, I, dacl] Subscriber (Tom) dacl = {Tom} OR Infer AND dacl = {Tom} I dacl = ϕ

Our approach Additional protection with operational discretionary access control (OACL) policies oacl: E → 2 P such that: – Subscriber p i receives event e iff p i ∈ oacl(e) – For every event e: oacl(e) ⊆ dacl(e) Events DACL policies OACL policies Subscriber Access on event e denied Access on event e granted Question: Is system PS[E, I, dacl, oacl] safe w.r.t. subscriber p i ? Question: Is system PS[E, I, dacl, oacl] safe w.r.t. subscriber p i ? I infer the truth of e’

Outline Safety definition based on nondeducibility Safety verification algorithm and its complexity analysis Experiments with a SAT solver Conclusion

Nondeducibility considers information flow between two information functions regarding system configuration Events E PS ⊆ E PS[E, I, dacl, oacl] Non-confidential events that subscriber p i receives Function v 1 : 2 E → 2 E v 1 (E PS ) = {e | e ∈ E PS ∧ p i ∈ oacl(e)} Confidential events that subscriber p i is NOT authorized to receive Function v 2 : 2 E → 2 E v 2 (E PS ) = {e | e ∈ E PS ∧ p i ∉ dacl(e)} Information flow

Safety definition A pub-sub system PS[E, I, dacl, oacl] is safe if ∀ E PS ⊆ E ∀ e ∈ E where p i ∉ dacl(e) ∃ E’ PS and E’’ PS such that: 1. v 1 (E PS ) = v 1 (E’ PS ) = v 1 (E’’ PS ) 2. e ∈ v 2 (E’ PS ) 3. e ∉ v 2 (E’’ PS )

Example E = {loc(bob, bldg12), loc(alice, blde12), occupied(bldg12)} I = {occupied(B) ← loc(P, B)} dacl(loc(P, bldg)) = Φ, dacl(occupied, bldg12) = {dave} oacl(loc(P, bldg)) = Φ, oacl(occupied, bldg12) = {dave} E PS = {loc(bob, bldg12), occupied(bldg12)} 2E2E Events dave receives Events that should be protected from dave {occupied(bldg12)} {loc(bob, bldg12)} v1v1 v2v2 {loc(alice, bldg12)} E’ PS = {loc(alice, bldg12), occupied(bldg12)} PS

Outline Safety definition based on nondeducibility Safety verification algorithm and its complexity analysis Experiments with a SAT solver Conclusion

We represent a subscriber’s inferences with s-inference rules Represent a subscriber’s inferences with three-value logic with the function val: E → {T, F, U} where: T is known to be true F is known to be false U is unknown Capture both bottom-up and top-down inferences regarding a system’s derivation rules I

Bottom-up inferences Consider an derivation rule: e ← e 1, …, e n (Bottom-up-T) If a subscriber knows that events e 1,…,e n is true, then he knows e is also true. (Bottom-up-F) If a subscriber knows that some event e i is false, then he knows e is also false.

Top-down inferences Consider a set of derivation rules: (Top-down-T) If a subscriber knows that event e is true, then he knows there is some e i which is true. (Top-down-F) If a subscriber knows that event e is false, then he knows every e i is false.

Verification algorithm with s-inference rules 1.For each T/F assignment A: {e | p i ∈ oacl(e)} → {T, F}, do the following: 1)Compute a fixpoint from the initial state defined by A by applying s-inference rules 2)If there is event e ∈ E such that val(e) ≠ U and p i ∉ dacl(e), return FALSE 2. Return TRUE VerifySafety(E, I, dacl, oacl, p i )

Analysis of verification algorithm Sound and complete: – The algorithm returns TRUE if and only if a pub- sub system PS[E, I, dacl, oacl] is safe w.r.t. subscriber p i. Running time is exponential because we need to check all the possible truth assignments to non-confidential events

Complexity analysis UNSAFE = {(PS[E, I, dacl, oacl), p i ) VerifySafety(E, dacl, oacl, I, pi) = FALSE} UNSAFE is in NP-complete; that is: 1.UNSAFE is in NP 2.3-CNF-SAT is polynomially reducible to UNSAFE

Basic idea: construct PS such that a confidential event s is known when formula Φ is satisfiable Φ= (x 1 ∨ ¬x 2 ∨ ¬x 3 ) ∧ (¬x 1 ∨ x 2 ∨ x 3 ) y1y1 y2y2 S (≡y 1 ∧ y 2 ) SAT PS y 1 ← x 1 y 1 ← nx 2 y 1 ← nx 3 y 2 ← nx 1 y 2 ← x 2 y 2 ← x 3 S ← y 1, y 2 Either x 1, nx 2, or nx 3 is known to be true Either nx 1, x 2, or x 3 is known to be true y 1 is known to true y 2 is known to true (Bottom-up-T) s is known be true Must be consistent val(x 1 ) = T iff val(nx 1 ) = F

Truth assignment must be consistent x 1 ← nx 1, z 1 x 1 ← u 1, z’ 1 x 1 is known to be true x 1 and nx 1 are consistent iff u 1 is known to be true val(nx 1 ∧ z 1 ) = T or val(u 1 ∧ z’ 1 ) = T (Top-down-T) nx 1 is known to be false (S5) u 1 is known to be true S ← y 1 ∧ y 2 ∧ u 1 ∧.. s is known y 1, y 2 are known to be true p i ∈ dacl(u 1 ) p i ∉ oacl(u 1 ) p i ∈ dacl(u 1 ) p i ∉ oacl(u 1 )

Experiments with a SAT solver Convert PS[E, I, dacl, oacl] into a SAT formula Φ j such that there is a safety violation w.r.t. principal p j iff Φ j is satisfiable Encode in Φ j a sequence of s-inference rule applications leading to a safety violation Measure latency for solving converted SAT problems using SAT4J SAT solver

Latency results #events #rules Parameters

Conclusion Define safety in a logic-based pub-sub system formally Capture a subscriber’s inferences with a set of s-inference rules Prove that the safety problem is in co-NP- complete Show the feasibility of safety verification with moderate number of events and rules using a SAT solver

Any questions?