Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June.

Slides:



Advertisements
Similar presentations
A Probabilistic Analysis of Onion Routing in a Black-box Model 10/29/2007 Workshop on Privacy in the Electronic Society Aaron Johnson (Yale) with Joan.
Advertisements

Slide 1 of 18 Uncertainty Representation and Reasoning with MEBN/PR-OWL Kathryn Blackmond Laskey Paulo C. G. da Costa The Volgenau School of Information.
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,
1 University of Southern California Keep the Adversary Guessing: Agent Security by Policy Randomization Praveen Paruchuri University of Southern California.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
A Brief Introduction to Bayesian Inference Robert Van Dine 1.
Uncertainty in Engineering The presence of uncertainty in engineering is unavoidable. Incomplete or insufficient data Design must rely on predictions or.
PROBABILISTIC COMPUTATION FOR INFORMATION SECURITY Piotr (Peter) Mardziel (UMD) Kasturi Raghavan (UCLA)
Ashish Kundu CS590F Purdue 02/12/07 Language-Based Information Flow Security Andrei Sabelfield, Andrew C. Myers Presentation: Ashish Kundu
DYNAMIC ENFORCEMENT OF KNOWLEDGE-BASED SECURITY POLICIES Piotr (Peter) Mardziel, Stephen Magill, Michael Hicks, and Mudhakar Srivatsa.
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
Mutual Information Mathematical Biology Seminar
Shuchi Chawla, Carnegie Mellon University Static Optimality and Dynamic Search Optimality in Lists and Trees Avrim Blum Shuchi Chawla Adam Kalai 1/6/2002.
Machine Learning CMPT 726 Simon Fraser University
Physics 310 Errors in Physical Measurements Error definitions Measurement distributions Central measures.
1 The Assumptions. 2 Fundamental Concepts of Statistics Measurement - any result from any procedure that assigns a value to an observable phenomenon.
The Calibration Process
Information Theory and Security
1 Advanced Smoothing, Evaluation of Language Models.
Information Theory and Security Prakash Panangaden McGill University First Canada-France Workshop on Foundations and Practice of Security Montréal 2008.
An Information Theory based Modeling of DSMLs Zekai Demirezen 1, Barrett Bryant 1, Murat M. Tanik 2 1 Department of Computer and Information Sciences,
Language-Based Information-Flow Security Richard Mancusi CSCI 297.
3.1 & 3.2: Fundamentals of Probability Objective: To understand and apply the basic probability rules and theorems CHS Statistics.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
General Principle of Monte Carlo Fall 2013 By Yaohang Li, Ph.D.
Containment and Integrity for Mobile Code Security policies as types Andrew Myers Fred Schneider Department of Computer Science Cornell University.
Trust-Aware Optimal Crowdsourcing With Budget Constraint Xiangyang Liu 1, He He 2, and John S. Baras 1 1 Institute for Systems Research and Department.
Chapter 4 Correlation and Regression Understanding Basic Statistics Fifth Edition By Brase and Brase Prepared by Jon Booze.
Inferring Decision Trees Using the Minimum Description Length Principle J. R. Quinlan and R. L. Rivest Information and Computation 80, , 1989.
Reasoning about Information Leakage and Adversarial Inference Matt Fredrikson 1.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 11: Bayesian learning continued Geoffrey Hinton.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
A Research Agenda for Scientific Foundations of Security David Evans University of Virginia NITRD Post-Oakland Program 25 May 2011 Artwork: Giacomo Marchesi.
LECTURE 14 TUESDAY, 13 OCTOBER STA 291 Fall
On joint modelling of random uncertainty and fuzzy imprecision Olgierd Hryniewicz Systems Research Institute Warsaw.
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Privacy vs. Utility Xintao Wu University of North Carolina at Charlotte Nov 10, 2008.
- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.
Evaluating VR Systems. Scenario You determine that while looking around virtual worlds is natural and well supported in VR, moving about them is a difficult.
MPS/MSc in StatisticsAdaptive & Bayesian - Lect 71 Lecture 7 Bayesian methods: a refresher 7.1 Principles of the Bayesian approach 7.2 The beta distribution.
Mix networks with restricted routes PET 2003 Mix Networks with Restricted Routes George Danezis University of Cambridge Computer Laboratory Privacy Enhancing.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University IEEE Computer Security Foundations Symposium July 17, 2010.
Univariate Gaussian Case (Cont.)
Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University RADICAL May 10, 2010.
1 Adversaries and Information Leaks Geoffrey Smith Florida International University November 7, 2007 TGC 2007 Workshop on the Interplay of Programming.
Information and Statistics in Nuclear Experiment and Theory - Introduction D. G. Ireland 16 November 2015 ISNET-3, ECT* Trento.
KNOWLEDGE-ORIENTED MULTIPARTY COMPUTATION Piotr (Peter) Mardziel, Michael Hicks, Jonathan Katz, Mudhakar Srivatsa (IBM TJ Watson)
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
REU 2009-Traffic Analysis of IP Networks Daniel S. Allen, Mentor: Dr. Rahul Tripathi Department of Computer Science & Engineering Data Streams Data streams.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Models as the representations of Boolean concepts Geoff Goodwin Princeton University.
Statistical Concepts Basic Principles An Overview of Today’s Class What: Inductive inference on characterizing a population Why : How will doing this allow.
ENTROPY Entropy measures the uncertainty in a random experiment. Let X be a discrete random variable with range S X = { 1,2,3,... k} and pmf p k = P X.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Bayesian Optimization. Problem Formulation Goal  Discover the X that maximizes Y  Global optimization Active experimentation  We can choose which values.
On triangular norms, metric spaces and a general formulation of the discrete inverse problem or starting to think logically about uncertainty On triangular.
PASSWORD tYPOS and How to Correct Them Securely R. Chatterjee, A. Athalye, D. Akhawe, A. Juels, T. Ristenpart To typo is human; to tolerate, divine.
Statistica /Statistics Statistics is a discipline that has as its goal the study of quantity and quality of a particular phenomenon in conditions of.
Bayesian analysis of a conceptual transpiration model with a comparison of canopy conductance sub-models Sudeep Samanta Department of Forest Ecology and.
Keep the Adversary Guessing: Agent Security by Policy Randomization
Bayes Net Learning: Bayesian Approaches
Probabilistic Data Management
Quantification of Integrity
Inference Concerning a Proportion
When are Fuzzy Extractors Possible?
When are Fuzzy Extractors Possible?
CS639: Data Management for Data Science
Reinforcement Learning (2)
Presentation transcript:

Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June 20, 2005

Clarkson et al.: Belief in Information Flow 2 Password Checker PWC: if p = g then a := 1 else a := 0 Some programs require leakage of information p : stored password g : guessed password a : authentication flag

Clarkson et al.: Belief in Information Flow 3 Password Checker PWC: if p = g then a := 1 else a := 0  a depends on p : Noninterference is too strong But intuitively secure because PWC leaks little information about p …

Clarkson et al.: Belief in Information Flow 4 Quantitative Information Flow Quantitative security policies: –Expected rate of flow is at most k bits per second –At most k bits leak in any execution Enforcing these requires a model for quantitative information flow (QIF) This work: A model for QIF

Clarkson et al.: Belief in Information Flow 5 Traditional Model for QIF Flow = Uncertainty(H in ) – Uncertainty(H in | L out ) Uncertainty measured with some variant of entropy [Denning 82; McIver and Morgan 03; Clark, Hunt, and Malacaria 05] S H in L in H out L out probability distributions Information flows when uncertainty is decreased

Clarkson et al.: Belief in Information Flow 6 Adding Beliefs to Model Model attacker’s uncertainty about H inputs as a probability distribution We call this distribution a belief S H in L in H out L out

Clarkson et al.: Belief in Information Flow 7 Analyzing PWC PWC: if p = g then a := 1 else a := 0 Attacker guesses A : g = A Attacker believes: p = A 0.98 B 0.01 C 0.01 After observing a = 0, attacker believes: p = A 0 B 0.5 C 0.5 (Password is really C)

Clarkson et al.: Belief in Information Flow 8 Analyzing PWC p = A 0.98 B 0.01 C 0.01 p = A 0 B 0.5 C 0.5 more uncertaintya little uncertainty PrebeliefPostbelief Information flows when uncertainty is decreased  Traditional metric: Uncertainty = closeness to uniform distribution

Clarkson et al.: Belief in Information Flow 9 Why Uncertainty Fails Uncertainty-based approach addresses objective probabilities on system but not subjective probabilities of attacker

Clarkson et al.: Belief in Information Flow 10 Metric for Belief p = A 0.98 B 0.01 C 0.01 prebelief p = A 0 B 0.5 C 0.5 postbelief p = A 0 B 0 C 1 reality d 1 > d 2 : postbelief closer to reality because of observation of program d1d1 d2d2

Clarkson et al.: Belief in Information Flow 11Accuracy Accuracy: Distance from a belief to reality Certainty Accuracy Accuracy is the correct metric for QIF p = A 0 B 0.5 C 0.5 postbelief p = A 0.98 B 0.01 C 0.01 prebelief

Clarkson et al.: Belief in Information Flow 12 Belief in Information Flow Experiment protocol Describes how attackers revise beliefs from observation of program execution Accuracy metric How change in accuracy of belief can be used to measure the amount of information flow Extensions Repeated experiments, other metrics, and misinformation

Clarkson et al.: Belief in Information Flow 13Experiments p = A 0.98 B 0.01 C 0.01 p = A 0 B 0.5 C 0.5 PrebeliefPostbelief Experiment: How an attacker revises his belief

Clarkson et al.: Belief in Information Flow 14 Experiment Protocol S H in L in H out L out preBpostB

Clarkson et al.: Belief in Information Flow 15 Experiment Protocol 1.Attacker chooses prebelief S H in L in H out L out preBpostB preB

Clarkson et al.: Belief in Information Flow 16 Experiment Protocol 1.Attacker chooses prebelief 2.System and attacker choose inputs H in, L in S H in L in H out L out preBpostB preB

Clarkson et al.: Belief in Information Flow 17 Experiment Protocol 1.Attacker chooses prebelief 2.System and attacker choose inputs H in, L in 3.System executes S and produces observation Execution modeled as a distribution transformer semantics: « S ¬ : Dist ! Dist S H in L in H out L out preBpostB preB observation

Clarkson et al.: Belief in Information Flow 18 Experiment Protocol 1.Attacker chooses prebelief 2.System and attacker choose inputs H in, L in 3.System executes S and produces observation 4.Attacker conducts thought-experiment to obtain prediction preBpostB preB S L in H in L in H out L out observation

Clarkson et al.: Belief in Information Flow 19 H in Experiment Protocol 4.Attacker conducts thought-experiment to obtain prediction S L in H’ out L’ out prediction H in preB

Clarkson et al.: Belief in Information Flow 20 Experiment Protocol 1.Attacker chooses prebelief 2.System and attacker choose inputs H in, L in 3.System executes S and produces observation 4.Attacker conducts thought-experiment to obtain prediction 5.Attacker infers postbelief: preBpostB preB S observation H in L in S preB L in prediction postB prediction = | observation Bayesian inference

Clarkson et al.: Belief in Information Flow 21 Belief Revision in PWC postB = prediction | observation p,a = A,0 0 A, B, B,1 0 C, C,1 0 a = 0 | = ´ p = A 0 B 0.5 C 0.5 p = A 0.98 B 0.01 C 0.01 preB =

Clarkson et al.: Belief in Information Flow 22 Accuracy Metric Amount of flow Q is improvement in accuracy of belief i.e. (initial error) – (final error) Q = D(preB ! H in ) – D(postB ! H in ) preBpostB H in D(preB ! H in ) D(postB ! H in )

Clarkson et al.: Belief in Information Flow 23 Belief Distance Unit is (information theoretic) bits Relative entropy: When D is defined as relative entropy, Q is the amount of information that the attacker’s observation contains about the high input.

Clarkson et al.: Belief in Information Flow 24 Amount of Flow from PWC Q = D( .98,.01,.01  !  0, 0, 1  ) – D(  0,.5,.5  !  0, 0, 1  ) = 5.6 bits Information is in the eye of the beholder Max leakage of lg 3 bits implies uniform prebelief: p = A,B,C 1/3 each  0, 0, 1  1/3, 1/3, 1/3  1.6 bits .98,.01,.01  5 bits  0,.5,.5 .6 bits

Clarkson et al.: Belief in Information Flow 25 Repeated Experiments Experiment protocol is compositional B Exp B’ Exp B’’ Information flow is also compositional The amount of information flow over a series of experiments is equal to the sum of the amount of information flow in each individual experiment.

Clarkson et al.: Belief in Information Flow 26 Extensions of Metric So far: exact flow for a single execution Extend to: {Expected, maximum} amount of information flow {for a given experiment, over all experiments} Language for quantitative flow policies

Clarkson et al.: Belief in Information Flow 27 Non-probabilistic programs cannot create misinformation.Misinformation Certainty Accuracy ? FPWC: if p = g then a := 1 else a := 0; if random () <.1 then a := !a

Clarkson et al.: Belief in Information Flow 28Summary Attackers have beliefs Quantifying information flow with beliefs requires accuracy –Traditional uncertainty model is inappropriate Presented more expressive, fine-grained model of quantitative information flow –Compositional experiment protocol –Probabilistic language semantics –Accuracy-based metric

Clarkson et al.: Belief in Information Flow 29 Related Work Information theory in information flow –[Denning 82], [Millen 87], [Wittbold, Johnson 90], [Gray 91] Quantitative information flow –Using uncertainty: [Lowe 02], [McIver, Morgan 03], [Clark, Hunt, Malacaria ] –Using sampling theory: [Di Pierro, Hankin, Wiklicky ] Database privacy –Using relative entropy: [Evfimievski, Gehrke, Srikant 03]

Clarkson et al.: Belief in Information Flow 30 Future Work Extended programming language Lattice of security levels Static analysis Quantitative security policies

Belief in Information Flow Michael Clarkson, Andrew Myers, Fred B. Schneider Cornell University 18 th IEEE Computer Security Foundations Workshop June 20, 2005

Clarkson et al.: Belief in Information Flow 32 Extra Slides

Clarkson et al.: Belief in Information Flow 33 Beliefs as Distributions Other choices: Dempster-Shafer belief functions, plausibility measures, etc. Probability distributions are: –Quantitative –Axiomatically justifiable –Straightforward –Familiar Abstract operations –Product: combine disjoint beliefs –Update: condition belief to include new information –Distance: quantification of difference between two beliefs

Clarkson et al.: Belief in Information Flow 34 Program Semantics

Clarkson et al.: Belief in Information Flow 35 Postbeliefs are Bayesian Standard techniques for inference in applied statistics –Bayesian inference –Sampling/frequentist theory Bayesian inference: –Formalization of scientific method –Consistent with principles of rationality in a betting game May be subjective but is not arbitrary

Clarkson et al.: Belief in Information Flow 36 Interpreting Flow Quantities Uncertainty (entropy) interpreted as: –Improvement of expected codeword length Accuracy (relative entropy) interpreted as: –Improvement in efficiency of optimal code