CAUSAL INFERENCE IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (www.cs.ucla.edu/~judea)

Slides:



Advertisements
Similar presentations
Department of Computer Science
Advertisements

RELATED CLASS CS 262 Z – SEMINAR IN CAUSAL MODELING CURRENT TOPICS IN COGNITIVE SYSTEMS INSTRUCTOR: JUDEA PEARL Spring Quarter Monday and Wednesday, 2-4pm.
Random Assignment Experiments
The World Bank Human Development Network Spanish Impact Evaluation Fund.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
ASSESSING CAUSAL QUANTITIES FROM EXPERIMENTAL AND NONEXPERIMENTAL DATA Judea Pearl Computer Science and Statistics UCLA
CAUSES AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
1 THE SYMBIOTIC APPROACH TO CAUSAL INFERENCE Judea Pearl University of California Los Angeles (
TRYGVE HAAVELMO AND THE EMERGENCE OF CAUSAL CALCULUS Judea Pearl University of California Los Angeles (
The Fundamental Problem of Causal Inference Confounds and The Fundamental Problem of Causal Inference Probabilistic vs. Deterministic Causality Four Criteria.
THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering An Introduction to Pearl’s Do-Calculus of Intervention Marco Valtorta Department.
COMMENTS ON By Judea Pearl (UCLA). notation 1990’s Artificial Intelligence Hoover.
Appendix to Chapter 1 Mathematics Used in Microeconomics © 2004 Thomson Learning/South-Western.
Appendix to Chapter 1 Mathematics Used in Microeconomics © 2004 Thomson Learning/South-Western.
Chapter 51 Experiments, Good and Bad. Chapter 52 Experimentation u An experiment is the process of subjecting experimental units to treatments and observing.
Judea Pearl University of California Los Angeles CAUSAL REASONING FOR DECISION AIDING SYSTEMS.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Sampling Theory Determining the distribution of Sample statistics.
1 CAUSAL INFERENCE: MATHEMATICAL FOUNDATIONS AND PRACTICAL APPLICATIONS Judea Pearl University of California Los Angeles (
SIMPSON’S PARADOX, ACTIONS, DECISIONS, AND FREE WILL Judea Pearl UCLA
CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
1 REASONING WITH CAUSES AND COUNTERFACTUALS Judea Pearl UCLA (
Judea Pearl Computer Science Department UCLA DIRECT AND INDIRECT EFFECTS.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
THE MATHEMATICS OF CAUSE AND EFFECT: With Reflections on Machine Learning Judea Pearl Departments of Computer Science and Statistics UCLA.
IPSS Ch 2. Selection Problem 2.1. The Nature of the Problem Non-Response, Dropped from Census, Sample Attrition in Longitudinal Survey, Censored Data We.
Estimating Causal Effects from Large Data Sets Using Propensity Scores Hal V. Barron, MD TICR 5/06.
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
QM Spring 2002 Business Statistics Probability Distributions.
CAUSES AND COUNTERFACTIALS IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
THE MATHEMATICS OF CAUSE AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
Impact Evaluation Sebastian Galiani November 2006 Causal Inference.
Logistic Regression. Linear regression – numerical response Logistic regression – binary categorical response eg. has the disease, or unaffected by the.
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
EED 401: ECONOMETRICS COURSE OUTLINE
Judea Pearl Computer Science Department UCLA ROBUSTNESS OF CAUSAL CLAIMS.
CAUSAL REASONING FOR DECISION AIDING SYSTEMS COGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl, Mark Hopkins, Blai Bonet, Chen Avin, Ilya Shpitser.
CS104:Discrete Structures Chapter 2: Proof Techniques.
Mediation: The Causal Inference Approach David A. Kenny.
1 CONFOUNDING EQUIVALENCE Judea Pearl – UCLA, USA Azaria Paz – Technion, Israel (
Causality and Identification in Structural Econometrics Causality and Probability in the Sciences University of Kent, Canterbury September Damien.
Summary: connecting the question to the analysis(es) Jay S. Kaufman, PhD McGill University, Montreal QC 26 February :40 PM – 4:20 PM National Academy.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Identification in Econometrics: A Way to Get Causal Information from Observations? Damien Fennell, LSE UCL, May 27, 2005.
THE MATHEMATICS OF CAUSE AND EFFECT: Thinking Nature and Talking Counterfactuals Judea Pearl Departments of Computer Science and Statistics UCLA.
Political Science 30 Political Inquiry The Fundamental Problem of Causal Inference To rent SPSS for a PC for $40, go to
Scientific Method Vocabulary Observation Hypothesis Prediction Experiment Variable Experimental group Control group Data Correlation Statistics Mean Distribution.
AP Statistics From Randomness to Probability Chapter 14.
CAUSAL INFERENCE IN STATISTICS: A Gentle Introduction Judea Pearl Departments of Computer Science and Statistics UCLA.
University of California
Department of Computer Science
Judea Pearl University of California Los Angeles
Chen Avin Ilya Shpitser Judea Pearl Computer Science Department UCLA
Department of Computer Science
A MACHINE LEARNING EXERCISE
Honors Statistics From Randomness to Probability
Computer Science and Statistics
CAUSAL INFERENCE IN STATISTICS
From Propensity Scores And Mediation To External Validity
THE MATHEMATICS OF PROGRAM EVALUATION
Department of Computer Science
CAUSAL REASONING FOR DECISION AIDING SYSTEMS
Presentation transcript:

CAUSAL INFERENCE IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (

Inference: Statistical vs. Causal distinctions and mental barriers Formal semantics for counterfactuals: definition, axioms, graphical representations Inference to three types of claims: 1.Effect of potential interventions 2.Attribution (Causes of Effects) 3.Direct and indirect effects OUTLINE

TRADITIONAL STATISTICAL INFERENCE PARADIGM Data Inference Q(P) (Aspects of P ) P Joint Distribution e.g., Infer whether customers who bought product A would also buy product B. Q = P(B | A)

What happens when P changes? e.g., Infer whether customers who bought product A would still buy A if we were to double the price. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Probability and statistics deal with static relations Data Inference Q( P ) (Aspects of P ) P Joint Distribution P Joint Distribution change

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Note: P (v)  P (v | price = 2) P does not tell us how it ought to change e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation What remains invariant when P changes say, to satisfy P (price=2)=1 Data Inference Q( P ) (Aspects of P ) P Joint Distribution P Joint Distribution change

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score 1.Causal and statistical concepts do not mix

CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score 1.Causal and statistical concepts do not mix Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 2. MENTAL BARRIERS 2.No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  }

4.Non-standard mathematics: a)Structural equation models (Wright, 1920; Simon, 1960) b)Counterfactuals (Neyman-Rubin (Y x ), Lewis (x Y)) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score 1.Causal and statistical concepts do not mix. 3.Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 2. MENTAL BARRIERS 2.No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  }

Y = 2X WHY CAUSALITY NEEDS SPECIAL MATHEMATICS Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1. X = 1 Y = 2 The solution Process information Y : = 2X Correct notation: X = 1 e.g., Pricing Policy: “Double the competitor’s price” Scientific Equations (e.g., Hooke’s Law) are non-algebraic

Y  2X (or) WHY CAUSALITY NEEDS SPECIAL MATHEMATICS Process information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1. Correct notation: X = 1 e.g., Pricing Policy: “Double the competitor’s price” X = 1 Y = 2 The solution Scientific Equations (e.g., Hooke’s Law) are non-algebraic

Data Inference Q(M) (Aspects of M ) Data Generating Model M – Invariant strategy (mechanism, recipe, law, protocol) by which Nature assigns values to variables in the analysis. Joint Distribution THE STRUCTURAL MODEL PARADIGM M

Z Y X INPUTOUTPUT FAMILIAR CAUSAL MODEL ORACLE FOR MANIPILATION

STRUCTURAL CAUSAL MODELS Definition: A structural causal model is a 4-tuple  V,U, F, P(u) , where V = {V 1,...,V n } are observable variables U = {U 1,...,U m } are background variables F = {f 1,..., f n } are functions determining V, v i = f i (v, u) P(u) is a distribution over U P(u) and F induce a distribution P(v) over observable variables

STRUCTURAL MODELS AND CAUSAL DIAGRAMS The arguments of the functions v i = f i (v,u) define a graph v i = f i (pa i,u i ) PA i  V \ V i U i  U Example: Price – Quantity equations in economics U1U1 U2U2 IW Q P PA Q

U1U1 U2U2 IW Q P Let X be a set of variables in V. The action do(x) sets X to constants x regardless of the factors which previously determined X. do ( x ) replaces all functions f i determining X with the constant functions X=x, to create a mutilated model M x STRUCTURAL MODELS AND INTERVENTION

U1U1 U2U2 IW Q P P = p 0 MpMp Let X be a set of variables in V. The action do(x) sets X to constants x regardless of the factors which previously determined X. do ( x ) replaces all functions f i determining X with the constant functions X=x, to create a mutilated model M x STRUCTURAL MODELS AND INTERVENTION

CAUSAL MODELS AND COUNTERFACTUALS Definition: The sentence: “ Y would be y (in situation u ), had X been x,” denoted Y x (u) = y, means: The solution for Y in a mutilated model M x, (i.e., the equations for X replaced by X = x ) with input U=u, is equal to y. The Fundamental Equation of Counterfactuals:

In particular: CAUSAL MODELS AND COUNTERFACTUALS Definition: The sentence: “ Y would be y (in situation u ), had X been x,” denoted Y x (u) = y, means: The solution for Y in a mutilated model M x, (i.e., the equations for X replaced by X = x ) with input U=u, is equal to y. The Fundamental Equation of Counterfactuals: Joint probabilities of counterfactuals:

AXIOMS OF CAUSAL COUNTERFACTUALS 1.Definiteness 2.Uniqueness 3.Effectiveness 4.Composition 5.Reversibility Y would be y, had X been x (in state U = u )

The problem: To predict the impact of a proposed intervention using data obtained prior to the intervention. The solution (conditional): Causal Assumptions + Data  Policy Claims 1.Mathematical tools for communicating causal assumptions formally and transparently. 2.Deciding (mathematically) whether the assumptions communicated are sufficient for obtaining consistent estimates of the prediction required. 3.Deriving (if (2) is affirmative) a closed-form expression for the predicted impact INFERRING THE EFFECT OF INTERVENTIONS 4.Suggesting (if (2) is negative) a set of measurements and experiments that, if performed, would render a consistent estimate feasible.

NON-PARAMETRIC STRUCTURAL MODELS Given P(x,y,z), should we ban smoking? x = u 1, z =  x + u 2, y =  z +  u 1 + u 3. Find:    Find: P(y|do(x)) x = f 1 (u 1 ), z = f 2 (x, u 2 ), y = f 3 (z, u 1, u 3 ). Linear AnalysisNonparametric Analysis U X Z Y 1 U 2 SmokingTar in Lungs Cancer U 3 U X Z Y 1 U 2 SmokingTar in Lungs Cancer  U 3 f1f1 f2f2 f3f3

2 f2f2 Given P(x,y,z), should we ban smoking? x = u 1, z =  x + u 2, y =  z +  u 1 + u 3. Find:    Find: P(y|do(x)) = P(Y=y) in new model x = const. z = f 2 (x, u 2 ), y = f 3 (z, u 1, u 3 ). Linear AnalysisNonparametric Analysis U X = x Z Y 1 U SmokingTar in Lungs Cancer U 3 U X Z Y 1 U 2 SmokingTar in Lungs Cancer  U 3 f3f3 EFFECT OF INTERVENTION AN EXAMPLE 

EFFECT OF INTERVENTION AN EXAMPLE (cont) U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking?

EFFECT OF INTERVENTION AN EXAMPLE (cont) U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking? Pre-interventionPost-intervention

EFFECT OF INTERVENTION AN EXAMPLE (cont) U (unobserved) X = x Z Y SmokingTar in Lungs Cancer U (unobserved) X Z Y SmokingTar in Lungs Cancer Given P(x,y,z), should we ban smoking? Pre-interventionPost-intervention To compute P(y,z|do(x)), we must eliminate u. (graphical problem).

ELIMINATING CONFOUNDING BIAS A GRAPHICAL CRITERION P(y | do(x)) is estimable if there is a set Z of variables such that Z d -separates X from Y in G x. Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z Moreover, P(y | do(x)) =   P(y | x,z) P(z) (“adjusting” for Z ) z GxGx G

RULES OF CAUSAL CALCULUS Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w) Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w) Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

DERIVATION IN CAUSAL CALCULUS Smoking Tar Cancer P (c | do{s}) =  t P (c | do{s}, t) P (t | do{s}) =  s   t P (c | do{t}, s) P (s | do{t}) P(t |s) =  t P (c | do{s}, do{t}) P (t | do{s}) =  t P (c | do{s}, do{t}) P (t | s) =  t P (c | do{t}) P (t | s) =  s  t P (c | t, s) P (s) P(t |s) =  s   t P (c | t, s) P (s | do{t}) P(t |s) Probability Axioms Rule 2 Rule 3 Rule 2 Genotype (Unobserved)

INFERENCE ACROSS DESIGNS Problem: Predict P ( y | do ( x )) from a study in which only Z can be controlled. Solution: Determine if P ( y | do ( x )) can be reduced to a mathematical expression involving only do ( z ).

do -calculus is complete Complete graphical criterion for identifying causal effects (Shpitser and Pearl, 2006). Complete graphical criterion for empirical testability of counterfactuals (Shpitser and Pearl, 2007). COMPLETENESS RESULTS ON IDENTIFICATION

From Hoover (2004) “Lost Causes” THE CAUSAL RENAISSANCE: VOCABULARY IN ECONOMICS From Hoover (2004) “Lost Causes”

THE CAUSAL RENAISSANCE: USEFUL RESULTS 1.Complete formal semantics of counterfactuals 2.Transparent language for expressing assumptions 3.Complete solution to causal-effect identification 4.Legal responsibility (bounds) 5.Imperfect experiments (universal bounds for IV) 6.Integration of data from diverse sources 7.Direct and Indirect effects, 8.Complete criterion for counterfactual testability 7.Direct and Indirect effects,

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug.

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P (? | A is dead, took the drug) > 0.50 PN =

THE PROBLEM Semantical Problem: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

THE PROBLEM Semantical Problem: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Answer: Computable from M

THE PROBLEM Semantical Problem: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” 2.Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined. Analytical Problem:

TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival ExperimentalNonexperimental do(x) do(x) x x Deaths (y) Survivals (y) ,0001,0001,0001,000 1.He actually died 2.He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug? Plaintiff: Mr. A is special.

SOLUTION TO THE ATTRIBUTION PROBLEM WITH PROBABILITY ONE 1  P(y x | x,y)  1 Combined data tell more that each study alone

EFFECT DECOMPOSITION (direct vs. indirect effects) 1.Why decompose effects? 2.What is the semantics of direct and indirect effects? 3.What are the policy implications of direct and indirect effects? 4.When can direct and indirect effect be estimated consistently from experimental and nonexperimental data?

WHY DECOMPOSE EFFECTS? 1.To understand how Nature works 2.To comply with legal requirements 3.To predict the effects of new type of interventions: Signal routing, rather than variable fixing

XZ Y LEGAL IMPLICATIONS OF DIRECT EFFECT What is the direct effect of X on Y ? (averaged over z ) (Qualifications) (Hiring) (Gender) Can data prove an employer guilty of hiring discrimination? Adjust for Z? No! No!

z = f (x, u) y = g (x, z, u) XZ Y NATURAL SEMANTICS OF AVERAGE DIRECT EFFECTS Average Direct Effect of X on Y: The expected change in Y, when we change X from x 0 to x 1 and, for each u, we keep Z constant at whatever value it attained before the change. In linear models, DE = Controlled Direct Effect Robins and Greenland (1992) – “Pure”

SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS Consider the quantity Given  M, P(u) , Q is well defined Given u, Z x * (u) is the solution for Z in M x *, call it z is the solution for Y in M xz Can Q be estimated from data? Experimental: nest-free expression Nonexperimental: subscript-free expression

z = f (x, u) y = g (x, z, u) XZ Y NATURAL SEMANTICS OF INDIRECT EFFECTS Indirect Effect of X on Y: The expected change in Y when we keep X constant, say at x 0, and let Z change to whatever value it would have attained had X changed to x 1. In linear models, IE = TE - DE

POLICY IMPLICATIONS OF INDIRECT EFFECTS f GENDERQUALIFICATION HIRING What is the indirect effect of X on Y? The effect of Gender on Hiring if sex discrimination is eliminated. XZ Y IGNORE Blocking a link – a new type of intervention

Theorem 5: The total, direct and indirect effects obey The following equality In words, the total effect (on Y) associated with the transition from x * to x is equal to the difference between the direct effect associated with this transition and the indirect effect associated with the reverse transition, from x to x *. RELATIONS BETWEEN TOTAL, DIRECT, AND INDIRECT EFFECTS

Is identifiable from experimental data and is given by Theorem: If there exists a set W such that EXPERIMENTAL IDENTIFICATION OF AVERAGE DIRECT EFFECTS Then the average direct effect

Example: Theorem: If there exists a set W such that GRAPHICAL CONDITION FOR EXPERIMENTAL IDENTIFICATION OF DIRECT EFFECTS then,

Y Z X W x*x* z * = Z x* (u) Nonidentifiable even in Markovian models GENERAL PATH-SPECIFIC EFFECTS (Def.) Y Z X W Form a new model,, specific to active subgraph g Definition: g -specific effect

SUMMARY OF RESULTS 1.Formal semantics of path-specific effects, based on signal blocking, instead of value fixing. 2.Path-analytic techniques extended to nonlinear and nonparametric models. 3.Meaningful (graphical) conditions for estimating direct and indirect effects from experimental and nonexperimental data.

CONCLUSIONS Structural-model semantics, enriched with logic and graphs, provides: Complete formal basis for causal and counterfactual reasoning Unifies the graphical, potential-outcome and structural equation approaches Provides friendly and formal solutions to century-old problems and confusions.

Y = 2X WHY CAUSALITY NEEDS SPECIAL MATHEMATICS X = 1 Y = 2 Process information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1. Static information Y : = 2X Correct notation: X = 1 e.g., Pricing Policy: “Double the competitor’s price” SEM Equations are Non-algebraic

ELIMINATING CONFOUNDING BIAS A GRAPHICAL CRITERION P(y | do(x)) is estimable if there is a set Z of variables such that Z d -separates X from Y in G x. Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z6Z6 Z3Z3 Z2Z2 Z5Z5 Z1Z1 X Y Z4Z4 Z Moreover, P(y | do(x)) =   P(y | x,z) P(z) (“adjusting” for Z ) z GxGx G

RULES OF CAUSAL CALCULUS Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w) Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w) Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

DERIVATION IN CAUSAL CALCULUS Smoking Tar Cancer P (c | do{s}) =  t P (c | do{s}, t) P (t | do{s}) =  s   t P (c | do{t}, s) P (s | do{t}) P(t |s) =  t P (c | do{s}, do{t}) P (t | do{s}) =  t P (c | do{s}, do{t}) P (t | s) =  t P (c | do{t}) P (t | s) =  s  t P (c | t, s) P (s) P(t |s) =  s   t P (c | t, s) P (s | do{t}) P(t |s) Probability Axioms Rule 2 Rule 3 Rule 2 Genotype (Unobserved)

INFERENCE ACROSS DESIGNS Problem: Predict P ( y | do ( x )) from a study in which only Z can be controlled. Solution: Determine if P ( y | do ( x )) can be reduced to a mathematical expression involving only do ( z ).

do -calculus is complete Complete graphical criterion for identifying causal effects (Shpitser and Pearl, 2006). Complete graphical criterion for empirical testability of counterfactuals (Shpitser and Pearl, 2007). COMPLETENESS RESULTS ON IDENTIFICATION

From Hoover (2004) “Lost Causes” THE CAUSAL RENAISSANCE: VOCABULARY IN ECONOMICS From Hoover (2004) “Lost Causes”

THE CAUSAL RENAISSANCE: USEFUL RESULTS 1.Complete formal semantics of counterfactuals 2.Transparent language for expressing assumptions 3.Complete solution to causal-effect identification 4.Legal responsibility (bounds) 5.Imperfect experiments (universal bounds for IV) 6.Integration of data from diverse sources 7.Direct and Indirect effects, 8.Complete criterion for counterfactual testability

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug.

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P (? | A is dead, took the drug) > 0.50 PN =

THE PROBLEM Semantical Problem: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

THE PROBLEM Semantical Problem: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Answer: Computable from M

THE PROBLEM Semantical Problem: 1.What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” 2.Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined. Analytical Problem:

TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival ExperimentalNonexperimental do(x) do(x) x x Deaths (y) Survivals (y) ,0001,0001,0001,000 1.He actually died 2.He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug? Plaintiff: Mr. A is special.

TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

SOLUTION TO THE ATTRIBUTION PROBLEM WITH PROBABILITY ONE 1  P(y x | x,y)  1 Combined data tell more that each study alone

``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’ [ Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7 th Cir. (1996))] x = male, x = female y = hire, y = not hire z = applicant’s qualifications LEGAL DEFINITION OF DIRECT EFFECT (FORMALIZING DISCRIMINATION) NO DIRECT EFFECT