Download presentation
Presentation is loading. Please wait.
1
A MACHINE LEARNING EXERCISE
CAUSAL INFERENCE AS A MACHINE LEARNING EXERCISE Judea Pearl Computer Science and Statistics UCLA
2
OUTLINE Learning: Statistical vs. Causal concepts
Causal models and identifiability Learnability of three types of causal queries: Effects of potential interventions, Queries about attribution (responsibility) Queries about direct and indirect effects
3
TRADITIONAL MACHINE LEARNING PARADIGM P Joint Distribution Q(P)
(Aspects of P) Data Learning e.g., Learn whether customers who bought product A would also buy product B. Q = P(B|A)
4
THE CAUSAL ANALYSIS PARADIGM M Data-generating Model Q(M)
(Aspects of M) Data Learning Some Q(M) cannot be inferred from P. e.g., Learn whether customers who bought product A would still buy A if we double the price. Data-mining vs. knowledge mining
5
THE SECRETS OF CAUSAL MODELS
Causal Model = Data-generating model satisfying: Modularity (Symbol-mechanism correspondence) Uniqueness (Variable-mechanism correspondence)
6
THE SECRETS OF CAUSAL MODELS
Causal Model = Data-generating model satisfying: Modularity (Symbol-mechanism correspondence) Uniqueness (Variable-mechanism correspondence) Causal Model Joint Distribution PR = f (CP,PS, e1) P(PR, PS, CP, ME, QS) QS = g(ME, PR, e2); P(e1, e2) Q1: P(QS|PR=2) computable from P (and M) Q2: P(QS|do(PR=2) computable from M (not P) f Quantity Sold (QS) g Cost Proj. Prev. Sale Others (e1) Others (e2) Marketing (MS) PRICE (PR)
7
FROM STATISTICAL TO CAUSAL ANALYSIS:
1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations Probability Statistics
8
FROM STATISTICAL TO CAUSAL ANALYSIS:
1. THE DIFFERENCES Data joint distribution inferences from passive observations Probability and statistics deal with static relations Probability Statistics Causal analysis deals with changes (dynamics) i.e. What remains invariant when P changes. P does not tell us how it ought to change e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation
9
FROM STATISTICAL TO CAUSAL ANALYSIS:
1. THE DIFFERENCES Data joint distribution predictions from passive observations Probability and statistics deal with static relations Probability Statistics Causal Model Data assumptions Effects of interventions Causes of effects Explanations Causal analysis deals with changes (dynamics) Experiments
10
FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT)
Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score Causal and statistical concepts do not mix.
11
} FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT)
Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score Causal and statistical concepts do not mix. No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions } Causal assumptions cannot be expressed in the mathematical language of standard statistics. Non-standard mathematics: Structural equation models (SEM) Counterfactuals (Neyman-Rubin) Causal Diagrams (Wright, 1920)
12
WHAT'S IN A CAUSAL MODEL? Oracle that assigns truth value to causal
sentences: Action sentences: B if we do A. Counterfactuals: B would be different if A were true. Explanation: B occurred because of A. Optional: with what probability?
13
ORACLE FOR MANIPILATION
FAMILIAR CAUSAL MODEL ORACLE FOR MANIPILATION X Y Z Here is a causal model we all remember from high-school -- a circuit diagram. There are 4 interesting points to notice in this example: (1) It qualifies as a causal model -- because it contains the information to confirm or refute all action, counterfactual and explanatory sentences concerned with the operation of the circuit. For example, anyone can figure out what the output would be like if we set Y to zero, or if we change this OR gate to a NOR gate or if we perform any of the billions combinations of such actions. (2) Logical functions (Boolean input-output relation) is insufficient for answering such queries (3)These actions were not specified in advance, they do not have special names and they do not show up in the diagram. In fact, the great majority of the action queries that this circuit can answer have never been considered by the designer of this circuit. (4) So how does the circuit encode this extra information? Through two encoding tricks: 4.1 The symbolic units correspond to stable physical mechanisms (i.e., the logical gates) 4.2 Each variable has precisely one mechanism that determines its value. INPUT OUTPUT
14
CAUSAL MODELS AND CAUSAL DIAGRAMS
Definition: A causal model is a 3-tuple M = V,U,F with a mutilation operator do(x): M Mx where: (i) V = {V1…,Vn} endogenous variables, (ii) U = {U1,…,Um} background variables (iii) F = set of n functions, fi : V \ Vi U Vi vi = fi(pai,ui) PAi V \ Vi Ui U
15
CAUSAL MODELS AND CAUSAL DIAGRAMS U1 I W U2 Q P PAQ
Definition: A causal model is a 3-tuple M = V,U,F with a mutilation operator do(x): M Mx where: (i) V = {V1…,Vn} endogenous variables, (ii) U = {U1,…,Um} background variables (iii) F = set of n functions, fi : V \ Vi U Vi vi = fi(pai,ui) PAi V \ Vi Ui U U1 I W U2 Q P PAQ
16
CAUSAL MODELS AND MUTILATION Definition: A causal model is a 3-tuple
M = V,U,F with a mutilation operator do(x): M Mx where: (i) V = {V1…,Vn} endogenous variables, (ii) U = {U1,…,Um} background variables (iii) F = set of n functions, fi : V \ Vi U Vi vi = fi(pai,ui) PAi V \ Vi Ui U (iv) Mx= U,V,Fx, X V, x X where Fx = {fi: Vi X } {X = x} (Replace all functions fi corresponding to X with the constant functions X=x)
17
CAUSAL MODELS AND MUTILATION U1 I W U2 Q P P = p0
Definition: A causal model is a 3-tuple M = V,U,F with a mutilation operator do(x): M Mx where: (i) V = {V1…,Vn} endogenous variables, (ii) U = {U1,…,Um} background variables (iii) F = set of n functions, fi : V \ Vi U Vi vi = fi(pai,ui) PAi V \ Vi Ui U (iv) Mp U1 I W U2 Q P P = p0
18
CAUSAL MODELS AND MUTILATION U1 I W U2 Q P
Definition: A causal model is a 3-tuple M = V,U,F with a mutilation operator do(x): M Mx where: (i) V = {V1…,Vn} endogenous variables, (ii) U = {U1,…,Um} background variables (iii) F = set of n functions, fi : V \ Vi U Vi vi = fi(pai,ui) PAi V \ Vi Ui U (iv) U1 I W U2 Q P
19
PROBABILISTIC CAUSAL MODELS Definition: A causal model is a 3-tuple
M = V,U,F with a mutilation operator do(x): M Mx where: (i) V = {V1…,Vn} endogenous variables, (ii) U = {U1,…,Um} background variables (iii) F = set of n functions, fi : V \ Vi U Vi vi = fi(pai,ui) PAi V \ Vi Ui U (iv) Mx= U,V,Fx, X V, x X where Fx = {fi: Vi X } {X = x} (Replace all functions fi corresponding to X with the constant functions X=x) Definition (Probabilistic Causal Model): M, P(u) P(u) is a probability assignment to the variables in U.
20
CAUSAL MODELS AND COUNTERFACTUALS
Definition: Potential Response The sentence: “Y would be y (in unit u), had X been x,” denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”)
21
CAUSAL MODELS AND COUNTERFACTUALS
Definition: Potential Response The sentence: “Y would be y (in unit u), had X been x,” denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”) Joint probabilities of counterfactuals:
22
CAUSAL MODELS AND COUNTERFACTUALS
Definition: Potential Response The sentence: “Y would be y (in unit u), had X been x,” denoted Yx(u) = y, is the solution for Y in a mutilated model Mx, with the equations for X replaced by X = x. (“unit-based potential outcome”) Joint probabilities of counterfactuals: In particular:
23
CAUSAL INFERENCE MADE EASY (1985-2000)
Inference with Nonparametric Structural Equations made possible through Graphical Analysis. Mathematical underpinning of counterfactuals through nonparametric structural equations Graphical-Counterfactuals symbiosis
24
NON-PARAMETRIC STRUCTURAL MODELS
Given P(x,y,z), should we ban smoking? U U 1 1 U U 3 3 U U 2 2 f3 f1 f2 X Z Y X Z Y Smoking Tar in Lungs Cancer Smoking Tar in Lungs Cancer Linear Analysis Nonparametric Analysis x = u1, z = x + u2, y = z + u1 + u3. x = f1(u1), z = f2(x, u2), y = f3(z, u1, u3). Find: Find: P(y|do(x))
25
LEARNING THE EFFECTS OF ACTIONS Given P(x,y,z), should we ban smoking?
1 1 U U 3 3 U U 2 2 f3 f2 X = x Y X Z Z Y Smoking Tar in Lungs Cancer Smoking Tar in Lungs Cancer Linear Analysis Nonparametric Analysis x = u1, z = x + u2, y = z + u1 + u3. x = const. z = f2(x, u2), y = f3(z, u1, u3). Find: Find: P(y|do(x)) = P(Y=y) in new model
26
IDENTIFIABILITY P(M1) = P(M2) Þ Q(M1) = Q(M2) Definition:
Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff P(M1) = P(M2) Þ Q(M1) = Q(M2) for all M1, M2, that satisfy A.
27
IDENTIFIABILITY P(M1) = P(M2) Þ Q(M1) = Q(M2) Definition:
Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff P(M1) = P(M2) Þ Q(M1) = Q(M2) for all M1, M2, that satisfy A. In other words, Q can be determined uniquely from the probability distribution P(v) of the endogenous variables, V, and assumptions A.
28
IDENTIFIABILITY P(M1) = P(M2) Þ Q(M1) = Q(M2) Definition:
Let Q(M) be any quantity defined on a causal model M, and let A be a set of assumption. Q is identifiable relative to A iff P(M1) = P(M2) Þ Q(M1) = Q(M2) for all M1, M2, that satisfy A. In this talk: A: Assumptions encoded in the diagram Q1: P(y|do(x)) Causal Effect (= P(Yx=y)) Q2: P(Yx =y | x, y) Probability of necessity Q3: Direct Effect
29
THE FUNDAMENTAL THEOREM
OF CAUSAL INFERENCE Causal Markov Theorem: Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as Where pai are the (values of) the parents of Vi in the causal diagram associated with M.
30
THE FUNDAMENTAL THEOREM
OF CAUSAL INFERENCE Causal Markov Theorem: Any distribution generated by Markovian structural model M (recursive, with independent disturbances) can be factorized as Where pai are the (values of) the parents of Vi in the causal diagram associated with M. Corollary: (Truncated factorization, Manipulation Theorem) The distribution generated by an intervention do(X=x) (in a Markovian model M) is given by the truncated factorization
31
RAMIFICATIONS OF THE FUNDAMENTAL THEOREM
U (unobserved) X = x Z Y Smoking Tar in Lungs Cancer X Given P(x,y,z), should we ban smoking?
32
RAMIFICATIONS OF THE FUNDAMENTAL THEOREM
U (unobserved) X = x Z Y Smoking Tar in Lungs Cancer X Given P(x,y,z), should we ban smoking? Pre-intervention Post-intervention
33
RAMIFICATIONS OF THE FUNDAMENTAL THEOREM
U (unobserved) X = x Z Y Smoking Tar in Lungs Cancer X Given P(x,y,z), should we ban smoking? Pre-intervention Post-intervention To compute P(y,z|do(x)), we must eliminate u. (graphical problem).
34
THE BACK-DOOR CRITERION
Graphical test of identification P(y | do(x)) is identifiable in G if there is a set Z of variables such that Z d-separates X from Y in Gx. G Gx Z1 Z1 Z2 Z2 Z Z3 Z3 Z4 Z5 Z5 Z4 X Z6 Y X Z6 Y
35
THE BACK-DOOR CRITERION
Graphical test of identification P(y | do(x)) is identifiable in G if there is a set Z of variables such that Z d-separates X from Y in Gx. G Gx Z1 Z1 Z2 Z2 Z Z3 Z3 Z4 Z5 Z5 Z4 X Z6 Y X Z6 Y Moreover, P(y | do(x)) = å P(y | x,z) P(z) (“adjusting” for Z) z
36
RULES OF CAUSAL CALCULUS
Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w) Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w) Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w) if ( Y ^ ^ Z | X , W ) G X Z
37
DERIVATION IN CAUSAL CALCULUS
Genotype (Unobserved) Smoking Tar Cancer P (c | do{s}) = t P (c | do{s}, t) P (t | do{s}) Probability Axioms = t P (c | do{s}, do{t}) P (t | do{s}) Rule 2 = t P (c | do{s}, do{t}) P (t | s) Rule 2 = t P (c | do{t}) P (t | s) Rule 3 = st P (c | do{t}, s) P (s | do{t}) P(t |s) Probability Axioms Rule 2 = st P (c | t, s) P (s | do{t}) P(t |s) = s t P (c | t, s) P (s) P(t |s) Rule 3
38
IDENTIFICATION RESULT
A RECENT IDENTIFICATION RESULT Theorem: [Tian and Pearl, 2001] The causal effect P(y|do(x)) is identifiable whenever the ancestral graph of Y contains no confounding path ( ) between X and any of its children. X Y X Z1 (c) Z2 X Z1 Z1 Z2 Y Y (a) (b)
39
OUTLINE Learning: Statistical vs. Causal concepts
Causal models and identifiability Learnability of three types of causal queries: Distinguishing direct from indirect effects Queries about attribution (responsibility)
40
DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem)
Your Honor! My client (Mr. A) died BECAUSE he used that drug.
41
DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem)
Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P(? | A is dead, took the drug) > 0.50
42
THE PROBLEM Theoretical Problems: What is the meaning of PN(x,y):
“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”
43
THE PROBLEM Theoretical Problems: What is the meaning of PN(x,y):
“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Answer:
44
THE PROBLEM Theoretical Problems: What is the meaning of PN(x,y):
“Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined.
45
WHAT IS LEARNABLE FROM EXPERIMENTS?
Simple Experiment: Q = P(Yx= y | z) Z nondescendants of X. Compound Experiment: Q = P(YX(z) = y | z) Multi-Stage Experiment: etc…
46
CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY?
Experimental Nonexperimental do(x) do(x) x x Deaths (y) Survivals (y) 1,000 1,000 1,000 1,000 Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival Plaintiff: Mr. A is special. He actually died He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug?
47
TYPICAL THEOREMS (Tian and Pearl, 2000)
Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio
48
SOLUTION TO THE ATTRIBUTION PROBLEM (Cont)
WITH PROBABILITY ONE P(yx | x,y) =1 From population data to individual case Combined data tell more that each study alone
49
OUTLINE Learning: Statistical vs. Causal concepts
Causal models and identifiability Learnability of three types of causal queries: Effects of potential interventions, Queries about attribution (responsibility) Queries about direct and indirect effects
50
QUESTIONS ADDRESSED What is the semantics of direct and indirect effects? Can we estimate them from data? Experimental data?
51
THE OPERATIONAL MEANING OF
DIRECT EFFECTS X Z z = f (x, 1) y = g (x, z, 2) Y “Natural” Direct Effect of X on Y: The expected change in Y per unit change of X, when we keep Z constant at whatever value it attains before the change. In linear models, NDE = Controlled Direct Effect
52
THE OPERATIONAL MEANING OF
INDIRECT EFFECTS X Z z = f (x, 1) y = g (x, z, 2) Y “Natural” Indirect Effect of X on Y: The expected change in Y when we keep X constant, say at x0, and let Z change to whatever value it would have under a unit change in X. In linear models, NIE = TE - DE
53
LEGAL DEFINITIONS TAKE THE NATURAL CONCEPTION
(FORMALIZING DISCRIMINATION) ``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’ [Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7th Cir. (1996))] x = male, x = female y = hire, y = not hire z = applicant’s qualifications NO DIRECT EFFECT YxZx = Yx, YxZx = Yx
54
SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS
Consider the quantity Given M, P(u), Q is well defined Given u, Zx*(u) is the solution for Z in Mx*, call it z is the solution for Y in Mxz Can Q be estimated from data?
55
ANSWERS TO QUESTIONS Graphical conditions for estimability from
experimental / nonexperimental data. Graphical conditions hold in Markovian models
56
IDENTIFICATION IN MARKOVIAN MODELS X Z Y
57
ANSWERS TO QUESTIONS Graphical conditions for estimability from
experimental / nonexperimental data. Graphical conditions hold in Markovian models Useful in answering new type of policy questions involving mechanism blocking instead of variable fixing.
58
CONCLUSIONS General theme: Define Q(M) as a counterfactual expression
Determine conditions for the reduction If reduction is feasible, Q is learnable. Demonstrated on three types of queries: Q1: P(y|do(x)) Causal Effect (= P(Yx=y)) Q2: P(Yx = y | x, y) Probability of necessity Q3: Direct Effect
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.