THE MATHEMATICS OF PROGRAM EVALUATION

Slides:



Advertisements
Similar presentations
Department of Computer Science
Advertisements

RELATED CLASS CS 262 Z – SEMINAR IN CAUSAL MODELING CURRENT TOPICS IN COGNITIVE SYSTEMS INSTRUCTOR: JUDEA PEARL Spring Quarter Monday and Wednesday, 2-4pm.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
CAUSES AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
1 THE SYMBIOTIC APPROACH TO CAUSAL INFERENCE Judea Pearl University of California Los Angeles (
Linear Regression and Binary Variables The independent variable does not necessarily need to be continuous. If the independent variable is binary (e.g.,
TRYGVE HAAVELMO AND THE EMERGENCE OF CAUSAL CALCULUS Judea Pearl University of California Los Angeles (
THE MATHEMATICS OF CAUSAL MODELING Judea Pearl Department of Computer Science UCLA.
COMMENTS ON By Judea Pearl (UCLA). notation 1990’s Artificial Intelligence Hoover.
Causal Networks Denny Borsboom. Overview The causal relation Causality and conditional independence Causal networks Blocking and d-separation Excercise.
Judea Pearl University of California Los Angeles CAUSAL REASONING FOR DECISION AIDING SYSTEMS.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
1 CAUSAL INFERENCE: MATHEMATICAL FOUNDATIONS AND PRACTICAL APPLICATIONS Judea Pearl University of California Los Angeles (
SIMPSON’S PARADOX, ACTIONS, DECISIONS, AND FREE WILL Judea Pearl UCLA
CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS.
1 WHAT'S NEW IN CAUSAL INFERENCE: From Propensity Scores And Mediation To External Validity Judea Pearl University of California Los Angeles (
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
CAUSAL INFERENCE IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
1 REASONING WITH CAUSES AND COUNTERFACTUALS Judea Pearl UCLA (
Judea Pearl Computer Science Department UCLA DIRECT AND INDIRECT EFFECTS.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
Judea Pearl University of California Los Angeles ( THE MATHEMATICS OF CAUSE AND EFFECT.
THE MATHEMATICS OF CAUSE AND EFFECT: With Reflections on Machine Learning Judea Pearl Departments of Computer Science and Statistics UCLA.
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Estimating Causal Effects from Large Data Sets Using Propensity Scores Hal V. Barron, MD TICR 5/06.
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
Propositional Calculus CS 270: Mathematical Foundations of Computer Science Jeremy Johnson.
QM Spring 2002 Business Statistics Probability Distributions.
CAUSES AND COUNTERFACTIALS IN THE EMPIRICAL SCIENCES Judea Pearl University of California Los Angeles (
REASONING WITH CAUSE AND EFFECT Judea Pearl Department of Computer Science UCLA.
THE MATHEMATICS OF CAUSE AND COUNTERFACTUALS Judea Pearl University of California Los Angeles (
Impact Evaluation Sebastian Galiani November 2006 Causal Inference.
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Judea Pearl Computer Science Department UCLA ROBUSTNESS OF CAUSAL CLAIMS.
CAUSAL REASONING FOR DECISION AIDING SYSTEMS COGNITIVE SYSTEMS LABORATORY UCLA Judea Pearl, Mark Hopkins, Blai Bonet, Chen Avin, Ilya Shpitser.
Mediation: The Causal Inference Approach David A. Kenny.
#1 Make sense of problems and persevere in solving them How would you describe the problem in your own words? How would you describe what you are trying.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Identification in Econometrics: A Way to Get Causal Information from Observations? Damien Fennell, LSE UCL, May 27, 2005.
CAUSAL INFERENCE IN STATISTICS: A Gentle Introduction Judea Pearl Departments of Computer Science and Statistics UCLA.
Methods of Presenting and Interpreting Information Class 9.
Math 6330: Statistical Consulting Class 6
Measuring Results and Impact Evaluation: From Promises into Evidence
INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE
assignment 7 solutions ► office networks ► super staffing
Unit 3 Hypothesis.
Combining Random Variables
Explanation of slide: Logos, to show while the audience arrive.
University of California
Department of Computer Science
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Determining the distribution of Sample statistics
Propositional Calculus: Boolean Algebra and Simplification
Presentation, data and programs at:
Judea Pearl University of California Los Angeles
Chen Avin Ilya Shpitser Judea Pearl Computer Science Department UCLA
Department of Computer Science
A MACHINE LEARNING EXERCISE
Joint Statistical Meetings, Vancouver, August 1, 2018
What LIMIT Means Given a function: f(x) = 3x – 5 Describe its parts.
Computer Science and Statistics
CAUSAL INFERENCE IN STATISTICS
The Fundamental Theorem of Calculus
THE FOUNDATIONS OF CAUSAL INFERENCE
From Propensity Scores And Mediation To External Validity
Counterfactual models Time dependent confounding
16. Mean Square Estimation
Department of Computer Science
CAUSAL REASONING FOR DECISION AIDING SYSTEMS
Presentation transcript:

THE MATHEMATICS OF PROGRAM EVALUATION Judea Pearl University of California Los Angeles (www.cs.ucla.edu/~judea)

OUTLINE Statistical vs. causal modeling: distinction and mental barriers Formal semantics for counterfactuals: definition, axioms, graphical representations The policy-evaluation problem and its solution. Causal and counterfactual frills

TRADITIONAL STATISTICAL INFERENCE PARADIGM Data Inference Q(P) (Aspects of P) P Joint Distribution e.g., Infer whether customers who bought product A would also buy product B. Q = P(B | A)

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Probability and statistics deal with static relations P Joint Distribution P Joint Distribution Q(P) (Aspects of P) Data change Inference What happens when P changes? e.g., Infer whether customers who bought product A would still buy A if we were to double the price.

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES What remains invariant when P changes say, to satisfy P (price=2)=1 P Joint Distribution P Joint Distribution Q(P) (Aspects of P) Data change Inference Note: P (v)  P (v | price = 2) P does not tell us how it ought to change e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation

FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score Causal and statistical concepts do not mix.

FROM STATISTICAL TO CAUSAL ANALYSIS: 2. MENTAL BARRIERS CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score Causal and statistical concepts do not mix. No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  } Causal assumptions cannot be expressed in the mathematical language of standard statistics.

FROM STATISTICAL TO CAUSAL ANALYSIS: 2. MENTAL BARRIERS CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence “Controlling for” / Conditioning Odd and risk ratios Collapsibility Propensity score Causal and statistical concepts do not mix. No causes in – no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions  } Causal assumptions cannot be expressed in the mathematical language of standard statistics. Non-standard mathematics: Structural equation models (Wright, 1920; Simon, 1960) Counterfactuals (Neyman-Rubin (Yx), Lewis (x Y))

e.g., Pricing Policy: “Double the competitor’s price” WHY CAUSALITY NEEDS SPECIAL MATHEMATICS SEM Equations are Non-algebraic e.g., Pricing Policy: “Double the competitor’s price” Y := 2X Correct notation: Y = 2X X = 1 Y = 2 The solution X = 1 Process information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1.

e.g., Pricing Policy: “Double the competitor’s price” WHY CAUSALITY NEEDS SPECIAL MATHEMATICS SEM Equations are Non-algebraic e.g., Pricing Policy: “Double the competitor’s price” Correct notation: Y  2X (or) X = 1 Y = 2 The solution X = 1 Process information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1.

THE STRUCTURAL MODEL PARADIGM Data Joint Generating Q(M) Distribution (Aspects of M) Data M Inference M – Oracle for computing answers to Q’s. e.g., Infer whether customer u who bought product A would still buy A if we were to double the price.

ORACLE FOR MANIPILATION FAMILIAR CAUSAL MODEL ORACLE FOR MANIPILATION X Y Z Here is a causal model we all remember from high-school -- a circuit diagram. There are 4 interesting points to notice in this example: (1) It qualifies as a causal model -- because it contains the information to confirm or refute all action, counterfactual and explanatory sentences concerned with the operation of the circuit. For example, anyone can figure out what the output would be like if we set Y to zero, or if we change this OR gate to a NOR gate or if we perform any of the billions combinations of such actions. (2) Logical functions (Boolean input-output relation) is insufficient for answering such queries (3)These actions were not specified in advance, they do not have special names and they do not show up in the diagram. In fact, the great majority of the action queries that this circuit can answer have never been considered by the designer of this circuit. (4) So how does the circuit encode this extra information? Through two encoding tricks: 4.1 The symbolic units correspond to stable physical mechanisms (i.e., the logical gates) 4.2 Each variable has precisely one mechanism that determines its value. INPUT OUTPUT

STRUCTURAL CAUSAL MODELS Definition: A structural causal model is a 4-tuple V,U, F, P(u), where V = {V1,...,Vn} are observable variables U = {U1,...,Um} are background variables F = {f1,..., fn} are functions determining V, vi = fi(v, u) P(u) is a distribution over U P(u) and F induce a distribution P(v) over observable variables

STRUCTURAL MODELS AND CAUSAL DIAGRAMS U1 I W U2 Q P PAQ The arguments of the functions vi = fi(v,u) define a graph vi = fi(pai,ui) PAi  V \ Vi Ui  U Example: Price – Quantity equations in economics U1 I W U2 Q P PAQ

STRUCTURAL MODELS AND INTERVENTION U1 I W U2 Q P Let X be a set of variables in V. The action do(x) sets X to constants x regardless of the factors which previously determined X. do(x) replaces all functions fi determining X with the constant functions X=x, to create a mutilated model Mx U1 I W U2 Q P

STRUCTURAL MODELS AND INTERVENTION U1 I W U2 Q P P = p0 Let X be a set of variables in V. The action do(x) sets X to constants x regardless of the factors which previously determined X. do(x) replaces all functions fi determining X with the constant functions X=x, to create a mutilated model Mx Mp U1 I W U2 Q P P = p0

CAUSAL MODELS AND COUNTERFACTUALS Definition: The sentence: “Y would be y (in situation u), had X been x,” denoted Yx(u) = y, means: The solution for Y in a mutilated model Mx, (i.e., the equations for X replaced by X = x) with input U=u, is equal to y. The Fundamental Equation of Counterfactuals:

CAUSAL MODELS AND COUNTERFACTUALS Definition: The sentence: “Y would be y (in situation u), had X been x,” denoted Yx(u) = y, means: The solution for Y in a mutilated model Mx, (i.e., the equations for X replaced by X = x) with input U=u, is equal to y. Joint probabilities of counterfactuals: The super-distribution P* is derived from M. Parsimonious, consistent, and transparent The Fundamental Equation of Counterfactuals:

AXIOMS OF CAUSAL COUNTERFACTUALS Y would be y, had X been x (in state U = u) Definiteness Uniqueness Effectiveness Composition Reversibility

DIFFICULTIES WITH ALGEBRAIC LANGUAGE: Z X Y Consider a set of assumptions: Unfriendly: Consistent?, complete?, redundant?, arguable? Z X Y Friendly language:

GRAPHICAL – COUNTERFACTUALS SYMBIOSIS Every causal graph expresses counterfactuals assumptions, e.g., X  Y  Z Missing arrows Y  Z 2. Missing arcs Y Z consistent, and readable from the graph. Every theorem in SEM is a theorem in Potential-Outcome Model, and conversely.

POLICY EVALUATION A SOLVED PROBLEM The problem: To predict the impact of a proposed intervention using data obtained prior to the intervention. The solution (conditional): Causal Assumptions + Data   Policy Claims Mathematical tools for communicating causal assumptions formally and transparently. Deciding (mathematically) whether the assumptions communicated are sufficient for obtaining consistent estimates of the prediction required. Deriving (if (2) is affirmative) a closed-form expression for the predicted impact Suggesting (if (2)  is negative) a set of measurements and experiments that, if performed, would render a consistent estimate feasible.

ELIMINATING CONFOUNDING BIAS A GRAPHICAL CRITERION P(y | do(x)) is estimable if there is a set Z of variables such that Z d-separates X from Y in Gx. G Gx Z1 Z1 Z2 Z2 Z Z3 Z3 Z4 Z5 Z5 Z4 X Z6 Y X Z6 Y Moreover, P(y | do(x)) = å P(y | x,z) P(z) (“adjusting” for Z) z

RULES OF CAUSAL CALCULUS Rule 1: Ignoring observations P(y | do{x}, z, w) = P(y | do{x}, w) Rule 2: Action/observation exchange P(y | do{x}, do{z}, w) = P(y | do{x},z,w) Rule 3: Ignoring actions P(y | do{x}, do{z}, w) = P(y | do{x}, w)

DERIVATION IN CAUSAL CALCULUS Genotype (Unobserved) Smoking Tar Cancer P (c | do{s}) = t P (c | do{s}, t) P (t | do{s}) Probability Axioms = t P (c | do{s}, do{t}) P (t | do{s}) Rule 2 = t P (c | do{s}, do{t}) P (t | s) Rule 2 = t P (c | do{t}) P (t | s) Rule 3 = st P (c | do{t}, s) P (s | do{t}) P(t |s) Probability Axioms Rule 2 = st P (c | t, s) P (s | do{t}) P(t |s) = s t P (c | t, s) P (s) P(t |s) Rule 3

INFERENCE ACROSS DESIGNS Problem: Predict P(y | do(x)) from a study in which only Z can be controlled. Solution: Determine if P(y | do(x)) can be reduced to a mathematical expression involving only do(z).

COMPLETENESS RESULTS ON IDENTIFICATION do-calculus is complete Complete graphical criterion for identifying causal effects (Shpitser and Pearl, 2006). Complete graphical criterion for empirical testability of counterfactuals (Shpitser and Pearl, 2007).

THE CAUSAL RENAISSANCE: VOCABULARY IN ECONOMICS From Hoover (2004) “Lost Causes” From Hoover (2004) “Lost Causes”

THE CAUSAL RENAISSANCE: USEFUL RESULTS Complete formal semantics of counterfactuals Transparent language for expressing assumptions Complete solution to causal-effect identification Legal responsibility (bounds) Imperfect experiments (universal bounds for IV) Integration of data from diverse sources Direct and Indirect effects, Complete criterion for counterfactual testability

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug.

DETERMINING THE CAUSES OF EFFECTS (The Attribution Problem) Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P(? | A is dead, took the drug) > 0.50 PN =

THE PROBLEM Semantical Problem: What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.”

THE PROBLEM Semantical Problem: What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Answer: Computable from M

THE PROBLEM Semantical Problem: What is the meaning of PN(x,y): “Probability that event y would not have occurred if it were not for event x, given that x and y did in fact occur.” Under what condition can PN(x,y) be learned from statistical data, i.e., observational, experimental and combined. Analytical Problem:

TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? Experimental Nonexperimental do(x) do(x) x x Deaths (y) 16 14 2 28 Survivals (y) 984 986 998 972 1,000 1,000 1,000 1,000 Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival Plaintiff: Mr. A is special. He actually died He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug?

TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

SOLUTION TO THE ATTRIBUTION PROBLEM WITH PROBABILITY ONE 1  P(yx | x,y)  1 Combined data tell more that each study alone

COUNTERFACTUAL ANALYSIS THE FRUITS OF COUNTERFACTUAL ANALYSIS e.g., Effect Decomposition: Can data prove an employer guilty of hiring discrimination? (Gender) X Z (Qualifications) Y (Hiring) What is the direct effect of X on Y ? (averaged over z)

(FORMALIZING DISCRIMINATION) LEGAL DEFINITION OF DIRECT EFFECT (FORMALIZING DISCRIMINATION) ``The central question in any employment-discrimination case is whether the employer would have taken the same action had the employee been of different race (age, sex, religion, national origin etc.) and everything else had been the same’’ [Carson versus Bethlehem Steel Corp. (70 FEP Cases 921, 7th Cir. (1996))] x = male, x = female y = hire, y = not hire z = applicant’s qualifications NO DIRECT EFFECT

AVERAGE DIRECT EFFECTS NATURAL SEMANTICS OF AVERAGE DIRECT EFFECTS Robins and Greenland (1992) – “Pure” X Z z = f (x, u) y = g (x, z, u) Y Average Direct Effect of X on Y: The expected change in Y, when we change X from x0 to x1 and, for each u, we keep Z constant at whatever value it attained before the change. In linear models, DE = Controlled Direct Effect

SEMANTICS AND IDENTIFICATION OF NESTED COUNTERFACTUALS Consider the quantity Given M, P(u), Q is well defined Given u, Zx*(u) is the solution for Z in Mx*, call it z is the solution for Y in Mxz Can Q be estimated from data? Experimental: nest-free expression Nonexperimental: subscript-free expression

NATURAL SEMANTICS OF INDIRECT EFFECTS X Z z = f (x, u) y = g (x, z, u) Indirect Effect of X on Y: The expected change in Y when we keep X constant, say at x0, and let Z change to whatever value it would have attained had X changed to x1. In linear models, IE = TE - DE

POLICY IMPLICATIONS OF INDIRECT EFFECTS What is the indirect effect of X on Y? The effect of Gender on Hiring if sex discrimination is eliminated. GENDER QUALIFICATION HIRING X Z IGNORE f Y Blocking a link – a new type of intervention

RELATIONS BETWEEN TOTAL, DIRECT, AND INDIRECT EFFECTS Theorem 5: The total, direct and indirect effects obey The following equality In words, the total effect (on Y) associated with the transition from x* to x is equal to the difference between the direct effect associated with this transition and the indirect effect associated with the reverse transition, from x to x*.

EXPERIMENTAL IDENTIFICATION OF AVERAGE DIRECT EFFECTS Theorem: If there exists a set W such that Then the average direct effect Is identifiable from experimental data and is given by

SUMMARY OF RESULTS Formal semantics of path-specific effects, based on signal blocking, instead of value fixing. Path-analytic techniques extended to nonlinear and nonparametric models. Meaningful (graphical) conditions for estimating direct and indirect effects from experimental and nonexperimental data.

CONCLUSIONS Structural-model semantics, enriched with logic and graphs, provides: Complete formal basis for causal and counterfactual reasoning Unifies the graphical, potential-outcome and structural equation approaches Reduces the program-evaluation problem to a mathematical exercise in a friendly causal calculus.

e.g., Pricing Policy: “Double the competitor’s price” WHY CAUSALITY NEEDS SPECIAL MATHEMATICS SEM Equations are Non-algebraic e.g., Pricing Policy: “Double the competitor’s price” Y := 2X Correct notation: Y = 2X X = 1 Y = 2 X = 1 Process information Static information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1.