Henrik Singmann Karl Christoph Klauer Sieghard Beller

Slides:



Advertisements
Similar presentations
What is Science?.
Advertisements

Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Pattern Recognition and Machine Learning
Statistics between Inductive Logic and Empirical Science Jan Sprenger University of Bonn Tilburg Center for Logic and Philosophy of Science 3 rd PROGIC.
Suppressing valid inferences with conditionals Ruth M.J. Byrne, MRC Applied Psychology Unit, Cambridge (1987, 1988, 1989) Ruth M.J. Byrne, MRC Applied.
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
CS 589 Information Risk Management 6 February 2007.
Cognitive Processes PSY 334 Chapter 10 – Reasoning & Decision-Making August 19, 2003.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
Part II – TIME SERIES ANALYSIS C2 Simple Time Series Methods & Moving Averages © Angel A. Juan & Carles Serrat - UPC 2007/2008.
Cognitive - reasoning.ppt © 2001 Laura Snodgrass, Ph.D.1 Reasoning and Decision Making Five general strategies Reasoning and Logic Two hypotheses –inherently.
Computer vision: models, learning and inference
Probability, Bayes’ Theorem and the Monty Hall Problem
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
Bayesian Hierarchical Models of Individual Differences in Skill Acquisition Dr Jeromy Anglim Deakin University 22 nd May 2015.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Deduction, Proofs, and Inference Rules. Let’s Review What we Know Take a look at your handout and see if you have any questions You should know how to.
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Bayesian Methods I: Parameter Estimation “A statistician is a person who draws a mathematically precise line from an unwarranted assumption to a foregone.
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
Abstract We offer a formal treatment of choice behaviour based on the premise that agents minimise the expected free energy of future outcomes. Crucially,
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Cognitive Processes PSY 334 Chapter 10 – Reasoning.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
1 Propositional Proofs 1. Problem 2 Deduction In deduction, the conclusion is true whenever the premises are true.  Premise: p Conclusion: (p ∨ q) 
Fluency, the Feeling of Rightness, and Analytic Thinking Valerie Thompson Gordon Pennycook Jonathan Evans Jamie Prowse Turner.
Thinking and reasoning with everyday causal conditionals Jonathan Evans Centre for Thinking and Language School of Psychology University of Plymouth.
BHS Methods in Behavioral Sciences I April 7, 2003 Chapter 2 – Introduction to the Methods of Science.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Bayesian Model Selection and Averaging SPM for MEG/EEG course Peter Zeidman 17 th May 2016, 16:15-17:00.
Bayesian Estimation and Confidence Intervals Lecture XXII.
March 23 rd. Four Additional Rules of Inference  Constructive Dilemma (CD): (p  q) (r  s) p v r q v s.
The Dual-strategy model of deductive inference
Henrik Singmann Christoph Klauer Sieghard Beller
Bayesian Estimation and Confidence Intervals
Conditionals and Inferential Connections: A Hypothetical Inferential Theory (HIT) Henrik Singmann, Igor Douven, Shira Elqayam, David Over, & Janneke van.
Knowledge and reasoning – second part
Bayesian data analysis
Henrik Singmann Christoph Klauer Sieghard Beller
Chapter 11: Simple Linear Regression
A Holistic View of Persuasion
Demonstrating the validity of an argument using syllogisms.
Suppression Effects in the Dual-Source Model of Conditional Reasoning
Henrik Singmann Karl Christoph Klauer Sieghard Beller
Henrik Singmann Karl Christoph Klauer Sieghard Beller
Henrik Singmann Karl Christoph Klauer
Suppression Effects in the Dual-Source Model of Conditional Reasoning
Henrik Singmann Karl Christoph Klauer Sieghard Beller
7.1 Rules of Implication I Natural Deduction is a method for deriving the conclusion of valid arguments expressed in the symbolism of propositional logic.
Henrik Singmann Karl Christoph Klauer Sieghard Beller
Henrik Singmann Karl Christoph Klauer David Over
Henrik Singmann Sieghard Beller Karl Christoph Klauer
Henrik Singmann Karl Christoph Klauer Sieghard Beller
Bayesian Models in Machine Learning
Henrik Singmann David Kellen Christoph Klauer Johannes Falck
Measuring Belief Bias with Ternary Response Sets
David Kellen, Henrik Singmann, Sharon Chen, and Samuel Winiger
Suppression Effects in the Dual-Source Model of Conditional Reasoning
Henrik Singmann Karl Christoph Klauer David Over
Rational analysis of the selection task
A Switching Observer for Human Perceptual Estimation
A Switching Observer for Human Perceptual Estimation
CS639: Data Management for Data Science
Henrik Singmann (University of Warwick)
Chapter 8 Natural Deduction
Reward associations do not explain transitive inference performance in monkeys by Greg Jensen, Yelda Alkan, Vincent P. Ferrera, and Herbert S. Terrace.
Assessing Similarity to Support Pediatric Extrapolation
Presentation transcript:

Henrik Singmann Karl Christoph Klauer Sieghard Beller Comparing Models of Probabilistic Conditional Reasoning: Evidence for an Influence of Logical Form Henrik Singmann Karl Christoph Klauer Sieghard Beller

Conditional Reasoning Conditional is a rule with form: If p then q. Conditional inferences usually consist of conditional rule (major premise), minor premise (e.g., p), and conclusion (e.g., q). e.g.: If a person drinks a lot of coke then the person will gain weight. A person drinks a lot of coke. Conclusion: The person will gain weight. 26.12.2018 Dual-Source Model

4 Conditional Inferences Modus Ponens (MP): If p then q. p Conclusion: q Affirmation of the consequent (AC): If p then q. q Conclusion: p Modus Tollens (MT): If p then q. Not q Conclusion: Not p Denial of the antecedent (DA): If p then q. Not p Conclusion: Not q 26.12.2018 Dual-Source Model

4 Conditional Inferences Modus Ponens (MP): If p then q. p Conclusion: q Affirmation of the consequent (AC): If p then q. q Conclusion: p Modus Tollens (MT): If p then q. Not q Conclusion: Not p Denial of the antecedent (DA): If p then q. Not p Conclusion: Not q valid in standard logic (i.e., truth of premises entails truth of conclusion) NOT valid in standard logic (i.e., truth of premises does NOT entail truth of conclusion) 26.12.2018 Dual-Source Model

Research Question: Effect of presence of conditional in everyday probabilistic conditional reasoning? 26.12.2018 Dual-Source Model

Example Item: Knowledge Phase Observation: A balloon is pricked with a needle. How likely is it that it will pop? 26.12.2018 Dual-Source Model

Example Item: Knowledge Phase Observation: A balloon is pricked with a needle. How likely is it that it will pop? X 26.12.2018 Dual-Source Model

Example Item: Rule Phase Rule: If a balloon is pricked with a needle, then it will pop. Observation: A balloon is pricked with a needle. How likely is it that it will pop? 26.12.2018 Dual-Source Model

Example Item: Rule Phase Rule: If a balloon is pricked with a needle, then it will pop. Observation: A balloon is pricked with a needle. How likely is it that it will pop? X 26.12.2018 Dual-Source Model

Klauer, Beller, & Hütter (2010): Experiment 1 (n = 15) error bars: difference adjusted Cousineau-Morey-Baguley intervals different lines: different conditionals (i.e., different contents/items) conditional absent conditional present 26.12.2018 Dual-Source Model

↑ conditional present ↓ Klauer, Beller, & Hütter (2010): Experiment 1 (n = 15) error bars: difference adjusted Cousineau-Morey-Baguley intervals different lines: different conditionals (i.e., different contents/items) ↑ conditional absent ↓ ↑ conditional present ↓ new data (n = 29) 26.12.2018 Dual-Source Model

↑ conditional present ↓ Klauer, Beller, & Hütter (2010): Experiment 1 (n = 15) error bars: difference adjusted Cousineau-Morey-Baguley intervals different lines represent different contents of the conditional Presence of conditional increases participants' estimates of probability of conclusion. Especially for formally valid inferences MP and MT. ↑ conditional absent ↓ ↑ conditional present ↓ new data (n = 29) 26.12.2018 Dual-Source Model

↑ conditional present ↓ Klauer, Beller, & Hütter (2010): Experiment 1 (n = 15) error bars: difference adjusted Cousineau-Morey-Baguley intervals different lines represent different contents of the conditional How can we explain effect of presence of conditional? Our data challenge pure Bayesian or probabilistic approaches that solely rely on background knowledge. ↑ conditional absent ↓ ↑ conditional present ↓ new data (n = 29) 26.12.2018 Dual-Source Model

Example Item: Knowledge Phase Observation: A balloon is pricked with a needle. How likely is it that it will pop? X 26.12.2018 Dual-Source Model

The Knowledge Phase Participants estimate conclusion given minor premise only, e.g., Given p, how likely is q? Response should reflect conditional probability of conclusion given minor premise, e.g., P(q|p) "Inference" "MP" "MT" "AC" "DA" p  q ¬q  ¬p q  p ¬p  ¬q Response reflects P(q|p) P(¬p|¬q) P(p|q) P(¬q|¬p) 26.12.2018 Dual-Source Model

Formalizing the Knowledge Phase Joint probability distribution over P(p), P(q), and their negations per content: Provide the conditional probabilities, e.g.: P(MP) = P(q|p) = P(p  q) / P(q) Three independent parameters describe joint probability distribution (e.g., P(p), P(q), and P(¬q|p), Oaksford, Chater, & Larkin, 2000). q ¬q p P(p  q) P(p  ¬q) ¬p P(¬p  q) P(¬p  ¬q) 26.12.2018 Dual-Source Model

How do we explain the effect of the conditional? Dual-source model for probabilistic conditional inferences (Klauer, Beller, & Hütter, 2010): Conditional absent (knowledge phase) Conditional probability of conclusion given minor premise (from background knowledge). E.g., Given p, how likely is q? P(q|p) Conditional present (rule phase) Conditional adds form-based evidence: Subjective probability to which inference is seen as warranted by (logical) form of inference. E.g., How likely is the conclusion given that the inference is MP? 26.12.2018 Dual-Source Model

How do we explain the effect of the conditional? Dual-source model for probabilistic conditional inferences (Klauer, Beller, & Hütter, 2010): Conditional absent (knowledge phase) Conditional probability of conclusion given minor premise (from background knowledge). E.g., Given p, how likely is q? P(q|p) Conditional present (rule phase) Conditional adds form-based evidence: Subjective probability to which inference is seen as warranted by (logical) form of inference. E.g., How likely is the conclusion given that the inference is MP? Our intuition: Conditional provides form-based information which is integrated with background knowledge. 26.12.2018 Dual-Source Model

Formalizing the Dual-Source Model Response + × (1 – λ) ξ(C,x) × λ τ(x) + (1 – τ(x)) × ξ(C,x) knowledge-based form-based C = content (1 – 4) x = inference (MP, MT, AC, & DA) 26.12.2018 Dual-Source Model

How else explain effect of conditional? Alternative (Bayesian) views: New paradigm psychology of reasoning (e.g., Oaksford & Chater, 2007; Pfeifer & Kleiter, 2010) Causal Bayes nets (e.g., Sloman, 2005; Fernbach & Erb, in press) Reasoning basically probability estimation from coherent probability distribution of background knowledge. Presence of conditional changes knowledge base: Form affects each item idiosyncratically. 26.12.2018 Dual-Source Model

Model differences Dual-Source Model Alternative Views Rule Phase: Knowledge Phase + Form-based evidence per inference (+ 4 parameters) Knowledge Phase: Probability distribution over P(p), P(¬p), P(q), P(¬q) per content. Rule Phase: Probability distribution per content/conditional updates: P'(¬q|p) < P(¬q|p) (Oaksford, Chater & Larkin, 2000) P'(q|p) > P(q|p), then minimizing KL-distance to prior distribution (Hartmann & Rafiee Rad, 2012) 26.12.2018 Dual-Source Model

Which model provides the best account to the data? Research Question: Which model provides the best account to the data? 26.12.2018 Dual-Source Model

The Data Sets Data sets (or parts thereof) that implement default paradigm: Knowledge phase without conditional. Rule phase with conditional. No additional manipulations. 6 datasets (N = 148): Klauer et al. (2010, Exps. 1 & 3, n = 15 & 18) and 2 new data sets (n = 26 & 29) exactly implemented default paradigm. (For Klauer et al.'s experiments third sessions were omitted) For Klauer et al. (2010, Exp 4, n = 13) we omitted trials which did not follow default paradigm. New data set (n = 47) in which speaker expertise was manipulated (Experiment 2, Part I) 26.12.2018 Dual-Source Model

General Methodology 4 different conditionals (i.e., contents) Participants responded to all four inferences per content. Participants gave two estimates per inference: original and converse inference (pooled for analysis). Two measurements (> one week in between): without conditional with conditional → 32 data points per participant. Exceptions: Klauer et al. (2010, Exp. 3): 5 conditionals = 40 data points per participant New data on speaker expertise: 6 conditionals (3 experts, 3 novice) = 48 data points 26.12.2018 Dual-Source Model

Candidate Models Fitted all five models to individual data by minimizing sum of squared deviance from data and predictions. Knowledge Phase (for all models): 3 parameters per content/conditional = 12 parameters for 4 conditionals. Rule Phase: Dual-source model (DS): + 4 form-based parameters = 16 parameters (independent of the number of conditionals) Oaksford et al. (2000, OCL): + 1 parameter per content = 16 parameters. Extended Oaksford & Chater (2007, eOC & eOC*): + 2 parameters per content = 20 parameters. (eOC* does not restrict posterior probability distribution) Minimizing the Kullback-Leibler divergence (KL): + 1 parameter per content = 16 parameters 26.12.2018 Dual-Source Model

Results I Klauer et al. (2010), Exp. 4 5 measurement points - we only use data that is comparable to other conditions (spread across 5 measurement points) Dashed line shows mean fit value of DS. Numbers behind model names = numbers of parameters per model. Ordering: DS = eOC* > OCL > KL 26.12.2018 Dual-Source Model

26.12.2018 Dual-Source Model

DS > KL > eOC* > OCL DS > eOC* > OCL = KL DS > eOC* = KL > OCL DS > eOC* = KL = OCL 26.12.2018 Dual-Source Model

Results Speaker Expertise (Experiment 2) 48 data points. In addition to DS, we fitted DS* which allowed for separate form parameters for experts and novices, respectively. Ordering: eOC* = DS* ≥ OCL = DS > KL 26.12.2018 Dual-Source Model

Summary Dual-source model provides overall best account for 6 datasets. For two cases where it shares the first place (with the extended model by Oaksford & Chater, 2007), it has less parameters Provides best account for 83 of 148 data sets (56%). Extended Oaksford & Chater model (48 best accounts) performs better than Kullback-Leibler model (17 best account), but has more parameters. Strong evidence for our interpretation of conditional: Provide formal information that cannot easily be accounted for by changes in participants' knowledge base only. 26.12.2018 Dual-Source Model

General Discussion (Conditional) Reasoning based on different integrated information: No single-process theory adequate (Singmann & Klauer, 2011). „Pure“ Bayesianism adequate only for knowledge phase (i.e., without conditional) Dual-source model is not tied to normative model (e.g., Elqayam & Evans, 2011). Parameters of dual-source model offer insights into cognitive processes: Dissociation of suppression effects: disablers decrease weighting parameters, whereas alternatives decrease background-knowledge based evidence for AC and DA (and both affect the form-based component) Speaker expertise affects weighting parameter only 26.12.2018 Dual-Source Model

Thanks for your attention Thanks for your attention. Thanks to Karl Christoph Klauer Sieghard Beller Slides are available at my website: http://www.psychologie.uni-freiburg.de/Members/singmann/ 26.12.2018 Dual-Source Model