Inference: Conscious and Unconscious J. McClelland SymSys 100 April 20, 2010.

Slides:



Advertisements
Similar presentations
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Advertisements

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Language and Cognition Colombo, June 2011 Day 8 Aphasia: disorders of comprehension.
Hypothesis Testing A hypothesis is a claim or statement about a property of a population (in our case, about the mean or a proportion of the population)
4-4 Multiplication Rule The basic multiplication rule is used for finding P(A and B), the probability that event A occurs in a first trial and event B.
FT228/4 Knowledge Based Decision Support Systems
Probability Simple Events
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Introduction to Probability and Probabilistic Forecasting L i n k i n g S c i e n c e t o S o c i e t y Simon Mason International Research Institute for.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Fundamentals of Forensic DNA Typing Slides prepared by John M. Butler June 2009 Appendix 3 Probability and Statistics.
CS 589 Information Risk Management 30 January 2007.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Solved the Maze? Start at phil’s house. At first, you can only make right turns through the maze. Each time you cross the red zigzag sign (under Carl’s.
Lecture 05 Rule-based Uncertain Reasoning
Inference: Conscious and Unconscious J. McClelland SymSys 100 April 9, 2009.
Perceptual Inference and Information Integration in Brain and Behavior PDP Class Jan 11, 2010.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 7 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Does Naïve Bayes always work?
Chapter 8 Hypothesis testing 1. ▪Along with estimation, hypothesis testing is one of the major fields of statistical inference ▪In estimation, we: –don’t.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
2 nd Order CFA Byrne Chapter 5. 2 nd Order Models The idea of a 2 nd order model (sometimes called a bi-factor model) is: – You have some latent variables.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Confidence intervals and hypothesis testing Petter Mostad
Introduction to Physical Science Chapter 1. “The world is full of obvious things, which nobody by any chance ever observes.” - Sherlock Holmes.
Correlation Assume you have two measurements, x and y, on a set of objects, and would like to know if x and y are related. If they are directly related,
Uncertainty Management in Rule-based Expert Systems
Uncertainty in Expert Systems
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reasoning in Psychology Using Statistics Psychology
Fall 2002Biostat Statistical Inference - Confidence Intervals General (1 -  ) Confidence Intervals: a random interval that will include a fixed.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
Slide 1 UCL JDI Centre for the Forensic Sciences 21 March 2012 Norman Fenton Queen Mary University of London and Agena Ltd Bayes and.
G. Cowan Lectures on Statistical Data Analysis Lecture 8 page 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem 2Random variables and.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Chance Models, Hypothesis Testing, Power Q560: Experimental Methods in Cognitive Science Lecture 6.
Warsaw Summer School 2015, OSU Study Abroad Program Normal Distribution.
1 Guess the Covered Word Goal 1 EOC Review 2 Scientific Method A process that guides the search for answers to a question.
Statistical Methods. 2 Concepts and Notations Sample unit – the basic landscape unit at which we wish to establish the presence/absence of the species.
Hypothesis Testing. Statistical Inference – dealing with parameter and model uncertainty  Confidence Intervals (credible intervals)  Hypothesis Tests.
The normal approximation for probability histograms.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
Probabilistic Inference: Conscious and Unconscious J. McClelland SymSys 100 April 5, 2011.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. 1 FINAL EXAMINATION STUDY MATERIAL III A ADDITIONAL READING MATERIAL – INTRO STATS 3 RD EDITION.
+ Testing a Claim Significance Tests: The Basics.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Bayesian inference in neural networks
Dependent-Samples t-Test
Histograms CSE 6363 – Machine Learning Vassilis Athitsos
Genetics and Probability
Does Naïve Bayes always work?
What Is a Test of Significance?
Unit 5: Hypothesis Testing
CHAPTER 9 Testing a Claim
Bayesian inference in neural networks
CHAPTER 9 Testing a Claim
Reasoning in Psychology Using Statistics
CHAPTER 9 Testing a Claim
Some Basic Aspects of Perceptual Inference Under Uncertainty
CHAPTER 9 Testing a Claim
Warsaw Summer School 2017, OSU Study Abroad Program
Reasoning in Psychology Using Statistics
28th September 2005 Dr Bogdan L. Vrusias
CHAPTER 9 Testing a Claim
Reasoning in Psychology Using Statistics
CHAPTER 9 Testing a Claim
CHAPTER 9 Testing a Claim
Presentation transcript:

Inference: Conscious and Unconscious J. McClelland SymSys 100 April 20, 2010

A Conscious Inference I wake up in morning and ask myself has it rained overnight? I look out the window and notice that the ground is wet. I consider whether I should conclude that it rained last night. I consider the fact that it is April, and rain is relatively rare. I consider that I have a sprinker system, and ask myself if it is set to operate on Tuesdays. I conclude that any conclusion I can reach by looking at my own back yard would be tentative; so I go outside and notice that the street and sidewalks along my block are all wet. I conclude that it did indeed rain last night.

Some Examples of “Unconscious Inferences”

Eberhardt: Seeing Black Participants were shown very brief presentations of black faces, white faces, or nothing (20 stimuli). They then identified pictures that gradually became clearer frame by frame (500 msec/frame). Exposure to black faces led to faster recognition of crime objects, even though participants were not aware that they had been shown any faces.

Perception as ‘Unconscious Inference’ Helmholtz coined the term in the 19 th century, drawing on ideas going back to the ancients. The terms suggests a ‘little man in the head’, entertaining hypotheses, weighing evidence, and making decisions. But we can think of it differently: perhaps your brain is wired to –Use context together with direct information –Combine multiple sources of information –Allow ‘hypotheses’ to exert mutual influences on each other

Rest of This Lecture Examine a “normative theory” of inference under uncertainty –How can we reason from given information to conclusions when everything we know is uncertain? –How can we combine different sources of evidence? Consider two experiments that produce data consistent (in a sense) with the normative theory.

Next two lectures A neuro-mechanistic examination of unconscious inference –How neurons can produce output matching the normative theory –How networks of neurons can work together to produce collective behavior consistent with the normative theory.

An inference question An apartment building has a surveillance camera with automatic face recognition software. If the camera sees a known terrorist, it will trigger an alarm with 99% probability. If the camera sees a non-terrorist, it will trigger the alarm 1% of the time. Suppose somebody triggers the alarm. What is the chance he/she is a known terrorist?

The Normative Theory Bayesian Inference: Definitions Hypothesis (It’s a terrorist) Evidence (Alarm rang) Probability: Expected Ratio of Occurrences to Opportunities NB: Sometimes these are estimated or subjective probabilities P(E|H): Likelihood = P(E&H)/P(H): In this case:.99 P(E|~H) = P(E&~H)/P(~H): In this case:.01 P(H|E) = P(E&H)/P(E) Can we compute P(H|E)?

Deriving Bayes’ Rule The definition P(H|E) = P(E&H)/P(E) implies P(H|E)P(E) = P(E&H) Similarly, P(E|H) = P(E&H)/P(H) implies P(E|H)P(H) = P(E&H) It follows that P(H|E)P(E) = P(E|H)P(H) Divide both sides by P(E) we obtain P(H|E) = P(E|H)P(H)/P(E) Final Step: Compute P(E) In our case, the probability that the alarm goes off when a person walks by the camera Can we derive P(E) from quantities we’ve already discussed? P(E) = P(E|H)P(H) + P(E|~H)P(~H) Substituting, we get Bayes’ Rule:

Importance of ‘priors’ or ‘base rates’ If terrorists and non-terrorists are equally likely, p(H|E) =.99 If only one person in 100 is a terrorist, p(H|E) =.5 If only one person in 10,000 is a terrorist, p(H|E) is ~.01!

Lessons From this Example Priors are crucial if we are to derive a correct inference from the data! For items with low ‘base rates’ (priors), people often overestimate the posterior, p(H|E). In explicit inferences, people often neglect base rates (or underweight them)

Ganong Experiment Spoken words such as ‘establish’ or ‘malpractice’ are presented but the final fricative sound is replaced with an ambiguous sound, somewhere on a continuum from ‘ss’ to ‘sh’. –Stimuli are blends of natural ‘ss’ and ‘sh’ sounds. The participant must decide: is the last sound ‘ss’ or ‘sh’.

Natural ‘sh’ and ‘ss’

Data from experiment performed last year: P(‘sh’) for ‘establi..’ and ‘malpracti…’ contexts

Can we understand these data from a Bayesian Perspective? We can think of the different stimuli used as differing in their probability, given that the stimulus was ‘sh’ or ‘ss’. We can think of ‘establi…’ increases the prior probability of ‘sh’ relative to ‘ss’ Similarly, we can think of ‘malpracti…’ as increasing the prior probability of ‘ss’ relative to ‘sh’ The effect is substantial, but not overwhelming: p(sh|establi…)/(p(sh|establi…)+p(s|establi…)) =.63 p(sh|establi…)/p(s|establi…) =.63/.37 ≈ 1.7 Is the effect too big, or is it too small?

P(stim|sh) & P(stim|ss) [Hypothetical!]

Stimulus value (1-9) P(‘sh’) Expected form of the data from Ganong experiment

Data from experiment performed last year: P(‘sh’) for ‘establi..’ and ‘malpracti…’ contexts

How might we combine information from different sources? For example, visible and heard speech (The McGurk effect) Auditory cue: starting frequency of syllable Visible cue: degree of mouth opening Treat A and V as conditionally independent –P(Ea&Ev|H) = P(Ea|H)P(Ev|H) for H = ‘ba’ or ‘da’ Then we get this normative model:

Summary People make unconscious inferences all the time. The responses they make often approximately match an estimate of the posterior, assuming that context adjusts their priors towards alternatives that are consistent with the given context. People also often combine different sources of information in a way that is consistent with Bayes rule, and the additional assumption that the different sources of evidence are treated as conditionally independent. Next time we will consider how neurons in the brain could carry out such computations.