Inference: Conscious and Unconscious J. McClelland SymSys 100 April 9, 2009.

Slides:



Advertisements
Similar presentations
CHAPTER 12: General Rules of Probability
Advertisements

Lahore University of Management Sciences, Lahore, Pakistan Dr. M.M. Awais- Computer Science Department 1 Lecture 12 Dealing With Uncertainty Probabilistic.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Probability Simple Events
From Randomness to Probability
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Probabilistic thinking – part 1 Nur Aini Masruroh.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Solved the Maze? Start at phil’s house. At first, you can only make right turns through the maze. Each time you cross the red zigzag sign (under Carl’s.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Lecture 05 Rule-based Uncertain Reasoning
AI Principles, Lecture on Reasoning Under Uncertainty Reasoning Under Uncertainty (A statistical approach) Jeremy Wyatt.
Perceptual Inference and Information Integration in Brain and Behavior PDP Class Jan 11, 2010.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
Chapter 8 Hypothesis testing 1. ▪Along with estimation, hypothesis testing is one of the major fields of statistical inference ▪In estimation, we: –don’t.
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
1-1 Copyright © 2015, 2010, 2007 Pearson Education, Inc. Chapter 13, Slide 1 Chapter 13 From Randomness to Probability.
Probability Rules!. ● Probability relates short-term results to long-term results ● An example  A short term result – what is the chance of getting a.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
What is science?. Science is a way of learning about the natural world. Scientists use skills such as observing, inferring, predicting, classifying, evaluating.
Inference: Conscious and Unconscious J. McClelland SymSys 100 April 20, 2010.
Perception is… Awareness of things (aka reality) through our 5 senses Sight Smell Touch Hearing Taste.
Brain Mechanisms of Unconscious Inference J. McClelland Symsys 100 April 7, 2011.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CHAPTER 12: General Rules of Probability Lecture PowerPoint Slides The Basic Practice of Statistics 6 th Edition Moore / Notz / Fligner.
Introduction to Physical Science Chapter 1. “The world is full of obvious things, which nobody by any chance ever observes.” - Sherlock Holmes.
Nature of Science.
Uncertainty Management in Rule-based Expert Systems
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Reasoning in Psychology Using Statistics Psychology
Probability is a measure of the likelihood of a random phenomenon or chance behavior. Probability describes the long-term proportion with which a certain.
Copyright © 2010 Pearson Education, Inc. Unit 4 Chapter 14 From Randomness to Probability.
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
Please turn off cell phones, pagers, etc. The lecture will begin shortly. There will be a very easy quiz at the end of today’s lecture.
BLOOM’S TAXONOMY Mrs. Eagen A, A. Bloom identified six levels within the cognitive domain, from the simple recall or recognition of facts,
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
The Socio-cultural Level of Analysis
Warsaw Summer School 2015, OSU Study Abroad Program Normal Distribution.
1 Guess the Covered Word Goal 1 EOC Review 2 Scientific Method A process that guides the search for answers to a question.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 9 Testing a Claim 9.1 Significance Tests:
Nature of Science. Purpose of Science ► Science is the pursuit of explanations of the natural world.
Chap 4-1 Chapter 4 Using Probability and Probability Distributions.
Probabilistic Inference: Conscious and Unconscious J. McClelland SymSys 100 April 5, 2011.
REASONING UNDER UNCERTAINTY: CERTAINTY THEORY
Bayesian inference in neural networks
From Randomness to Probability
Histograms CSE 6363 – Machine Learning Vassilis Athitsos
What Is a Test of Significance?
Critical Thinking.
CHAPTER 9 Testing a Claim
From Randomness to Probability
Bayesian inference in neural networks
CHAPTER 9 Testing a Claim
Understanding Science
Reasoning in Psychology Using Statistics
CHAPTER 9 Testing a Claim
Some Basic Aspects of Perceptual Inference Under Uncertainty
CHAPTER 9 Testing a Claim
Reasoning in Psychology Using Statistics
28th September 2005 Dr Bogdan L. Vrusias
CHAPTER 9 Testing a Claim
From Randomness to Probability
Reasoning in Psychology Using Statistics
CHAPTER 9 Testing a Claim
CHAPTER 9 Testing a Claim
What do we see? #Perceiving images
What do we see? #Perceiving images
Presentation transcript:

Inference: Conscious and Unconscious J. McClelland SymSys 100 April 9, 2009

A Conscious Inference I wake up in morning and ask myself has it rained overnight? I look out the window and notice that the ground is wet. I consider whether I should conclude that it rained last night. I consider the fact that it is April, and rain is relatively rare. I consider that I have a sprinker system, and ask myself if it is set to operate on Wednesdays. I conclude that any conclusion I can reach by looking at my own back yard would be tentative; so I go outside and notice that the street and sidewalks along my block are all wet. I conclude that it did indeed rain (a little) in the wee hours of this morning.

Some Examples of “Unconscious Inferences”

Eberhardt: Seeing Black Participants were shown very brief presentations of black faces, white faces, or nothing (20 stimuli). They then identified pictures that gradually became clearer frame by frame (500 msec/frame). Exposure to black faces led to faster recognition of crime objects, even though participants were not aware that they had been shown any faces.

Perception as ‘Unconscious Inference’ Helmholtz coined the term in the 19 th century, drawing on ideas going back to the ancients. The terms suggests a ‘little man in the head’, entertaining hypotheses, weighing evidence, and making decisions. But we can think of it differently: perhaps your brain is wired to –Use context together with direct information –Combine multiple sources of information –Allow ‘hypotheses’ to exert mutual influences on each other

Rest of This Lecture Examine a “normative theory” of inference under uncertainty –How can we reason from given information to conclusions when everything we know is uncertain? –How can we combine different sources of evidence? Consider two experiments that produce data consistent (in a sense) with the normative theory.

Next two lectures A neuro-mechanistic examination of unconscious inference –How neurons can produce output matching the normative theory –How networks of neurons can work together to produce collective behavior consistent with the normative theory.

An inference question An apartment building has a surveillance camera with an automatic face recognition software. If the camera sees a known terrorist, it will ring a bell with 99% probability. If the camera sees a non-terrorist, it will trigger the alarm 1% of the time. Suppose somebody triggers the alarm. What is the chance he/she is a known terrorist?

The Normative Theory: Bayesian Inference. Definitions Hypothesis Evidence Probability –NB: Sometimes these are estimated or subjective probabilities P(E|H): Likelihood –= P(E&H)/P(H): In this case:.99 P(E|~H) –= P(E&~H)/P(~H): In this case:.01 P(H|E) –= P(E&H)/P(E) What is it that we want to compute in this case? –P(H|E) Can we derive P(H|E)?

P(H|M) & P(H|F)

Deriving Bayes’ Rule P(H|E) = P(E&H)/P(E) => –P(H|E)P(E) = P(E&H) Similarly, P(E|H) = P(E&H)/P(H) => –P(E|H)P(H) = P(E&H) So –P(H|E)P(E) = P(E|H)P(H) Divide both sides by P(E): –P(H|E) = P(E|H)P(H)/P(E) Final Step: Compute P(E) –In our case, the probability that the alarm goes off when a person walks by the camera Can we derive P(E) from quantities we’ve already discussed? –P(E) = P(E|H)P(H) + P(E|~H)P(~H) Substituting, we get Bayes’ Rule:

‘Sh’ and ‘ss’

P(‘sh’|freq) & P(‘ss’|freq) [Hypothetical Data!]

Stimulus value (1-9) P(‘sh’) Expected form of the data from Ganong Experiment

Class Data: P(‘sh’) for ‘establi..’ and ‘malpracti…’ contexts

How might we combine information from different sources? For example, visible and heard speech (The McGurk effect) Auditory cue: starting frequency of syllable Visible cue: degree of mouth opening Treat A and V as ‘conditionally independent’ –P(Ea&Ev|H) = P(Ea|H)P(Ev|H) for H = ‘ba’ and ‘da’ Then we get this normative model: