Download presentation
Presentation is loading. Please wait.
1
Exploring subjective probability distributions using Bayesian statistics Tom Griffiths Department of Psychology Cognitive Science Program University of California, Berkeley
2
Perception is optimal Körding & Wolpert (2004)
3
Cognition is not
4
Bayesian models of cognition What would an “ideal cognizer” do? –“rational analysis” (Anderson, 1990) How can structured representations be combined with statistical inference? –graphical models, probabilistic grammars, etc. What knowledge guides human inferences? –questions about priors and likelihoods
5
Exploring subjective distributions Natural statistics in cognition (joint work with Josh Tenenbaum) Markov chain Monte Carlo with people (joint work with Adam Sanborn)
6
Exploring subjective distributions Natural statistics in cognition (joint work with Josh Tenenbaum) Markov chain Monte Carlo with people (joint work with Adam Sanborn)
7
Natural statistics Images of natural scenes Prior distributions p(x)p(x)
8
Predicting the future How often is Google News updated? t = time since last update t total = time between updates What should we guess for t total given t?
9
Bayesian inference p(t total |t) p(t|t total ) p(t total ) posterior probability likelihoodprior
10
Bayesian inference p(t total |t) p(t|t total ) p(t total ) p(t total |t) 1/t total p(t total ) assume random sample (0 < t < t total ) posterior probability likelihoodprior
11
The effects of priors
12
Evaluating human predictions Different domains with different priors: –a movie has made $60 million [power-law] –your friend quotes from line 17 of a poem [power-law] –you meet a 78 year old man [Gaussian] –a movie has been running for 55 minutes [Gaussian] –a U.S. congressman has served for 11 years [Erlang] Prior distributions derived from actual data Use 5 values of t for each People predict t total
13
people parametric prior empirical prior Gott’s rule
14
Exploring subjective distributions Natural statistics in cognition (joint work with Josh Tenenbaum) Markov chain Monte Carlo with people (joint work with Adam Sanborn)
15
Markov chain Monte Carlo
16
Metropolis-Hastings algorithm (Metropolis et al., 1953; Hastings, 1970) Step 1: propose a state (we assume symmetrically) Q(x (t+1) |x (t) ) = Q(x (t) )|x (t+1) ) Step 2: decide whether to accept, with probability Metropolis acceptance function Barker acceptance function
17
A task Ask subjects which of two alternatives comes from a target category Which animal is a frog?
18
A Bayesian analysis of the task Assume:
19
Response probabilities If people probability match to the posterior, response probability is equivalent to the Barker acceptance function for target distribution p(x|c)
20
Collecting the samples Which is the frog? Trial 1Trial 2Trial 3
21
Sampling from natural categories Examined distributions for four natural categories: giraffes, horses, cats, and dogs Presented stimuli with nine-parameter stick figures (Olman & Kersten, 2004)
22
Choice task
23
Samples from Subject 3 (projected onto a plane)
24
Mean animals by subject giraffe horse cat dog S1S2S3S4S5S6S7S8
25
Marginal densities (aggregated across subjects) Giraffes are distinguished by neck length, body height and body tilt Horses are like giraffes, but with shorter bodies and nearly uniform necks Cats have longer tails than dogs
26
Markov chain Monte Carlo with people Probabilistic models can guide the design of experiments to measure psychological variables Markov chain Monte Carlo can be used to sample from subjective probability distributions –category distributions (Metropolis-Hastings) –prior distributions (Gibbs sampling) Effective for exploring large stimulus spaces, with distributions on a small part of the space
27
Conclusion Probabilistic models give us a way to explore the knowledge that guides people’s inferences Basic problem for both cognition and perception: identifying subjective probability distributions Two strategies: –natural statistics –Markov chain Monte Carlo with people
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.