Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Slides:



Advertisements
Similar presentations
The influence of domain priors on intervention strategy Neil Bramley.
Advertisements

Probabilistic Models in Human and Machine Intelligence.
AI 授課教師:顏士淨 2013/09/12 1. Part I & Part II 2  Part I Artificial Intelligence 1 Introduction 2 Intelligent Agents Part II Problem Solving 3 Solving Problems.
Hidden Markov Models Reading: Russell and Norvig, Chapter 15, Sections
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Probabilistic Models of Cognition Conceptual Foundations Chater, Tenenbaum, & Yuille TICS, 10(7), (2006)
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Bayesian models of inductive learning
Organizational Notes no study guide no review session not sufficient to just read book and glance at lecture material midterm/final is considered hard.
Bayesian models of inductive learning
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Nonparametric Bayes and human cognition Tom Griffiths Department of Psychology Program in Cognitive Science University of California, Berkeley.
Part III Hierarchical Bayesian Models. Phrase structure Utterance Speech signal Grammar Universal Grammar Hierarchical phrase structure grammars (e.g.,
Big Ideas in Cmput366. Search Blind Search Iterative deepening Heuristic Search A* Local and Stochastic Search Randomized algorithm Constraint satisfaction.
CSE 574: Artificial Intelligence II Statistical Relational Learning Instructor: Pedro Domingos.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Semantics For the Semantic Web: The Implicit, the Formal and The Powerful Amit Sheth, Cartic Ramakrishnan, Christopher Thomas CS751 Spring 2005 Presenter:
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Overview and History of Cognitive Science. How do minds work? What would an answer to this question look like? What is a mind? What is intelligence? How.
CPSC 322, Lecture 31Slide 1 Probability and Time: Markov Models Computer Science cpsc322, Lecture 31 (Textbook Chpt 6.5) March, 25, 2009.
Baysian Approaches Kun Guo, PhD Reader in Cognitive Neuroscience School of Psychology University of Lincoln Quantitative Methods 2011.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
Modeling Vision as Bayesian Inference: Is it Worth the Effort? Alan L. Yuille. UCLA. Co-Director: Centre for Image and Vision Sciences. Dept. Statistics.
CSE 590ST Statistical Methods in Computer Science Instructor: Pedro Domingos.
Part II: How to make a Bayesian model. Questions you can answer… What would an ideal learner or observer infer from these data? What are the effects of.
Vision in Man and Machine. STATS 19 SEM Talk 2. Alan L. Yuille. UCLA. Dept. Statistics and Psychology.
CSE 515 Statistical Methods in Computer Science Instructor: Pedro Domingos.
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
THEORIES OF MIND: AN INTRODUCTION TO COGNITIVE SCIENCE Jay Friedenberg and Gordon Silverman.
A Brief Introduction to Graphical Models
9/8/20151 Natural Language Processing Lecture Notes 1.
1 AI and Agents CS 171/271 (Chapters 1 and 2) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Artificial Intelligence Dr. Paul Wagner Department of Computer Science University of Wisconsin – Eau Claire.
Learning causal theories Josh Tenenbaum MIT Department of Brain and Cognitive Sciences Computer Science and AI Lab (CSAIL)
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
A Cognitive Substrate for Natural Language Understanding Nick Cassimatis Arthi Murugesan Magdalena Bugajska.
What is Cognitive Science? Josh Tenenbaum MLSS 2010.
Adaptor Grammars Ehsan Khoddammohammadi Recent Advances in Parsing Technology WS 2012/13 Saarland University 1.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Inferring structure from data Tom Griffiths Department of Psychology Program in Cognitive Science University of California, Berkeley.
IRCS/CCN Summer Workshop June 2003 Speech Recognition.
Perceptual Multistability as Markov Chain Monte Carlo Inference.
Computational models of cognitive development: the grammar analogy Josh Tenenbaum MIT.
Graphical Models in Vision. Alan L. Yuille. UCLA. Dept. Statistics.
Bayesian models of inductive learning Tom Griffiths UC Berkeley Josh Tenenbaum MIT Charles Kemp CMU.
1 LING 696B: Midterm review: parametric and non-parametric inductive inference.
Randomized Algorithms for Bayesian Hierarchical Clustering
Cognitive Processes Chapter 8. Studying CognitionLanguage UseVisual CognitionProblem Solving and ReasoningJudgment and Decision MakingRecapping Main Points.
Probabilistic Models in Human and Machine Intelligence.
Probabilistic Robotics
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Grounded cognition. Barsalou, L. W. (2008). Annual Review of Psychology, 59, Grounded theories versus amodal representations. – Recapitulation.
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
CS382 Introduction to Artificial Intelligence Lecture 1: The Foundations of AI and Intelligent Agents 24 January 2012 Instructor: Kostas Bekris Computer.
APPLICATIONS OF DIRICHLET PROCESS MIXTURES TO SPEAKER ADAPTATION Amir Harati and Joseph PiconeMarc Sobel Institute for Signal and Information Processing,
1 Artificial Intelligence & Prolog Programming CSL 302.
Basic Bayes: model fitting, model selection, model averaging Josh Tenenbaum MIT.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
The Hebrew University of Jerusalem School of Engineering and Computer Science Academic Year: 2011/2012 Instructor: Jeff Rosenschein.
A probabilistic approach to cognition
Markov ó Kalman Filter Localization
Course: Autonomous Machine Learning
CSE 515 Statistical Methods in Computer Science
Probabilistic Models in Human and Machine Intelligence
Building and evaluating models of human-level intelligence
What is AI?.
AI and Agents CS 171/271 (Chapters 1 and 2)
Expectation-Maximization & Belief Propagation
The causal matrix: Learning the background knowledge that makes causal learning possible Josh Tenenbaum MIT Department of Brain and Cognitive Sciences.
Presentation transcript:

Introduction to probabilistic models of cognition Josh Tenenbaum MIT

Why probabilistic models of cognition?

The fundamental problem of cognition How does the mind get so much out of so little? How do we make inferences, generalizations, models, theories and decisions about the world from impoverished (sparse, incomplete, noisy) data? “The problem of induction”

Visual perception (Marr)

Goal of visual perception is to recover world structure from visual images. Why the problem is hard: many world structures can produce the same visual input. Illusions reveal the visual system’s implicit knowledge of the physical world and the processes of image formation. Ambiguity in visual perception (Shepard)

“horse” Learning concepts from examples

“tufa”

Causal inference 15 62no drug drug cold 1 week cold 1 week Don’t press this button! Does this drug help you get over a cold faster?

Causal inference 15 62no drug drug cold 1 week cold 1 week Don’t press this button! Does this drug help you get over a cold faster?

Language Parsing: –Two cars were reported stolen by the Groveton police yesterday. –The judge sentenced the killer to die in the electric chair for the second time. –No one was injured in the blast, which was attributed to a buildup of gas by one town official. –One witness told the commissioners that she had seen sexual intercourse taking place between two parked cars in front of her house. (Pinker)

Language Parsing Acquisition: –Learning the English past tense (rule vs. exceptions) –Learning the Spanish or Arabic past tense (multiple rules plus exceptions) –Learning verb argument structure (“give” vs. “donate”) –Learning to be bilingual.

Intuitive theories Physics –Parsing: Inferring support relations, or the causal history and properties of an object. –Acquisition: Learning about gravity and support. Gravity -- what’s that? Contact is sufficient Mass distribution and location is important Psychology –Parsing: Inferring beliefs, desires, plans. –Acquisition: Learning about agents. Recognizing intentionality, but without mental state reasoning Reasoning about beliefs and desires Reasoning about plans, rationality and “other minds”.

The big questions 1. How does knowledge guide inductive learning, inference, and decision-making from sparse, noisy or ambiguous data? 2. What are the forms and contents of our knowledge of the world? 3. How is that knowledge itself learned from experience? 4. When faced with surprising data, when do we assimilate the data to our current model versus accommodate our model to the new data? 5. How can accurate inductive inferences be made efficiently, even in the presence of complex hypothesis spaces?

A toolkit for answering these questions 1.Bayesian inference in probabilistic generative models 2.Probabilities defined over structured representations: graphs, grammars, predicate logic, schemas 3.Hierarchical probabilistic models, with inference at all levels of abstraction 4.Adaptive nonparametric or “infinite” models, which can grow in complexity or change form in response to the observed data. 5.Approximate methods of learning and inference, such as belief propagation, expectation-maximization (EM), Markov chain Monte Carlo (MCMC), and sequential Monte Carlo (particle filtering).

Phrase structure S Utterance U Grammar G P(S | G) P(U | S) P(S | U, G) ~ P(U | S) x P(S | G) Bottom-up Top-down

Phrase structure Utterance Speech signal Grammar “Universal Grammar” Hierarchical phrase structure grammars (e.g., CFG, HPSG, TAG) P(phrase structure | grammar) P(utterance | phrase structure) P(speech | utterance) P(grammar | UG)

(Han and Zhu, 2006) Vision as probabilistic parsing

Principles Structure Data Whole-object principle Shape bias Taxonomic principle Contrast principle Basic-level bias Learning word meanings

Causal learning and reasoning Principles Structure Data

Goal-directed action (production and comprehension) (Wolpert et al., 2003)

Why probabilistic models of cognition? A framework for understanding how the mind can solve fundamental problems of induction. Strong, principled quantitative models of human cognition. Tools for studying people’s implicit knowledge of the world. Beyond classic limiting dichotomies: “structure vs. statistics”, “nature vs. nurture”, “domain-general vs. domain-specific”. A unifying mathematical language for all of the cognitive sciences: AI, machine learning and statistics, psychology, neuroscience, philosophy, linguistics…. A bridge between engineering and “reverse-engineering”. Why now? Much recent progress, in computational resources, theoretical tools, and interdisciplinary connections.

Summer school plan Weekly plan –Week 1: Basic probabilistic models. Applications to visual perception, categorization, causal learning. –Week 2: More advanced probabilistic models (grammars, logic, MDPs). Applications to reasoning, language, scene understanding, decision-making, neuroscience. –Week 3: Further applications to memory, motor control, sensory integration, unsupervised learning and cognitive development. Symposia on open challenges and student research.

Summer school plan Daily plan –5 (or 6) lectures per day. –Starting Wednesday, break-out sessions after lunch, for discussion with speakers. –Evening tutorials: Matlab, Probability basics, Bayes net toolbox (for matlab), SamIam, BUGS, Markov logic networks and Alchemy. –Psych computer lab (available afternoons). –Self-organizing activities: Sign up for calendar on 30boxes.com: address: Password: “ipam07”

Background poll Bayes’ rule Conjugate prior Bayesian network Plate notation for graphical models Mixture model Hidden Markov model Expectation-maximization (EM) algorithm Dynamic programming Gaussian processes Dirichlet processes First-order logic (Stochastic) context-free grammar Probabilistic relational models MCMC Particle filtering Partially observable Markov decision process Serotonin

Poll for tonight Matlab tutorial? Probability basics?