Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

The influence of domain priors on intervention strategy Neil Bramley.
A Tutorial on Learning with Bayesian Networks
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Introduction of Probabilistic Reasoning and Bayesian Networks
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Part II: Graphical models
The dynamics of iterated learning Tom Griffiths UC Berkeley with Mike Kalish, Steve Lewandowsky, Simon Kirby, and Mike Dowman.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Learning with Bayesian Networks David Heckerman Presented by Colin Rickert.
Bayesian Belief Networks
Bayesian Networks. Graphical Models Bayesian networks Conditional random fields etc.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Does Naïve Bayes always work?
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Bayes Net Perspectives on Causation and Causal Inference
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
PARADIGMS These are frames of reference that are used for understanding things Different paradigms suggest different theories that in turn inspire different.
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
A Brief Introduction to Graphical Models
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Introduction to Bayesian Networks
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Learning the Structure of Related Tasks Presented by Lihan He Machine Learning Reading Group Duke University 02/03/2006 A. Niculescu-Mizil, R. Caruana.
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Introduction to Life Science. Science is a way of learning about the natural world Scientific inquiry – all the diverse ways in which scientist study.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Probabilistic Reasoning Inference and Relational Bayesian Networks.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Recuperação de Informação B Modern Information Retrieval Cap. 2: Modeling Section 2.8 : Alternative Probabilistic Models September 20, 1999.
CS 2750: Machine Learning Directed Graphical Models
Nevin L. Zhang Room 3504, phone: ,
Qian Liu CSE spring University of Pennsylvania
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
A Bayesian Approach to Learning Causal networks
Read R&N Ch Next lecture: Read R&N
CSE-490DF Robotics Capstone
Propagation Algorithm in Bayesian Networks
Instructors: Fei Fang (This Lecture) and Dave Touretzky
Class #19 – Tuesday, November 3
Class #16 – Tuesday, October 26
Observation Use of one or more of the senses to gather information
Belief Networks CS121 – Winter 2003 Belief Networks.
Machine Learning: Lecture 6
Machine Learning: UNIT-3 CHAPTER-1
Read R&N Ch Next lecture: Read R&N
Chapter 14 February 26, 2004.
Presentation transcript:

Bayesian Learning By Porchelvi Vijayakumar

Cognitive Science Current Problem: How do children learn and how do they get it right?

Connectionists and Associationists Associationism: maintains that all knowledge is represented in terms of associations between ideas, that complex ideas are built up from combinations of more primitive ideas, which, in accordance with empiricist philosophy, are ultimately derived from the senses. Connectionism : is a more powerful associationist theory than its predecessors (Shanks, 1995), that seeks to model cognitive processes in a way that broadly reflects the computational style of the brain.

Developmental Scientists Developmental scientists believe that behavior is both abstract representation and learning – Inductive learning

How do we reason?  Pure Logic  Reasoning with Beliefs (probability) Taken From: l2004 Associationists and Connectionists Developmental Cognitive Scientists

Pure Logic Pure Logic: If A is TRUE the B is also TRUE. A: My car isn’t where I left it. B: My car was stolen Taken From: l2004

Introduction to Bayesian Network Basics: Probability, Joint Probability, Conditional Probability. Bayes Law Markov Condition

Conditional Probability, Independence Conditional Probability P(E|F) = P( E AND F)/ P(F) We know that the P(E AND F) = P(E) * P(F) when E and F are independent. Independence : P(E|F) = P(E) Conditional Independence: P(E| F AND G) = P( E|G)

Bayes’ Theorem Inference : P(E| F) = P(F|E) * P(E) P(F) Likelihood Prior Probability Marginal Probability Posterior Probability

Bayesian Network Bayesian Net: DAG - Directed Acyclic Graph which satisfies Markov Condition. Nodes - Variable in the Causal System. Edges – direct influence. p(h1) p(b1/h1) p(L1/h1) p(f1|b1,l1) p(c1|l1) From: Learning Bayesian Networks by Richard E. Neapolitan BL F H C

Markov Condition: If for each variable X € V {X} is conditionally independent of the set of all its non descendents, given the set of all its parents. Bayesian Network

Patterns in Causal Chain A B C D = Markov Equivalent A B C D These two chains have same pattern of dependence and conditional probability.

Learning Causal Bayesian Networks  provides an account for Inductive Inference.  defines a Joint Probability Distribution – thereby specifying how likely is any joint settings of the variables.  can be used to predict about the variables when the graph structure is known.  can be used to learn the graph structure when it is un know, by observing the settings of the variables tend to occur together more or less often.

Intervention Mutilated Graph  Intervention on particular variable X changes probabilistic dependencies over all the variables in the network.  Two networks that would otherwise imply identical patterns of probabilistic dependence may become distinguishable under intervention.  Mutilated Graph in which all incoming arrows to X are cut.

Intervention and mutilated Graph A B C D = Pattern before intervention A B C D = Mutilated graph A B C D = Pattern before intervention A B C D = Mutilated graph Thus two chains which had similar patters of dependencies are different from each other after intervention. This is constraint based learning

Intervention and mutilated Graph  These algorithms can work backward to figure out the set of causal structure compatible with the constraints of the evidence. Given the observed patterns of independence and conditional independence among a set of variables perhaps under different conditions of interventions.

Bayesian Learning  Human inclined tend to judge one causal structure more likely than another.  This degree of believe may be strongly influenced by prior expectations about which causal structures are more likely.  Example: People know Causal mechanism at work

Bayesian Learning H - A space of possible causal models d – Some data - observations of the states of one or more variables in the causal system for different cases, individuals or situations. P(h|d) = posterior probability distribution. P(h|d) =

Conclusion Posterior probabilities – Probability of any event given any evidence Most likely explanation – Scenario that explains evidence Rational decision making – Maximize expected utility – Value of Information Effect of intervention – Causal analysis. Bayesian model may be traditionally been limited by a focus on learning representations at only a single level of abstraction.

References Learning Bayesian Networks – by Richard E. Neapolitan Bayesian networks, Bayesian Learning and Cognitive Development.