Laboratory for Social & Neural Systems Research (SNS) PATTERN RECOGNITION AND MACHINE LEARNING Institute of Empirical Research in Economics (IEW) Computational Neuroeconomics and Neuroscience
Course schedule Computational Neuroeconomics and Neuroscience 2 Date Topic Chapter Density Estimation, Bayesian Inference 2 Adrian Etter, Marco Piccirelli, Giuseppe Ugazio Linear Models for Regression 3 Susanne Leiberg, Grit Hein Linear Models for Classification 4 Friederike Meyer, Chaohui Guo Kernel Methods I: Gaussian Processes 6 Kate Lomakina Kernel Methods II: SVM and RVM 7 Christoph Mathys, Morteza Moazami Probabilistic Graphical Models 8 Justin Chumbley Date Topic Chapter Density Estimation, Bayesian Inference 2 Adrian Etter, Marco Piccirelli, Giuseppe Ugazio Linear Models for Regression 3 Susanne Leiberg, Grit Hein Linear Models for Classification 4 Friederike Meyer, Chaohui Guo Kernel Methods I: Gaussian Processes 6 Kate Lomakina Kernel Methods II: SVM and RVM 7 Christoph Mathys, Morteza Moazami Probabilistic Graphical Models 8 Justin Chumbley
Course schedule Computational Neuroeconomics and Neuroscience 3 Date Topic Chapter Mixture Models and EM 9 Bastiaan Oud, Tony Williams Approximate Inference I: Deterministic Approximations 10 Falk Lieder Approximate Inference II: Stochastic Approximations 11 Kay Brodersen Inference on Continuous Latent Variables: PCA, Probabilistic PCA, ICA 12 Lars Kasper Sequential Data: Hidden Markov Models, Linear Dynamical Systems 13 Chris Burke, Yosuke Morishima Date Topic Chapter Mixture Models and EM 9 Bastiaan Oud, Tony Williams Approximate Inference I: Deterministic Approximations 10 Falk Lieder Approximate Inference II: Stochastic Approximations 11 Kay Brodersen Inference on Continuous Latent Variables: PCA, Probabilistic PCA, ICA 12 Lars Kasper Sequential Data: Hidden Markov Models, Linear Dynamical Systems 13 Chris Burke, Yosuke Morishima
Sandra Iglesias Laboratory for Social & Neural Systems Research (SNS) CHAPTER 1: PROBABILITY, DECISION, AND INFORMATION THEORY Institute of Empirical Research in Economics (IEW) Computational Neuroeconomics and Neuroscience
Outline -Introduction -Probability Theory -Probability Rules -Bayes’Theorem -Gaussian Distribution -Decision Theory -Information Theory Computational Neuroeconomics and Neuroscience 5
Pattern recognition Computational Neuroeconomics and Neuroscience 6 computer algorithms automatic discovery of regularities in data use of these regularities to take actions such as classifying the data into different categories classify data (patterns) based either on -a priori knowledge or -statistical information extracted from the patterns
Machine learning Computational Neuroeconomics and Neuroscience 7 'How can we program systems to automatically learn and to improve with experience?' the machine is programmed to learn from an incomplete set of examples (training set) the core objective of a learner is to generalize from its experience
Polynomial Curve Fitting Computational Neuroeconomics and Neuroscience
Sum-of-Squares Error Function Computational Neuroeconomics and Neuroscience
Plots of polynomials Computational Neuroeconomics and Neuroscience
Over-fitting Root-Mean-Square (RMS) Error: Computational Neuroeconomics and Neuroscience
Regularization Penalize large coefficient values Computational Neuroeconomics and Neuroscience M = 9
Regularization: vs Computational Neuroeconomics and Neuroscience M = 9
Outline -Introduction -Probability Theory -Decision Theory -Information Theory Computational Neuroeconomics and Neuroscience 14
Probability Theory Uncertainty Probability theory: consistent framework for the quantification and manipulation of uncertainty Computational Neuroeconomics and Neuroscience 15 Noise on measurements Finite size of data sets
Probability Theory Marginal Probability Conditional Probability Joint Probability Computational Neuroeconomics and Neuroscience
Probability Theory Computational Neuroeconomics and Neuroscience i = 1, …,M j = 1, …,L n ij : number of trials in which X = x i and Y = y j c i : number of trials in which X = x i irrespective of the value of Y r j : number of trials in which X = x i irrespective of the value of Y
Probability Theory Marginal Probability Conditional Probability Joint Probability Computational Neuroeconomics and Neuroscience
Probability Theory Marginal Probability Conditional Probability Joint Probability Computational Neuroeconomics and Neuroscience
Probability Theory Marginal Probability Conditional Probability Joint Probability Computational Neuroeconomics and Neuroscience
Probability Theory Sum Rule Computational Neuroeconomics and Neuroscience
Probability Theory Product Rule Computational Neuroeconomics and Neuroscience
The Rules of Probability Sum Rule Product Rule Computational Neuroeconomics and Neuroscience
Bayes’ Theorem Computational Neuroeconomics and Neuroscience T. Bayes ( ) P.-S. Laplace ( ) p(X,Y) = p(Y,X)
Bayes’ Theorem posterior likelihood × prior Computational Neuroeconomics and Neuroscience T. Bayes ( ) P.-S. Laplace ( ) Polynomial curve fitting problem
Probability Densities Computational Neuroeconomics and Neuroscience
Expectations Expectation for a discrete distribution: Computational Neuroeconomics and Neuroscience Expectation for a continuous distribution: Expectation of f(x) is the average value of some function f(x) under a probability distribution p(x)
The Gaussian Distribution Computational Neuroeconomics and Neuroscience
Gaussian Parameter Estimation Likelihood function Computational Neuroeconomics and Neuroscience
Maximum (Log) Likelihood Computational Neuroeconomics and Neuroscience
Curve Fitting Re-visited Computational Neuroeconomics and Neuroscience
Maximum Likelihood Determine by minimizing sum-of-squares error, Computational Neuroeconomics and Neuroscience
Outline -Introduction -Probability Theory -Decision Theory -Information Theory Computational Neuroeconomics and Neuroscience 33
Decision Theory Used with probability theory to make optimal decisions Input vector x, target vector t Regression: t is continuous Classification: t will consist of class labels Summary of uncertainty associated is given by Inference problem: is to obtain from data Decision problem: make specific prediction for value of t and take specific actions based on t Inference stepDecision step Determine either or. For given x, determine optimal t Computational Neuroeconomics and Neuroscience
Medical Diagnosis Problem X-ray image of patient Whether patient has cancer or not Input vector x: set of pixel intensities Output variable t: whether cancer or not C1 = cancer; C2 = no cancer General inference problem is to determine which gives most complete description of situation In the end we need to decide whether to give treatment or not Decision theory helps do this Computational Neuroeconomics and Neuroscience
Bayes’ Decision How do probabilities play a role in making a decision? Given input x and classes C k using Bayes’ theorem Quantities in Bayes theorem can be obtained from p(x,Ck) either by marginalizing or conditioning with respect to the appropriate variable Computational Neuroeconomics and Neuroscience 36
Minimum Expected Loss Example: classify medical images as ‘cancer’ or ‘normal’ Unequal importance of mistakes Loss or Cost Function given by Loss Matrix Utility is negative of Loss Minimize Average Loss Decision Truth Computational Neuroeconomics and Neuroscience Regions are chosen to minimize
Why Separate Inference and Decision? Classification problem broken into two separate stages: – Inference stage: training data is used to learn a model for – Decision stage: posterior probabilities used to make optimal class assignments Three distinct approaches to solving decision problems 1. Generative models: 2. Discriminative models 3. Discriminant functions Computational Neuroeconomics and Neuroscience
Generative models 1. solve inference problem of determining class-conditional densities for each class separately and the prior probabilities 2. use Bayes’ theorem to determine posterior probabilities 3. use decision theory to determine class membership Computational Neuroeconomics and Neuroscience 39
Discriminative models 1. solve inference problem to determine posterior class probabilities 2. Use decision theory to determine class membership Computational Neuroeconomics and Neuroscience 40
Discriminant functions 1. Find a function f(x) that maps each input x directly to a class label e.g. two-class problem: f (·) is binary valued f =0 represents C1, f =1 represents C2 Probabilities play no role Computational Neuroeconomics and Neuroscience 41
Decision Theory for Regression Inference step Determine Decision step For given x, make optimal prediction, y(x), for t Loss function: Computational Neuroeconomics and Neuroscience
Outline -Introduction -Probability Theory -Decision Theory -Information Theory Computational Neuroeconomics and Neuroscience 43
Information theory Quantification of information Degree of surprise: highly improbable a lot of information highly probable less information certain no information Based on probability theory Most important quantity: entropy Computational Neuroeconomics and Neuroscience 44
Entropy Computational Neuroeconomics and Neuroscience H[x] p(x) 0 Entropy is the average amount of information expected, weighted with the probability of the random variable quantifies the uncertainty involved when we encounter this random variable
The Kullback-Leibler Divergence Computational Neuroeconomics and Neuroscience Non-symmetric measure of the difference between two probability distributions Also called relative entropy
Mutual Information Computational Neuroeconomics and Neuroscience Two sets of variables: x and y If independent: If not independent:
Mutual Information Computational Neuroeconomics and Neuroscience Mutual information mutual dependence shared information related to the conditional entropy
Course schedule Computational Neuroeconomics and Neuroscience 49 Date Topic Chapter Probability, Decision, and Information Theory Density Estimation, Bayesian Inference Linear Models for Regression Linear Models for Classification Kernel Methods I: Gaussian Processes Kernel Methods II: SVM and RVM Probabilistic Graphical Models Mixture Models and EM Approximate Inference I: Deterministic Approximations Approximate Inference II: Stochastic Approximations Inference on Continuous Latent Variables: PCA, Probabilistic PCA, ICA Sequential Data: Hidden Markov Models, Linear Dynamical Systems 13 Date Topic Chapter Probability, Decision, and Information Theory Density Estimation, Bayesian Inference Linear Models for Regression Linear Models for Classification Kernel Methods I: Gaussian Processes Kernel Methods II: SVM and RVM Probabilistic Graphical Models Mixture Models and EM Approximate Inference I: Deterministic Approximations Approximate Inference II: Stochastic Approximations Inference on Continuous Latent Variables: PCA, Probabilistic PCA, ICA Sequential Data: Hidden Markov Models, Linear Dynamical Systems 13