Download presentation
Presentation is loading. Please wait.
1
Pattern Recognition and Machine Learning
Chapter 1: Introduction
2
Example Handwritten Digit Recognition
3
Polynomial Curve Fitting
4
Sum-of-Squares Error Function
5
0th Order Polynomial
6
1st Order Polynomial
7
3rd Order Polynomial
8
9th Order Polynomial
9
Over-fitting Root-Mean-Square (RMS) Error:
10
Polynomial Coefficients
11
Data Set Size: 9th Order Polynomial
12
Data Set Size: 9th Order Polynomial
13
Regularization Penalize large coefficient values
14
Regularization:
15
Regularization:
16
Regularization: vs.
17
Polynomial Coefficients
18
Probability Theory Apples and Oranges
19
Probability Theory Marginal Probability Conditional Probability
Joint Probability
20
Probability Theory Sum Rule Product Rule
21
The Rules of Probability
Sum Rule Product Rule
22
Bayes’ Theorem posterior likelihood × prior
23
Probability Densities
24
Expectations Conditional Expectation (discrete)
Approximate Expectation (discrete and continuous)
25
Variances and Covariances
26
The Gaussian Distribution
27
Gaussian Mean and Variance
28
The Multivariate Gaussian
29
Gaussian Parameter Estimation
Likelihood function
30
Maximum (Log) Likelihood
31
Properties of and
32
Curve Fitting Re-visited
33
Maximum Likelihood Determine by minimizing sum-of-squares error,
34
Predictive Distribution
35
MAP: A Step towards Bayes
Determine by minimizing regularized sum-of-squares error,
36
Bayesian Curve Fitting
37
Bayesian Predictive Distribution
38
Model Selection Cross-Validation
39
Curse of Dimensionality
40
Curse of Dimensionality
Polynomial curve fitting, M = 3 Gaussian Densities in higher dimensions
41
Decision Theory Inference step Determine either or . Decision step For given x, determine optimal t.
42
Minimum Misclassification Rate
43
Minimum Expected Loss Example: classify medical images as ‘cancer’ or ‘normal’ Decision Truth
44
Minimum Expected Loss Regions are chosen to minimize
45
Reject Option
46
Why Separate Inference and Decision?
Minimizing risk (loss matrix may change over time) Reject option Unbalanced class priors Combining models
47
Decision Theory for Regression
Inference step Determine . Decision step For given x, make optimal prediction, y(x), for t. Loss function:
48
The Squared Loss Function
49
Generative vs Discriminative
Generative approach: Model Use Bayes’ theorem Discriminative approach: Model directly
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.