Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Generative Models Thus far we have essentially considered techniques that perform classification indirectly by modeling the training data, optimizing.
Machine Learning and Data Mining Linear regression
Notes Sample vs distribution “m” vs “µ” and “s” vs “σ” Bias/Variance Bias: Measures how much the learnt model is wrong disregarding noise Variance: Measures.
EKF, UKF TexPoint fonts used in EMF.
Longin Jan Latecki Temple University
Pattern Recognition and Machine Learning
Computer vision: models, learning and inference Chapter 8 Regression.
Recap. Be cautious.. Data may not be in one blob, need to separate data into groups Clustering.
Data mining and statistical learning - lecture 6
Basis Expansion and Regularization Presenter: Hongliang Fei Brian Quanz Brian Quanz Date: July 03, 2008.
Regression. So far, we've been looking at classification problems, in which the y values are either 0 or 1. Now we'll briefly consider the case where.
Logistic Regression Principal Component Analysis Sampling TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAA A A A.
Kernel methods - overview
x – independent variable (input)
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Basis Expansions and Regularization Based on Chapter 5 of Hastie, Tibshirani and Friedman.
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Collaborative Filtering Matrix Factorization Approach
PATTERN RECOGNITION AND MACHINE LEARNING
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Perceptual and Sensory Augmented Computing Machine Learning, WS 13/14 Machine Learning – Lecture 14 Introduction to Regression Bastian Leibe.
Learning Theory Reza Shadmehr logistic regression, iterative re-weighted least squares.
Perceptual and Sensory Augmented Computing Advanced Machine Learning Winter’12 Advanced Machine Learning Lecture 3 Linear Regression II Bastian.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
LOGISTIC REGRESSION David Kauchak CS451 – Fall 2013.
CS Statistical Machine learning Lecture 10 Yuan (Alan) Qi Purdue CS Sept
An Introduction to Support Vector Machines (M. Law)
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Overview of the final test for CSC Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random.
CS 478 – Tools for Machine Learning and Data Mining SVM.
Perceptual and Sensory Augmented Computing Machine Learning, WS 13/14 Machine Learning – Lecture 15 Regression II Bastian Leibe RWTH Aachen.
Sparse Kernel Methods 1 Sparse Kernel Methods for Classification and Regression October 17, 2007 Kyungchul Park SKKU.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Over-fitting and Regularization Chapter 4 textbook Lectures 11 and 12 on amlbook.com.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Data Mining Lectures Lecture 7: Regression Padhraic Smyth, UC Irvine ICS 278: Data Mining Lecture 7: Regression Algorithms Padhraic Smyth Department of.
Basis Expansions and Generalized Additive Models Basis expansion Piecewise polynomials Splines Generalized Additive Model MARS.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
CSC321: Lecture 8: The Bayesian way to fit models Geoffrey Hinton.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Lecture 16: Image alignment
PREDICT 422: Practical Machine Learning
Linli Xu Martha White Dale Schuurmans University of Alberta
Probability Theory and Parameter Estimation I
Multiple Linear Regression
Ch3: Model Building through Regression
CH 5: Multivariate Methods
Machine learning, pattern recognition and statistical data modelling
Machine learning, pattern recognition and statistical data modelling
Machine Learning – Regression David Fenyő
Probabilistic Models for Linear Regression
CSCI 5822 Probabilistic Models of Human and Machine Learning
Collaborative Filtering Matrix Factorization Approach
Mathematical Foundations of BME Reza Shadmehr
Linear regression Fitting a straight line to observations.
Biointelligence Laboratory, Seoul National University
Contact: Machine Learning – (Linear) Regression Wilson Mckerrow (Fenyo lab postdoc) Contact:
Nonlinear Fitting.
Mathematical Sciences
Lecture 8: Image alignment
Regression and Correlation of Data
Multiple linear regression
Presentation transcript:

Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A

Classification

? CatDog

Cleanliness Size

? $$$$$$$$$$

Regression

$ $$ $$$ $$$$ Price Top speed x y

Regression Data Goal: given, predict i.e. find a prediction function

Nearest neighbor

Nearest neighbor To predict x –Find the data point x i closest to x –Choose y = y i + No training – Finding closest point can be expensive – Overfitting

Kernel Regression To predict X –Give data point x i weight –Normalize weights –Let e.g.

Kernel Regression [matlab demo]

Kernel Regression + No training + Smooth prediction – Slower than nearest neighbor – Must choose width of

Linear regression

Temperature [start Matlab demo lecture2.m] Given examples Predict given a new point Temperature

Temperature Linear regression Prediction

Linear Regression Error or “residual” Prediction Observation Sum squared error

Linear Regression n d Solve the system (it’s better not to invert the matrix)

Minimize the sum squared error Sum squared error Linear equation Linear system

LMS Algorithm (Least Mean Squares) where Online algorithm

Beyond lines and planes everything is the same with still linear in

Linear Regression [summary] n d Let For example Let Minimize by solving Given examples Predict

Probabilistic interpretation Likelihood

Overfitting [Matlab demo] Degree 15 polynomial

Ridge Regression (Regularization) Effect of regularization (degree 19) with “small” Minimize Solve Let

Probabilistic interpretation Likelihood Prior Posterior

Locally Linear Regression

[source: Global temperature increase

Locally Linear Regression To predict X –Give data point x i weight –Let e.g.

Locally Linear Regression + Good even at the boundary (more important in high dimension) – Solve linear system for each new prediction – Must choose width of To minimize Solve Predict where

[source: Locally Linear Regression Gaussian kernel 180

[source: Locally Linear Regression Laplacian kernel 180

L1 Regression

Sensitivity to outliers High weight given to outliers Influence function

L 1 Regression Linear program Influence function

Spline Regression Regression on each interval

Spline Regression With equality constraints

Spline Regression With L 1 cost

To learn more The Elements of Statistical Learning, Hastie, Tibshirani, Friedman, Springer