Presentation is loading. Please wait.

Presentation is loading. Please wait.

Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A.

Similar presentations


Presentation on theme: "Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A."— Presentation transcript:

1 Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A

2 Classification

3 ? CatDog

4 Cleanliness Size

5 ? $$$$$$$$$$

6 Regression

7 $ $$ $$$ $$$$ Price Top speed x y

8 Regression Data Goal: given, predict i.e. find a prediction function

9 Nearest neighbor -50510152025 -10 -5 0 5 10 15

10 Nearest neighbor To predict x –Find the data point x i closest to x –Choose y = y i + No training – Finding closest point can be expensive – Overfitting

11 Kernel Regression To predict X –Give data point x i weight –Normalize weights –Let e.g.

12 Kernel Regression -50510152025 -10 -5 0 5 10 15 [matlab demo]

13 Kernel Regression + No training + Smooth prediction – Slower than nearest neighbor – Must choose width of

14 Linear regression

15 0 10 20 30 40 0 10 20 30 20 22 24 26 Temperature [start Matlab demo lecture2.m] Given examples Predict given a new point 0 10 20 30 40 0 10 20 30 20 22 24 26 Temperature

16 0 10 20 30 40 0 10 20 30 20 22 24 26 Temperature Linear regression Prediction

17 Linear Regression Error or “residual” Prediction Observation Sum squared error

18 Linear Regression n d Solve the system (it’s better not to invert the matrix)

19 Minimize the sum squared error Sum squared error Linear equation Linear system

20 LMS Algorithm (Least Mean Squares) where Online algorithm

21 Beyond lines and planes everything is the same with still linear in 01020 0 40

22 Linear Regression [summary] n d Let For example Let Minimize by solving Given examples Predict

23 Probabilistic interpretation Likelihood

24 Overfitting 02468101214161820 -15 -10 -5 0 5 10 15 20 25 30 [Matlab demo] Degree 15 polynomial

25 Ridge Regression (Regularization) 02468101214161820 -10 -5 0 5 10 15 Effect of regularization (degree 19) with “small” Minimize Solve Let

26 Probabilistic interpretation Likelihood Prior Posterior

27 Locally Linear Regression

28 [source: http://www.cru.uea.ac.uk/cru/data/temperature] 1840186018801900192019401960198020002020 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 Global temperature increase

29 Locally Linear Regression To predict X –Give data point x i weight –Let e.g.

30 Locally Linear Regression + Good even at the boundary (more important in high dimension) – Solve linear system for each new prediction – Must choose width of To minimize Solve Predict where

31 [source: http://www.cru.uea.ac.uk/cru/data/temperature] Locally Linear Regression Gaussian kernel 180

32 [source: http://www.cru.uea.ac.uk/cru/data/temperature] Locally Linear Regression Laplacian kernel 180

33 L1 Regression

34 Sensitivity to outliers High weight given to outliers Influence function

35 L 1 Regression Linear program Influence function

36 Spline Regression Regression on each interval 5200540056005800 50 60 70

37 Spline Regression With equality constraints 5200540056005800 50 60 70

38 Spline Regression With L 1 cost 5200540056005800 50 60 70

39 To learn more The Elements of Statistical Learning, Hastie, Tibshirani, Friedman, Springer


Download ppt "Regression “A new perspective on freedom” TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAA A A A A AAA A A."

Similar presentations


Ads by Google