Download presentation
Presentation is loading. Please wait.
Published byDana Dennis Modified over 9 years ago
1
Adavanced Numerical Computation 2008, AM NDHU1 Function approximation Linear models Line fitting Hyper-plane fitting Discriminate analysis Least Square Method
2
Unconstrained optimization A target function, y = g(x), where x R d A sample from the surface of f paired data S={(x i,y i )} i y i =g(x i )+ i Let G(x; ) be an approximating function to g collects built-in parameters Minimizing the mean square approximating error induces an unconstrained optimization Advanced Numerical Computation 2008, AM NDHU2
3
Adavanced Numerical Computation 2008, AM NDHU3 Data driven function approximation
4
Adavanced Numerical Computation 2008, AM NDHU4 Data driven function approximation
5
Mean square approximating error Advanced Numerical Computation 2008, AM NDHU5
6
Nonlinear System Advanced Numerical Computation 2008, AM NDHU6
7
Minima dE/d = 0 Local minima : unsatisfied approximation Global minima : reliable and effective approximation Advanced Numerical Computation 2008, AM NDHU7
8
Nonlinear system The severe local minimum problem needs to be overcome Adavanced Numerical Computation 2008, AM NDHU8
9
Approximating function Dimensionality One dimensional functions High dimensional functions Extremely high dimensional functions Linear functions, quadratic functions and nonlinear functions Adavanced Numerical Computation 2008, AM NDHU9
10
10 Line fitting Given paired data, (x i, y i ), minimize one dimensional linear function
11
Adavanced Numerical Computation 2008, AM NDHU11 Paired data n=100; x=rand(1, n ); y=1.5*x+2+rand(1, n )*0.1-0.05; plot(x,y,'.')
12
Adavanced Numerical Computation 2008, AM NDHU12 Fitting criteria
13
Adavanced Numerical Computation 2008, AM NDHU13 Pseudo Inverse
14
Adavanced Numerical Computation 2008, AM NDHU14 Form X and b X=[x' ones(n,1)]; b=y';
15
Adavanced Numerical Computation 2008, AM NDHU15 Line fitting >> a=inv(X'*X)*X'*b a = 1.4816 2.0113
16
Adavanced Numerical Computation 2008, AM NDHU16 Strategy I: Line fitting >> a=pinv(X)*b a = 1.4816 2.0113 Matlab built-in function
17
Adavanced Numerical Computation 2008, AM NDHU17 Demo_line_fitting demo_line_fitting.m
18
Adavanced Numerical Computation 2008, AM NDHU18 Stand alone executable file mcc -m demo_line_fitting.m demo_line_fitting.exe demo_line_fitting.ctf
19
Strategy II Conjugate gradient method Adavanced Numerical Computation 2008, AM NDHU19
20
Comparison pinv versus conjugate gradient method d=1,n=100 Adavanced Numerical Computation 2008, AM NDHU20 d=1;n=100; x=rand(d, n); y=rand(1,d)*x+2+rand(1, n)*0.1-0.05; plot(x,y,'.'); X=[x' ones(n,1)]; b=y'; tstart = tic; a=pinv(X)*b telapsed = toc(tstart) tstart = tic; x0=rand(1,d+1)'; a2 = conjugate(X'*X,X'*b,x0) telapsed = toc(tstart)
21
Adavanced Numerical Computation 2008, AM NDHU21 Hyper-plane fitting
22
Adavanced Numerical Computation 2008, AM NDHU22 Sampling A sample from a hyper-plane or a cloud A mapping from R 2 to R Paired data:
23
Adavanced Numerical Computation 2008, AM NDHU23 n=100, d=2 General coordinate of points (s 1,s 2,s 3 ) or (x 1,x 2,y) Linea relation s 1 a 1 +s 2 a 2 +a 3 =s 3 equivalently x 1 a 1 +x 2 a 2 +a 3 =y
24
Adavanced Numerical Computation 2008, AM NDHU24 d < n The number of unknowns is less than the constraint number One hundred data points in R 3 space Data point on a hyper-plane s i1 a 1 +s i2 a 2 +a 3 - s i3 =0 x i1 a 1 +x i2 a 2 +a 3 - y i = Minimization of the mean square error
25
Adavanced Numerical Computation 2008, AM NDHU m=d+1 and
26
Adavanced Numerical Computation 2008, AM NDHU26 Strategy I : Pseudo Inverse
27
Adavanced Numerical Computation 2008, AM NDHU27 Strategy II: minimizing mean square errors
28
Adavanced Numerical Computation 2008, AM NDHU28 Minimization
29
Adavanced Numerical Computation 2008, AM NDHU29 Derivative
30
Adavanced Numerical Computation 2008, AM NDHU30 Vector Form
31
Adavanced Numerical Computation 2008, AM NDHU31 Linear system: normal equations
32
Adavanced Numerical Computation 2008, AM NDHU32
33
Adavanced Numerical Computation 2008, AM NDHU33
34
Adavanced Numerical Computation 2008, AM NDHU34
35
Adavanced Numerical Computation 2008, AM NDHU35
36
Adavanced Numerical Computation 2008, AM NDHU36
37
Comparison pinv versus conjugate gradient method d=2,n=100 Adavanced Numerical Computation 2008, AM NDHU37 d=2;n=100; x=rand(d, n); y=rand(1,d)*x+2+rand(1, n)*0.1-0.05; plot(x,y,'.'); X=[x' ones(n,1)]; b=y'; tstart = tic; a=pinv(X)*b telapsed = toc(tstart) tstart = tic; x0=rand(1,d+1)'; a2 = conjugate(X'*X,X'*b,x0) telapsed = toc(tstart)
38
Adavanced Numerical Computation 2008, AM NDHU38 Hyper-plane fitting Step 1. Input paired data, (x i, y i ), i=1 … n Step 2. Form matrix X and vector b Step 3. Set a to pinv(X)*b Step 4. Set c to Hyper-plane fitting a xyxy
39
Adavanced Numerical Computation 2008, AM NDHU39 >> n=30;S=rand(n,2);y=S*[1 2] ' +1; >> b=y; >> X=[S ones(n,1)]; >> a=pinv(X)*b; c=inv(X'*X)*(X'*b); >> sum(abs(a-c)) ans = 1.0547e-015
40
Adavanced Numerical Computation 2008, AM NDHU40 Generate training data Get S and y Performance evaluation Hyper-plane fitting S,y a Generate testing data Get S_test and y_test Generate y_hat S_test y_test - Error rate for testing TRAINING TESTING
41
Adavanced Numerical Computation 2008, AM NDHU41 Generate y_hat >> n=10;S_test=rand(n,2);y_test=S_test*[1 2]'+1; >> X_test = [S_test ones(n,1)]; >> y_hat = X_test * a;
42
Adavanced Numerical Computation 2008, AM NDHU42 Testing error >> error_rate= mean((y_test-y_hat).^2)
43
Adavanced Numerical Computation 2008, AM NDHU43 demo_hp_fitting >> demo_hp_fitting a1:1 a2:2 a3:3 a = 0.9959 2.0035 3.0141
44
Adavanced Numerical Computation 2008, AM NDHU44 HP Tool
45
Adavanced Numerical Computation 2008, AM NDHU45 HP Tool MLP_Tool.m MLP_Tool.fig
46
Adavanced Numerical Computation 2008, AM NDHU46 Mesh fstr=input('input a 2D function: x1.^2+x2.^2+cos(x1) :','s'); fx=inline(fstr); range=2*pi; x1=-range:0.1:range; x2=x1; for i=1:length(x1) C(i,:)=fx(x1(i),x2); end mesh(x1,x2,C);
47
Adavanced Numerical Computation 2008, AM NDHU47 Nonlinear function approximation
48
Adavanced Numerical Computation 2008, AM NDHU48 Nonlinear function approximation Target function & sample Unfaithful approximation by hyper-plane fitting
49
Adavanced Numerical Computation 2008, AM NDHU49 Linear projection
50
Adavanced Numerical Computation 2008, AM NDHU50 Two linear projections Add two linear projections
51
Adavanced Numerical Computation 2008, AM NDHU51
52
Adavanced Numerical Computation 2008, AM NDHU52 Post-nonlinear Projection tanh y
53
Adavanced Numerical Computation 2008, AM NDHU53
54
Adavanced Numerical Computation 2008, AM NDHU54 Two post-nonlinear projections
55
Adavanced Numerical Computation 2008, AM NDHU55 Data driven function approximation
56
Adavanced Numerical Computation 2008, AM NDHU56 Data driven function approximation
57
Adavanced Numerical Computation 2008, AM NDHU57 Linearly non-separable Classify blue and red dots to two categories Linearly non-separable by hyper-plane fitting
58
Adavanced Numerical Computation 2008, AM NDHU58 Error rate 22.48 %
59
Adavanced Numerical Computation 2008, AM NDHU59 Classification Discriminate analysis Linear discriminate analysis win.rar 178 paired data (s,y) s {R 13 } : predictor or features, the last 13 column y { 1,2,3} : three categories, the first column
60
Adavanced Numerical Computation 2008, AM NDHU60 Training & Testing data win.dat load win.dat ind=randperm(178) win_train=win(ind(1:140),:); win_test=win(ind(141:178),:); save win_train.mat win_train; save win_test.mat win_test; win_train.mat win_test.mat
61
Adavanced Numerical Computation 2008, AM NDHU61 Load training data Get S and y Discriminate Analysis Hyper-plane fitting S,y a Load testing data Get S_test and y_test Generate y_hat S_test y_test compare Error rate for testing
62
Adavanced Numerical Computation 2008, AM NDHU62
63
Adavanced Numerical Computation 2008, AM NDHU63 Linear assumption Predictor x=[x 1,…,x 13 ] T y = a 1 *x 1 +a 2 *x 2 +…+a 13 *x 13 Find a to
64
Adavanced Numerical Computation 2008, AM NDHU64 Demo_wine_fitting Error Rate : 3.93%
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.