軟體實作與計算實驗 1 Lecture 9 Classification Two mappings from position to color Linear combination of distances to centers RBF: Linear combination of exponential.

Slides:



Advertisements
Similar presentations
4.1- Plot Points in a Coordinate Plane
Advertisements

軟體實作與計算實驗 1  Roots of nonlinear functions  Nested FOR loops Lecture 5II.
Component Analysis (Review)
Exercise 1 In the ISwR data set alkfos, do a PCA of the placebo and Tamoxifen groups separately, then together. Plot the first two principal components.
Types of Algorithms.
Computer vision: models, learning and inference Chapter 8 Regression.
Radial Basis Functions
Coloring black boxes: visualization of neural network decisions Włodzisław Duch School of Computer Engineering, Nanyang Technological University, Singapore,
Data mining and statistical learning, lecture 5 Outline  Summary of regressions on correlated inputs  Ridge regression  PCR (principal components regression)
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
Prediction Networks Prediction –Predict f(t) based on values of f(t – 1), f(t – 2),… –Two NN models: feedforward and recurrent A simple example (section.
Bioinformatics Challenge  Learning in very high dimensions with very few samples  Acute leukemia dataset: 7129 # of gene vs. 72 samples  Colon cancer.
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
Discriminant Analysis Testing latent variables as predictors of groups.
Radial Basis Function Networks
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
KE22 FINAL YEAR PROJECT PHASE 2 Modeling and Simulation of Milling Forces SIMTech Project Ryan Soon, Henry Woo, Yong Boon April 9, 2011 Confidential –
Chapter 4 Supervised learning: Multilayer Networks II.
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
數值方法 2008, Applied Mathematics NDHU 1  Nonlinear Recursive relations  Nonlinear recursive relations  Time series prediction  Hill-Valley classification.
1 Discrete and Combinatorial Mathematics R. P. Grimaldi, 5 th edition, 2004 Chapter 6 Generating Functions.
Geog. 579: GIS and Spatial Analysis - Lecture 21 Overheads 1 Point Estimation: 3. Methods: 3.6 Ordinary Kriging Topics: Lecture 23: Spatial Interpolation.
MA/CS 375 Fall MA/CS 375 Fall 2002 Lecture 31.
Classification Heejune Ahn SeoulTech Last updated May. 03.
So Far……  Clustering basics, necessity for clustering, Usage in various fields : engineering and industrial fields  Properties : hierarchical, flat,
Cluster Analysis Cluster Analysis Cluster analysis is a class of techniques used to classify objects or cases into relatively homogeneous groups.
Chapter 9 DTW and VQ Algorithm  9.1 Basic idea of DTW  9.2 DTW algorithm  9.3 Basic idea of VQ  9.4 LBG algorithm  9.5 Improvement of VQ.
軟體實作與計算實驗 1  Binary search  While loop  Root Finding Lecture 6II While Loop.
Non-Bayes classifiers. Linear discriminants, neural networks.
CLASSIFICATION. Periodic Table of Elements 1789 Lavosier 1869 Mendelev.
M. Ronen, G. Chandy, T. Meyer & J.E. Ferrell. Curve Feature Extraction basal storage Sustained (delta/ratio) Max amplitude (delta / ratio) Max slope Time.
BOĞAZİÇİ UNIVERSITY DEPARTMENT OF MANAGEMENT INFORMATION SYSTEMS MATLAB AS A DATA MINING ENVIRONMENT.
Adavanced Numerical Computation 2008, AM NDHU1 Function approximation Linear models Line fitting Hyper-plane fitting Discriminate analysis Least Square.
Feature extraction using fuzzy complete linear discriminant analysis The reporter : Cui Yan
國立清華大學高速通訊與計算實驗室 NTHU High-Speed Communication & Computing Laboratory Optimal Provisioning for Elastic Service Oriented Virtual Network Request in Cloud.
Lecture 2: Statistical learning primer for biologists
DATA MINING van data naar informatie Ronald Westra Dep. Mathematics Maastricht University.
Machine Learning Queens College Lecture 7: Clustering.
軟體實作與計算實驗 1  While-loop flow chart  Decimal to binary representations Lecture 6 While-Loop programming.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 28 Nov 9, 2005 Nanjing University of Science & Technology.
軟體實驗: C-Lab7 虞台文. Lab7 Issues Lab 7-1  On-Off bits counting Lab 7-2  ACSII File Verification.
Iterative K-Means Algorithm Based on Fisher Discriminant UNIVERSITY OF JOENSUU DEPARTMENT OF COMPUTER SCIENCE JOENSUU, FINLAND Mantao Xu to be presented.
Fuzzy Pattern Recognition. Overview of Pattern Recognition Pattern Recognition Procedure Feature Extraction Feature Reduction Classification (supervised)
Given a set of data points as input Randomly assign each point to one of the k clusters Repeat until convergence – Calculate model of each of the k clusters.
6.6 Factoring Quadratics with Composite Leading Coefficients Objective: Be able to factor quadratics that start with a composite number.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature.
Color Image Segmentation Mentor : Dr. Rajeev Srivastava Students: Achit Kumar Ojha Aseem Kumar Akshay Tyagi.
軟體實作與計算實驗 1  While loop  Positive integer square root  Decimal to binary transition Lecture 6 While Loop.
Linear Models & Clustering Presented by Kwak, Nam-ju 1.
Date of download: 6/23/2016 Copyright © 2016 SPIE. All rights reserved. Five sampling types with P=8, R=1: (a) X=0, (b) X=1, (c) X=2, (d) X=3, (e) X=4.
軟體實作與計算實驗 1  SUDOKU  New  Play  Check Lecture 8II SUDOKU PUZZLE.
Date of download: 10/1/2017 Copyright © ASME. All rights reserved.
Big data classification using neural network
Coordinate Plane SOL A.7.
Adavanced Numerical Computation 2008, AM NDHU
Distance and Midpoint In The Coordinate Plane
Matan Goldshtein, David B. Lukatsky  Biophysical Journal 
Nearest-Neighbor Classifiers
The regression model in matrix form
Neuro-Computing Lecture 4 Radial Basis Function Network
Redundancy in the Population Code of the Retina
ENM 310 Design of Experiments and Regression Analysis Chapter 3
Prediction Networks Prediction A simple example (section 3.7.3)
8/22/2019 Exercise 1 In the ISwR data set alkfos, do a PCA of the placebo and Tamoxifen groups separately, then together. Plot the first two principal.
Presentation transcript:

軟體實作與計算實驗 1 Lecture 9 Classification Two mappings from position to color Linear combination of distances to centers RBF: Linear combination of exponential distances to centers Four computational steps (A) K-means (B) Cross Distances (C) Optimal Coefficients (D) Coloring Five flow charts: OC (optimal coefficients), Coloring, ER (error rate), TRAINING & TESTING :

軟體實作與計算實驗 2 Kmeans: Data and MATLAB codes data_9.zip demo_kmeans.m load data_9.mat plot(X(:,1),X(:,2),'.'); [cidx, Y] = kmeans(X,10); hold on; plot(Y(:,1),Y(:,2),'ro');

軟體實作與計算實驗 3 My_kmeans: Data and MATLAB codes data_9.zip demo_my_kmeans.m my_kmeans.m load data_9.mat plot(X(:,1),X(:,2),'.'); [Y] = my_kmeans(X,10); hold on; plot(Y(:,1),Y(:,2),'ro');

軟體實作與計算實驗 4 ClusteringTest.rar my_kmeans.m GUI

軟體實作與計算實驗 5

6 Path of 4 means to centers of clusters

軟體實作與計算實驗 7 Tracking Convergence of K means

軟體實作與計算實驗 8 Path to converge data_2c.zip

軟體實作與計算實驗 9 Color points A color point POSITION COLOR A mapping from position to color

軟體實作與計算實驗 10 Classification of color points A mapping from position to color A rule for discriminating red points from blue points

軟體實作與計算實驗 11 A mapping from position to color Each point has its membership to partitioned regions Blue points and red points are well separated A mathematical mapping NO TABLE LOOK-UP !!

軟體實作與計算實驗 12 load data_2c.mat data_2c.zip X=data_2c(:,1:2); Y=data_2c(:,3); X : positions of points Y : 0 or 1 ( blue or red color) Color points

軟體實作與計算實驗 13 Two-dimensional color points Load data for classification function show_color_data(X,Y) figure; ind = find(Y==0); plot(X(ind,1), X(ind,2), 'b. '); hold on ind = find(Y==1); plot(X(ind,1), X(ind,2), 'r.') load data_2c X=data_2c(:,1:2); Y=data_2c(:,3); show_color_data(X,Y)

軟體實作與計算實驗 14 Position & Color X and Y are paired data X collects positions of points Y collects colors of points A mapping from position to color is derived for classification of color points

軟體實作與計算實驗 15 Sampling for training Sampling one half as training points N=size(X,1); ind=randperm(N); n = floor(N/2); x_train=X(ind(1:n),:); y_train=Y(ind(1:n),:); show_color_data(x_train,y_train)

軟體實作與計算實驗 16 X and Y Sampling x_train and y_train Sampling for training n=N/2

軟體實作與計算實驗 17 x_train and y_train OC(optimal coefficients) STEPS A,B,C TRAINING STEP D: coloring x_trainy_haty_train ER (error rate) Error rate for training ½ data centers, a

軟體實作與計算實驗 18 X and Y TESTING STEP D: coloring XY_hatY ER (error rate) Error rate for training centers, a All data

軟體實作與計算實驗 19 Lower error rate for better testing Error Rate A mapping X All data

軟體實作與計算實驗 20 Linear cut load data_2c.txt error rate : zero

軟體實作與計算實驗 21 Linear cut x1x1 x2x2

軟體實作與計算實驗 22 Linear cut fails load data_3c.txt error rate : 0.289

軟體實作與計算實驗 23 Computations STEP A: kmeans Apply K-means to find K centers STEP B: cross distances Calculate cross distances between K means and N data points STEP C: optimal coefficients Determine a mapping from position to color

軟體實作與計算實驗 24 Sampling of training data data_3c.zip load data_3c X=data_3c(:,1:2); Y=data_3c(:,3); N=size(X,1); ind=randperm(N); n=floor(N/2); x_train=X(ind(1:n),:); y_train=Y(ind(1:n),:); show_color_data(x_train, y_train)

軟體實作與計算實驗 25 Step A K-means

軟體實作與計算實驗 26 centers = my_kmeans(x_train,3);

軟體實作與計算實驗 27 Step B Distances between centers and data points D(i,j) stores the distance between the ith point and the jth center D = cross_distance(x_train,centers) cross_distance.m

軟體實作與計算實驗 28 Step C: optimal coefficients Linear combination of distances to centers x … … a1a1 akak aKaK sum

軟體實作與計算實驗 29 A mapping for coloring (mapping I)

軟體實作與計算實驗 30 Coloring one point

軟體實作與計算實驗 31

軟體實作與計算實驗 32

軟體實作與計算實驗 33 STEP D: Coloring n points

軟體實作與計算實驗 34 STEP C: Optimal coefficients

軟體實作與計算實驗 35 Flow chart I : Optimal coefficients centers = my_kmeans(x_train,3 ); D = cross_distance(x_train,centers) x_train, y_train a = pinv(D)*y_train; centers, a STEP A STEP B STEP C funtion [centers,a]=OC(x_train,y_train)

軟體實作與計算實驗 36 x_train, centers, a y_hat=D*a; Flow chart II : coloring D = cross_distance(x_train,centers) STEP D coloring y_hat function y_hat=coloring(x_train,centers,a) STEP B

軟體實作與計算實驗 37 y_hat,y_train e=abs(y_train-round(y_hat)); n= length(y_train); er=sum(e)/n Error rate ans = 0 Flow chart III: Error rate for training function er=err_rate(y_hat,y_train)

軟體實作與計算實驗 38 Flow chart IV : TRAINING [centers,a]=OC(x_train,y_train) y_hat=coloring(x_train,centers,a) er=err_rate(y_hat,y_train) x_train,y_train er FLOW CHART I FLOW CHART II FLOW CHART III

軟體實作與計算實驗 39 Flow Chart V: TESTING load data_3c X=data_3c(:,1:2); Y=data_3c(:,3); Error rate ans = y_hat=coloring(x_train,centers,a) er=err_rate(y_hat,y_train) centers,a FLOW CHART II FLOW CHART III

軟體實作與計算實驗 40 Mapping II from position to color x … … a1a1 akak aKaK sum Linear combination of exponential distances to centers

軟體實作與計算實驗 41 Discriminating rule

軟體實作與計算實驗 42

軟體實作與計算實驗 43 Set all variances to h for simplicity Step D: Coloring

軟體實作與計算實驗 44 Optimal coefficients

軟體實作與計算實驗 45 Flow chart I : Optimal coefficients centers = my_kmeans(x_train,3 ); D = cross_distance(x_train,centers) x_train, y_train h=200; a = pinv(exp(-D/h))*Y_TRAIN; centers, a STEP A STEP B STEP C funtion [centers,a]=OC(x_train,y_train)

軟體實作與計算實驗 46 x_train, centers, a y_hat=exp(-D/h)*a; Flow chart II : coloring D = cross_distance(x_train,centers) STEP D coloring Y_hat function y_hat=coloring(x_train,centers,a) STEP B

軟體實作與計算實驗 47 y_hat,y_train e=abs(y_train-round(y_hat)); n= length(y_train); er=sum(e)/n Error rate ans = 0 Flow chart III: Error rate for training function er=err_rate(y_hat,y_train)

軟體實作與計算實驗 48 Flow chart IV: TRAINING [centers,a]=OC(x_train,y_train) y_hat=coloring(x_train,centers,a) er=err_rate(y_hat,y_train) x_train,y_train er FLOW CHART I FLOW CHART II FLOW CHART III

軟體實作與計算實驗 49 Flow Chart V: TESTING load data_3c X=data_3c(:,1:2); Y=data_3c(:,3); y_hat=coloring(x_train,centers,a) er=err_rate(y_hat,y_train) Error rate ans = centers,a FLOW CHART II FLOW CHART III