Combining Regression Trees and Radial Basis Function Networks paper by: M. Orr, J. Hallam, K. Takezawa, A. Murray, S. Ninomiya, M. Oide, T. Leonard presentation.

Slides:



Advertisements
Similar presentations
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Advertisements

Data Mining Classification: Alternative Techniques
Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Artificial Neural Networks
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Neural Networks II CMPUT 466/551 Nilanjan Ray. Outline Radial basis function network Bayesian neural network.
Lecture 14 – Neural Networks
OLS REGRESSION VS. NEURAL NETWORKS VS. MARS A COMPARISON R. J. Lievano E. Kyper University of Minnesota Duluth.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Chapter 5 NEURAL NETWORKS
Radial Basis Function Networks 표현아 Computer Science, KAIST.
Three kinds of learning
End of Chapter 8 Neil Weisenfeld March 28, 2005.
1 Diagnosing Breast Cancer with Ensemble Strategies for a Medical Diagnostic Decision Support System David West East Carolina University Paul Mangiameli.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Data mining and statistical learning - lecture 12 Neural networks (NN) and Multivariate Adaptive Regression Splines (MARS)  Different types of neural.
Intelligible Models for Classification and Regression
Last lecture summary.
Radial-Basis Function Networks
Last lecture summary.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Radial Basis Function Networks
1 Prediction of Software Reliability Using Neural Network and Fuzzy Logic Professor David Rine Seminar Notes.
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
Radial Basis Function Networks
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
MML Inference of RBFs Enes Makalic Lloyd Allison Andrew Paplinski.
Introduction to Radial Basis Function Networks. Content Overview The Models of Function Approximator The Radial Basis Function Networks RBFN’s for Function.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Classification / Regression Neural Networks 2
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Jeff Howbert Introduction to Machine Learning Winter Regression Linear Regression.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Akram Bitar and Larry Manevitz Department of Computer Science
Jeff Howbert Introduction to Machine Learning Winter Regression Linear Regression Regression Trees.
Feature selection with Neural Networks Dmitrij Lagutin, T Variable Selection for Regression
Reservoir Uncertainty Assessment Using Machine Learning Techniques Authors: Jincong He Department of Energy Resources Engineering AbstractIntroduction.
Ensemble Methods in Machine Learning
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
KNN Classifier.  Handed an instance you wish to classify  Look around the nearby region to see what other classes are around  Whichever is most common—make.
Computational Intelligence: Methods and Applications Lecture 29 Approximation theory, RBF and SFN networks Włodzisław Duch Dept. of Informatics, UMK Google:
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
New Algorithms for Efficient High-Dimensional Nonparametric Classification Ting Liu, Andrew W. Moore, and Alexander Gray.
Introduction to Radial Basis Function Networks
Adversarial Search 2 (Game Playing)
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Dimensions of Neural Networks Ali Akbar Darabi Ghassem Mirroshandel Hootan Nokhost.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Overfitting, Bias/Variance tradeoff. 2 Content of the presentation Bias and variance definitions Parameters that influence bias and variance Bias and.
RiskTeam/ Zürich, 6 July 1998 Andreas S. Weigend, Data Mining Group, Information Systems Department, Stern School of Business, NYU 2: 1 Nonlinear Models.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Machine Learning Supervised Learning Classification and Regression
Adavanced Numerical Computation 2008, AM NDHU
Structure learning with deep autoencoders
Forward & Backward selection in hybrid network
Neuro-Computing Lecture 4 Radial Basis Function Network
Neural Network - 2 Mayank Vatsa
Multilayer Perceptron & Backpropagation
Robust Full Bayesian Learning for Neural Networks
CSC 558 – Data Analytics II, Prep for assignment 1 – Instance-based (lazy) machine learning January 2018.
Image recognition.
Presentation transcript:

Combining Regression Trees and Radial Basis Function Networks paper by: M. Orr, J. Hallam, K. Takezawa, A. Murray, S. Ninomiya, M. Oide, T. Leonard presentation by: Vladimir Vacić

Contents: Regression Trees Radial Basis Function Neural Networks Combining RTs and RBF NNs Method Experimental Results Conclusion

X ik <b S L S R Regression Trees b

Radial Basis Function Neural Networks

Combining RTs and RBF NNs RT generates candidate units for the RBF NN RT specifies RBF centers and radii RT influences the order in which candidate units are evaluated

Method Generating the regression tree: recursively cut along the k dimensions determine output for each node

Method Transforming tree nodes in RBFs: center radius

Method Selecting RBF units from the set of candidates: necessary because so far we have not performed any pruning of the regression tree too complex of a RBF runs into a risk of over-fitting complex RBF is computationally expensive

Method Selecting RBF units from the set of candidates: standard selection methods are forward selection, backward elimination, combination of the two, full combinatorial search… problem with forward selection is that once choice may block subsequent informative choices

Method Using the trees to guide selection: put the root node into the list of active nodes for each node, consider the effect of adding one or both children and keeping or removing the parent choose the combination which improves performance the most and update the active list repeat

Method Calculating the weights: least square minimization

Method Model selection criterion: Bayesian information criterion (BIC) BIC imposes a penalty for model complexity and hence leads to smaller networks

Method Note that so far we have had 2 free parameters : p (controls the resulting network size)  (controls the ratio of the RBF radii to corresponding hyper-rectangle size)

Experimental Results the authors report that the best experimentally determined p and  on the training set do not always yield best performance on the test set instead, they suggest using a set of best values for p and  from training and then find the best combination on the test set

Experimental Results 2D sine wave problem simulated circuit problem

Experimental Results Comparison with other learning methods: linear least squares regression k-nearest neighbor ensembles of multilayer perceptrons multilayer perceptrons trained using MCMC multivariate adaptive regression splines (MARS) with bagging

Experimental Results Datasets: DELVE dataset (non-linear, nigh noise, 8- and 32-dimensional examples), generated from simulated robotic arms soybean classification into three classes (good, fair, poor) from digital images

Experimental Results DELVE, 8-dimensional examples:

Experimental Results DELVE, 32-dimensional examples:

Conclusion improvement and analysis of previous work by Kubat combining RTs and RBF NNs as a technique is competitive with some leading modern methods

Combining Regression Trees and Radial Basis Function Networks paper by: M. Orr, J. Hallam, K. Takezawa, A. Murray, S. Ninomiya, M. Oide, T. Leonard presentation by: Vladimir Vacić