Regularization of Evolving Polynomial Models

Slides:



Advertisements
Similar presentations
Design of Experiments Questions Network Inference Working Group October 8, 2008.
Advertisements

: INTRODUCTION TO Machine Learning Parametric Methods.
Polynomial Curve Fitting BITS C464/BITS F464 Navneet Goyal Department of Computer Science, BITS-Pilani, Pilani Campus, India.
Kriging.
/k 2DS00 Statistics 1 for Chemical Engineering lecture 4.
Overfitting and Regularization Chapters 11 and 12 on amlbook.com.
Optimizing number of hidden neurons in neural networks
Transient and steady state response (cont.)
Modeling, Simulation, and Control of a Real System Robert Throne Electrical and Computer Engineering Rose-Hulman Institute of Technology.
Institute of Computer Science, Prague 1 Neural Networks Marcel Jiřina.
Generation & Propagation of Uncertainty Analysis P M V Subbarao Professor Mechanical Engineering Department A Measure of Confidence Level in compound Experiments…..
APPLICATION OF GEOMETRICAL EXTRAPOLATION METHOD BASED HYBRID SYSTEM CONTROLLER ON PURSUIT-AVOIDANCE DIFFERENTIAL GAME Ginzburg Pavel and Slavnaya Lyudmila.
Chapter 4 Roots of Polynomials.
Signal Strength based Communication in Wireless Sensor Networks (Sensor Network Estimation) Imran S. Ansari EE 242 Digital Communications and Coding (Fall.
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Dr M F Abbod Using Intelligent Optimisation Methods to Improve the Group Method of Data Handling in Time Series Prediction Maysam Abbod and Karishma Dashpande.
Machine Learning Week 4 Lecture 1. Hand In Data Is coming online later today. I keep test set with approx test images That will be your real test.
TS Modeling Based on GMDH and Its application Changzheng He Dept. of Management Science, Sichuan University of P.R.China.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
1 A method of successive elimination of spurious arguments for effective solution the search- based modelling tasks Oleksandr Samoilenko, Volodymyr Stepashko.
Introduction to Machine Learning Supervised Learning 姓名 : 李政軒.
Experimental research in noise influence on estimation precision for polyharmonic model frequencies Natalia Visotska.
Using Interactive Evolution for Exploratory Data Analysis Tomáš Řehořek Czech Technical University in Prague.
Multivariate Dyadic Regression Trees for Sparse Learning Problems Xi Chen Machine Learning Department Carnegie Mellon University (joint work with Han Liu)
CROSS-VALIDATION AND MODEL SELECTION Many Slides are from: Dr. Thomas Jensen -Expedia.com and Prof. Olga Veksler - CS Learning and Computer Vision.
Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo.
Some Aspects of Bayesian Approach to Model Selection Vetrov Dmitry Dorodnicyn Computing Centre of RAS, Moscow.
CS Inductive Bias1 Inductive Bias: How to generalize on novel data.
1 Capabilities and Prospects of Inductive Modeling Volodymyr STEPASHKO Prof., Dr. Sci., Head of Department INFORMATION TECHNOLOGIES FOR INDUCTIVE MODELING.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
CSC321: Lecture 7:Ways to prevent overfitting
Dividing Polynomials SYNTHETIC DIVISION AND LONG DIVISION METHODS.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Over-fitting and Regularization Chapter 4 textbook Lectures 11 and 12 on amlbook.com.
Bias and Variance of the Estimator PRML 3.2 Ethem Chp. 4.
Machine Learning 5. Parametric Methods.
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
CS 2750: Machine Learning The Bias-Variance Tradeoff Prof. Adriana Kovashka University of Pittsburgh January 13, 2016.
Machine Learning CUNY Graduate Center Lecture 6: Linear Regression II.
MACHINE LEARNING 3. Supervised Learning. Learning a Class from Examples Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
FAKE GAME updates Pavel Kordík
VISUALIZATION TECHNIQUES UTILIZING THE SENSITIVITY ANALYSIS OF MODELS Ivo Kondapaneni, Pavel Kordík, Pavel Slavík Department of Computer Science and Engineering,
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CTU Prague, Faculty of Electrical Engineering, Department of Computer Science Pavel Kordík GMDH and FAKE GAME: Evolving ensembles of inductive models.
Inductive modelling: Detection of system states validity Pavel Kordík Department of Computer Science and Engineering, FEE,CTU Prague
The GAME Algorithm Applied to Complex Fractionated Atrial Electrograms Data Set Pavel Kordík, Václav Křemen and Lenka Lhotská Department of Computer Science.
Essential Questions How do we use long division and synthetic division to divide polynomials?
OPTIMIZATION OF MODELS: LOOKING FOR THE BEST STRATEGY
REGRESSION (R2).
International Workshop
Machine Learning – Regression David Fenyő
CSE 4705 Artificial Intelligence
Boosting and Additive Trees
2006 IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (IJCNN) Evolutionary Search for Interesting Behavior.
Special Topics In Scientific Computing
Bias and Variance of the Estimator
CS 2750: Machine Learning Line Fitting + Bias-Variance Trade-off
10701 / Machine Learning Today: - Cross validation,
The Bias Variance Tradeoff and Regularization
The Factor Theorem A polynomial f(x) has a factor (x − k) if and only if f(k) = 0.
Overfitting and Underfitting
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
(1) Find all of the zeros of f.
Uncertainty Propagation
Presentation transcript:

Regularization of Evolving Polynomial Models IWIM 2007 workshop, Sept. 23-26, 2007, Prague Regularization of Evolving Polynomial Models Pavel Kordík CTU Prague, Faculty of Electrical Engineering, Department of Computer Science

External criteria in GMDH theory

GAME model

Encoding into chromosomes

Polynomial units encoded into neurons

Data set division Adaptive division? 2/3 1/3 A B Training data Validation data Testing data Optimize coefficients (learning of units) Select surviving units Check if model overfits data

Fitness function – which units survive? RMS Error of the unit on the training data – feedback for optimization methods RMS Error of the unit on the validation data – used to compute fitness of units

External criterion Computation of error on validation set Fitness = 1/CR

CRrms-r-val criterion on real data Optimal value of R is 300 on the Antro data set

CRrms-r-val criterion on real data Optimal value of R is 725 on the Building data set

CR should be sensitive to noise

How to estimate the penalization strength (1/R)? Variance of the output variable?

Experiments with synthetic data Validate just on validation set! Regularization works, but the difference of R300 and p-n is not significant.

Theoretical and experimental aspects of regularization

So which criterion is the best?

Regularized polynomial models on Antro data set It is evident that optimal value of R is between 100 and 1000 – the same results as in our pervious experiments with Antro data set (Ropt=300). Linear models are still better than the best polynomial !!!

Conclusion Experiments with regularization of polynomial models Every data set requires different level of penalization for complexity It can be partially derived from the variance of the output variable The regularization is still not sufficient, linear models perform better on highly noisy data sets!

Thank you!