Hazırlayan NEURAL NETWORKS Least Squares Estimation PROF. DR. YUSUF OYSAL.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Regression Eric Feigelson Lecture and R tutorial Arcetri Observatory April 2014.
Ordinary Least-Squares
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
NEURAL NETWORKS Backpropagation Algorithm
NEURAL NETWORKS Perceptron
Some terminology When the relation between variables are expressed in this manner, we call the relevant equation(s) mathematical models The intercept and.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Linear regression models
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Cost Behavior, Activity Analysis, and Cost Estimation
Least Square Regression
PART 7 Constructing Fuzzy Sets 1. Direct/one-expert 2. Direct/multi-expert 3. Indirect/one-expert 4. Indirect/multi-expert 5. Construction from samples.
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Development of Empirical Models From Process Data
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Linear and generalised linear models Purpose of linear models Least-squares solution for linear models Analysis of diagnostics Exponential family and generalised.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 10 Real Inner Products and Least-Square (cont.)
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
Neural Networks Lecture 8: Two simple learning algorithms
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 11 Regression.
Least-Squares Regression
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Natural Gradient Works Efficiently in Learning S Amari (Fri) Computational Modeling of Intelligence Summarized by Joon Shik Kim.
Outline 1-D regression Least-squares Regression Non-iterative Least-squares Regression Basis Functions Overfitting Validation 2.
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
Computacion Inteligente Least-Square Methods for System Identification.
Curve-Fitting Regression
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
MA3264 Mathematical Modelling Lecture 3 Model Fitting.
CROSS-VALIDATION AND MODEL SELECTION Many Slides are from: Dr. Thomas Jensen -Expedia.com and Prof. Olga Veksler - CS Learning and Computer Vision.
Concept learning, Regression Adapted from slides from Alpaydin’s book and slides by Professor Doina Precup, Mcgill University.
Neural Nets: Something you can use and something to think about Cris Koutsougeras What are Neural Nets What are they good for Pointers to some models and.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
V Bandi and R Lahdelma 1 Forecasting. V Bandi and R Lahdelma 2 Forecasting? Decision-making deals with future problems -Thus data describing future must.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
MODEL FITTING jiangyushan. Introduction The goal of model fitting is to choose values for the parameters in a function to best describe a set of data.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Nonlinear differential equation model for quantification of transcriptional regulation applied to microarray data of Saccharomyces cerevisiae Vu, T. T.,
تنظيم پارامترهاي يك سيستم با روش کمترين مربعات خطا Parameter Identification of a System with LSE … Vali Derhami Yazd University, Computer Department
Introduction to Statistical Quality Control, 4th Edition Chapter 13 Process Optimization with Designed Experiments.
Computacion Inteligente Least-Square Methods for System Identification.
CORRELATION-REGULATION ANALYSIS Томский политехнический университет.
GEOMETRIC CAMERA CALIBRATION The Calibration Problem Least-Squares Techniques Linear Calibration from Points Reading: Chapter 3.
Linear Models Tony Dodd. 21 January 2008Mathematics for Data Modelling: Linear Models Overview Linear models. Parameter estimation. Linear in the parameters.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
ELG5377 Adaptive Signal Processing Lecture 13: Method of Least Squares.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
Linear Regression Modelling
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Linear Regression.
Simultaneous equation system
A Brief Introduction of RANSAC
Data Mining Lecture 11.
CHAPTER 29: Multiple Regression*
Unfolding Problem: A Machine Learning Approach
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Recursively Adapted Radial Basis Function Networks and its Relationship to Resource Allocating Networks and Online Kernel Learning Weifeng Liu, Puskal.
Presentation transcript:

Hazırlayan NEURAL NETWORKS Least Squares Estimation PROF. DR. YUSUF OYSAL

System Identification NEURAL NETWORKS – Least Squares Estimation Goal Determine a mathematical model for an unknown system (or target system) by observing its input- output data pairs Purposes To predict a system’s behavior, as in time series prediction & weather forecasting To explain the interactions & relationships between inputs & outputs of a system Example: Find a line that represent the ”best” linear relationship: b = ax 2

Parameter Identification NEURAL NETWORKS – Least Squares Estimation The data set composed of m desired input-output pairs (u i y i ) (i = 1,…,m) is called the training data System identification needs to do both structure & parameter identification repeatedly until satisfactory model is found: it does this as follows: 1. Specify & parameterize a class of mathematical models representing the system to be identified 2. Perform parameter identification to choose the parameters that best fit the training data set 3. Conduct validation set to see if the model identified responds correctly to an unseen data set 4. Terminate the procedure once the results of the validation test are satisfactory. Otherwise, another class of model is selected & repeat step 2 to 4 3

Parameter Identification NEURAL NETWORKS – Least Squares Estimation The structure of the model is known, however we need to apply optimization techniques in order to determine the parameter vector such that the resulting model describes the system appropriately: 4 General form: y =  1 f 1 (u) +  2 f 2 (u) + … +  n f n (u) u = (u 1, …, u p )T is the model input vector f 1, …, f n are known functions of u  1, …,  n are unknown parameters to be estimated

Parameter Identification NEURAL NETWORKS – Least Squares Estimation 5 A  = y

Parameter Identification NEURAL NETWORKS – Least Squares Estimation 6 A  + e = y

Least-Squares Estimator NEURAL NETWORKS – Least Squares Estimation 7 Theorem: Least-Squares Estimator (LSE) The squared error is minimized when (called the least-squares estimators, LSE) satisfies the normal equation If A T A is nonsingular, is unique & is given by

Least-Squares Estimator Example NEURAL NETWORKS – Least Squares Estimation 8 Example Spring Example The relationship is between the spring length and the force applied L = k 1 f + k 0 (linear model) Goal: Find = (k 0, k 1 ) T that best fits the data for a given force and length values. The table shows the training data for the spring example. ExperimentForce (Newtons)Length of Spring

Least-Squares Estimator Example NEURAL NETWORKS – Least Squares Estimation 9 minimizes