SYSTEMS Identification

Slides:



Advertisements
Similar presentations
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
5.4 Basis And Dimension.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
CY3A2 System identification Modelling Elvis Impersonators Fresh evidence that pop stars are more popular dead than alive. The University of Missouri’s.
STAT 497 APPLIED TIME SERIES ANALYSIS
Visual Recognition Tutorial
Lectures 5 & 6: Least Squares Parameter Estimation
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
SYSTEMS Identification
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
AGC DSP AGC DSP Professor A G Constantinides© Estimation Theory We seek to determine from a set of data, a set of parameters such that their values would.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
SYSTEMS Identification
SYSTEMS Identification
Multivariable Control Systems
Development of Empirical Models From Process Data
Prediction and model selection
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Basics of regression analysis
Lecture II-2: Probability Review
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Review of Probability.
Principles of the Global Positioning System Lecture 13 Prof. Thomas Herring Room A;
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
2. Stationary Processes and Models
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Lecture 22 Numerical Analysis. Chapter 5 Interpolation.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Sampling Design and Analysis MTH 494 Lecture-22 Ossam Chohan Assistant Professor CIIT Abbottabad.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Dept. E.E./ESAT-STADIUS, KU Leuven
Bias and Variance of the Estimator PRML 3.2 Ethem Chp. 4.
Discrete-time Random Signals
Joint Moments and Joint Characteristic Functions.
Signals and Systems Analysis NET 351 Instructor: Dr. Amer El-Khairy د. عامر الخيري.
Sampling Design and Analysis MTH 494 Lecture-21 Ossam Chohan Assistant Professor CIIT Abbottabad.
Model Structures 1. Objective Recognize some discrete-time model structures which are commonly used in system identification such as ARX, FIR, ARMAX,
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad.
DEPARTMENT OF MECHANICAL TECHNOLOGY VI -SEMESTER AUTOMATIC CONTROL 1 CHAPTER NO.6 State space representation of Continuous Time systems 1 Teaching Innovation.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
DSP-CIS Part-III : Optimal & Adaptive Filters Chapter-9 : Kalman Filters Marc Moonen Dept. E.E./ESAT-STADIUS, KU Leuven
Chapter 6 Random Processes
Chapter 7. Classification and Prediction
PSG College of Technology
LINEAR CONTROL SYSTEMS
SYSTEMS Identification
State Space Models.
Linear Filters.
Principles of the Global Positioning System Lecture 13
§1—2 State-Variable Description The concept of state
Lecturer Dr. Veronika Alhanaqtah
WHY WE FILTER TO IDENTIFY THE RELATIONSHIP
Kalman Filter: Bayes Interpretation
16. Mean Square Estimation
Presentation transcript:

SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad <<<1.1>>> ###Control System Design### {{{Control, Design}}} Reference: “System Identification Theory For The User” Lennart Ljung

Models of linear time invariant system Lecture 4 Models of linear time invariant system Topics to be covered include: Linear models and sets of linear models. A family of transfer function models. State space models. Identifiability of some model structures.

Linear models and sets of linear models Topics to be covered include: Linear models and sets of linear models. A family of transfer function models. State space models. Identifiability of some model structures.

Linear models and sets of linear models A complete model is given by with A particular model thus corresponds to specification of the function G, H and fe. Most often fe not specified as a function, but first and second moments are specified as: It is also common to assume e(t) is Gaussian.

Linear models and sets of linear models with A particular model thus corresponds to specification of the function G, H and fe. We try to parameterize coefficients so: Sets of models Where θ is a vector in Rd space. We thus have:

A family of transfer function models Topics to be covered include: Linear models and sets of linear models. A family of transfer function models. State space models. Identifiability of some model structures.

A family of transfer function models AR part Equation error model structure Adjustable parameters in this case are X part Define ARX model + u e y The ARX model structure So we have: where

A family of transfer function models Equation error model structure We have: where Now if we introduce regression vector Linear regression in statistic Linear regression in statistic

A family of transfer function models Exercise(4E.1): Consider the ARX model structure where b1 is known to be 0.5. Write the corresponding predictor in the following linear regression form. Linear regression in statistic

A family of transfer function models ARMAX model structure AR part with X part MA part So we have: now where Let

A family of transfer function models Then we have Or To start it up at time t = 0 and predict y(1) requires the knowledge of One can consider the data as zero but there is a difference that decays cμt where μ is the maximum magnitude of the zero of C(z). Exercise(4G.1): Show that the effect from an erroneous initial condition in is bounded by cμt .

A family of transfer function models We saw that To start it up at time t = 0 and predict y(1) requires the knowledge of It is also possible to start the recursion at time max(n*, nb) and include the unknown initial condition (k|θ), k = 1, 2,…, nc , in the vector θ. Then Now if we introduce Pseudo linear regressions

A family of transfer function models Other equation error type model structures AR part ARARX model With X part AR part We could use an ARMA description for error + u e y The equation error model family AR part X part ARMA part ARARMAX model

A family of transfer function models Output error model structure If we suppose that the relation between input and undisturbed output w can be written as: Then With + u e y The output error model structure So OE model

A family of transfer function models Output error model structure + u e y The output error model structure Let w(t) is never observed instead it is constructed from u So

A family of transfer function models Box-Jenkins model structure A natural development of the output error model is to further model the properties of the output error. Let output error with ARMA model then BJ model This is Box and Jenkins model (1970) + u e y The BJ model structure

A family of transfer function models A general family of model structure The structure we have discussed in this section may give rise to 32 different model sets, depending on which of the five polynomials A, B, C, D, F are used. For convenience, we shall therefore use a generalized model structure: General model structure + u e y General model structure

A family of transfer function models + u e y General model structure Sometimes the dynamics from u to y contains a delay of nk samples, so So But for simplicity

A family of transfer function models The structure we have discussed in this section may give rise to 32 different model sets, depending on which of the five polynomials A, B, C, D, F are used. General model structure Some common black-box SISO models as special cases of generalized model structure Polynomial used Name of model Structure B FIR (finite impulse response) AB ARX ABC ARMAX AC ARMA ABD ARARX ABCD ARARMAX BF OE (output error) BFCD BJ (Box-Jenkins)

A family of transfer function models General model structure Predictor A pseudolinear form for general model structure Predictor error is:

A family of transfer function models So we have:

A family of transfer function models

A family of transfer function models Other model structure Consider FIR model It is a linear regression (being a special case of ARX) The model can be effectively estimated. It is a an output error model (being a special case of OE) It is robust against noise. The basic disadvantage is that many parameters may be needed if the system has a small time constant. Whether it would be possible to retain the linear regression and output error features, while offering better possibilities to treat slowly decaying impulse responses.

Topics to be covered include: State space models Topics to be covered include: Linear models and sets of linear models. A family of transfer function models. State space models. Identifiability of some model structures.

State Space models For most physical systems it is easier to construct models with physical insight in continuous time: θ is a vector of parameters that typically correspond to unknown values of physical coefficients, material constants, and the like. Let η(t) be the measurements that would be obtained with ideal, noise free sensors We can derive the transfer operator from u to η

State Space models Sampling the transfer function Let Then x(kT+t) is So x(kT+T) is We can derive the transfer operator from u to η

State Space models Example 4.1: DC servomotor

State Space models Example 4.1: DC servomotor Let La ≈ 0 so we have

State Space models Example 4.1: DC servomotor Assume that the actual measurement is made with a certain noise so: with v being white noise. The natural predictor is: This predictor parameterize using only two parameters. But ARX or OE model contains four adjustable parameters. But this method (2 parameters) is far more complicated than ARX or OE.

State Space models A standard discrete time state space model. Corresponding to where Although sampling a time-continuous is a natural way to obtain the discrete model but for certain application direct discrete time is better since the matrices A, B and C are directly parameterize in terms of θ.

State Space models Noise Representation and the time-invariant Kalman filter A straightforward but entirly valid approach would be: with {e(t)} being white noise with variance λ. Note: The θ-parameter in H(q, θ) could be partly in common with those in G(q, θ) or be extra. process noise measurement noise {w(t)} and {v(t)} are assumed to be sequences of independent random variables with zero mean and

State Space models Noise Representation and the time-invariant Kalman filter {w(t)} and {v(t)} are assumed to be sequences of independent random variables with zero mean and process noise measurement noise {w(t)} and {v(t)} may often be signals whose physical origins are known. The load variation Tl(t) was a “process noise”. The inaccuracy in the potentiometer angular sensor is the “measurement noise”. In such cases it may of course not always be realistic to assume that the signals are white noises.

State Space models Exercise(4G.2) Colored measurement noise: (I)

State Space models For state space descriptions, The conditional expectation of y(t), given data y(s), u(s), s≤t-1, is: The conditional expectation of x(t), by Kalman filter is: Here K(θ) is given by where is obtained as the psd solution of the stationary Riccati equation:

State Space models For state space descriptions, The conditional expectation of y(t), given data y(s), u(s), s≤t-1, is: The conditional expectation of x(t), by Kalman filter is: The conditional expectation of x(t) is: The predictor filter can thus be written as: Exercise: Show that covariance matrix of state estimator error is

The innovation form of state space description State Space models Innovation representation Innovation=Amounts of y(t) that cannot be predicted from past data Innovation Let it e(t) The innovation form of state space description Exercise: Show that the covariance of e(t) is:

The innovation form of state space description State Space models Innovation representation The innovation form of state space description Let suppose Directly Parameterized Innovations form Which one involve with lower parameters? Both according to situation.

State Space models Innovation representation It is ARMAX model

So we have an ARMAX model with State Space models Example 4.2 Companion form parameterization Let So we have an ARMAX model with

Identifiability of some model structures Topics to be covered include: Linear models and sets of linear models. A family of transfer function models. State space models. Identifiability of some model structures.

Identifiability of some model structures Some notation It is convenient to introduce some more compact notation One step ahead predictor is:

Identifiability of some model structures Definition 4.1. A predictor model of a linear, time-invariant system is a stable filter W(q). Definition 4.2. A complete probabilistic model of a linear, time-invariant system is a pair (W(q),fe(x)) of a predictor model W(q) and the PDF fe(x) of the associated errors. Clearly, we can also have models where the PDFs are only partially specified (e.g., by the variance of e) We shall say that two models W1(q) and W2(q) are equal if

Identifiability of some model structures Identifiability properties The problem is whether the identification procedure will yield a unique value of the parameter θ, and/or whether the resulting model is equal to the true system. Definition 4.6. A model structure M is globally identifiable at θ* if Definition 4.7. A model structure M is strictly globally identifiable if it is globally identifiable at all at This definition is quite demanding. A weaker and more realistic property is: Definition 4.8. A model structure M is globally identifiable if it is globally identifiable at almost all at For corresponding local property, the most natural definition of local identifiability of M at θ* would be to require that there exist an ε such that

Identifiability of some model structures Use of the Identifiability concept The identifiability concept concerns the unique representation of a given system description in a model structure. Let such a description as: Let M be a model structure based on one-step-ahead predictors for Then define the set DT(S,M) as those θ-values in DM for which S=M (θ) The set is empty in case Now suppose that so that S=M(θ0)

Identifiability of some model structures A model structure is globally identifiable at θ* if and only if Parameterization in Terms of Physical Parameters

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures

Identifiability of some model structures