EE-M110 2006/7, EF L17 1/12, v1.0 Lecture 17: ARMAX and other Linear Model Structures Dr Martin Brown Room: E1k, Control Systems Centre

Slides:



Advertisements
Similar presentations
Chapter 5 One- and Two-Sample Estimation Problems.
Advertisements

FINITE WORD LENGTH EFFECTS
Slide 1 Insert your own content. Slide 2 Insert your own content.
1 Copyright © 2010, Elsevier Inc. All rights Reserved Fig 3.1 Chapter 3.
Introduction to Kalman Filters
SMA 6304 / MIT / MIT Manufacturing Systems Lecture 11: Forecasting Lecturer: Prof. Duane S. Boning Copyright 2003 © Duane S. Boning. 1.
Getting Ready: Addition and Subtraction Concepts by: Nicole Lamons
0 - 0.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
Addition Facts
C82MST Statistical Methods 2 - Lecture 2 1 Overview of Lecture Variability and Averages The Normal Distribution Comparing Population Variances Experimental.
DCSP-17: Matched Filter Jianfeng Feng Department of Computer Science Warwick Univ., UK
DCSP-20 Jianfeng Feng Department of Computer Science Warwick Univ., UK
BT Wholesale October Creating your own telephone network WHOLESALE CALLS LINE ASSOCIATED.
Control and Feedback Introduction Open-loop and Closed-loop Systems
System Function For A Closed-Loop System
1 Although they are biased in finite samples if Part (2) of Assumption C.7 is violated, OLS estimators are consistent if Part (1) is valid. We will demonstrate.
ADAPTIVE EXPECTATIONS 1 The dynamics in the partial adjustment model are attributable to inertia, the drag of the past. Another, completely opposite, source.
Machine Learning Math Essentials Part 2
LOGO Regression Analysis Lecturer: Dr. Bo Yuan
Discrete Controller Design
Parallel List Ranking Advanced Algorithms & Data Structures Lecture Theme 17 Prof. Dr. Th. Ottmann Summer Semester 2006.
1 Probabilistic Uncertainty Bounding in Output Error Models with Unmodelled Dynamics 2006 American Control Conference, June 2006, Minneapolis, Minnesota.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 9) Slideshow: two-stage least squares Original citation: Dougherty, C. (2012) EC220.
Lectures 12&13: Persistent Excitation for Off-line and On-line Parameter Estimation Dr Martin Brown Room: E1k Telephone:
Lectures 3&4: Linear Machine Learning Algorithms
Professor A G Constantinides 1 Z - transform Defined as power series Examples:
EC220 - Introduction to econometrics (chapter 8)
Christopher Dougherty EC220 - Introduction to econometrics (chapter 8) Slideshow: model b: properties of the regression coefficients Original citation:
3.2 Chapter 3 Quadratic Equations. To solve quadratic equations by factoring, apply the which states that, if the product of two real numbers is zero,
1 MAXIMUM LIKELIHOOD ESTIMATION OF REGRESSION COEFFICIENTS X Y XiXi 11  1  +  2 X i Y =  1  +  2 X We will now apply the maximum likelihood principle.
MODELS WITH A LAGGED DEPENDENT VARIABLE
Insert Date HereSlide 1 Using Derivative and Integral Information in the Statistical Analysis of Computer Models Gemma Stephenson March 2007.
EC220 - Introduction to econometrics (chapter 3)
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Addition 1’s to 20.
Test B, 100 Subtraction Facts
U1A L6 Linear, Quadratic & Polynomial Inequalities
11 = This is the fact family. You say: 8+3=11 and 3+8=11
Week 1.
STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP
An approach to the SN ratios based on
Probabilistic Reasoning over Time
Lecture 14 Nonlinear Problems Grid Search and Monte Carlo Methods.
EE-M /7: IS L7&8 1/24, v3.0 Lectures 7&8: Non-linear Classification and Regression using Layered Perceptrons Dr Martin Brown Room: E1k
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
CY3A2 System identification Modelling Elvis Impersonators Fresh evidence that pop stars are more popular dead than alive. The University of Missouri’s.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
A Typical Feedback System
Lectures 5 & 6: Least Squares Parameter Estimation
Lecture 11: Recursive Parameter Estimation
Forecasting JY Le Boudec 1. Contents 1.What is forecasting ? 2.Linear Regression 3.Avoiding Overfitting 4.Differencing 5.ARMA models 6.Sparse ARMA models.
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
SYSTEMS Identification
Kalman Filtering Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read.
Module 2: Representing Process and Disturbance Dynamics Using Discrete Time Transfer Functions.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Model Structures 1. Objective Recognize some discrete-time model structures which are commonly used in system identification such as ARX, FIR, ARMAX,
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
(5) Notes on the Least Squares Estimate
Recursive Identification of Switched ARX Hybrid Models: Exponential Convergence and Persistence of Excitation René Vidal National ICT Australia Brian D.O.Anderson.
Probabilistic Robotics
Probabilistic Models for Linear Regression
10701 / Machine Learning Today: - Cross validation,
MODEL DEVELOPMENT FOR HIGH-SPEED RECEIVERS
Presentation transcript:

EE-M /7, EF L17 1/12, v1.0 Lecture 17: ARMAX and other Linear Model Structures Dr Martin Brown Room: E1k, Control Systems Centre Telephone:

EE-M /7, EF L17 2/12, v1.0 L17: Resources & Learning Objectives Core texts Ljung, Chapters 2, 3 & 4 In this lecture we’re looking at the basic ARMAX model structure and considering 1.How it differs from ARX representation 2.What disturbance signals can be modelled 3.How the parameters are represented and estimated 4.Other discrete time polynomials models

EE-M /7, EF L17 3/12, v1.0

EE-M /7, EF L17 4/12, v1.0 Not Gaussian, Additive Disturbances The disturbances are characterised by the fact that the value is not known beforehand, however it is important for making predictions about future values. Use a probabilistic framework to describe disturbances, and generally describe e(t) by its mean and variance (iid). The modelling of the transfer function h, can give dynamic disturbance terms: where  is small and r~N(0,  2 ) v(t)v(t)

EE-M /7, EF L17 5/12, v1.0

EE-M /7, EF L17 6/12, v1.0

EE-M /7, EF L17 7/12, v1.0

EE-M /7, EF L17 8/12, v1.0

EE-M /7, EF L17 9/12, v1.0 Example: ARMAX Model First order model We assume that e(t) is normal, iid noise. This is not true for v(t) = e(t)+0.2e(t-1), hence we can’t use an ARX model and must use a first order ARMAX system. The poles of the disturbance->output and the control- >output are both given by A=1-0.5q -1 The zeros of the disturbance->output are given by C=1+0.2q -1 The zeros of the control->output are given by B=q -1 In forming a prediction, we use e(t)=y(t)-y(t), hence the model is non-linear in its parameters. ^

EE-M /7, EF L17 10/12, v1.0

EE-M /7, EF L17 11/12, v1.0

EE-M /7, EF L17 12/12, v1.0

EE-M /7, EF L17 13/12, v1.0 L17 Summary Whilst much of this course has concentrated on a simple ARX model, this is very limiting in the type of disturbances that can be modelled. ARMAX, Output Error, Box-Jenkins … models all generalise the basic ARX transfer function and can disturbance/noise terms with dynamics However, the parameter estimation problem is no longer a quadratic optimization process and iterative algorithms must be used.

EE-M /7, EF L17 14/12, v1.0 L17 Lab