Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Gaussian Process Model Identification: a Process.

Slides:



Advertisements
Similar presentations
Insert Date HereSlide 1 Using Derivative and Integral Information in the Statistical Analysis of Computer Models Gemma Stephenson March 2007.
Advertisements

Copula Regression By Rahul A. Parsa Drake University &
Polynomial Curve Fitting BITS C464/BITS F464 Navneet Goyal Department of Computer Science, BITS-Pilani, Pilani Campus, India.
Face Recognition Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
CS Statistical Machine learning Lecture 13 Yuan (Alan) Qi Purdue CS Oct
Artificial Intelligence Lecture 2 Dr. Bo Yuan, Professor Department of Computer Science and Engineering Shanghai Jiaotong University
Visual Recognition Tutorial
EE-148 Expectation Maximization Markus Weber 5/11/99.
Hilbert Space Embeddings of Hidden Markov Models Le Song, Byron Boots, Sajid Siddiqi, Geoff Gordon and Alex Smola 1.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Speaker Adaptation for Vowel Classification
Linear Models Tony Dodd January 2007An Overview of State-of-the-Art Data Modelling Overview Linear models. Parameter estimation. Linear in the.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Confidence intervals.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Development of Empirical Models From Process Data
Generative Models Rong Jin. Statistical Inference Training ExamplesLearning a Statistical Model  Prediction p(x;  ) Female: Gaussian distribution N(
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Review of Lecture Two Linear Regression Normal Equation
Crash Course on Machine Learning
1 Linear Methods for Classification Lecture Notes for CMPUT 466/551 Nilanjan Ray.
1 September 4, 2003 Bayesian System Identification and Structural Reliability Soheil Saadat, Research Associate Mohammad N. Noori, Professor & Head Department.
Gaussian process regression Bernád Emőke Gaussian processes Definition A Gaussian Process is a collection of random variables, any finite number.
Gaussian process modelling
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Model Inference and Averaging
Soft Sensor for Faulty Measurements Detection and Reconstruction in Urban Traffic Department of Adaptive systems, Institute of Information Theory and Automation,
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
Computacion Inteligente Least-Square Methods for System Identification.
Prognosis of Gear Health Using Gaussian Process Model Department of Adaptive systems, Institute of Information Theory and Automation, May 2011, Prague.
Prognosis of gear health using stochastic dynamical models with online parameter estimation 10th International PhD Workshop on Systems and Control a Young.
Multiple Model approach to Multi-Parametric Model Predictive Control of a Nonlinear Process a simulation case study Boštjan Pregelj, Samo Gerkšič Jožef.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Empirical Research Methods in Computer Science Lecture 7 November 30, 2005 Noah Smith.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
PROCESS MODELLING AND MODEL ANALYSIS © CAPE Centre, The University of Queensland Hungarian Academy of Sciences Statistical Model Calibration and Validation.
Sparse Kernel Methods 1 Sparse Kernel Methods for Classification and Regression October 17, 2007 Kyungchul Park SKKU.
An Introduction to Kalman Filtering by Arthur Pece
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
Bayes Theorem The most likely value of x derived from this posterior pdf therefore represents our inverse solution. Our knowledge contained in is explicitly.
- 1 - Overall procedure of validation Calibration Validation Figure 12.4 Validation, calibration, and prediction (Oberkampf and Barone, 2004 ). Model accuracy.
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
Gaussian Processes For Regression, Classification, and Prediction.
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Geology 5670/6670 Inverse Theory 28 Jan 2015 © A.R. Lowry 2015 Read for Fri 30 Jan: Menke Ch 4 (69-88) Last time: Ordinary Least Squares: Uncertainty The.
Introduction to Gaussian Process CS 478 – INTRODUCTION 1 CS 778 Chris Tensmeyer.
Computacion Inteligente Least-Square Methods for System Identification.
“Jožef Stefan” Institute Department of Systems and Control Modelling and Control of Nonlinear Dynamic Systems with Gaussian Process Models Juš Kocijan.
Flexible Speaker Adaptation using Maximum Likelihood Linear Regression Authors: C. J. Leggetter P. C. Woodland Presenter: 陳亮宇 Proc. ARPA Spoken Language.
Linear Models Tony Dodd. 21 January 2008Mathematics for Data Modelling: Linear Models Overview Linear models. Parameter estimation. Linear in the parameters.
Data Modeling Patrice Koehl Department of Biological Sciences
Probability Theory and Parameter Estimation I
Model Inference and Averaging
Ch3: Model Building through Regression
CSE 4705 Artificial Intelligence
CH 5: Multivariate Methods
Department of Civil and Environmental Engineering
Probabilistic Models for Linear Regression
Roberto Battiti, Mauro Brunato
CSCI 5822 Probabilistic Models of Human and Machine Learning
Bootstrap - Example Suppose we have an estimator of a parameter and we want to express its accuracy by its standard error but its sampling distribution.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Biointelligence Laboratory, Seoul National University
Generally Discriminant Analysis
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Yalchin Efendiev Texas A&M University
Probabilistic Surrogate Models
Presentation transcript:

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Gaussian Process Model Identification: a Process Engineering Case Study Juš Kocijan 1,2, Kristjan Ažman 1, 1 Jožef Stefan Institute, Ljubljana, Slovenia 2 University of Nova Gorica, Nova Gorica, Slovenia

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Motivation: Topic: nonlinear dynamic systems identification Problem: unballance between number of measurements in equilibrium and out of equilibrium Theoretical solution: Gaussian process model with incorporated linear local models  problem solution + measure of confidence in prediction Validation of theory: application in a process engineering case study

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Overview: Modelling with Gaussian process (GP) priors Incorporation of linear local models Modelling case study of gas-liquid separator

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Identification – why and how Dynamic system identification  model  e.g. prediction, automatic control,... Nonlinear dynamic system identification problems  ANN, fuzzy models,... difficult to use (structure determination, large number of parameters, lots of training data)  GP model – reduces some of these problems

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw p(y) * * * * y x x0x0 * | x=x 0 GP model Probabilistic, non-parametric model, constituted of: covariance function input/output data pairs (points, not signals) Prediction of the output based on similarity test input – training inputs Normally distributed output:

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Gaussian processes Covariance function Gaussian Optimisation: cost function: log-density method: maximum likelihood optimisation: conjugate gradients Gaussian process – set of normally distributed random variables: mean μ(X) covariance matrix K(X)

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw GP model attributes (vs. e.g. ANN) Smaller number of parameters Measure of confidence in prediction, depending on data Incorporation of prior knowledge * Easy to use (practice) computational cost increases with amount of data  Recent method, still in development Nonparametrical model * (also possible in some other models)

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw y = f(x) = = cos (6x 2 ) Staticexample GP model x y

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw GP model Dynamic system Input/output training pairs x i /y i x i... regressor values [u t-1,..,u t-k, y t-1,..,y t-k ] y i... system output y t Simulation “naive”... m(k)

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Problem of nonlinear dynamic systems identification Engine example – longitudinal dynamics

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Incorporation of local linear models (LMGP model) Derivative of function observed beside the values of function Derivatives are coefficients of linear local model in an equilibrium point (prior knowledge) Covariance function to be replaced; the procedure equals as with usual GP Very suited to data distribution that can be found in practice

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Case study: gas-liquid separator

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Nonlinearity of the system Model structure:

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Model identification Seven equilibrium points Seven linear LM (14 points) 60 off equilibrium points

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Model validation SE= LD=-1.97

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Model validation

Jožef Stefan Institute Department of Systems and Control Systems Science XVI, September 2007, Wroclaw Conclusions The Gaussian process model is an example of a flexible, probabilistic, nonparametric model with inherent uncertainty prediction. The GP model with incorporated local linear models (LMGP) is a possible solution for the problem of measurement data distribution in equilibrium and out of equilibrium. The application of LMGP modelling method on a gas-liquid separator demonstrated feasibility of this solution in practice.