Comparison of the autoregressive and autocovariance prediction results on different stationary time series Wiesław Kosek University of Agriculture in Krakow,

Slides:



Advertisements
Similar presentations
Chapter 9. Time Series From Business Intelligence Book by Vercellis Lei Chen, for COMP
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
Part II – TIME SERIES ANALYSIS C3 Exponential Smoothing Methods © Angel A. Juan & Carles Serrat - UPC 2007/2008.
DFT/FFT and Wavelets ● Additive Synthesis demonstration (wave addition) ● Standard Definitions ● Computing the DFT and FFT ● Sine and cosine wave multiplication.
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Possible excitation of the Chandler wobble by the geophysical annual cycle Kosek Wiesław Space Research Centre, Polish Academy of Sciences Seminar at.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
1 Detection and Analysis of Impulse Point Sequences on Correlated Disturbance Phone G. Filaretov, A. Avshalumov Moscow Power Engineering Institute, Moscow.
Presenter: Yufan Liu November 17th,
Time Series Analysis Autocorrelation Naive & Simple Averaging
Comparison of polar motion prediction results supplied by the IERS Sub-bureau for Rapid Service and Predictions and results of other prediction methods.
Pole Zero Speech Models Speech is nonstationary. It can approximately be considered stationary over short intervals (20-40 ms). Over thisinterval the source.
Warped Linear Prediction Concept: Warp the spectrum to emulate human perception; then perform linear prediction on the result Approaches to warp the spectrum:
Correlation and spectral analysis Objective: –investigation of correlation structure of time series –identification of major harmonic components in time.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by descriptive statistic.
Normalised Least Mean-Square Adaptive Filtering
Constant process Separate signal & noise Smooth the data: Backward smoother: At any give T, replace the observation yt by a combination of observations.
Abstract The International Earth Rotation and Reference Systems Service (IERS) has established a Working Group on Prediction to investigate what IERS prediction.
Variable seasonal and subseasonal oscillations in sea level anomaly data and their impact on sea level prediction accuracy W. Kosek 1,2, T. Niedzielski.
Oceanography 569 Oceanographic Data Analysis Laboratory Kathie Kelly Applied Physics Laboratory 515 Ben Hall IR Bldg class web site: faculty.washington.edu/kellyapl/classes/ocean569_.
STAT 497 LECTURE NOTES 2.
Eigenstructure Methods for Noise Covariance Estimation Olawoye Oyeyele AICIP Group Presentation April 29th, 2003.
Wireless Communication Technologies 1 Outline Introduction OFDM Basics Performance sensitivity for imperfect circuit Timing and.
EGU General Assembly 2013, 7 – 12 April 2013, Vienna, Austria This study: is pioneer in modeling the upper atmosphere, using space geodetic techniques,
Experimental research in noise influence on estimation precision for polyharmonic model frequencies Natalia Visotska.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 5: Exponential Smoothing (Ch. 8) Material.
P. Wielgosz and A. Krankowski IGS AC Workshop Miami Beach, June 2-6, 2008 University of Warmia and Mazury in Olsztyn, Poland
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Prediction of Earth orientation parameters by artificial neural networks Kalarus Maciej and Kosek Wiesław Space Research Centre, Polish Academy of Sciences.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
Noise and Sensitivity of RasClic 91
Wiesław Kosek 1,2, Agnieszka Wnęk 1, Maria Zbylut 1, Waldemar Popiński 3 1) Environmental Engineering and Land Surveying Department, University of Agriculture.
Bangladesh Short term Discharge Forecasting time series forecasting Tom Hopson A project supported by USAID.
Contribution of wide-band oscillations excited by the fluid excitation functions to the prediction errors of the pole coordinates data W. Kosek 1, A. Rzeszótko.
LEAST MEAN-SQUARE (LMS) ADAPTIVE FILTERING. Steepest Descent The update rule for SD is where or SD is a deterministic algorithm, in the sense that p and.
VARIABILITY OF TOTAL ELECTRON CONTENT AT EUROPEAN LATITUDES A. Krankowski(1), L. W. Baran(1), W. Kosek (2), I. I. Shagimuratov(3), M. Kalarus (2) (1) Institute.
Possible excitation of the Chandler wobble by the annual oscillation of polar motion Kosek Wiesław Space Research Centre, Polish Academy of Sciences Annual.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room ;
The statistical properties and possible causes of polar motion prediction errors Wiesław Kosek (1), Maciej Kalarus (2), Agnieszka Wnęk (1), Maria Zbylut-Górska.
Boyce/DiPrima 9 th ed, Ch 7.6: Complex Eigenvalues Elementary Differential Equations and Boundary Value Problems, 9 th edition, by William E. Boyce and.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
Autoregressive (AR) Spectral Estimation
Forecasting of the Earth orientation parameters – comparison of different algorithms W. Kosek 1, M. Kalarus 1, T. Niedzielski 1,2 1 Space Research Centre,
S.Frasca on behalf of LSC-Virgo collaboration New York, June 23 rd, 2009.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Module 4 Forecasting Multiple Variables from their own Histories EC 827.
Solution of the inverse problems in nonlinear and chaotic dynamics (project 2.2) Research group: Prof. S.Berczynski, Szczecin University of Technology.
Investigation of the use of deflections of vertical measured by DIADEM camera in the GSVS11 Survey YM Wang 1, X Li 2, S Holmes 3, DR Roman 1, DA Smith.
Modelling and prediction of the FCN Maciej Kalarus 1 Brian Luzum 2 Sébastien Lambert 2 Wiesław Kosek 1 Maciej Kalarus 1 Brian Luzum 2 Sébastien Lambert.
Doc.: IEEE /0779r0 Submission Guixia Kang, BUPT July 2010 VHT-LTF Design for IEEE802.11ac Slide 1 Date: Authors:
State-Space Recursive Least Squares with Adaptive Memory College of Electrical & Mechanical Engineering National University of Sciences & Technology (NUST)
Geology 6600/7600 Signal Analysis 26 Oct 2015 © A.R. Lowry 2015 Last time: Wiener Filtering Digital Wiener Filtering seeks to design a filter h for a linear.
Estimation of Doppler Spectrum Parameters Comparison between FFT-based processing and Adaptive Filtering Processing J. Figueras i Ventura 1, M. Pinsky.
Analyzing circadian expression data by harmonic regression based on autoregressive spectral estimation Rendong Yang and Zhen Su Division of Bioinformatics,
EC 827 Module 2 Forecasting a Single Variable from its own History.
Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods.
(2) Space Research Centre, Polish Academy of Sciences, Warsaw, Poland
Department of Civil and Environmental Engineering
Performance The need for speed.
Performance The need for speed.
Performance The need for speed.
Wiener Filtering: A linear estimation of clean signal from the noisy signal Using MMSE criterion.
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
Real-time Uncertainty Output for MBES Systems
Presentation transcript:

Comparison of the autoregressive and autocovariance prediction results on different stationary time series Wiesław Kosek University of Agriculture in Krakow, Poland Abstract. The advantages and disadvantages of the autoregressive and autocovariance prediction methods are presented using different model time series similar to the observed geophysical ones, e.g. Earth orientation parameters or sea level anomalies data. In the autocovariance prediction method the first predicted value is determined by the principle that the autocovariances estimated from the extended by the first prediction value series coincide as closely as possible with the autocovariances estimated from the given series. In the autoregressive prediction method the autoregressive model is used to estimate the first prediction value which depends on the autoregressive order and coefficients computed from the autocovariance estimate. In both methods the autocovariance estimations of time series must be computed, thus application of them makes sense when these series are stationary. However, the autoregressive prediction is more suitable for less noisy data and can be applied to short time span series. The autocovariance prediction is recommended for longer time series and but unlike autoregressive method can be applied to more noisy data. The autoregressive method can be applied for time series having close frequency oscillations while the autocovariance prediction is not suitable for such data. In case of the autocovariance prediction the problem of estimation of the appropriate forecast amplitude is also discussed. 1/2 European Geosciences Union General Assembly 2015, Vienna | Austria | 12 – 17 April 2015 Autocovariance prediction Autoregressive prediction Results Conclusions

Comparison of the autoregressive and autocovariance prediction results on different stationary time series Wiesław Kosek University of Agriculture in Krakow MODEL DATA 2/2 Autocovariance prediction Autoregressive prediction Results Conclusions PeriodsAmplitudesPhasesNoise MODEL 1 L=3 M=100 20, 30, 50All equal to 1.0All equal to 0.0standard deviations: 0, 3 MODEL 2 L=3 M=100 50, 57, 60All equal to 1.0All equal to 0.0standard deviations: 0, 1, 2, 3 MODEL 3 L=9 M=100 10, 15, 20, 25, 40, 60, 90, 120, 180 All equal to 1.0All equal to 0.0standard deviations: 0, 1, 2, 3, 5 MODEL 4 L=1 M= Random walk computed by integration of white nose with standard deviations equal to 1 o,2 o, 3 o standard deviation: 0.1 MODEL 5 L=2 M= , , 0.5Random walk computed by integration of white nose with standard deviations equal to 2 o standard deviation: 0.03 MODEL 6 L=2 M= , 0.016Random walk computed by integration of white nose with standard deviations equal to 2 o standard deviation:

Autocovariance prediction - complex-valued stationary time series with - prediction - the number of data - biased autocovariance estimate next slide ▼

The biased autocovariances of a complex-valued stationary time series can be expressed by the real-valued auto/cross-covariances of the real and imaginary parts: After the time series is extended by the first prediction point computed by: a new estimation of the autocovariance can be computed using the previous one by the following recursion formula: where and it can be used to compute the next prediction point etc. results

Autoregressive prediction next slide ▼ Autoregressive order: Akaike godness-of-fit criterion: Autoregressive coefficients: where

Uncorrected autocovariance predictions (red) of the deterministic and noise data next slide ▼

Correction to amplitudes of the autocovariance prediction 1.n=0.7×N where N is the total number of data 2.computation of the autocovariance c k of x t time series for k=0,1,…,n-1 3.computation of uncorrected autocovariance predictions x n+m for m=1,2,…,N-n+1 4.computation of the autocovariance c k of prediction time series x n+m for k=0,1,…,m-1 5.computation of the amplitude coefficient β= sqrt[( | c 1 | + | c 2 | +…+ | c 8 | )/( | c 1 | + | c 2 | +…+ | c 8 | )] 6. computation of corrected autocovariance predictions β×x N+L for L=1,2,….M β×x N+L x n+m signal next slide ▼ time series

Autocovariance (red) and autoregressive (green) predictions of the model data [T=20, 30, 50; A=1, 1, 1] next slide ▼

Autocovariance (red) and autoregressive (green) predictions of the model data with close frequencies (T=50,57,60, A=1,1,1, noise std. dev.: sd=0,1,2,3) next slide ▼

Autocovariance predictions (red) of the deterministic model data with close frequencies [T=50, 57, 60; A=1, 1, 1] [sd=0.0] next slide ▼

Autocovariance (red) and autoregressive (green) predictions of the model data with many frequencies T=10, 15, 20, 25, 40, 60, 90, 120, 180; A=1 (all); f=0 (all) (noise sd=0,1,2) next slide ▼

Autocovariance predictions (red) of the model data with big number of frequencies T=10, 15, 20, 25, 40, 60, 90, 120, 180; A=1 (all); f=0 (all) next slide ▼

Autocovariance (red) and autoregressive (green) predictions of the seasonal model data with random walk phase: Random walk computed by integration of white noise (sd=1 o, 2 o, 3 o ) next slide ▼

Autocovariance (red) and autoregressive (green) predictions of the model data with random walk phase. Random walk computed by integration of white noise (sd= 2 o ) next slide ▼

Autocovariance (red) and autoregressive (green) predictions of noise data [sd=1.0] next slide ▼

Conclusions The input time series for computation of autocovariance and autoregressive predictions should be stationary, because both methods need autocovarince estimates that should be functions of time lag only. The autocovariance prediction formulae do not able to estimate the appropriate value of prediction amplitude, so it must be rescaled using constant value of the amplitude coefficient β estimated empirically. The accuracy of the autocovariance predictions depend on the length of time series and noise level in data. The predictions may become unstable and when the length of time series decreases, the noise level is big or the frequencies of oscillations are too close. The autoregressive prediction is not recommended for noisy time series, but it can be applied when oscillation frequencies are close. The autocovariance prediction method can be applied to noisy time series if their length is long enough, but it is not recommended if frequencies of oscillations are close. The autocovariance predictions of noise data are similar to noise with smaller standard deviations and autoregressive predictions are close to zero. next slide

References Barrodale I. and Erickson R. E., 1980, Algorithms for least-squares linear prediction and maximum entropy spectral analysis - Part II: Fortran program, Geophysics, 45, Brzeziński A., 1994, Algorithms for estimating maximum entropy coefficients of the complex valued time series, Allgemeine Vermessungs-Nachrichten, Heft 3/1994, pp , Herbert Wichman Verlag GmbH, Heidelberg. Kosek W., 1993, The Autocovariance Prediction of the Earth Rotation Parameters. Proc. 7th International Symposium ”Geodesy and Physics of the Earth” IAG Symposium No. 112, Potsdam, Germany, Oct. 5-10, H. Montag and Ch. Reigber (eds.), Springer Verlag, Kosek W., 1997, Autocovariance Prediction of Short Period Earth Rotation Parameters, Artificial Satellites, Journal of Planetary Geodesy, 32, Kosek W., 2002, Autocovariance prediction of complex-valued polar motion time series, Advances of Space Research, 30, next slide

Acknowledgments Paper was supported by the Polish Ministry of Science and Education, project UMO-2012/05/B/ST10/02132 under the leadership of Prof. A. Brzeziński.