Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods.

Slides:



Advertisements
Similar presentations
ELG5377 Adaptive Signal Processing
Advertisements

ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
Model specification (identification) We already know about the sample autocorrelation function (SAC): Properties: Not unbiased (since a ratio between two.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
OPTIMUM FILTERING.
Itay Ben-Lulu & Uri Goldfeld Instructor : Dr. Yizhar Lavner Spring /9/2004.
AGC DSP AGC DSP Professor A G Constantinides©1 A Prediction Problem Problem: Given a sample set of a stationary processes to predict the value of the process.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
Time series analysis - lecture 2 A general forecasting principle Set up probability models for which we can derive analytical expressions for and estimate.
Modeling Cycles By ARMA
Pole Zero Speech Models Speech is nonstationary. It can approximately be considered stationary over short intervals (20-40 ms). Over thisinterval the source.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
Non-Seasonal Box-Jenkins Models
Adaptive FIR Filter Algorithms D.K. Wise ECEN4002/5002 DSP Laboratory Spring 2003.
Adaptive Signal Processing
Introduction to Spectral Estimation
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
Speech Coding Using LPC. What is Speech Coding  Speech coding is the procedure of transforming speech signal into more compact form for Transmission.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
T – Biomedical Signal Processing Chapters
Lecture 7: Forecasting: Putting it ALL together. The full model The model with seasonality, quadratic trend, and ARMA components can be written: Ummmm,
1 Linear Prediction. 2 Linear Prediction (Introduction) : The object of linear prediction is to estimate the output sequence from a linear combination.
1 Linear Prediction. Outline Windowing LPC Introduction to Vocoders Excitation modeling  Pitch Detection.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Speech Signal Representations I Seminar Speech Recognition 2002 F.R. Verhage.
Functional Brain Signal Processing: EEG & fMRI Lesson 5 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
2. Stationary Processes and Models
Adv DSP Spring-2015 Lecture#9 Optimum Filters (Ch:7) Wiener Filters.
Linear Predictive Analysis 主講人:虞台文. Contents Introduction Basic Principles of Linear Predictive Analysis The Autocorrelation Method The Covariance Method.
The process has correlation sequence Correlation and Spectral Measure where, the adjoint of is defined by The process has spectral measure where.
ECE 5525 Osama Saraireh Fall 2005 Dr. Veton Kepuska
EE Audio Signals and Systems Linear Prediction Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
1 Spectrum Estimation Dr. Hassanpour Payam Masoumi Mariam Zabihi Advanced Digital Signal Processing Seminar Department of Electronic Engineering Noushirvani.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Experiments on Noise CharacterizationRoma, March 10,1999Andrea Viceré Experiments on Noise Analysis l Need of noise characterization for  Monitoring the.
0 - 1 © 2007 Texas Instruments Inc, Content developed in partnership with Tel-Aviv University From MATLAB ® and Simulink ® to Real Time with TI DSPs Spectrum.
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
Lecture#10 Spectrum Estimation
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
EEL 6586: AUTOMATIC SPEECH PROCESSING Speech Features Lecture Mark D. Skowronski Computational Neuro-Engineering Lab University of Florida February 27,
Autoregressive (AR) Spectral Estimation
Discrete-time Random Signals
Lecture 12: Parametric Signal Modeling XILIANG LUO 2014/11 1.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Power Spectral Estimation
EEL 6586: AUTOMATIC SPEECH PROCESSING Speech Features Lecture Mark D. Skowronski Computational Neuro-Engineering Lab University of Florida February 20,
Linear Prediction.
Introduction to stochastic processes
Econometric methods of analysis and forecasting of financial markets Lecture 3. Time series modeling and forecasting.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
ELG5377 Adaptive Signal Processing Lecture 13: Method of Least Squares.
Figure 11.1 Linear system model for a signal s[n].
Computational Data Analysis
Linear Prediction.
Modern Spectral Estimation
Linear Predictive Coding Methods
Linear Prediction.
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Presentation transcript:

Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods

Introduction  One limitation of non-parametric methods is that they cant incorporate the a-priori information about the process in estimation.  For example: In speech processing, an acoustic tube model for the vocal tract imposes an AR model on speech waveform.  If it is possible to incorporate a model for the process directly in to spectrum estimation, then a more accurate and high resolution estimate can be found.

Parametric Method  First step is to select an appropriate model for the process.  This selection is based upon: A-priori knowledge about how the process is generated Experimental results indicate that a particular model “works well”.  Models used are Autoregressive (AR) Model Moving Average (MA) Model Autoregressive Moving Average (ARMA) Model Harmonic Model (Complex exponential in noise)

Parametric Methods  Once the model is selected, the next step is to estimate the model parameters from the given data.  The final step is the estimate the power spectrum by incorporating the estimated parameters into the parametric form for the spectrum.  Example: An ARMA(p,q) model with a p (k) and b q (k) estimated, the spectrum estimate would be

Autoregressive Spectrum Estimation  An AR process, x[n], may be represented as the output of an all-pole filter that is driven by unit variance white noise.  The power spectrum of a pth-order AR process is:  Therefore, if b(0) and a p (k) can be estimated from the data then an estimate of the power spectrum may be formed using:

Autoregressive Spectrum Estimation  The accuracy of this estimate will depend on how accurately the model parameters may be estimated.  Also, whether or not an AR model is consistent with the way in which data is generated.  AR spectrum estimation requires that an all-pole model may be found.  Variety of techniques are available for all-pole modeling.

AR: The Autocorrelation Method  The Autocorrelation Method: AR coefficients are found by solving following normal equations: Yule-Walker Method

AR: The Autocorrelation Method  The autocorrelation method effectively applies a rectangular window to the data when estimating the autocorrelation sequence, hence the data is effectively extrapolated with zeros  Due to this, the autocorrelation method generally produces a lower estimation estimate.  For short data records the autocorrelation method is not generally used.

AR: The Autocorrelation Method  An artifact that may be observed with the autocorrelation method is “Spectral Line Splitting”. Involves the splitting of single spectral line into two separate and distinct peaks  Spectral line splitting occur when x[n] is over- modeled, i.e. when model order ‘p’ is too large.

AR: The Covariance Method  The covariance method requires finding the solution to the set of linear equations:  The advantage is that no windowing of data is required.  For short data record this method produces a high resolution estimate.

AR Method: Example  Consider the AR process that is generated with the difference equation, where w[n] is unit variance white Gaussian noise:  Data records of length 128, and an ensemble of 50 estimates were computed using Yule-Walker and the covariance method.

AR Method: Example zplane([1],[ ]) Pair of Complex Poles at:

AR: The Autocorrelation Method

AR: The Covariance Method

Model Order Selection  How to select the model order ‘p’ of the AR process.  If the model order is too small, then the resulting spectrum will be smoothed and will have poor resolution.  If the model order is too large, then the spectrum may contain spurious peaks, and may lead to spectral line splitting.  It is useful to have criteria that indicates the appropriate model order.

Model Order Selection  One approach would be to increase the model order until the modeling error is minimized.  Several criteria of the following form has been proposed: ‘p’ is the model order N is the data record length ε p is the modeling error F(N) is a constant that depend upon N

Model Order Selection  Akaike Information Criteria (AIC)  Minimum Description Length (MDL)  Akaike’s Final Prediction Error (FPE)  Criterion Autoregressive Transfer Function (CAT)

Moving Average (MA) Spectrum Estimation  A MA process may be generated by filtering unit variance white noise,w[n], with an FIR filter as follows:  The relationship between the power spectrum of a MA process and the coefficients b q [k] is,

Moving Average (MA) Spectrum Estimation  Equivalently, the power spectrum may be written in terms of the autocorrelation sequence r x [k] as:  Where r x [k] is related to the filter coefficients b q [k] through Yule-Walker equations:

ARMA Spectrum Estimation  An ARMA process has a power spectrum of the form:  This process can be generated by filtering unit variance white noise with filter having both poles and zeros:

ARMA Spectrum Estimation  Following the approach used for AR(p) and MA(q) spectrum estimation, the spectrum of ARMA(p,q) process may be estimated by using the estimates of model parameters (Ch:4)