Download presentation
Presentation is loading. Please wait.
Published byKristopher Porter Modified over 8 years ago
1
Page 0 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Saurabh Prasad Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Dynamical Invariants from a Time Series
2
Page 1 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Estimating the correlation integral from a time series Correlation Integral of an attractor’s trajectory : Correlation sum of a system’s attractor is a (probabilistic) measure quantifying the average number of neighbors in a neighborhood of radius along the trajectory. where represents the i’th point on the trajectory, is a valid norm and is the Heaviside’s unit step function (serving as a count function here) Correlation Dimension : In the limit that we have an infinitely large data set and a very small neighborhood For small epsilons, the correlation integral is expected to grow exponentially with the true dimension of the attractor Hence,
3
Page 2 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Correlation Integral and dimension estimation: Practical considerations Temporal correlations vs. geometric correlations : Choosing neighbors in a small neighborhood about a point forces the inclusion of temporally correlated points This results in biasing the estimator, yielding a lower dimension estimate Theiler’s correction : The solution is simple – exclude temporally correlated points from analysis. ‘w’ – Theiler’s correction factor An optimal value for w may be found by a space-time separation plot of the data- set.
4
Page 3 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Some results on the Lorentz attractor
5
Page 4 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Fitting the current discussion into the scope of Pattern Recognition RPS Lyapunov Spectra K 2 Entropy Correlation Dimension Input Time Series (scalar / vector) Measure discriminability in a space comprised of all possible combinations of these “features” Higher Order Statistics
6
Page 5 of 5 Dynamical Invariants of an Attractor and potential applications for speech data A note on Vector-Embedding Vector Embedding: x(t) = [s(t) s(t - ) s(t - 2 ) … s(t - (d - 1) )] x(t) = [s 1 (t) s 2 (t)….s m (t) s 1 (t - ) s 2 (t - ) ….s m (t - ) … s 1 (t - (d - 1) ) s 2 (t - (d - 1) )…s m (t - (d -1) )] It is typically assumed that the delay coordinates chosen are such that components of the embedded vectors are uncorrelated. For scalar embedding, an optimal choice of ensures this. For vector embedding, a strong correlation between components of the (observed) vector stream may hurt the embedding procedure. PCA/SVD based de-correlation may help remove correlations in second order statistics. If correlations in higher order statistics of the data stream are removed, it is hoped that it will provide a more meaningful reconstruction.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.