S. Srinivasan, S. Prasad, S. Patil, G. Lazarou and J. Picone Intelligent Electronic Systems Center for Advanced Vehicular Systems Mississippi State University.

Slides:



Advertisements
Similar presentations
On an Improved Chaos Shift Keying Communication Scheme Timothy J. Wren & Tai C. Yang.
Advertisements

Time-Series Analysis J. C. (Clint) Sprott
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
From
Forecasting using Non Linear Techniques in Time Series Analysis – Michel Camilleri – September FORECASTING USING NON-LINEAR TECHNIQUES IN TIME SERIES.
Neural chaos (Chapter 11 of Wilson 1999) Klaas Enno Stephan Laboratory for Social & Neural Systems Research Dept. of Economics University of Zurich Wellcome.
Page 0 of 8 Time Series Classification – phoneme recognition in reconstructed phase space Sanjay Patil Intelligent Electronics Systems Human and Systems.
Introduction to chaotic dynamics
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Lyapunov Exponents By Anna Rapoport. Lyapunov A. M. ( ) Alexander Lyapunov was born 6 June 1857 in Yaroslavl, Russia in the family of the famous.
Basic Concepts and Definitions Vector and Function Space. A finite or an infinite dimensional linear vector/function space described with set of non-unique.
Deterministic Chaos PHYS 306/638 University of Delaware ca oz.
Introduction to Wavelets
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Modelling and Simulation 2008 A brief introduction to self-similar fractals.
Summarized by Soo-Jin Kim
Chaos Identification For the driven, damped pendulum, we found chaos for some values of the parameters (the driving torque F) & not for others. Similarly,
Page 0 of 14 Dynamical Invariants of an Attractor and potential applications for speech data Saurabh Prasad Intelligent Electronic Systems Human and Systems.
Reconstructed Phase Space (RPS)
Fractional Dimension! Presentedby Sonali Saha Sarojini Naidu College for Women 30 Jessore Road, Kolkata
Algorithms for a large sparse nonlinear eigenvalue problem Yusaku Yamamoto Dept. of Computational Science & Engineering Nagoya University.
Renormalization and chaos in the logistic map. Logistic map Many features of non-Hamiltonian chaos can be seen in this simple map (and other similar one.
10/2/2015Electronic Chaos Fall Steven Wright and Amanda Baldwin Special Thanks to Mr. Dan Brunski.
Graphite 2004 Statistical Synthesis of Facial Expressions for the Portrayal of Emotion Lisa Gralewski Bristol University United Kingdom
Lecture 2 Signals and Systems (I)
Wolf-Gerrit Früh Christina Skittides With support from SgurrEnergy Preliminary assessment of wind climate fluctuations and use of Dynamical Systems Theory.
1 Recurrence analysis and Fisheries Fisheries as a complex systems Traditional science operates on the assumption that natural systems like fish populations.
Strange Attractors From Art to Science J. C. Sprott Department of Physics University of Wisconsin - Madison Presented to the Society for chaos theory in.
Introduction to Quantum Chaos
NONLINEAR DYNAMIC INVARIANTS FOR CONTINUOUS SPEECH RECOGNITION Author: Daniel May Mississippi State University Contact Information: 1255 Louisville St.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Ch 9.8: Chaos and Strange Attractors: The Lorenz Equations
Nonlinear Dynamical Invariants for Speech Recognition S. Prasad, S. Srinivasan, M. Pannuri, G. Lazarou and J. Picone Department of Electrical and Computer.
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
Progress in identification of damping: Energy-based method with incomplete and noisy data Marco Prandina University of Liverpool.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
Quantifying Chaos 1.Introduction 2.Time Series of Dynamical Variables 3.Lyapunov Exponents 4.Universal Scaling of the Lyapunov Exponents 5.Invariant Measures.
Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy Soundararajan Ezekiel Matthew Lang Computer Science Department Indiana University.
Some figures adapted from a 2004 Lecture by Larry Liebovitch, Ph.D. Chaos BIOL/CMSC 361: Emergence 1/29/08.
Introduction to Chaos by: Saeed Heidary 29 Feb 2013.
Introduction: Brain Dynamics Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
CCN COMPLEX COMPUTING NETWORKS1 This research has been supported in part by European Commission FP6 IYTE-Wireless Project (Contract No: )
Page 0 of 8 Lyapunov Exponents – Theory and Implementation Sanjay Patil Intelligent Electronics Systems Human and Systems Engineering Center for Advanced.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
Z bigniew Leonowicz, Wroclaw University of Technology Z bigniew Leonowicz, Wroclaw University of Technology, Poland XXIX  IC-SPETO.
1 Challenge the future Chaotic Invariants for Human Action Recognition Ali, Basharat, & Shah, ICCV 2007.
Principal Component Analysis (PCA)
Predictability of daily temperature series determined by maximal Lyapunov exponent Jan Skořepa, Jiří Mikšovský, Aleš Raidl Department of Atmospheric Physics,
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
Optimization of Nonlinear Singularly Perturbed Systems with Hypersphere Control Restriction A.I. Kalinin and J.O. Grudo Belarusian State University, Minsk,
Controlling Chaos Journal presentation by Vaibhav Madhok.
Madhulika Pannuri Intelligent Electronic Systems Human and Systems Engineering Center for Advanced Vehicular Systems An overview of work done so far in.
Matrix Factorization & Singular Value Decomposition Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
ECE-7000: Nonlinear Dynamical Systems 3. Phase Space Methods 3.1 Determinism: Uniqueness in phase space We Assume that the system is linear stochastic.
Unsupervised Learning II Feature Extraction
[Chaos in the Brain] Nonlinear dynamical analysis for neural signals Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
Page 0 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Saurabh Prasad Intelligent Electronic Systems Human and Systems.
CLASSIFICATION OF ECG SIGNAL USING WAVELET ANALYSIS
Stability and instability in nonlinear dynamical systems
Chaos Analysis.
LECTURE 11: Advanced Discriminant Analysis
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
Handout #21 Nonlinear Systems and Chaos Most important concepts
Introduction to chaotic dynamics
Autonomous Cyber-Physical Systems: Dynamical Systems
Introduction to chaotic dynamics
X.1 Principal component analysis
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

S. Srinivasan, S. Prasad, S. Patil, G. Lazarou and J. Picone Intelligent Electronic Systems Center for Advanced Vehicular Systems Mississippi State University URL: ESTIMATION OF LYAPUNOV SPECTRA FROM A TIME SERIES

Page 1 of 24 Estimation of Lyapunov Spectra from a Time Series Analysis of chaotic signals: Reconstruction a phase-space from a scalar observable Lyapunov exponents as a tool to analyze chaos Lyapunov spectra of chaotic and non-chaotic time series Optimize parameters of Lyapunov spectra estimation Motivation

Page 2 of 24 Estimation of Lyapunov Spectra from a Time Series Definitions A deterministic signal or system: every event is the result of preceding events and actions; hence predictable completely Stochastic noise: signal that is not deterministic, i.e., inherently unpredictable A chaotic signal or system: sensitive to initial conditions (Butterfly Effect) Chaos says: predictability holds only in principle, hence chaotic signals are also called deterministic noise. Dimension of a system: number degrees of freedom possessed by the system Deterministic Chaos or Stochastic Noise?  Both have continuous power spectra (and not easily distinguishable)  Noise is infinite-dimensional.  Chaotic signals are finite dimensional, but dimension no longer associated with number of independent frequencies, but a statistical feature related to both temporal evolution and geometric aspect (self- similar structure of the attractor)

Page 3 of 24 Estimation of Lyapunov Spectra from a Time Series Power Spectrum of a Lorentz Signal Power spectra of chaotic signals are continuous, though the system is finite dimensional. For example, the power spectrum of a 3-dimensional chaotic Lorentz signal is shown below. Stochastic systems have similar spectra even though they are infinite dimensional.

Page 4 of 24 Estimation of Lyapunov Spectra from a Time Series Attractors for Dynamical Systems System Attractor: Trajectories approach a limit with increasing time, irrespective of the initial conditions within a region Basin of Attraction: Set of initial conditions converging to a particular attractor Attractors: Non-chaotic (point, limit cycle or torus), or chaotic (strange attactors) Example: point and limit cycle attractors of a logistic map (a discrete nonlinear chaotic map)

Page 5 of 24 Estimation of Lyapunov Spectra from a Time Series Strange Attractors Strange Attractors: attractors whose shapes are neither points nor limit cycles. They typically have a fractal structure (i.e., they have dimensions that are not integers but fractional) Example: a Lorentz system with parameters

Page 6 of 24 Estimation of Lyapunov Spectra from a Time Series Characterizing Chaos Exploit geometrical (self-similar structure) aspects of an attractor or the temporal evolution for system characterization Geometry of a Strange Attractor:  Most strange attractors show a similar structure at various scales, i.e., parts are similar to the whole.  Fractal dimensions can be used to quantify this self-similarity.  e.g., Hausdorff, correlation dimensions. Temporal Aspect of Chaos:  Characteristic exponents or Lyapunov Exponents (LE’s) - captures rate of divergence (or convergence) of nearby trajectories;  Also Correlation Entropy captures similar information. Any characterization presupposes that phase-space is available.  What if only one scalar time series measurement of the system (and not its actual phase space) is available?

Page 7 of 24 Estimation of Lyapunov Spectra from a Time Series Reconstructed Phase Space (RPS): Embedding Embedding: A mapping from an one-dimensional signal to an m-dimensional signal Taken’s Theorem:  Can reconstruct a phase space “equivalent” to the original phase space by embedding with m ≥ 2d+1 (d is the system dimension) Embedding Dimension: a theoretically sufficient bound; in practice, embedding with a smaller dimension is adequate. Equivalence:  means the system invariants characterizing the attractor are the same  does not mean reconstructed phase space (RPS) is exactly the same as original phase space RPS Construction: techniques include differential embedding, integral embedding, time delay embedding, and SVD embedding

Page 8 of 24 Estimation of Lyapunov Spectra from a Time Series Reconstructed Phase Space (RPS): Time Delay Embedding Uses delayed copies of the original time series as components of RPS to form a matrix m: embedding dimension, : delay parameter Each row of the matrix is a point in the RPS

Page 9 of 24 Estimation of Lyapunov Spectra from a Time Series Reconstructed Phase Space (RPS) Time Delay Embedding of a Lorentz time series

Page 10 of 24 Estimation of Lyapunov Spectra from a Time Series Reconstructed Phase Space (RPS): Time Delay Embedding Setting very small delay value: leads to highly correlated vector elements, concentrated around the diagonal in embedding space. Structure perpendicular to the diagonal not captured adequately. Setting very large delay value: leads elements of the vector to behave as if they are independent. Evolutionary information in the system is lost. Quantitative tools for fixing delay: plots of autocorrelation and auto-mutual information are useful guides. Advantages: easy to compute; the attractor structure is not distorted since no extra processing is done on it. Disadvantages: choice of delay parameter value is not obvious; leads to poor RPS in presence of noise.

Page 11 of 24 Estimation of Lyapunov Spectra from a Time Series Reconstructed Phase Space (RPS): SVD-based Embedding Works in two stages: 1.Delay embed, with one sample delay, to a dimension larger than twice the actual embedding dimension 2.Reduce this matrix using SVD to finally have number of columns equal to embedding dimension. (SVD-based matrix reduction is done by projecting each row onto only the first few eigenvectors and then reconstructing it to a lower- dimensional space) SVD window size: dimension of time delayed embedded matrix over which SVD operates Advantages: No delay parameter value to be set; more robust to noise due to SVD stage Disadvantages: Noise reducing property of SVD may also distort the attractor properties

Page 12 of 24 Estimation of Lyapunov Spectra from a Time Series Reconstructed Phase Space (RPS): Reconstruction Attractor reconstruction using SVD embedding (for a Lorentz system)

Page 13 of 24 Estimation of Lyapunov Spectra from a Time Series Lyapunov Exponents Quantifies separation in time between trajectories, assuming rate of growth (or decay) is exponential in time, as: where J is the Jacobian matrix at point p. Captures sensitivity to initial conditions. Analyzes separation in time of two trajectories with close initial points where is the system’s evolution function.

Page 14 of 24 Estimation of Lyapunov Spectra from a Time Series Lyapunov Exponents – Some Properties m-dimensional system has m LE’s LE is a measure averaged over the whole attractor Sum of first k LE’s: rate of growth of k-dimensional Euclidean volume element Bounded attractor: Sum of all LEs equals zero (conservative) or negative (dissipative) Zero exponents indicate periodic attractor (limit cycle) or a flow Negative exponents pull points in the basin of attraction to the attractor Positive exponents indicate divergence: signature for existence of chaos

Page 15 of 24 Estimation of Lyapunov Spectra from a Time Series Lyapunov Exponents: Computation 1.Embed time series to form RPS matrix. Rows represent points in phase space 2.Take first point as center 3.Form neighborhood matrix, each row obtained by subtracting a neighbor from the centre 4.Find evolution of each neighbor and form the evolved neighborhood matrix by subtracting each evolved neighbor from the evolved centre 5.Compute trajectory matrix at the center by multiplying pseudo-inverse of neighborhood matrix with evolved neighborhood matrix 6.Advance center to a new point and go to step 3, averaging the trajectory matrix in each iteration The LEs are given by the average of the eigenvalues from each R matrix. Direct averaging has numerical issues, hence an iterative QR decomposition method (treppen-iteration) is used.

Page 16 of 24 Estimation of Lyapunov Spectra from a Time Series Lyapunov Exponents: Computation Flowchart Embed the data Locate nearest points Evolve “a” steps Calculate trajectory martix Form neighborhood Locate nearest points Perform QR decomposition oLocate nearest points Calculate exponents from R Evolve center Form neighborhood Input time series

Page 17 of 24 Estimation of Lyapunov Spectra from a Time Series Experimental Design Three systems tested : two chaotic (Lorentz and Rossler) and one non- chaotic (sine signal) Two test conditions: clean and noisy (10 dB white noise) Lorentz system: Parameters: Expected LEs: (+1.37, 0, ) Rossler system: Parameters: a = 0.15, b = 0.2, c = 10 Expected LEs: (0.090, 0.00, -9.8) Sine Signal: Parameters: Freq=1Hz, Samp freq=16Hz, Amp=1 Expected LEs: (0.00, 0.00, -1.85)

Page 18 of 24 Estimation of Lyapunov Spectra from a Time Series Experimental Design Experiments performed to optimize parameters of estimation algorithm 30,000 points were generated for each series in both the conditions 5,000 iterations (or the number of evolution steps) were used for averaging using QR treppen-iteration Variation of LEs with SVD window size and number of nearest neighbors Varied number of neighbors with SVD window size: 15 for clean data; 50 for noisy data Varied SVD window size with number of neighbors: 15 for clean data; 50 for noisy data

Page 19 of 24 Estimation of Lyapunov Spectra from a Time Series Experimental Results Lyapunov Exponents (LEs) for a Lorentz System For clean data: Positive and zero exponents are almost constant at the expected values For noisy data: Positive and zero exponents converge to the expected value for SVD window size about 50 and number of neighbors also about 50 Negative LE estimation: not reliable

Page 20 of 24 Estimation of Lyapunov Spectra from a Time Series Experimental Results Lyapunov Exponents (LEs) for a Rossler System For clean data: Positive and zero exponents are almost constant at the expected values For noisy data: Positive and zero exponents converge to the expected value for SVD window size about 60 and number of neighbors also about 50 Negative LE estimation: not reliable

Page 21 of 24 Estimation of Lyapunov Spectra from a Time Series Experimental Results Lyapunov Exponents (LEs) for Sine Signal For clean data: Positive and zero exponents are almost constant at the expected values For clean data: Positive and zero exponents converge to the expected value for SVD window size about 40 and number of neighbors also about 30 Negative LE estimation: not reliable

Page 22 of 24 Estimation of Lyapunov Spectra from a Time Series Summary and Future Work LEs are useful in quantifying how chaotic a system is. SVD embedding helps reconstructing phase spaces in noisy conditions. Parameters of the LE computation algorithm are optimized experimentally to get reliable estimates. Both the positive and zero LE’s are estimated near the actual values using optimized parameters. Negative LE estimation is unreliable (but this is of little concern in chaotic systems). The code for LE estimation is publicly available. Our future work will be to apply Lyapunov exponents to model nonlinearities in speech for better automatic speech recognition.

Page 23 of 24 Estimation of Lyapunov Spectra from a Time Series Resources Pattern Recognition Applet: compare popular linear and nonlinear algorithms on standard or custom data sets Speech Recognition Toolkits: a state of the art ASR toolkit for testing the efficacy of these algorithms on recognition tasks Foundation Classes: generic C++ implementations of many popular statistical modeling approaches

Page 24 of 24 Estimation of Lyapunov Spectra from a Time Series References J. P. Eckmann and D. Ruelle, “Ergodic Theory of Chaos and Strange Attractors,” Reviews of Modern Physics, vol. 57, pp. 617 ‑ 656, July M. Banbrook, “Nonlinear analysis of speech from a synthesis perspective,” PhD Thesis, The University of Edinburgh, Edinburgh, UK, E. Ott, T. Sauer, J. A. Yorke, Coping with chaos, Wiley Interscience, New York, New York, USA, M. Sano and Y. Sawada, “Measurement of the Lyapunov Spectrum from a Chaotic Time Series,” Physical Review Letters, vol. 55, pp , G. Ushaw, “Sigma delta modulation of a chaotic signal,” PhD Thesis, The University of Edinburgh, Edinburgh, UK, 1996.