Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantifying Chaos 1.Introduction 2.Time Series of Dynamical Variables 3.Lyapunov Exponents 4.Universal Scaling of the Lyapunov Exponents 5.Invariant Measures.

Similar presentations


Presentation on theme: "Quantifying Chaos 1.Introduction 2.Time Series of Dynamical Variables 3.Lyapunov Exponents 4.Universal Scaling of the Lyapunov Exponents 5.Invariant Measures."— Presentation transcript:

1 Quantifying Chaos 1.Introduction 2.Time Series of Dynamical Variables 3.Lyapunov Exponents 4.Universal Scaling of the Lyapunov Exponents 5.Invariant Measures 6.Kolmogorov-Sinai Entropy 7.Fractal Dimensions 8.Correlation Dimension & a Computational History 9.Comments & Conclusions

2 1. Introduction Why quantify chaos? To distinguish chaos from noise / complexities. To determine active degrees of freedom. To discover universality classes. To relate chaotic parameters to physical quantities.

3 2. Time Series of Dynamical Variables (Discrete) time series data: –x(t 0 ), x(t 1 ), …, x(t n ) –Time-sampled (stroboscopic) measurements –Poincare section values Real measurements & calculations are always discrete. Time series of 1 variable of n-D system : – If properly chosen, essential features of system can be re-constructed: Bifurcations Chaos on-set – Choice of sampling interval is crucial if noise is present (see Chap 10) Quantification of chaos: –Dynamical: Lyapunov exponents Kolmogorov-Sinai (K-S) Entropy –Geometrical: Fractal dimension Correlation dimension Only 1-D dissipative systems are discussed in this chapter.

4 9.3. Lyapunov Exponents Time series: Given i & j, let System is chaotic ifwith Lyapunov exponent  Technical Details: Check exponential dependence. λ is x dependent → λ = Σ i λ(x i ) / N. N can’t be too large for bounded systems. λ = 0 for periodic system. i & j shouldn’t be too close. Bit- version: d n = d 0 2 nλ Logistic Map

5 9.4. Universal Scaling of the Lyapunov Exponents Period-doubling route to chaos: Logistic map: A  = 3.5699… LyapunovExponents.nb λ < 0 in periodic regime. λ = 0 at bifurcation point. (period-doubling) λ > 0 in chaotic regime. λ tends to increase with A → More chaotic as A increases.

6 Huberman & Rudnick: λ(A > A  ) is universal for periodic-doubling systems: = Feigenbaum δ λ 0 = 0.9 λ ~ order parameter A  A  ~ T  T C

7 Derivation of the Universal Law for λ Chaotic bands merge via “period-undoubling” for A > A . Ratio of convergence tends to Feigenbaum δ. Logistic map

8 Let 2 m bands merge to 2 m  1 bands at A = A m. Reminder: 2 m bands bifurcate to 2 m+1 bands at A = A m. Divergence of trajectories in 1 band : Divergence of trajectories among 2 m band : λ = effective Lyapunov exponent denoting 2 m iterations as one. λ = Lyapunov exponent for If λ is the same for all bands, then Ex.2.4-1: Assuming δ n = δ gives Similarly:

9 → → i.e.,

10 9.5. Invariant Measures Definition of Probability Invariant Measures Ergodic Behavior For systems of large DoFs, geometric analysis becomes unwieldy. Alternative approach: Statistical methods. Basic quantity of interest: Probability of trajectory to pass through given region of state space.

11 Definition of Probability Consider an experiment with N possible results (outcomes). After M runs (trials) of the experiment, let there be m i occurrences of the ith outcome. The probability p i of the ith outcome is defined as where → ( Normalization ) If the outcomes are described by a set of continuous parameters x, N = . m i are finite → M =  and p i = 0  i. Remedy: Divide range of x into cells/bins. m i = number of outcomes belonging to the ith cell.

12 Invariant Measures For an attractor in state space: 1.Divide attractor into cells. 2.1-D case: p i  m i / M. Set {p i } is a natural probability measure if it is independent of (almost all) IC. Letthen μ is an invariant probability measure if p(x) dx = probability of trajectory visiting interval [ x, x+dx ] or [ x  dx/2, x+dx/2 ]. = probability of trajectory visiting cell i. Treating M as total mass → p(x) = ρ(x)

13 Example: Logistic Map, A = 4 From § 4.8: For A = 4, logistic map is equivalent to Bernoulli shift. → with → Numerical: 1024 iterations into 20 bins →

14 Ergodic Behavior Time average of B(x):  B  t should be independent of t 0 as T → . Ensemble average of B(x): System is ergodic if  B  t =  B  p. Comments:  B  p is meaningful only for invariant probability measures. p(x) may not exist, e.g., strange attractors.

15 Example: Logistic Map, A = 4 Local values of the Lyapunov exponent: Ensemble average value of the Lyapunov exponent: ( same as the Bernoulli shift ) Same as that calculated by time average (c.f. §5.4):

16 9.6. Kolmogorov-Sinai Entropy Brief Review of Entropy: Microcanonical ensemble (closed, isolated system in thermal equilibrium): S = k ln N =  k ln pp = 1/N Canonical ensemble (small closed subsystem): S =  k Σ i p i ln p i Σ i p i = 1 2nd law: ΔS  0 for spontaneous processes in closed isolated system. → S is maximum at thermodynamic equilibrium Issue: No natural way to count states in classical mechanics. → S is defined only up to an constant ( only ΔS physically meaningful ) Quantum mechanics: phase space volume of each state = h n, n = DoF.

17 Entropy for State Space Dynamics 1. Divide state space into cells (e.g., hypercubes of volume L Dof ). 2. For dissipative systems, replace state space with attractors. 3. Start evolution for an ensemble of I.C.s (usually all located in 1 cell). 4. After n time steps, count number of states in each cell. Note: Non-chaotic motion: Number of cells visited (& hence S ) is independent of t & M on the macroscopic time-scale. Chaotic motion: Number of cells visited (& hence S ) increases with t but independent of M. Random motion: Number of cells visited (& hence S ) increases with both t & M k = Boltzmann constant

18 Only ΔS is physically significant. Kolmogorov-Sinai entropy rate = K-S entropy = K is defined as For iterated maps or Poincare sections, τ= 1 so that E.g., if the number of occupied cells N n is given by and all occupied cells have the same probability then Pesin identity: λ i = positive average Lyapunov exponents

19 Alternative Definition of the K-S Entropy See Schuster 1. Map out attractor by running a single trajectory for a long time. 2. Divide attractor into cells. 3. Start a trajectory of N steps & mark the cell it’s in at t = nτas b(n). 4. Do the same for a series of other slightly different trajectories starting from the same initial cell. 5. Calculate the fraction p(i) of trajectories described by the ith cell sequence. Thenwhere Exercise: Show that both definitions of K give roughly the same result for all 3 types of motions discussed earlier.

20 9.7. Fractal Dimensions Geometric aspects of attractors Distribution of state space points of a long time series →Dimension of attractor Importance of dimensionality: Determines range of possible dynamical behavior. Dictates long-term dynamics. Reveals active degrees of freedom. For a dissipative system : D < d, D  dimension of attractor, d  dimension of state space. D* < D, D* = dimension of attractor on Poincare section.

21 For a Hamiltonian system, D  d  1, D = dimension of points generated by one trajectory ( trajectory is confined on constant energy surface ) D* < D, D* = dimension of points on Poincare section. Dimension is further reduced if there are other constants of motion. Example: 3-D state space → attractor must shrink to a point or a curve → system can’t be quasi-periodic ( no torus ) → no q.p. solutions for the Lorenz system. Dissipative system: Strange attractor = Attractor with fractional dimensions (fractals) Caution: There’re many inequivalent definitions of fractal dimension. See J.D.Farmer, E.Ott, J.A.Yorke, Physica D7, 153-80 (1983)

22 Capacity ( Box-Counting ) Dimension D b Easy to understand. Not good for high d systems. 1 st used by Komogorov N(R) = Number of boxes of side R that covers the object

23 Example 1: Points in 2-D space A single point:Box = square of sides R. → Set of N isolated points:Box = square of sides R. R = ½ (minimal distance between points). → Example 2: Line segment of length L in 2-D space → Box = square of sides R.

24 Example 3: Cantor Set Starting with a line segment of length 1, take out repeatedly the middle third of each remaining segment. Caution: Given M finite, set consists of 2 M line segments → D b = 1. Given M infinite, set consists of discrete points → D b = 0.  Limits M →  and R → 0 must be taken simultaneously. At step M, there remain 2 M segments, each of length 1/3 M.

25 Measure of the Cantor set: Length of set Ex. 9.7-5: Fat Fractal

26 Example 4: Koch Curve Start with a line segment of length 1. a) Construct an equilateral triangle with the middle third segment as base. b) Discard base segment. Repeat a) and b) for each remaining segment. At step M, there exists 4 M segments of length 1/3 M each.

27 Types of Fractals Fractals with self-similarity: small section of object, when magnified, is identical with the whole. Fractals with self-affinity: same as self-similarity, but with anisotropic magnification. Deterministic fractals: Fixed construction rules. Random fractals: Stochastic construction rules (see Chap 11).

28 Fractal Dimensions of State Space Attractors Difficulty: R → 0 not achievable due to finite precision of data. Remedy: Alternate definition of fractal dimension (see §9.8) Logistic map at A , renormalization method: D b = 0.5388… (universal) Elementary estimates: Consider A → A  + ( from above ). Sarkovskii’s theorem → chaotic bands undergo doubling-splits as A → A  +. Feigenbaum universality → splitted bands are narrower by 1/α and 1/α 2. Assume points in each band distributed uniformly → splitting is Cantor-set like.

29 1 st estimate: R decreases by factor 1/α at each splitting. 2 nd estimate: → → D b procedure dependent. An infinity of dimensional measures needed to characterize object (see Chap 10)

30 The Similarity Dimensions for Nonuniform Fractals

31 9.8. Correlation Dimension & a Computational History

32 9.9. Comments & Conclusions


Download ppt "Quantifying Chaos 1.Introduction 2.Time Series of Dynamical Variables 3.Lyapunov Exponents 4.Universal Scaling of the Lyapunov Exponents 5.Invariant Measures."

Similar presentations


Ads by Google