Presentation is loading. Please wait.

Presentation is loading. Please wait.

Central limit theorem revisited Throw a dice twelve times- the distribution of values is not Gaussian Dice Value Number Of Occurrences.

Similar presentations


Presentation on theme: "Central limit theorem revisited Throw a dice twelve times- the distribution of values is not Gaussian Dice Value Number Of Occurrences."— Presentation transcript:

1 Central limit theorem revisited Throw a dice twelve times- the distribution of values is not Gaussian Dice Value Number Of Occurrences

2 Repeat twelve dice throws twelve times- find the average for each set of twelve Dice Value Number Of Occurrences

3 What is the distribution of the AVERAGE of the 12 sets? Number of Occurrences Mean Dice Value

4 Same as last slide, for 180 sets Number Of Occurrences Average Dice Value of Each set Key Point: Even though the distribution in each set is NOT Gaussian, the average of the sets has a Gaussian Distribution

5 Time Series 1 Time Series 2 R 2 = 1 Perfectly correlated TS1=5*cos(2*t) TS2=cos(2*t)

6 * Time Series 1 Time Series 2 R 2 = 0 No LINEAR correlation TS1=5*cos(2*t) TS2=cos(t)

7 Time Series 1 Time Series 2 R 2 = 0 No correlation TS1=sin(t) TS2=cos(t)

8 Time Series 1 Time Series 2 TS1=cos(t-pi/4) TS2=cos(t)

9 Which linear fit minimizes the error in a least squares sense? A B C Hint, what is the fraction of the variance explained implied by each fit? R 2 = a 1 2 x’ 2 y’ 2

10 Answer B RMS error =.707 implies R 2 =.5RMS error =.767

11 Buying/Selling Nuts and Bolts Usually when you buy one nut, you buy one bolt as well Sometimes you miscount a little bit, but most of the time, you go to the store to buy sets of nuts and bolts

12 If I ask you to go to the store to get 12 nuts and 10 bolts, you trace out 10 units on the bolt axis and 12 units on the bolt axis 12 NUTS 10 Bolts

13 Alternatively, I could ask for 10 sets of nuts and bolts and 2 extra nuts Trace 10 units on the nut bolt axis Trace 2 units on the nut only axis Advantage: We explain more data with a single (rotated) axis

14 Some additional considerations: Maintaining orthogonality of our axes Keep the principle axis, make the secondary axis orthogonal to it 11 units On the nut bolt axis -1 unit on the bolt excess axis

15 Some additional considerations Rotate the axis about the mean of the data (x and y) The principle axis is then, the axis which explains the maximum fraction of the VARIANCE.

16 In EOF speak The Empirical Orthogonal Function (also called eigenvector) is the definition of the axes The Principal Component (also called time series), is the trace of each purchase (data realization) onto the axis

17 What about buying nuts, bolts, and washers? Number of Nuts

18 Number of Nuts Subtract the mean from each axis

19 Define principal axis which explains the most variance It is defined by 1 nut,1 bolt, 2 washers, NORMALIZED

20 The secondary axis is orthogonal to the primary axis And explains the greatest possibel fraction of the REMAINING variance Number of nuts

21 How do we represent the data that fall on the primary axis? Regress or correlate the each realization of the data against the rotated axis (eigenvector). Data point = [-9 nuts, -9 bolts, -18 washers] Regression onto principle axis= [-9, -9, -18] * [.408,.408,.81] = -22 units The dimensional principal component of the first EOF associated with a normalized axis

22 The third and final axis is orthogonal to the first two It is therefore uniquely defined by the first two Number of nuts

23 Alternatively, we could normalize the PC over all realizations Buyer NumberNuts BoughtBolts BoughtWashers BoughtProjection onto primary axisNormalized PC 114163216.329931621.074337606 2224-17.1464282-1.128054487 320213926.536138881.745798611 41210203.2659863240.214867521 5649-10.61445555-0.698319444 6449-11.43095213-0.752036325 7991800 815 2813.063945290.859470085 97817-2.041241452-0.134292201 10112-19.59591794-1.289205128 Normalized First PC

24 Now regress the normalized PC against the original nut and bolt data Regression coef. = The number of bolts sold associated with a one standard deviation purchase of the first EOF/PC pair = 6 nuts in this case Alternatively, we could express the PC’s projection onto the bolt data as a correlation =

25 Repeat the regression with bolts and washers we now the combination of nuts, bolts and washers associated with a standard purchase in the primary EOF/axis [6 NUTS, 6 BOLTS, 12 Washers] if the data was perfect Standard Purchase on The primary axis Multiply by the normalize PC and we have a large percent of our data

26 Again in EOF speak The axes definition are the EOFs (Eigenvectors, spatial patterns) The trace or projection onto the axes for each data realization is the principal component (time series) Lastly, the fraction of the variance explained along each axis is the eigenvalue or singular value ( will get to this soon)

27 Some thoughts 1 st EOF answers the question, what linear combination of the data best explains the variance 2 nd EOF answers question, what pattern best explains the REMAINING variance, is severely constrained by orthogonality Both PCs and EOFs are orthogonal Direction of EOF is ambiguous to 180 degrees EOF/PC pairs are uniquely defined by the data set, not the physics!!! Key Idea: Explain the majority of a large data set with a small number of spatial/temporal patterns

28 How does this relate to spatial- temporal data At any instance, we represent a spatial pattern by an N x by N y matrix representing the data at each grid point We could also think of the data at each time as vector of length Nx * Ny observations let Nx * Ny =M= # of observations at each time

29 Compose a matrix out of the vector of M observations at each of N observation times. M = number of observations at each time (Spatial Locations) N = number of time steps in the data set TIME SPACE Columns refer to a set of M spatial observations at a given time Rows correspond to a time series of observations at a given location

30 Singular Value Decomposition (SVD) X = U  V T M M M N N NM N DATA MATRIX EOFs Spatial Structures (Normalized) Variance Explained by PC/EOF pair Principal Component, time series (normalized) LEADING EOF (Eigenvector) VECTOR LENGTH M (NORMALIZED IN SPACE) Diagonal squared Gives amount Of variance explained First PC/ time series (NORMALIZED in Time)

31 Eigenvectors, Eigenvalues, PC ’ s Eigenvectors explain variance in spatial (observation) dimension; Principle components explain variance in the time (realization of data) dimension. Each eigenvector has a corresponding principle component. The PAIR define a mode that explains variance. Each eigenvector/PC pair has an associated eigenvalue which relates to how much of the total variance is explained by that mode.

32 Fabricated example of spatial temporal data What spatial and temporal patterns most efficiently describe this data set? Are they orthogonal in space? Time?

33 EOF 1 - 60% variance expl. PC 1 EOF 2 - 40% variance expl. PC 2

34 EOF 1 - 60% variance expl. PC 1 EOF 2 - 40% variance expl. PC 2 Eigenvalue Spectrum

35 Fabricated example of spatial temporal data What spatial and temporal patterns most efficiently describe this data set? Are they orthogonal in space? Time?

36 PC 1 EOF 2 PC 2

37 EOF 1 - 65% variance expl. PC 1 EOF 2 - 35% variance expl. PC 2 “Data”

38

39

40 EOFs of Real Data: Winter SLP anomalies EOF 1: AO/NAM (23% expl).EOF 2: PNA (13% expl.) EOF 3: non-distinct(10% expl.)

41 EOF 1 (AO/NAM) EOF 2 (PNA) EOF 3 (?) PC1 (AO/NAM) PC2 (PNA) PC3 (?)

42 PNA - Correlation map (r values of each point with index) PNA - Regression map (meters/std deviation of index) Correlation maps vs. regression maps Correlation maps put each point on ‘equal footing’ Regression maps show magnitude of typical variability

43 Regress the PCs of SLP against surface air temperature annomalies The surface air temperature field associated with a one standard deviation NAM or PNA event

44 Significance Each EOF / PC pair comes with an associated eigenvalue The normalized eigenvalues (each eigenvalue divided by the sum of all of the eigenvalues) tells you the percent of variance explained by that EOF / PC pair. Eigenvalues need to be well separated from each other to be considered distinct modes. First 25 Eigenvalues for DJF SLP

45 Significance: The North Test North et al (1982) provide estimate of error in estimating eigenvalues Requires estimating DOF of the data set. If eigenvalues overlap, those EOFs cannot be considered distinct. Any linear combination of overlapping EOFs is an equally viable structure. First 25 Eigenvalues for DJF SLP These two just barely overlap. Need physical intuition to help judge. Example of overlapping eigenvalues

46 Validity of PCA modes: Questions to ask Is the variance explained more than expected for null hypothesis (red noise, white noise, etc.)? Do we have an a priori reason for expecting this structure? Does it fit with a physical theory? Are the EOF’s sensitive to choice of spatial domain? Are the EOF’s sensitive to choice of sample? If data set is subdivided (in time), do you still get the same EOF’s?

47 Practical Considerations EOFs are easy to calculate, difficult to interpret. There are no hard and fast rules, physical intuition is a must. Due to the constraint of orthogonality, EOFs tend to create wave-like structures, even in data sets of pure noise. So pretty… so suggestive… so meaningless. Beware of this.

48 Practical Considerations EOF’s are created using linear methods, so they only capture linear relationships. By nature, EOF’s give are fixed spatial patterns which only vary in strength and in sign. E.g., the ‘positive’ phase of an EOF looks exactly like the negative phase, just with its sign changed. Many phenomena in the climate system don’t exhibit this kind of symmetry, so EOF’s can’t resolve them properly.

49

50

51

52

53

54


Download ppt "Central limit theorem revisited Throw a dice twelve times- the distribution of values is not Gaussian Dice Value Number Of Occurrences."

Similar presentations


Ads by Google