Negative Power Law Noise, Reality vs Myth Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA Precise Time and Time Interval Systems.

Slides:



Advertisements
Similar presentations
Introduction Simple Random Sampling Stratified Random Sampling
Advertisements

The Properties of Time and Phase Variances in the Presence of Power Law Noise for Various Systems Victor S. Reinhardt Raytheon Space and Airborne Systems.
Welcome to PHYS 225a Lab Introduction, class rules, error analysis Julia Velkovska.
How Extracting Information from Data Highpass Filters its Additive Noise Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA PTTI.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
On Difference Variances as Residual Error Measures in Geolocation Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA ION National.
The Profound Impact of Negative Power Law Noise on the Estimation of Causal Behavior Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo,
G. Alonso, D. Kossmann Systems Group
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
Regression Analysis. Unscheduled Maintenance Issue: l 36 flight squadrons l Each experiences unscheduled maintenance actions (UMAs) l UMAs costs $1000.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Quality Control Procedures put into place to monitor the performance of a laboratory test with regard to accuracy and precision.
A general assistant tool for the checking results from Monte Carlo simulations Koi, Tatsumi SLAC/SCCS.
Lecture 4 Measurement Accuracy and Statistical Variation.
Review of Probability and Random Processes
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
Principles of Time Scales
ELEC 303 – Random Signals Lecture 21 – Random processes
Standard error of estimate & Confidence interval.
Ordinary Least Squares
Lecture 5 Correlation and Regression
Chapter 6 Random Error The Nature of Random Errors
Review of Probability.
Principles of the Global Positioning System Lecture 13 Prof. Thomas Herring Room A;
Characterizing the Impact of Time Error on General Systems Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo CA, USA 2008 IEEE International.
Modeling Negative Power Law Noise Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA 2008 IEEE International Frequency Control.
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Calibration and Model Discrepancy Tony O’Hagan, MUCM, Sheffield.
Error Analysis Accuracy Closeness to the true value Measurement Accuracy – determines the closeness of the measured value to the true value Instrument.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Topic 5 Statistical inference: point and interval estimate
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Geographic Information Science
Topic 10 - Linear Regression Least squares principle - pages 301 – – 309 Hypothesis tests/confidence intervals/prediction intervals for regression.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
ICCS 2009 IDB Workshop, 18 th February 2010, Madrid 1 Training Workshop on the ICCS 2009 database Weighting and Variance Estimation picture.
LEAST MEAN-SQUARE (LMS) ADAPTIVE FILTERING. Steepest Descent The update rule for SD is where or SD is a deterministic algorithm, in the sense that p and.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
A Physical Interpretation of Difference Variances Victor S. Reinhardt 2007 Joint Meeting of the European Time and Frequency Forum (EFTF) and the IEEE International.
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room ;
Chapter 1 Random Process
Chapter 8: Simple Linear Regression Yang Zhenlin.
Professors: Eng. Diego Barral Eng. Mariano Llamedo Soria Julian Bruno
Marshall University School of Medicine Department of Biochemistry and Microbiology BMS 617 Lecture 11: Models Marshall University Genomics Core Facility.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
CS 8751 ML & KDDComputational Learning Theory1 Notions of interest: efficiency, accuracy, complexity Probably, Approximately Correct (PAC) Learning Agnostic.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Regression Analysis: A statistical procedure used to find relations among a set of variables B. Klinkenberg G
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Data analysis Gatut Yudoyono Physics department Physical Measurement Method ( Metode Pengukuran Fisika) SF
Chapter 6 Random Processes
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Introduction, class rules, error analysis Julia Velkovska
Review of Probability Theory
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Statistical Methods For Engineers
BA 275 Quantitative Business Methods
Simple Linear Regression
Principles of the Global Positioning System Lecture 13
Sample vs Population (true mean) (sample mean) (sample variance)
Presentation transcript:

Negative Power Law Noise, Reality vs Myth Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA Precise Time and Time Interval Systems and Applications Meeting 2009

Page 2 PTTI V. S. Reinhardt Negative Power Law (Neg-p) Noise, Reality vs Myth Not questioning the reality that  -variancesNot questioning the reality that  -variances  Like Allan and Hadamard variances  Are convergent measures of neg-p noise But will show it is myth that neg-p divergences in other variances like standard & N-sampleBut will show it is myth that neg-p divergences in other variances like standard & N-sample  Are mathematical defects in these variances  That should be “fixed” by replacing them with  -variances without further action Will show each type of variance is a statistical answer to a different error questionWill show each type of variance is a statistical answer to a different error question  And variance divergences are real indicators of physical problems that can’t be glossed over by swapping variances PSD L P (f)  f p for p < 0

Page 3 PTTI V. S. Reinhardt Negative Power Law Noise, Reality vs Myth Will also show it is myth that one can properly separate polynomial deterministic behavior & (non-highpass filtered) neg-p noiseWill also show it is myth that one can properly separate polynomial deterministic behavior & (non-highpass filtered) neg-p noise  Using least squares & Kalman filters  Except under certain conditions Will show the reality is that such neg-p noise is infinitely correlated & non-ergodicWill show the reality is that such neg-p noise is infinitely correlated & non-ergodic  Ensemble averages  time averages  And this makes neg-p noise act more like systematic error than conventional noise in statistical estimation

Page 4 PTTI V. S. Reinhardt t Data Collection Interval T = x c (t n ) x r (t) = Noise + x r (t n ) x c (t) = True or Deterministic Behavior x(t n ) N Samples of Measured Data Simple Statistical Estimation Model Myth 1: Can “Fix” Variance Divergences Just by Swapping Variances Using a technique like a least squares or Kalman filterUsing a technique like a least squares or Kalman filter Note x(t) here is any data variableNote x(t) here is any data variable  Not necessarily the time error Generate x a,M (t) an M- Parameter Est of x c (t)

Page 5 PTTI V. S. Reinhardt t T Basic Error Measures and the Questions They Address Can form variances from these error measuresCan form variances from these error measures  Point Variance:   (t) 2 = E x  (t) 2 (Kalman)  E.. = Ensemble average  Assumes E x  (t) = 0  Average Variance:   2 = a weighted average of   (t) 2 over T (LSQF) Data Precision x j,M (t n ): Data variation from fit? x j,M (t) Accuracy x w.M (t n ): Error of fit from true behavior? x w,M (t) M th Order  -Measures  (  ) M x(t): Stability over  ?  (  ) 2 x(t)  (  )x(t+  )  (  )x(t)=x(t+  )-x(t)

Page 6 PTTI V. S. Reinhardt Interpreting  -Measures as M th Order Stability Measures Definition of M th Order StabilityDefinition of M th Order Stability  Extrapolation or interpolation error x j,M (t m ) at an (M+1) th point  After a removing a perfect M-parameter fit over only M of those points Can show when x a,M (t) = (M-1) th order polynomial & points separated by  Can show when x a,M (t) = (M-1) th order polynomial & points separated by   x j,M (t m )   (  ) M x(t 0 ) Thus  -variances are statistical measures of such stabilityThus  -variances are statistical measures of such stability  x j,M (t m ) x a,M (t) Passes thru t 0 …t M but not t m       x j,M (t M )  x a,M (t) Passes thru t 0 …t M-1 but not t M  x(t m )      Extrapolation Error Interpolation Error

Page 7 PTTI V. S. Reinhardt A Derived Error Measure: The Fit Precision (Error Bars) Fit Precision: A statistical estimate of accuracy based on measured data precision variance and correction factors based on a specific theoretical noise modelFit Precision: A statistical estimate of accuracy based on measured data precision variance and correction factors based on a specific theoretical noise model  wj,M (t) 2 =  d (t)  j,M (t) 2  wj,M 2 =  d  j,M 2 For uncorrelated noise & an unweighted linear least squares fit (LSQF) For uncorrelated noise & an unweighted linear least squares fit (LSQF)   d = M/(N-M)  Not true for correlated noise t x w,M (t) x j,M (t) 2  wj,M (t) or 2  wj,M

Page 8 PTTI V. S. Reinhardt The Neg-p Convergence Properties of Variances for Polynomial x a,M (t) For  -variancesFor  -variances K(f)  f 2M (|f|<<1) Also true for M th order data precisionAlso true for M th order data precision Both converge for 2M  -pBoth converge for 2M  -p But for accuracy K(f) is a lowpass filterBut for accuracy K(f) is a lowpass filter  Accuracy won’t converge unless |H s (f)| 2 provides sufficient HP filtering Thus the temptation to “fix” a neg-p variance divergence by swapping variancesThus the temptation to “fix” a neg-p variance divergence by swapping variances System Response Variance Kernel PSD 1 Log(fT) M=5  f 10 M=4  f 8 M=3  f 6 M=2  f 4 M=1  f 2 dB K(f) for Data Precision

Page 9 PTTI V. S. Reinhardt This Is No “Fix” Because Each Variance Addresses a Different Error Question Arbitrarily swapping variances just misleadingly changes the questionArbitrarily swapping variances just misleadingly changes the question Does not remove the divergence problem for the original questionDoes not remove the divergence problem for the original question t  w,M (t)  j,M (t) 2  wj,M (t)  , M (t) Accuracy: Error of fit from true behavior? Data Precision: Data variation from fit? Fit Precision: Est of accuracy from data/model? Stability: Extrap/Interp error over  ?

Page 10 PTTI V. S. Reinhardt T 2  w,1 2  j,1 x(t) = f -2 Ensemble Members x j,1 t0t0 Start Noise Here Then What Does a Neg-p Variance Divergence Mean? Accuracy variance   as t 0   while precision remains misleadingly finiteAccuracy variance   as t 0   while precision remains misleadingly finite Thus an accuracy variance infinity is a real indicator of a severe modeling problemThus an accuracy variance infinity is a real indicator of a severe modeling problem  To truly fix  Must modify the system design or the error question being asked Note for f -3 noise  j,1   but  j,2 remains finiteNote for f -3 noise  j,1   but  j,2 remains finite 1/f noise is a marginal case   w,M 2  ln(f h t 0 )1/f noise is a marginal case   w,M 2  ln(f h t 0 )  1/f contribution can be small even for t 0 = age of universe   as t 0  

Page 11 PTTI V. S. Reinhardt Myth 2: Can Use Least Squares & Kalman Filters to Properly Separate True polynomial behavior from data containing (non-HP filtered) neg-p noiseTrue polynomial behavior from data containing (non-HP filtered) neg-p noise Will show that such noise acts like systematic error in statistical estimationWill show that such noise acts like systematic error in statistical estimation  Which generates anomalous fit results  Except under certain conditions Will show this occursWill show this occurs  Because (non-HP filtered) neg-p noise is infinitely correlated & non-ergodic

Page 12 PTTI V. S. Reinhardt NS & WSS Pictures of a Random Process x p (t) x p (t) starts at finite timex p (t) starts at finite time Can have R p   as t g  Can have R p   as t g   Wigner-Ville functionWigner-Ville function  Is a t g dependent PSD Two important NS  WSS TheoremsTwo important NS  WSS Theorems x p (t) active for all t x p (t) must have a bounded steady state PSD   = Time Difference t g = Time from x p start E = Ensemble Average (Complex) Fourier Transform NS Picture x p (t) = 0 t<0 0 tgtg  WSS Picture x p (t)  0 all t 

Page 13 PTTI V. S. Reinhardt The Properties of (Non-Highpass Filtered) Neg-p Noise R p (t g,  ) is finite for finite t g R p (t g,  )   as t g   for all  WSS R p (  ) is infinite for all WSS R p (  ) is infinite for all  Can define L p (f) without R p (  ) Can define L p (f) without R p (  )  Its correlation  time  c is infiniteIts correlation  time  c is infinite It is inherently non-ergodicIt is inherently non-ergodic  Theorem (Parzan): A NS random process is ergodic if and only if Neg-p Noise tgtg  t=0 Unbounded as t g   R p (t g,  ) <  as t g   and  c <  E..  T  

Page 14 PTTI V. S. Reinhardt Ergodicity and Practical Fitting Behavior Fitting Theory is Based on EFitting Theory is Based on E Practical fits rely on TPractical fits rely on T So noise must be ergodic-like over T for a practical fit to work as theoretically predictedSo noise must be ergodic-like over T T  E for a practical fit to work as theoretically predicted Even if noise is strictly ergodicEven if noise is strictly ergodic T   = E  May not have for the T in question  May not have T  E for the T in question Most theory assumes for N  Most theory assumes any T = E for N    Called local ergodicity for T  0 For correlated noise is also required for ergodic-like behavior (intermediate ergodicity)For correlated noise T >>  c is also required for ergodic-like behavior (intermediate ergodicity) Ensemble Av E Time Av T  T 

Page 15 PTTI V. S. Reinhardt A single neg-p ensemble member has polynomial-like behavior over TA single neg-p ensemble member has polynomial-like behavior over T  x p (t) is systematic with polynomial-like x c (t)  Even an augmented Kalman filter can only separate non-systematic parts of x p (t) & x c (t) Fitting methodologies can only separate linearly independent variablesFitting methodologies can only separate linearly independent variables Cannot truly separate (non-HP filtered) neg-p noise and poly-like deterministic behaviorCannot truly separate (non-HP filtered) neg-p noise and poly-like deterministic behavior  Noise whitening cannot be used in such cases f -2 Noise f 0 Noise Kalman Filter f 0 Noise LSQF Practical Examples of Neg-p Non- ergodicity Fitting Effects long term error- 1.xls Augmented Kalman f -2 Noise Standard Kalman f -3 Noise x a,M ±  wj,M –x -- x c

Page 16 PTTI V. S. Reinhardt Non-Ergodic-Like Behavior in Correlated WSS Processes T/  c = N i  Number of independent samplesT/  c = N i  Number of independent samples  Must have N i >> 1 for fit to be meaningful For non-HP filtered neg-p noise  c = For non-HP filtered neg-p noise  c =   No T will produce ergodic-like fit behavior - -True behavior Kalman filter output Meas data Fit precision T T/  c =2000 T/  c =200T/  c =20T/  c =2

Page 17 PTTI V. S. Reinhardt Conclusions Arbitrarily swapping variances does not really fix a neg-p divergence problemArbitrarily swapping variances does not really fix a neg-p divergence problem  Such divergences indicate real problems that must be physically addressed Non-HP filtered neg-p noise acts like more like systematic error than conventional noiseNon-HP filtered neg-p noise acts like more like systematic error than conventional noise  When fitting to (full order) polynomials  The surest way to reduce such problems is to develop freq standards with lower neg-p noise  Good news for frequency standards developers See: for preprints & other related materialSee: for preprints & other related materialhttp://