How Extracting Information from Data Highpass Filters its Additive Noise Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA PTTI.

Slides:



Advertisements
Similar presentations
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room A;
Advertisements

Chapter 12 Inference for Linear Regression
Structural Equation Modeling Using Mplus Chongming Yang Research Support Center FHSS College.
The Properties of Time and Phase Variances in the Presence of Power Law Noise for Various Systems Victor S. Reinhardt Raytheon Space and Airborne Systems.
Colorado Center for Astrodynamics Research The University of Colorado ASEN 5070 OD Accuracy Assessment OD Overlap Example Effects of eliminating parameters.
Part 12: Asymptotics for the Regression Model 12-1/39 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Introduction to Regression with Measurement Error STA431: Spring 2015.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
On Difference Variances as Residual Error Measures in Geolocation Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA ION National.
The Profound Impact of Negative Power Law Noise on the Estimation of Causal Behavior Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo,
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Omitted Variable Bias Methods of Economic Investigation Lecture 7 1.
Objectives (BPS chapter 24)
Introduction to Regression Analysis
Empirical Analysis Doing and interpreting empirical work.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
EVAL 6970: Meta-Analysis Fixed-Effect and Random- Effects Models Dr. Chris L. S. Coryn Spring 2011.
Chapter 12 Simple Regression
Part 18: Regression Modeling 18-1/44 Statistics and Data Analysis Professor William Greene Stern School of Business IOMS Department Department of Economics.
Meta-Analysis and Meta- Regression Airport Noise and Home Values J.P. Nelson (2004). “Meta-Analysis of Airport Noise and Hedonic Property Values: Problems.
QUIZ CHAPTER Seven Psy302 Quantitative Methods. 1. A distribution of all sample means or sample variances that could be obtained in samples of a given.
Characterizing the Impact of Time Error on General Systems Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo CA, USA 2008 IEEE International.
Modeling Negative Power Law Noise Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA 2008 IEEE International Frequency Control.
Colorado Center for Astrodynamics Research The University of Colorado STATISTICAL ORBIT DETERMINATION Project Report Unscented kalman Filter Information.
Chapter 11 Simple Regression
Overview of Meta-Analytic Data Analysis
Simple Linear Regression Models
Oceanography 569 Oceanographic Data Analysis Laboratory Kathie Kelly Applied Physics Laboratory 515 Ben Hall IR Bldg class web site: faculty.washington.edu/kellyapl/classes/ocean569_.
Copyright 2005 Victor S. Reinhardt--Rights to copy material is granted so long as a source reference is listed on each page, section, or graphic utilized.
BPS - 3rd Ed. Chapter 211 Inference for Regression.
A.P. STATISTICS LESSON 14 – 2 ( DAY 2) PREDICTIONS AND CONDITIONS.
Statistics 101 Chapter 10. Section 10-1 We want to infer from the sample data some conclusion about a wider population that the sample represents. Inferential.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
Speech Enhancement Using Spectral Subtraction
Ch4 Describing Relationships Between Variables. Pressure.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 12-1 Correlation and Regression.
Introduction to Linear Regression
Instrument Components Signal Generator (Energy Source) Analytical Signal Transducer Signal Processor Display Can you identify these components in the following.
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
The process has correlation sequence Correlation and Spectral Measure where, the adjoint of is defined by The process has spectral measure where.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
A Physical Interpretation of Difference Variances Victor S. Reinhardt 2007 Joint Meeting of the European Time and Frequency Forum (EFTF) and the IEEE International.
AP STATISTICS LESSON 14 – 1 ( DAY 1 ) INFERENCE ABOUT THE MODEL.
Principles of the Global Positioning System Lecture 12 Prof. Thomas Herring Room ;
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Inferential Statistics Introduction. If both variables are categorical, build tables... Convention: Each value of the independent (causal) variable has.
Week 6. Statistics etc. GRS LX 865 Topics in Linguistics.
G. Cowan Computing and Statistical Data Analysis / Stat 9 1 Computing and Statistical Data Analysis Stat 9: Parameter Estimation, Limits London Postgraduate.
CS 2750: Machine Learning The Bias-Variance Tradeoff Prof. Adriana Kovashka University of Pittsburgh January 13, 2016.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Negative Power Law Noise, Reality vs Myth Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA Precise Time and Time Interval Systems.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate its.
LIGO-G Z S5 calibration: time dependent coefficients  Myungkee Sung, Gabriela González, Mike Landry, Brian O’Reilly, Xavier Siemens,…
BPS - 5th Ed. Chapter 231 Inference for Regression.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
Locating a Shift in the Mean of a Time Series Melvin J. Hinich Applied Research Laboratories University of Texas at Austin
Inference about the slope parameter and correlation
Chapter 13 Simple Linear Regression
Inference for Least Squares Lines
Statistical Quality Control, 7th Edition by Douglas C. Montgomery.
BA 275 Quantitative Business Methods
6.1 Introduction to Chi-Square Space
1-Way Random Effects Model
Sample vs Population (true mean) (sample mean) (sample variance)
Presentation transcript:

How Extracting Information from Data Highpass Filters its Additive Noise Victor S. Reinhardt Raytheon Space and Airborne Systems El Segundo, CA, USA PTTI 2007 Thirty-Ninth Annual Precise Time and Time Interval (PTTI) Systems and Applications Meeting November , 2007 Long Beach, California

Page 2 PTTI V. Reinhardt Various Measures of Random Error Are Used Across EE Community Mean Square (Observable) Residual Error Mean Square (Observable) Residual Error  After removing a model function estimate of true causal behavior of data Jitter (and Wander) Jitter (and Wander)  Residual errors + HP & LP filtering Difference (  ) Variances Difference (  ) Variances  Allan  Mean sq of  (  )y(t)  Hadamard or Picinbono  Mean sq of  (  ) 2 y(t) ffcfc Jitter HP Filter Wander LP Filter t ● ● ● ●   (  )v(t n ) = v(t n +  ) - v(t n ) ● t ● ● ● ● ● Res Error ● ● ● ● Causal Estimate v(t)  x(t),y(t),  (t)

Page 3 PTTI V. Reinhardt Some Common Wisdoms About These Error Measures Residual error diverges in presence of negative power law (neg-p) noise Residual error diverges in presence of negative power law (neg-p) noise  Power law noise  PSD L v (f)  f p  Neg-p  p < 0 (-1,-2,-3,-4)  White noise  p = 0  Jitter &  -variance used to correct this problem  -variance doesn’t measure same thing as MS residual error  -variance doesn’t measure same thing as MS residual error  “Need ‘real’ variance not Allan Variance” Will show common wisdoms are not true Will show common wisdoms are not true  And all 3 essentially measure same thing for polynomial models of causal behavior  -Variances  Residual Error  Jitter

Page 4 PTTI V. Reinhardt Will Demonstrate This by Showing The estimation of causal behavior from data HP filters noise in residual error The estimation of causal behavior from data HP filters noise in residual error  Res error in general converges for neg-p noise  Res error guaranteed to converge if free to choose model for causal behavior Can define jitter simply as observable residual error Can define jitter simply as observable residual error  -variances are measures of residual error  -variances are measures of residual error  For any # of samples  When causal model is a polynomial  -Variances  Residual Error  Jitter

Page 5 PTTI V. Reinhardt Residual Error Consider N data samples of data over observation interval T Consider N data samples of data over observation interval T Data v(t n ) will be modeled as Data v(t n ) will be modeled as  True causal function = v c (t n ) … plus  True noise = v p (t n ) with L v (f)  f p  v p (t n ) also true residual error  But not measurable from data over T t ● ● ● ● ● ● ● ● ● ● True Causal Function v c (t) v(t n ) N Data Samples over T = N∙  o True Noise v p (t n ) Also True Residual Error

Page 6 PTTI V. Reinhardt Can estimate residual error by fitting model function v w,M (t,A) to dataCan estimate residual error by fitting model function v w,M (t,A) to data  A = (a o,a 1,…a M-1 )  M adjustable parameters  A  Information extracted from data  Class of v w,M (t,A)  (M-1) th order polynomials Will use Least SQ Fit to estimate v c (t n )Will use Least SQ Fit to estimate v c (t n )  LSQF equivalent to many other methods Residual Error t ● ● ● ● ● ● ● ● ● ● v c (t) Model Fn Est v w,M (t,A) v(t n ) N Data Samples over T = N∙  o

Page 7 PTTI V. Reinhardt Observable residual error  data – est fn Observable residual error  data – est fn v j (t n ) = v(t n ) - v w,M (t n,A)  Jitter  v j (t n ) with no ad hoc filtering True function error  est – true functions True function error  est – true functions v w (t n ) = v w,M (t n,A) – v c (t n )  v w (t n )  wander (Not observable from data over T alone) Residual Error t ● ● ● ● ● ● ● ● ● ● v c (t) Observable Res Error v j (t n ) True Function Error v w (t) Model Fn Est v w,M (t,A) v(t n ) N Data Samples over T = N∙  o   v-j 2 = MS of v j (t n )   v-w 2 = MS of v w (t n )

Page 8 PTTI V. Reinhardt Directly relates lower level error to primary performance measures in many systems Directly relates lower level error to primary performance measures in many systems  SNR  BER  MNR  NPR  ENOB Must use observable residual error for verification with data over T Must use observable residual error for verification with data over T  For true errors (over T) must measure over >> T for neg-p noise (exception to be discussed) Why Residual Error is Important Measure t ● ● ● ● ● ● ● ● ● ● v c (t) Observable Res Error v j (t n ) True Function Error v w (t) v(t n ) N Data Samples over T = N∙  o Model Fn Est v w,M (t,A)

Page 9 PTTI V. Reinhardt HP Filtering of Noise Due to Information Extraction Proved in paper Proved in paper  Will explain graphically as follows For white noise LSQF behaves in classical manner For white noise LSQF behaves in classical manner   v-w  0 as N    Note T fixed as N varies But for neg-p noise But for neg-p noise   v-w not  0 as N    Because fitted v w,M (t,A) tracks highly correlated LF noise components in data long term error-1.xls T LSQF for Various p f 0 Noise v(t n ) f -2 Noise vwvw vcvc f -4 Noise v a,M vjvj f 0 Noise v(t n ) vcvc f -4 Noise v a,M f -2 Noise vwvw vjvj

Page 10 PTTI V. Reinhardt HP Filtering of Noise Due to Information Extraction Happens because LSQF can’t separate highly correlated LF noise Happens because LSQF can’t separate highly correlated LF noise  With Fourier freqs f ≤ 1/T  From the causal behavior True for all noise True for all noise  Implicit in LSQF theory  Only apparent for neg-p noise because most power in f ≤ 1/T This tracking causes HP filtering of noise in v j &  v-j This tracking causes HP filtering of noise in v j &  v-j long term error-1.xls T LSQF for Various p f 0 Noise v(t n ) f -2 Noise vwvw vcvc f -4 Noise v a,M vjvj vcvc f -4 Noise v a,M f -2 Noise vwvw vjvj

Page 11 PTTI V. Reinhardt HP Filtering of Noise in Spectral Integral Representation of  v-j 2 H s (f) = System response function H s (f) = System response function  Described in Reinhardt, FCS 2006  |H s (f)| 2 is used to replace upper cut-off freq f h  Can show H s (f) often HP filters L v (f) (as well as LP filters)  Helps  v-j 2 converge |H s (f)| 2 replaces f h Example of H s (f) x ~D dd  (t)  (t-  d ) |H s (f)| 2  f 2 (f<<1) Effect of Delay & Mix on System

Page 12 PTTI V. Reinhardt HP Filtering of Noise in Spectral Integral Representation of  v-j 2 K v-j (f) spectral kernel also HP filters noise K v-j (f) spectral kernel also HP filters noise  Paper proves K v-j (f)  f 2M (f<<1)  When v a,M (t,A) is (M-1) th order polynomial  K v-j (f) at least  f 2 (f<<1)  For any v a,M (t,A) with DC component Thus convergence of  v-j 2 depends on complexity of model function v a,M (t,A) Thus convergence of  v-j 2 depends on complexity of model function v a,M (t,A) Convergence Guaranteed if free to choose model function Convergence Guaranteed if free to choose model function

Page 13 PTTI V. Reinhardt HP Filtering of Noise in Spectral Integral Representation of  v-j 2  v-c 2 = Contribution due to model error  v-c 2 = Contribution due to model error  Model error occurs when model function cannot follow variations in v c (t) over T  If model error present  v-j 2 &  v-w 2 will increase Thus if model function can’t track causal behavior over T Thus if model function can’t track causal behavior over T The causal behavior will contaminate  v-j 2 (&  v-w 2 ) The causal behavior will contaminate  v-j 2 (&  v-w 2 )

Page 14 PTTI V. Reinhardt aoao a o + a 1 t Simulation Verifying K v-j (f)  f 2M (f<<1) for Polynomial v a,M (t,A) Simulation Results  M < 1E-3  f 2  f 4  f 6  f 8  f 10 K v-j (f) in dB Log 10 (fT) (N=1000) HP Knee f T  1/T (Unweighted LSQF)

Page 15 PTTI V. Reinhardt Jitter and Wander Defined by ITU, IEEE BTS, SMPTE Defined by ITU, IEEE BTS, SMPTE  Filtered residual x-error after removing causal (time &) freq offset & freq drift  Jitter  HP filtered (> f c )  Wander  LP filtered (< f c ) Problem relating f c to user system params Problem relating f c to user system params  ITU  f c = 10 Hz  Standardizes HW producers  But not related to parameters in user systems  IEEE BTS, SMPTE  f c = PLL BW in system  What about systems without PLLs? ffcfc Jitter HP Filter Wander LP Filter

Page 16 PTTI V. Reinhardt v(t n ) v w,M (t n,A) v j (t n ) HP Filtering of Residual Error Resolves f c Relationship Problem Jitter  Observable residual error Jitter  Observable residual error v j (t n ) = v(t n ) - v w,M (t n,A) Wander  True causal function error Wander  True causal function error v w (t n ) = v w,M (t n,A) – v c (t n ) Have property  v j (t n ) + v w (t n ) = v p (t n ) Have property  v j (t n ) + v w (t n ) = v p (t n ) These definitions apply to any type of causal function removal & any variable These definitions apply to any type of causal function removal & any variable HP & LP properties generated by system HP & LP properties generated by system v c (t) v w (t) v p (t)

Page 17 PTTI V. Reinhardt − Wander − Jitter K(f) fTfTfTfT HP Filtering of Residual Error Resolves f c Relationship Problem LSQF HP filters v j & LP filters v w  f T  1/T LSQF HP filters v j & LP filters v w  f T  1/T H s (f) filters both the same  f l = HP f h = LP H s (f) filters both the same  f l = HP f h = LP For as T   (f T << f l ) wander shrinks to zero For as T   (f T << f l ) wander shrinks to zero  If H s (f) alone can overcome pole in L v (f)  So wander can also converge for neg-p noise Jitter variance with brickwall f c & f h is just bandpass approximation of  v-j 2 Jitter variance with brickwall f c & f h is just bandpass approximation of  v-j 2 f |H s (f)| 2 flflflfl fhfhfhfh f fTfTfTfT flflflfl fhfhfhfh

Page 18 PTTI V. Reinhardt  -Variances as Measure of  v-j 2 When v a,M (t,A) is Polynomial  v,M (  ) 2 = M x Mean Sq of  (  ) M v(t n )  v,M (  ) 2 = M x Mean Sq of  (  ) M v(t n )   (  )v(t) = v(t+  ) - v(t)   (  ) 2 v(t) = v(t+2  ) - 2v(t+  ) + v(t)  M  All  v,M (  ) 2 are equal for white noise  v,M (  ) 2 = M MS{  (  ) M v(t n )} t ● ● ● ● ● ● ● ● ● ●  (  )v(t n +  ) ●   (  )v(t n )  (  ) 2 v(t n ) ●  MS over N samples in T 

Page 19 PTTI V. Reinhardt  -Variances as Measure of  v-j 2 When v a,M (t,A) is Polynomial Paper shows Paper shows  For  v-j 2  “unbiased” mean square  “Unbiased”  Divide sum of squares by N - M Well-known for Allan (2-sample) variance Well-known for Allan (2-sample) variance  Is 2-sample MS residual of y(t) with 0 th order polynomial (freq offset) removed Hadamard-Picinbono variance is 3-sample residual variance of y(t) for M = 2 Hadamard-Picinbono variance is 3-sample residual variance of y(t) for M = 2  1 st order polynomial (freq offs & drift) removed  v-j 2 (N=M+1) =  v,M (  ) 2 when  = T/M

Page 20 PTTI V. Reinhardt Can Extend Equivalence to Any N Can show biased RMS{v j } doesn’t vary much with N Can show biased RMS{v j } doesn’t vary much with N  Biased  Divide sum sq by N  Note T fixed as N varied Thus for any N can write “unbiased”  v-j 2 as Thus for any N can write “unbiased”  v-j 2 as  Approx true for any p  Can generate exact relationship for each p like Allan-Barnes bias functions  v-j 2 (N)   v,M (T/M) 2 N-M N Errors vs N (T Fixed, M=2) RMS{v j } f 0 Noise  v-w K Samples N  f -2 Noise  v-w RMS{v j }  N = M f -4 Noise  v-w RMS{v j }

Page 21 PTTI V. Reinhardt Consequences of  v,M (  ) as Approximate Measure of  v-j Justifies using  v,M (T/M) in residual error problems Justifies using  v,M (T/M) in residual error problems  When v w,M (t,A) is (M-1) th order polynomial Don’t have to perform LSQF on data if use  v,M (  ) to estimate  v-j Don’t have to perform LSQF on data if use  v,M (  ) to estimate  v-j  Because  (  ) M of (M-1) th order polynomial = 0  Well-known insensitivity of  v,M (  ) to (M-1) th polynomial causal behavior  True even if there is model error (effect on both  v,M (  ) &  v-j the same)

Page 22 PTTI V. Reinhardt Consequences of  v,M (  ) as Approximate Measure of  v-j Provides guidance in determining order of  v,M (  ) to use as stability measure Provides guidance in determining order of  v,M (  ) to use as stability measure  v w,M (t,A) = Polynomial aging function  Aging is strictly fitted over T = M   For  decoupled from T/M  v,M (  )   v-j approx true Causal terms not modeled in v a,M (t,A) are part of instability measured by  v,M (  ) Causal terms not modeled in v a,M (t,A) are part of instability measured by  v,M (  )  Explains sensitivity of Allan variance to freq drift (only freq offset modeled)  & insensitivity of Hadamard variance to such drift (both freq offset & drift modeled)

Page 23 PTTI V. Reinhardt ssss ssss x & y Difference Variances Interpreted as Aging Removed  v-j 2 M  Var of y Aging Excluded Application  Var of x Aging Excl Application 0MS{y} None Synthesizers & Rel time dist equip MS{x} None Abs time dist equip 1Allan y y offset Oscillators (y drift in instability) TIErms 2 /2 MS{TIE}/2 x offset Synth & rel time dist 2 Hadamard Picinbono y ofs & drift Oscillators (y drift not in instability) Allan x Jitter 2 x & y offset Osc (y drift in instab) 3  x,3 2 (  ) is equivalent to MS of x-jitter with time & freq offsets & freq drift removed Hadamard Picinbono x,y ofs y drift Osc (y drift not in instab)

Page 24 PTTI V. Reinhardt M  Var of y Aging Excluded Application  Var of x Aging Excl Application 0MS{y} None Synthesizers & Rel time dist equip MS{x} None Abs time dist equip 1Allan y y offset Oscillators (y drift in instability) TIErms 2 /2 MS{TIE}/2 x offset Synth & rel time dist 2 Hadamard Picinbono y ofs & drift Oscillators (y drift not in instability) Allan x Jitter 2 x & y offset Osc (y drift in instab) 3  x,3 2 (  ) is equivalent to MS of x-jitter with time & freq offsets & freq drift removed Hadamard Picinbono x,y ofs y drift Osc (y drift not in instab) x & y Difference Variances Interpreted as Aging Removed  v-j 2 Explains why low order  v,M (  ) used for synthesizers & distribution equipment Explains why low order  v,M (  ) used for synthesizers & distribution equipment  Uncontrolled (but fixed) x & y offsets are part of “random” error  Not considered part of “random” error for oscillators

Page 25 PTTI V. Reinhardt M  Var of y Aging Excluded Application  Var of x Aging Excl Application 0MS{y} None Synthesizers & Rel time dist equip MS{x} None Abs time dist equip 1Allan y y offset Oscillators (y drift in instability) TIErms 2 /2 MS{TIE}/2 x offset Synth & rel time dist 2 Hadamard Picinbono y ofs & drift Oscillators (y drift not in instability) Allan x Jitter 2 x & y offset Osc (y drift in instab) 3  x,3 2 (  ) is equivalent to MS of x-jitter with time & freq offsets & freq drift removed Hadamard Picinbono x,y ofs y drift Osc (y drift not in instab) x & y Difference Variances Interpreted as Aging Removed  v-j 2 Hadamard-Picinbono variance or  x,3 2 (  ) equivalent to MS of x-jitter with time & freq offsets & freq drift removed Hadamard-Picinbono variance or  x,3 2 (  ) equivalent to MS of x-jitter with time & freq offsets & freq drift removed

Page 26 PTTI V. Reinhardt What to Do When  v-j Does Diverges  v-j or  v,M (  ) can diverge for neg-p noise  v-j or  v,M (  ) can diverge for neg-p noise  When v w,M (t,A) is fixed by system spec or physical problem being addressed  Has been considered mathematical nuisance to be heuristically patched Such a divergence indicates real problem Such a divergence indicates real problem  In system design, specification, or analysis  v w,M (t,A) fixed in spec for a reason  Not free to change w/o changing problem Thus divergence is diagnostic indication of a system problem to be fixed Thus divergence is diagnostic indication of a system problem to be fixed

Page 27 PTTI V. Reinhardt Divergence Example: 1 st Order PLL in Presence of f -3 Noise Well-known that 1 st order PLL will cycle slip when f -3 noise is present Well-known that 1 st order PLL will cycle slip when f -3 noise is present  Indicated by divergence in   -j for M=0  Changing to M > 0   -j eliminates divergence But this will not stop the cycle slipping But this will not stop the cycle slipping  det  Cycle Slips ~ Phase Lock Loop x  ref  out ~ VCO Loop Filt  det  =  out -  ref Ref Osc

Page 28 PTTI V. Reinhardt Can Only be Fixed by Changing System Design or Spec Fix design  Eliminate slips Fix design  Eliminate slips  By changing design to 2 nd order PLL Fix spec  Tolerate cycle slips but must change   -j spec to exclude cycle slipped data Fix spec  Tolerate cycle slips but must change   -j spec to exclude cycle slipped data  Effectively changes K v-j (f)  Should also specify mean time to cycle slip Non-essential divergences Non-essential divergences  System OK but wrong H s (f) or K v-j (f) due to faulty analysis  Example: Failure to recognize HP filtering of residual error

Page 29 PTTI V. Reinhardt Final Summary & Conclusions Jitter, residual error, and  variances can be viewed as equivalent measures Jitter, residual error, and  variances can be viewed as equivalent measures  When polynomial used for causal model Residual error guaranteed to converge if free to choose model for causal function Residual error guaranteed to converge if free to choose model for causal function  Because order of HP filtering increases with complexity of model function used to estimate causal behavior Residual error often converges even when model function fixed by spec or problem Residual error often converges even when model function fixed by spec or problem  When doesn’t is indication of real problem in design, spec, or analysis of system in question

Page 30 PTTI V. Reinhardt Final Summary & Conclusions Jitter should be defined as residual error without ad hoc HP filtering Jitter should be defined as residual error without ad hoc HP filtering  Because causal extraction provides HP filtering True causal error (wander) only accessible by determining true noise in system True causal error (wander) only accessible by determining true noise in system Paper is generalization & consolidation of previous work by many authors Paper is generalization & consolidation of previous work by many authors  Allan, Barnes, Gagnepain, Vernotte, Greenhall, Riley, Howe, & many others Preprints: Preprints: