1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 9 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH, 45701.

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Lecture 4: Signal Conditioning
Computational Modeling for Engineering MECN 6040
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Sampling: Final and Initial Sample Size Determination
Integration of sensory modalities
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 13 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
Chapter 10 Simple Regression.
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 13 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
EE 616 Computer Aided Analysis of Electronic Networks Lecture 8
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 11 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
Chapter 7 Sampling and Sampling Distributions
SYSTEMS Identification
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 7 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 6 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
Part III: Inference Topic 6 Sampling and Sampling Distributions
Continuous Random Variables and Probability Distributions
1 Adjoint Method in Network Analysis Dr. Janusz A. Starzyk.
EE EE 616 Computer Aided Analysis of Electronic Networks Lecture 2 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens,
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 10 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 14 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH,
Lecture II-2: Probability Review
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
VARIABLE-FREQUENCY NETWORK
Chapter 3 Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Chapter 8 Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Common Probability Distributions in Finance. The Normal Distribution The normal distribution is a continuous, bell-shaped distribution that is completely.
ECE 2300 Circuit Analysis Dr. Dave Shattuck Associate Professor, ECE Dept. Lecture Set #12 Natural Response W326-D3.
ENE 103 Electrotechnology
Development of An ERROR ESTIMATE P M V Subbarao Professor Mechanical Engineering Department A Tolerance to Error Generates New Information….
Chapter 9.3 (323) A Test of the Mean of a Normal Distribution: Population Variance Unknown Given a random sample of n observations from a normal population.
ECE 2300 Circuit Analysis Dr. Dave Shattuck Associate Professor, ECE Dept. Lecture Set #13 Step Response W326-D3.
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
PARAMETRIC STATISTICAL INFERENCE
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005 Dr. John Lipp Copyright © Dr. John Lipp.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Sampling distributions rule of thumb…. Some important points about sample distributions… If we obtain a sample that meets the rules of thumb, then…
EE EE 616 Computer Aided Analysis of Electronic Networks Lecture 2 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens,
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
ECE 2300 Circuit Analysis Dr. Dave Shattuck Associate Professor, ECE Dept. Lecture Set #14 Special Cases and Approaches with First Order Circuits
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Analysis of Experimental Data; Introduction
Continuous Random Variables and Probability Distributions
Joint Moments and Joint Characteristic Functions.
Sampling Design and Analysis MTH 494 Lecture-21 Ossam Chohan Assistant Professor CIIT Abbottabad.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
CHAPTER- 3.2 ERROR ANALYSIS. 3.3 SPECIFIC ERROR FORMULAS  The expressions of Equations (3.13) and (3.14) were derived for the general relationship of.
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
Variable-Frequency Response Analysis Network performance as function of frequency. Transfer function Sinusoidal Frequency Analysis Bode plots to display.
Lecture -5 Topic :- Step Response in RL and RC Circuits Reference : Chapter 7, Electric circuits, Nilsson and Riedel, 2010, 9 th Edition, Prentice Hall.
STATISTICS People sometimes use statistics to describe the results of an experiment or an investigation. This process is referred to as data analysis or.
This represents the most probable value of the measured variable. The more readings you take, the more accurate result you will get.
Virtual University of Pakistan
The simple linear regression model and parameter estimation
Transfer Functions Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: The following terminology.
ESTIMATION.
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
ECE 2202 Circuit Analysis II
Lecture Slides Elementary Statistics Thirteenth Edition
Chapter 9 Hypothesis Testing.
Introduction to Instrumentation Engineering
Control Systems (CS) Lecture-3 Introduction to Mathematical Modeling &
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

1 EE 616 Computer Aided Analysis of Electronic Networks Lecture 9 Instructor: Dr. J. A. Starzyk, Professor School of EECS Ohio University Athens, OH, 45701

2 Outline Sensitivities -- Network function sensitivity -- Zero and pole sensitivity -- Q and sensitivity Multiparameter Sensitivity Sensitivities to Parasitics and Operational Amplifiers

3 Sensitivities Normalized sensitivity of a function F w.r.t parameter Two semi-normalized sensitivities are discussed when either F or h is zero. and F can be a network function, its pole or zero, Q etc., while h can be component value, frequency s, operating temperature or humidity, etc.

4 SENSITIVITIES - Example Resonant circuit where We have so & also

5 SENSITIVITIES The use of sensitivities can be demonstrated when we replace differentials by increments. Using the above example we have and since => Assume that there is a 1% increase of then so we can expect Q to decrease by 0.5%.

6 Network function sensitivity If network function is then so if then so

7 Example: we have KCL at node v 1 : (v 1 - E)G 1 + (v 1 - v out )G 2 + (v 1 - v 2 )sC 2 + v 1 sC 1 = 0 KCL at node v 2 :(v 2 - v 1 ) sC 2 + v 2 G 3 = 0 or and from here the transfer function

8 Example (cont’d) For C 1 = C 2 = 1, G 1 = G 2 = G 3 = 1, and A = 2 we have

9 Zero and pole sensitivity Zeros and poles give good characterization of network response for different frequencies. The sensitivity of the zero of a polynomial is obtained through expressing zero as function of parameter h. Since zero of the polynomial is not known analytically (it can be obtained by nonlinear iterations), the problem which must be solved is how to find derivative for evaluation of its sensitivity without explicit knowledge of the zero or its functioan dependence on the parameter h.

10 Zero and pole sensitivity (cont’d) Differentiating P w.r.t. h gives => This expression is valid for simple zeros and can be used to get if z = a + jb we obtain

11 Zero and pole sensitivity - example Suppose a transfer function of the network is (compare with the previous example) Find the sensitivity of a pole s p = -1+j w.r.t. A

12 Zero and pole sensitivity - example Using the derived formula we have For C 1 = C 2 = 1, G 1 = G 2 = G 3 = 1, and A = 2 we have so the zero sensitivity w.r.t. A is and for s p =a+jb=-1+j, and

13 Q and sensitivity In filter design Q and  o are easier to work with. For a pair of complex zeros where or for using zero's sensitivity we obtain (high Q circuits)

14 Q and sensitivity (cont’d) Derivation

15 Example In the case of transfer function from previous example we have z = a+jb = -1+j so Usingandwe have In this case but Q was low so approximation did not hold.

16 Example 2 Derive the transfer function of the network shown in figure. Find the transfer function sensitivity w.r.t. the capacitors and the amplifier KCL at v 1 : KCL at v 2 :

17 Example 2 (cont’d) so Using the formula for transfer function sensitivity

18 Multiparameter Sensitivity The function F generally depends on several parameters The change in F due to infinitesimally small changes in parameters is expressed by the total differential or To compare different designs we introduce multiparameter sensitivity measures.

19 Multiparameter Sensitivity (cont’d) The worst case multiparameter sensitivity For incremental changes of parameters within their tolerance we have or in case all t i are equal to t This is a very pessimistic estimate of the function deviation from its nominal value.

20 Multiparameter Sensitivity (cont’d) In IC fabrication design parameters like resistor or capacitor values track each other – i.e. change in their values are strongly correlated. So, to design these circuits we use the multiparameter tracking sensitivity Since all elements of the same kind (e.g. capacitors) have similar values of and for such elements (only) then, for all types of elements, the worst case variation with tracking is given by

21 Multiparameter Sensitivity (cont’d) However, worst case situation is very unlikely to happen in practice. Fabricated device parameter deviations follow a statistical distribution. Two commonly used distributions to model parameter deviations are uniform and normal distributions -t 0 t For uniform distribution:

22 Multiparameter Sensitivity (cont’d) For normal distribution: The function deviation becomes a random variable with its own distribution. For large circuits has approximately normal distribution with zero mean and variance provided that the component variations are statistically independent, where

23 Multiparameter Sensitivity (cont’d) If all the tolerances are equal, and have the same distribution then the standard deviation can be calculated from where MSS is the multiparameter statistical sensitivity Actual variation will lie in the interval 68% of the time, 95%, and 99.7%.

24 Example We have KCL1: KCL2: so

25 Example (cont’d) Let us assume that all elements have tolerances t = 1% and s = 1. Let’s calculate various multiparameter sensitivities and use them to predict deviations of the transfer function T from its nominal value.

26 Example (cont’d) For the nominal values the transfer function can be evaluated as Let us discuss the effect of 1% changes assuming G 1 =.99C 2 = 1.01G 3 = 1.01G 4 = 3.96 The actual transfer function value can be calculated as so

27 Example (cont’d) while the estimate for such a change using different multiparameter sensitivities is as follows. Worst case analysis Worst case analysis with tracking (still too big) (too pessimistic)

28 Statistical analysis If tolerances are distributed uniformly then the standard deviation and if the tolerances are distributed normally then the standard deviation indicating that 95% of the time will be less than in the uniform case and less than in the normal case Since the true deviation that was 1.6% exceeded the 95% limits for the standard deviation of the normal distribution so our case was rather uniform than normal.

29 Sensitivities to Parasitics and Operational Amplifiers Since parasitics have nominal values equal zero we cannot calculate sensitivities to these elements in the regular way. Denote parasitics by. We have or equivalent  semi-normalized sensitivity as well as are fixed for a specific technology, so the only way to reduce functional variation of F is to have design with small and.

30 Sensitivities to Parasitics and Operational Amplifiers (cont’d) To evaluate we analyze the network in the regular way, calculate and finally substitute  i = 0 at the final result. In the case of Op Amp we may consider the inverse of its amplification as a parasitic we have or where B is parasitic. If B  0 then we obtain ideal Op Amp.

31 Example: Find the sensitivity for the transfer function T of the network shown where from the previous example