Chapter 4a Stochastic Modeling

Slides:



Advertisements
Similar presentations
S.Towers TerraFerMA TerraFerMA A Suite of Multivariate Analysis tools Sherry Towers SUNY-SB Version 1.0 has been released! useable by anyone with access.
Advertisements

Non-Gaussian Statistical Timing Analysis Using Second Order Polynomial Fitting Lerong Cheng 1, Jinjun Xiong 2, and Lei He 1 1 EE Department, UCLA *2 IBM.
A Fast Estimation of SRAM Failure Rate Using Probability Collectives Fang Gong Electrical Engineering Department, UCLA Collaborators:
Slide 1 Bayesian Model Fusion: Large-Scale Performance Modeling of Analog and Mixed- Signal Circuits by Reusing Early-Stage Data Fa Wang*, Wangyang Zhang*,
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Stochastic Analog Circuit Behavior Modeling by Point Estimation Method
Simple Linear Regression
STAT 497 APPLIED TIME SERIES ANALYSIS
An introduction to Principal Component Analysis (PCA)
Sampling Attila Gyulassy Image Synthesis. Overview Problem Statement Random Number Generators Quasi-Random Number Generation Uniform sampling of Disks,
Principal Component Analysis
Non-Linear Statistical Static Timing Analysis for Non-Gaussian Variation Sources Lerong Cheng 1, Jinjun Xiong 2, and Prof. Lei He 1 1 EE Department, UCLA.
CF-3 Bank Hapoalim Jun-2001 Zvi Wiener Computational Finance.
Evaluating Hypotheses
Generating Continuous Random Variables some. Quasi-random numbers So far, we learned about pseudo-random sequences and a common method for generating.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
1 Rasit Onur Topaloglu and Alex Orailoglu University of California, San Diego Computer Science and Engineering Department.
Fast integration using quasi-random numbers J.Bossert, M.Feindt, U.Kerzel University of Karlsruhe ACAT 05.
Separate multivariate observations
1 Assessment of Imprecise Reliability Using Efficient Probabilistic Reanalysis Farizal Efstratios Nikolaidis SAE 2007 World Congress.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Component Reliability Analysis
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Analysis of Monte Carlo Integration Fall 2012 By Yaohang Li, Ph.D.
Random Number Generators CISC/QCSE 810. What is random? Flip 10 coins: how many do you expect will be heads? Measure 100 people: how are their heights.
MEGN 537 – Probabilistic Biomechanics Ch.7 – First Order Reliability Methods Anthony J Petrella, PhD.
CSDA Conference, Limassol, 2005 University of Medicine and Pharmacy “Gr. T. Popa” Iasi Department of Mathematics and Informatics Gabriel Dimitriu University.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Chapter 4 Stochastic Modeling Prof. Lei He Electrical Engineering Department University of California, Los Angeles URL: eda.ee.ucla.edu
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
© The MathWorks, Inc. ® ® Monte Carlo Simulations using MATLAB Vincent Leclercq, Application engineer
Chapter 3: Maximum-Likelihood Parameter Estimation l Introduction l Maximum-Likelihood Estimation l Multivariate Case: unknown , known  l Univariate.
QuickYield: An Efficient Global-Search Based Parametric Yield Estimation with Performance Constraints Fang Gong 1, Hao Yu 2, Yiyu Shi 1, Daesoo Kim 1,
HW5 and Final Project Yield Estimation and Optimization for 6-T SRAM Cell Fang Gong
EE 201C Homework 4 [Due on Feb 26, 2013] Wei Wu
Unified Adaptivity Optimization of Clock and Logic Signals Shiyan Hu and Jiang Hu Dept of Electrical and Computer Engineering Texas A&M University.
Estimating standard error using bootstrap
Principal Component Analysis
Introduction to Monte Carlo Method
Chapter 4b Process Variation Modeling
Fang Gong HW5 and Final Project Yield Estimation and Optimization for 6-T SRAM Cell Fang Gong
Basic simulation methodology
SRAM Yield Rate Optimization EE201C Final Project
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Principal Component Analysis (PCA)
Chapter 3 Component Reliability Analysis of Structures.
Timing Analysis and Optimization Considering Process Variation
Chapter 2 Interconnect Analysis
3.1 Expectation Expectation Example
EE201C Modeling of VLSI Circuits and Systems Final Project
Review of Probability and Estimators Arun Das, Jason Rebello
Chapter 4a Stochastic Modeling
MEGN 537 – Probabilistic Biomechanics Ch
EE201C Modeling of VLSI Circuits and Systems Final Project
Stochastic Hydrology Random Field Simulation
Lecture 2 – Monte Carlo method in finance
Yield Optimization: Divide and Conquer Method
Chapter 4a Stochastic Modeling
Chapter 4b Statistical Static Timing Analysis: SSTA
EE513 Audio Signals and Systems
Stephen Govea Javad Zandazad
Monte Carlo I Previous lecture Analytical illumination formula
Lecture 4 - Monte Carlo improvements via variance reduction techniques: antithetic sampling Antithetic variates: for any one path obtained by a gaussian.
EE 201C Homework 5 [Due on March 12, 2012]
Analysis of Engineering and Scientific Data
Principal Component Analysis
Chapter 4C Statistical Static Timing Analysis: SSTA
Propagation of Error Berlin Chen
Sampling Plans.
Presentation transcript:

Chapter 4a Stochastic Modeling μx σx x f (x) Prof. Lei He Electrical Engineering Department University of California, Los Angeles URL: eda.ee.ucla.edu Email: lhe@ee.ucla.edu

Outline Introduction to Stochastic Modeling Monte Carlo Simulation Example: SRAM cell Yield Estimation with MC&QMC

R.V. X can take value from its domain randomly Review of Probability R.V. X can take value from its domain randomly Domain can be continuous/discrete, finite/infinite PDF vs. CDF x dx f (x) 1 x F (x)

Review of Probability Mean and Variance Normal Distribution σx μx f (x)

Multivariate Distribution Similar definition can be extended for multivariate cases Joint PDF (JPDF), Covariance Becomes much more complicated Correlation MATTERS!!

Independent Random Variables Two events A and B are independent  P(A ∩ B) = P(A)P(B). Two random variables X and Y are independent  events A={X<=a} and B={Y<=b} are independent. For two independent variables X and Y, we have E[X Y] = E[X] E[Y] var(X + Y) = var(X) + var(Y),

Correlation Coefficient Normalized covariance: Always lies between -1 and 1 Correlation of 1  x ~ y, -1 

Principal Component Analysis PAC helps to compress and classify data It reduces the dimensionality of a data set (sample) by finding a new set of variables The new set has a smaller number of variables The new set nonetheless retains most of the original information. By information we mean the variation present in the sample, given by the correlations between the original variables. The new variables, called principal components (PCs), are uncorrelated, and are ordered by the fraction of the total information each retains from high to low.

A sample of n observations in the 2-D space x = (x1, x2) Geometric Interpretation of Principal Components A sample of n observations in the 2-D space x = (x1, x2) Goal: to account for the variation in a sample in as few variables as possible, to some accuracy

The 1st PC z1 is a minimum distance fit to a line in x space Geometric Interpretation of Principal Components The 1st PC z1 is a minimum distance fit to a line in x space The 2nd PC z2 is a minimum distance fit to a line in the plane perpendicular to the 1st PC PCs are a series of linear least squares fits to a sample, each orthogonal to all the previous.

Outline Introduction to Stochastic Modeling Monte Carlo Simulation

Monte Carlo Simulation Problem Formulation Given a set of random variables X=(X1, X2, … Xn)T and a function of X, Y=f(X), estimate the distribution of the Y Method Generate N samples of X=(X1, X2, … Xn)T For each sample of X, calculate the correspondent sample of Y=f(X) Obtain the distribution of Y from the samples of Y

Advantage and Disadvantage of MC simulation Pro: Accurate Error→0 when N→∞ Flexible Works for any arbitrary distribution of X Works for any arbitrary function of f Simple Easy to implement Usually used as golden case in statistical analysis Con: Not efficient Need large N to obtain high accuracy Need to run large number of iterations Not suitable for statistical optimization

Example Given X1 and X2 are independent standard Gaussian RVs, estimate the distribution of max(X1, X2)

Quasi Monte Carlo Simulation Basic idea Use deterministic samples instead of pure random samples Select deterministic samples to cover the whole sample space evenly

Discrepancy measures how evenly the samples are in the sample place Definition N is total number of samples, A(B, P) is the number of points in bounding box B, λs(B) is the volume of B Discrepancy measures how evenly the samples are in the sample place

Low Discrepancy Sequence Sample sequence with low discrepancy Low discrepancy array generation algorithms Faure sequence Neiderreiter sequence Sobol sequence Halton Sequence

Example: Halton Sequence Basic idea Choose a prime number as base (let's say 2) Write natural number sequence 1, 2, 3, ... in base Reverse the digits, including the decimal sign Convert back to base 10: 1 = 1.0 => 0.1 = 1/2 2 = 10.0 => 0.01 = 1/4 3 = 11.0 => 0.11 = 3/4 4 = 100.0 => 0.001 = 1/8 5 = 101.0 => 0.101 = 5/8 6 = 110.0 => 0.011 = 3/8 7 = 111.0 => 0.111 = 7/8 High dimensional array Use different base for different dimension Example 2-d array, X-base 2, y-base 3 1 => x=1/2 y=1/3 2 => x=1/4 y=2/3 3 => x=3/4 y=1/9 4 => x=1/8 y=4/9 5 => x=5/8 y=7/9 6 => x=3/8 y=2/9 7 => x=7/8 7=5/9

Advantage and Disadvantage of QMC Simulation Efficient Use fewer sample than random Monte Carlo simulation Disadvantage Only works in low dimension cases Very slow when number of random variations become large Not very common in statistical analysis

Comparison of MC and QMC QMC converges faster than MC

Reference L. I. Smith. “A Tutorial on Principal Components Analysis”. Cornell University, USA, 2002. Singhee, A., Rutenbar, R. “From Finance to Flip Flops: A study of Fast Quasi-Monte Carlo Methods from Computational Finance Applied to Statistical Circuit Analysis.”

Homework 5 Yield Estimation using Monte Carlo Method Consider “access time failure” : the time that voltage difference between BL_B and BL becomes larger than certain value. The schematic are shown as below Initial Value: BL_B=1; Q_B=0; Q=1; BL=1; Variation Source: Vth (threshold voltage) of Mn1 and Mn2 Leff of Mn1 and Mn2 Device Model: Use BSIM3 model for all MOSFETs

netlist Netlist for 6-T cell SRAM * SRAM netlist Vdd dd 0 5 Mn1 3 2 0 0 nmos Mn2 3 5 4 4 nmos Mn3 2 3 0 0 nmos Mn4 2 5 1 1 nmos Mp5 3 2 dd dd pmos Mp6 2 3 dd dd pmos all MOSFETs should use BSIM3 model

Detailed Steps Performance Constraint: The voltage difference between BL_B and BL should be larger than ∆v at the time-step tthresh. Use Monte-Carlo and Quasi-Monte Carlo to calculate the yield Y, which is the percentage of circuits with satisfied performance. Steps: (1) Use MC and QMC to generate random sequences for two variable parameters with Matlab code. (2) Perform transient simulations with these sequences, and compare the variable performance with constraint. (3) Calculate the yield rate with definition. Nominal Values, Performance Constraint and Matlab code will be provided soon Due Feb 20th