Xin He, Yashwant Malaiya, Anura P. Jayasumana Kenneth P

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

Lesson Describing Distributions with Numbers parts from Mr. Molesky’s Statmonkey website.
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Component Analysis (Review)
Chapter 3 – Data Exploration and Dimension Reduction © Galit Shmueli and Peter Bruce 2008 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Compensation for Measurement Errors Due to Mechanical Misalignments in PCB Testing Anura P. Jayasumana, Yashwant K. Malaiya, Xin He, Colorado State University.
PCA + SVD.
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
Principal Component Analysis in MD Simulation Speaker: ZHOU Chen-Yang Supervisor: Wu Yun-Dong.
Digital Image Processing Contrast Enhancement: Part II
An introduction to Principal Component Analysis (PCA)
Psychology 202b Advanced Psychological Statistics, II April 7, 2011.
1 In-Network PCA and Anomaly Detection Ling Huang* XuanLong Nguyen* Minos Garofalakis § Michael Jordan* Anthony Joseph* Nina Taft § *UC Berkeley § Intel.
Principal component analysis (PCA)
Linear Regression Models Based on Chapter 3 of Hastie, Tibshirani and Friedman Slides by David Madigan.
Dimension reduction : PCA and Clustering Slides by Agnieszka Juncker and Chris Workman.
Face Recognition Using Eigenfaces
Exploring Microarray data Javier Cabrera. Outline 1.Exploratory Analysis Steps. 2.Microarray Data as Multivariate Data. 3.Dimension Reduction 4.Correlation.
Face Collections : Rendering and Image Processing Alexei Efros.
Linear and generalised linear models
Linear and generalised linear models
Sequence comparison: Significance of similarity scores Genome 559: Introduction to Statistical and Computational Genomics Prof. James H. Thomas.
Correlation. The sample covariance matrix: where.
Hydrologic Statistics
The Tutorial of Principal Component Analysis, Hierarchical Clustering, and Multidimensional Scaling Wenshan Wang.
Chapter 2 Dimensionality Reduction. Linear Methods
Exploratory Data Analysis. Computing Science, University of Aberdeen2 Introduction Applying data mining (InfoVis as well) techniques requires gaining.
Chapter 3 Data Exploration and Dimension Reduction 1.
The Multiple Correlation Coefficient. has (p +1)-variate Normal distribution with mean vector and Covariance matrix We are interested if the variable.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Additive Data Perturbation: data reconstruction attacks.
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
Sampling and Sample Size Part 1 Cally Ardington. Course Overview 1.What is Evaluation? 2.Outcomes, Impact, and Indicators 3.Why Randomise? 4.How to Randomise?
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
Unsupervised Learning Motivation: Given a set of training examples with no teacher or critic, why do we learn? Feature extraction Data compression Signal.
Principal Component Analysis Machine Learning. Last Time Expectation Maximization in Graphical Models – Baum Welch.
Principal Components Analysis. Principal Components Analysis (PCA) A multivariate technique with the central aim of reducing the dimensionality of a multivariate.
Daniel A. Keim, Hans-Peter Kriegel Institute for Computer Science, University of Munich 3/23/ VisDB: Database exploration using Multidimensional.
Discriminant Analysis
Principle Component Analysis and its use in MA clustering Lecture 12.
Statistical Data Analysis 2010/2011 M. de Gunst Lecture 10.
Principal Component Analysis (PCA)
© 2008 McGraw-Hill Higher Education The Statistical Imagination Chapter 5. Measuring Dispersion or Spread in a Distribution of Scores.
Feature Selection and Extraction Michael J. Watts
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Principal Components Analysis ( PCA)
Central limit theorem - go to web applet. Correlation maps vs. regression maps PNA is a time series of fluctuations in 500 mb heights PNA = 0.25 *
CLASSIFICATION OF ECG SIGNAL USING WAVELET ANALYSIS
Results from Mean and Variance Calculations The overall mean of the data for all features was for the REF class and for the LE class. The.
Descriptive Statistics The means for all but the C 3 features exhibit a significant difference between both classes. On the other hand, the variances for.
CSE 554 Lecture 8: Alignment
Principal Component Analysis (PCA)
Linear Algebra Review.
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Exploring Microarray data
School of Computer Science & Engineering
LECTURE 10: DISCRIMINANT ANALYSIS
Signal processing.
Principal Component Analysis (PCA)
Dimension Reduction via PCA (Principal Component Analysis)
Techniques for studying correlation and covariance structure
Derivative of scalar forms
AN ANALYSIS OF TWO COMMON REFERENCE POINTS FOR EEGS
Dimension reduction : PCA and Clustering
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Feature space tansformation methods
Cortical Mechanisms of Smooth Eye Movements Revealed by Dynamic Covariations of Neural and Behavioral Responses  David Schoppik, Katherine I. Nagel, Stephen.
LECTURE 09: DISCRIMINANT ANALYSIS
Principal Component Analysis
Presentation transcript:

Outlier Detection in Capacitive Open Test Data Using Principal Component Analysis Xin He, Yashwant Malaiya, Anura P. Jayasumana Kenneth P. Parker and Stephen Hird

Outline Introduction Test Data Analysis & Results ---Capacitive Lead Frame Testing ---PCA Based Outlier Detection Test Data Analysis & Results ---Global Analysis ---Localized Analysis Summary and Future Work

Capacitive Lead Frame Testing good open Capacitance is formed between the tested pin and sense plate Open defect existing on tested pin affects the normal signal level Ref: Parker, Hird ITC 2007

Principal Component Analysis (PCA) The 1st Principal Component 1st Principal Component is the axis containing largest variance from the data projection 2nd Principal Component is an axis orthogonal to the 1st one, containing as much of the remaining projected variance The 2nd Principal Component Measurement2 Measurement1 Is good at analyzing multi-dimensional interrelated data.

PCA for PCB Outlier Detection Let M be (mxn) matrix of Capacitive Lead Frame Testing measurements - m is the number of boards - n is number of tested pins per board M Mc Centering:

PCA for PCB Testing Mc = USVT where Using Singular Value Decomposition, Mc = USVT where Umxn gives scaled version of PC scores Snxn diagonal matrix whose squared diagonal values are Eigen values arranged in ascending order. VTnxn contains Eigen vectors (PCs). V is the transformation matrix. Matrix Z = McV gives the z-score value of boards. Z-score value of a board is a linear combination of all the corresponding measurement values for that board.

Test Statistic for Outlier Detection zik : the value of the kth PC for the ith board E : a subset of PCs --- most significant PCs are used here Sort the boards with respect to the d1 Plot the cumulative distribution function (CDF) curve of d1 Outliers are clearly identifiable on the right side of the plot, and typically are separated from other devices by a clear margin

Test Data Data Name Board Runs Tested Pins Data_j27 15 147 Data_j24 83 145 Courtesy of Agilent Technologies

Global Analysis - (Data_j27) 3 1 Total 15 board runs First 5 PCs are used Clear outliers: 2, 3 and 1

Global Analysis - (Data_j24) 17 18,19,20,21,22 Total 83 board runs First 5 PCs are used Clear outliers: 17, 18, 19, 20, 21, 22 10

Pins in white color are VDD/grounded pins Localized Analysis Pin 1 Pin 120 Pin 121 Pin 240 Pins in white color are VDD/grounded pins

PCA Applied on Test Windows (Data_ j24) 1 2 3 4 5 6 7 8 9 10 …… Test Window 1 Test Window 10 12 12

Comparison of Global and Localized Methods Pass Fail (ii) (i) (i) Sharp peak signal may exist, other measurements matches well (ii) Slight variation exist in multiple pins measurement 13 13

Comparison of Global and Localized Methods (Example) 6 4

PCA Flow Chart Measurement matrix M (m boards n tested pins) Sort matrix according to pin location Center matrix in each group Derive Z scores for each group Divide to w test windows (w=1 for Global Analysis) Evaluate d1 values in each group Pass/Fail decision & outlier location Compare maximum d1 value in different groups Sort d1 Value

PCA vs. Standard Deviation (STDev) Average + N*STDev Average Average - N*STDev 16

PCA vs. STDev N Abnormal PCB Detected Board runs No. 6 5.5 1 17 5 4.5 5.5 1 17 5 4.5 2 14, 17 4 3,11,14,17,83 3.5 11 3,4,5,6,8,11,14,15,17,59,83 3 18 3,4,5,6,8,9,11,14,15,16,17,18,19,20,21,51,59,83 2.5 35 3,4,5,6,7,8,9,11,12,13,14,15,16,17,18,19,20,21,22 ,34,36,47,48,51,53,57,58,59,60,63,68,73,80,81,83 17

PCA vs. STDev PCA detected outliers: board run 17, 18, 19, 20, 21, 22 14 3 83 11 PCA detected outliers: board run 17, 18, 19, 20, 21, 22 18

Summary PCA can effectively identify outlier boards Localized analysis can increase test resolution and assist in the location of outliers The global and localized analysis can be combined to filter the outliers It can also be applied to other kinds of PCB test data beside Capacitive Lead frame test data

Future Work On-line testing techniques to enhance the detection efficiency Investigate data variation caused by measurement errors, mechanical and electrical tolerances Technique to compensate for the effects of mechanical variations, parameter variations, etc.