An Autonomous System View To Apply Machine Learning AI 4.1 Li-C. Wang University of CA, Santa Barbara 2018 ITC AI 4.1 - Wang
What “Autonomous System” Reminds Us? 2018 ITC AI 4.1 - Wang
Self-Driving Car Sensing Perception Reasoning & Control Sensing: SR/LR Radars, LiDAR, Vision, Stereovision, U-Sonic, etc. Perception: Neuro-processors (lane detection, object recognition) Reasoning & Control: Free space calculation, path planning, speed/brake/rotation controls 2018 ITC AI 4.1 - Wang
Operate In A “WORLD VIEW” Source: Google self-driving car pictures from the public domain Sensing Perception Reasoning & Control The system maintains a “WORLD VIEW” on the environment, and react to it 2018 ITC AI 4.1 - Wang
Why “Autonomous System” Gives A View To “Applying Machine Learning?” 2018 ITC AI 4.1 - Wang
Autonomous System View to Applying ML Applying ML in an application In order to deploy a solution, it is not just about the ML tool – very often, we need a system to apply ML. Autonomous System This is especially the case when the ML solution is deployed in design/test processes 2018 ITC AI 4.1 - Wang
What an autonomous system looks like in our context? 2018 ITC AI 4.1 - Wang
Tasks Performed By A Yield Engineer Manufacturing Process Class Probe Wafer Packaging Burn-in Final Test Customer Returns Shipped to customers Production Test Interface Findings PPT Presentation Analytic Workflow yield issue 2018 ITC AI 4.1 - Wang
Autonomous System to execute the tasks Autonomous Sysem Manufacturing Process Class Probe Wafer Probe Production Test Final Test Packaging Burn-in Customer Returns Shipped to customers Interface Findings Autonomous System to execute the tasks PPT Presentation Analytic Workflow yield issue 2018 ITC AI 4.1 - Wang
The IEA project (Intelligence Engineering Assistant) The IEA research lab: https://iea.ece.ucsb.edu/ 2018 ITC AI 4.1 - Wang
<IEA Demo> 2018 ITC AI 4.1 - Wang
What’s IEA? 2018 ITC AI 4.1 - Wang
IEA pounced as “Ai-Ya” (Artificial Intelligence – Your Assistant) ITC 2018 Tutorial - ML in Test - Wang
Concept-Based Workflow Programming IF (observe a yield excursion) { % enter the particular yield excursion context FOR (selected low-yield lots) { % enter the low-yield context SELECT a test bin with the largest yield loss; PERFORM e-test correlation analysis; IF (observe a correlation trend) { % enter the correlation context REPORT the trend plot; GENERATE stack-wafer plots; IF (observe a grid pattern) -> REPORT plots; IF (observe a edge failing pattern) -> REPORT plots; }}} … To implement this workflow, we need several data-driven concept recognizers Yield excursion, low-yield, grid pattern, edge failing, correlation trend 2018 ITC AI 4.1 - Wang
Concept-Based Workflow Programming IEA is a concept-based workflow programming environment An IEA system comprises three components API for workflow construction Library of concept recognizers Data processing and analysis tools Executable Workflow Concept Recognizers Data processing and analysis tools 2018 ITC AI 4.1 - Wang
The Analogy Data Tools: Collect all relevant “data” IEA Autonomous System Executable Workflow Concept Recognizers Data processing and analysis tools Sensing Perception Reasoning & Control Data Tools: Collect all relevant “data” Concept Recognition: What “data” mean Workflow: What to do next 2018 ITC AI 4.1 - Wang
The Journey To IEA 2018 ITC AI 4.1 - Wang
“Applying Machine Learning”: My Start … 2004 DAC (Design Automation Conference) L.-C. Wang and et al., “On path-based learning and its applications in delay test and diagnosis” Quote: “The primary purpose of this work has twofold: to demonstrate that path-based learning is indeed feasible and to show how it can be applied in test and diagnosis applications” Feasibility + Applicability 2018 ITC AI 4.1 - Wang
Applying ML in Design/Test (2003-2013) Supervised learning Unsupervised learning Classification Regression Transformation Clustering Outlier Rule Learning Apply Layout verification Fmax Speed test Test cost reduction Design-silicon timing verification Functional verification Po-Si Validation Yield Zero DPPM Pre-silicon (Design Automation) e.g. Simulation data Post-silicon (Manufacturing Automation) e.g. Test data In-field (System/in-field test) e.g. Customer returns 2018 ITC AI 4.1 - Wang
Applying ML in Design/Test (2003-2013) Supervised learning Unsupervised learning Classification Regression Transformation Clustering Outlier Rule Learning IEEE Trans. On CAD Paper (Oct 2016): “Experience of Data Analytics in EDA and Test – Principles, Promises, and Challenges” Apply Layout verification Fmax Speed test Test cost reduction Design-silicon timing verification Functional verification Po-Si Validation Yield Zero DPPM Pre-silicon (Design Automation) e.g. Simulation data Post-silicon (Manufacturing Automation) e.g. Test data In-field (System/in-field test) e.g. Customer returns 2018 ITC AI 4.1 - Wang
Applying ML in Design/Test (2003-2013) Supervised learning Unsupervised learning Classification Regression Transformation Clustering Outlier Rule Learning This journey went through multiple stages … Apply Layout verification Fmax Speed test Test cost reduction Design-silicon timing verification Functional verification Po-Si Validation Yield Zero DPPM Pre-silicon (Design Automation) e.g. Simulation data Post-silicon (Manufacturing Automation) e.g. Test data In-field (System/in-field test) e.g. Customer returns 2018 ITC AI 4.1 - Wang
Applying ML in Design/Test (2003-2013) Supervised learning Unsupervised learning Classification Regression Transformation Clustering Outlier Rule Learning Stage 1: Algorithmic focus Stage 2: Methodology focus Stage 3: Application focus Stage 4: Added-Value focus Apply Layout verification Fmax Speed test Test cost reduction Design-silicon timing verification Functional verification Po-Si Validation Yield Zero DPPM Pre-silicon (Design Automation) e.g. Simulation data Post-silicon (Manufacturing Automation) e.g. Test data In-field (System/in-field test) e.g. Customer returns 2018 ITC AI 4.1 - Wang
Added Value – Yield Improvement (2013) Before ADJ #1 ADJ #2 Both Yield Density Before Yield Yield fluctuated for the SoC product line, and the product/design teams could not solve the problem for months after several design and test revisions, and several process tuning recipes By apply ML tools on silicon data, we found 5 process parameters to be tuned Foundry accepted them and implemented as two adjustments Significant yield improvement observed in production ITC 2014 paper documents the story 2018 ITC AI 4.1 - Wang
An Important Question Next … In the yield example, we and the product team had access to the same set of ML tools So, why we succeeded and they did not? Because we had the knowledge enabling us to conduct a more effective analytic process to apply the ML tools It was that piece of knowledge made the difference, not the tools in use 2018 ITC AI 4.1 - Wang
An Important Deployment Question For deploy a solution, I can’t just package the ML tools and give it to the product team I needed to “package the knowledge” – How am I going to do that? 2018 ITC AI 4.1 - Wang
We Need A System View To Apply ML (Since 2016) Knowledge Input Preparation Result Evaluation Tool Invocation Need Automation of all three components So in short, why the system view? Because we need the knowledge 2018 ITC AI 4.1 - Wang
We Need A System View To Apply ML Expert Input Preparation Result Evaluation Tool Invocation Why do we need the knowledge? Mainly, because we have limited data 2018 ITC AI 4.1 - Wang
We Need A System View To Apply ML Expert Input Preparation Result Evaluation Tool Invocation What is so special about applying ML in view of “limited data?” “Learning from Limited Data in VLSI CAD” – an upcoming book chapter - preview at our lab web site: https://iea.ece.ucsb.edu/ Because there are theoretical assumptions made to achieve ML, and with limited data those assumptions would be hard to meet in practice we need domain knowledge to compensate ML 2018 ITC AI 4.1 - Wang
In Summary, Four Barriers To Consider … Input Preparation Result Evaluation Tool Invocation Knowledge A result after considering those 4 barriers Data barrier Theoretical barrier Application environment constraints Deployment need (providing added value) See paper 2018 ITC AI 4.1 - Wang
The Yield Context Knowledge Input Preparation Result Evaluation Tool Invocation 3. GAN-based result recognizer (ITC 2018) 2. Learning the process as how a yield expert applies those ML tools (VTS 2017) 1. What ML tools are useful and required in yield engineering (ITC 2014) 2018 ITC AI 4.1 - Wang
The AI System The AI System Integration Autonomous Execution IEA Input Preparation Result Evaluation Tool Invocation The Speech Recognition and NLP Component The AI System The core of this AI system view is the autonomous execution of the workflow 2018 ITC AI 4.1 - Wang
We Need A System View To Apply ML Knowledge Input Preparation Result Evaluation Tool Invocation Need Automation of all three components IEA Autonomous System Executable Workflow Concept Recognizers Data processing and analysis tools 2018 ITC AI 4.1 - Wang
Recall: The Analogy Data Tools: Collect all relevant “data” IEA System Executable Workflow Concept Recognizers Data processing and analysis tools Sensing Perception Reasoning & Control Data Tools: Collect all relevant “data” Concept Recognition: What “data” mean Workflow: What to do next 2018 ITC AI 4.1 - Wang
“Intelligence” In IEA The language interface is mostly used for queries of results after the IEA autonomous execution is completed Level 1 Level 2 Level Natural Language Interface Manifestation Autonomous System Manifestation Person IEA 2018 ITC AI 4.1 - Wang
IEA systems in progress: Future Plan of IEA Tutorial and courses IEA courses (Fall and Winter quarters) IEA systems in progress: IEA for production yield engineer IEA for verification engineer 2018 ITC AI 4.1 - Wang
THANK YOU 2018 ITC AI 4.1 - Wang