MACHINE LEARNING IN ASIC BACKEND DESIGN

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Naïve-Bayes Classifiers Business Intelligence for Managers.
Machine Learning Instance Based Learning & Case Based Reasoning Exercise Solutions.
NEURAL NETWORKS Perceptron
Machine Learning in Practice Lecture 7 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
Indian Statistical Institute Kolkata
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Paper presentation for CSI5388 PENGCHENG XI Mar. 23, 2005
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Sparse vs. Ensemble Approaches to Supervised Learning
Boosting Main idea: train classifiers (e.g. decision trees) in a sequence. a new classifier should focus on those cases which were incorrectly classified.
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
ENDA MOLLOY, ELECTRONIC ENG. FINAL PRESENTATION, 31/03/09. Automated Image Analysis Techniques for Screening of Mammography Images.
Predicting Income from Census Data using Multiple Classifiers Presented By: Arghya Kusum Das Arnab Ganguly Manohar Karki Saikat Basu Subhajit Sidhanta.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Machine Learning Seminar: Support Vector Regression Presented by: Heng Ji 10/08/03.
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Some working definitions…. ‘Data Mining’ and ‘Knowledge Discovery in Databases’ (KDD) are used interchangeably Data mining = –the discovery of interesting,
10/18/ Support Vector MachinesM.W. Mak Support Vector Machines 1. Introduction to SVMs 2. Linear SVMs 3. Non-linear SVMs References: 1. S.Y. Kung,
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
Predictive Design Space Exploration Using Genetically Programmed Response Surfaces Henry Cook Department of Electrical Engineering and Computer Science.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Ensemble Methods: Bagging and Boosting
Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.
Ensemble Learning Spring 2009 Ben-Gurion University of the Negev.
Linear Discrimination Reading: Chapter 2 of textbook.
Linear Classification with Perceptrons
Patch Based Prediction Techniques University of Houston By: Paul AMALAMAN From: UH-DMML Lab Director: Dr. Eick.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
INCLUDING UNCERTAINTY MODELS FOR SURROGATE BASED GLOBAL DESIGN OPTIMIZATION The EGO algorithm STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP Thanks.
Cheng-Lung Huang Mu-Chen Chen Chieh-Jen Wang
Ensemble Methods Construct a set of classifiers from the training data Predict class label of previously unseen records by aggregating predictions made.
Feature Engineering Studio Special Session September 25, 2013.
Combining multiple learners Usman Roshan. Decision tree From Alpaydin, 2010.
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Support Vector Machines
Learning with Perceptrons and Neural Networks
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
A Simple Artificial Neuron
Reading: Pedro Domingos: A Few Useful Things to Know about Machine Learning source: /cacm12.pdf reading.
Estimating Link Signatures with Machine Learning Algorithms
Wave height prediction in the Apostle Islands
Classification with Perceptrons Reading:
Overview of Supervised Learning
Machine Learning Week 1.
Feature Engineering Studio Special Session
Perceptrons for Dummies
Neural Networks Advantages Criticism
The Science of Predicting Outcome
Ensembles.
Classification Boundaries
Deep Neural Networks: A Hands on Challenge Deep Neural Networks: A Hands on Challenge Deep Neural Networks: A Hands on Challenge Deep Neural Networks:
Somi Jacob and Christian Bach
Model generalization Brief summary of methods
Neuro-Computing Lecture 2 Single-Layer Perceptrons
Junheng, Shengming, Yunsheng 11/09/2018
MODEL DEVELOPMENT FOR HIGH-SPEED RECEIVERS
Identifying Severe Weather Radar Characteristics
Surrogate Modeling for Predicting FPGA Place and Route
IP Reuse through Machine Learning
DRC with Deep Networks Tanmay Lagare, Arpit Jain, Luis Francisco,
Advisor: Dr.vahidipour Zahra salimian Shaghayegh jalali Dec 2017
Presentation transcript:

MACHINE LEARNING IN ASIC BACKEND DESIGN Bowen Li, Paul Franzon Build Models for Physical Design. Use Machine Learning methods to predict the circuit performance after Physical Design Speedup design time. Objectives Surrogate Modeling for Global Route There are many surrogate models and adaptive model builders for us to use. Overall, 16 model are tested. Root relative squared error(RRSE) is used to measure model accuracy. RRSE is relative to squared error compared to a simple predictor. RRSE < 0.5 is target. Several model performances are shown below. ANN-Genetic model shows the best performance.   area_trial TNS violating_path WNS x_neg_1_4 x_neg_5_8 x_pos_0_10 x_pos_11_20 hold_slack_trial power_trial anngenetic 0.000 0.242 0.383 0.076 0.496 0.295 0.136 0.139 1.001 0.922 ann 0.003 0.270 0.364 0.079 0.517 0.302 0.193 0.227 1.000 0.939 annfixed 0.267 0.411 0.090 0.536 0.201 0.181 1.002 0.948 kriginggenetic 0.001 0.311 0.459 0.072 0.591 0.382 0.269 0.275 1.101 0.942 rational 0.004 0.262 0.394 0.096 0.542 0.428 0.362 0.371 1.016 0.961 gpmlgenetic 0.310 0.392 0.521 0.439 0.366 0.376 0.929 lssvmgenetic 0.309 0.403 0.088 0.522 0.437 0.377 0.378 0.930 elm 0.322 0.400 0.093 0.523 0.448 0.373 0.936 kriging 0.338 0.461 0.176 0.612 0.405 0.297 0.307 1.003 In this research, we separate physical design modeling into two stages. In Stage 1, Surrogate Modeling(SUMO) would run Physical Design for thousands of times to get obtain enough inputs and outputs after Global Route (GR). During this process, SUMO will generate models for each outputs to predict GR results in the future. In Stage 2, thousands of GR results and Detailed Route (DR) results are set as inputs and outputs respectively in machine learning models. Framework of Physical Design Simulation Linear regression, neural networks and decision tree models are used in this part. Here are the model accuracies for different outputs: Power: Linear Regression, 99% accuracy. Area: Linear Regression, 99 % accuracy. Hold Slack: Neural Networks, 93% accuracy; Decision Tree (turn to classification problem), 99% accuracy. Number of DRC: Decision Tree (turn to classification problem), 82.82% accuracy. Machine Learning for Detailed Route We selected six important features and 4 circuit performances. Also10 global route results are chosen, because they are related to four circuit performances. Inputs and Outputs of Model Conclusion and Future Work Achieving a high quality physical design is difficult and time consuming Many input “knobs” to set Evaluating each knob requires 40 minutes Key to success was picking the correct model for each steps Future Work: Improve Machine Learning Algorithm to get better performance. Build classifier so don’t have to build a model for each design.