Scalable Synthesis Brandon Lucia and Todd Mytkowicz Microsoft Research.

Slides:



Advertisements
Similar presentations
The Helmholtz Machine P Dayan, GE Hinton, RM Neal, RS Zemel
Advertisements

Logic & Critical Reasoning
Deep Learning Bing-Chen Tsai 1/21.
Combinational Circuit Yan Gu 2 nd Presentation for CS 6260.
G53MLE | Machine Learning | Dr Guoping Qiu
CSE 311: Foundations of Computing Fall 2013 Lecture 3: Logic and Boolean algebra.
CS590M 2008 Fall: Paper Presentation
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Support Vector Machines
Leo Lam © Signals and Systems EE235. Fourier Transform: Leo Lam © Fourier Formulas: Inverse Fourier Transform: Fourier Transform:
Artificial Neural Networks
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Non-linear classification problem using NN Fainan May 2006 Pattern Classification and Machine Learning Course Three layers Feedforward Neural Network (FFNN)
Soft computing Lecture 6 Introduction to neural networks.
Spring 07, Feb 8 ELEC 7770: Advanced VLSI Design (Agrawal) 1 ELEC 7770 Advanced VLSI Design Spring 2007 Logic Equivalence Vishwani D. Agrawal James J.
Artificial Neural Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Radial-Basis Function Networks
SE367 Project Final Presentation By: Sujith Thomas Parimi Krishna Chaitanya In charge:- Prof Amitabha Mukerjee.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
SOLVING SUDOKU WITH MATLAB VERIFICATION FUNCTION correctness verification of the puzzle: checks if the current element appears twice in the same line,
Comp 5013 Deep Learning Architectures Daniel L. Silver March,
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
Midterm Review Rao Vemuri 16 Oct Posing a Machine Learning Problem Experience Table – Each row is an instance – Each column is an attribute/feature.
Ketan Patel, Igor Markov, John Hayes {knpatel, imarkov, University of Michigan Abstract Circuit reliability is an increasingly important.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Machine learning in financial forecasting Haindrich Henrietta Vezér Evelin.
NEURAL NETWORKS FOR DATA MINING
1 CS 391L: Machine Learning: Bayesian Learning: Beyond Naïve Bayes Raymond J. Mooney University of Texas at Austin.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
Procedures for managing workflow components Workflow components: A workflow can usually be described using formal or informal flow diagramming techniques,
Computer Go : A Go player Rohit Gurjar CS365 Project Presentation, IIT Kanpur Guided By – Prof. Amitabha Mukerjee.
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Storyboard Programming Rishabh Singh and Armando Solar-Lezama.
PART 9 Fuzzy Systems 1. Fuzzy controllers 2. Fuzzy systems and NNs 3. Fuzzy neural networks 4. Fuzzy Automata 5. Fuzzy dynamic systems FUZZY SETS AND FUZZY.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
ECE 3110: Introduction to Digital Systems Chapter #4 Review.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Deep Belief Network Training Same greedy layer-wise approach First train lowest RBM (h 0 – h 1 ) using RBM update algorithm (note h 0 is x) Freeze weights.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
CEE 6410 Water Resources Systems Analysis
ECE 2110: Introduction to Digital Systems
Chilimbi, et al. (2014) Microsoft Research
ECE 3110: Introduction to Digital Systems
Other Classification Models: Neural Network
Matt Gormley Lecture 16 October 24, 2016
Lab. on Finite State Machine
Creating fuzzy rules from numerical data using a neural network
Inception and Residual Architecture in Deep Convolutional Networks
Classification with Perceptrons Reading:
Assoc. Prof. Yusuf Sahillioğlu
Schizophrenia Classification Using
Azure Machine Learning Noam Brezis Madeira Data Solutions
AV Autonomous Vehicles.
Unsupervised Learning and Autoencoders
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
Research Interests.
Cache Replacement Scheme based on Back Propagation Neural Networks
Neural Networks Chapter 5
Templates of slides for P4 Experiments with your synthesizer
EXPLICIT RULES: INPUT-OUTPUT FORMULAS
Type Topic in here! Created by Educational Technology Network
Sanguthevar Rajasekaran University of Connecticut
Learning and Memorization
Learning Combinational Logic
Presentation transcript:

Scalable Synthesis Brandon Lucia and Todd Mytkowicz Microsoft Research

Synthesizing Circuits Inference problem is undecidable in general – hard problems to solve! Can we leverage existing work to make this scale? There exists some parameters, w, such that all inputs x implement a specification

Neural Networks or… F T F T FF F FT T TF F TT T

Neural Networks 2 layer neural network can approximate any continuous function! or… FF F FT T TF T TT F F T F T

Learning Abstractions with ML First layer learns low level features Each subsequent layer learns higher level features Unsupervised training, layer by layer

Duality of Synthesis and ML Specification is implicit in input/output pairs Machine Learning Synthesis Specification is vector of logical formula

Synthesizing Sudoku Recognizer … … … A0A1A2 B0B1B2 C0C1C2 K0 If A0 is 0 then no other cell in A0’s column 0 no other cell in A0’s row is 0 no other cell in A0’s unit is 0 If A0 is not 0 then one cell in A0’s column must be 0 one cell in A0’s row must be 0 one cell in A0’s unit must be 0

Learning Abstractions in Sudoku First layer learns local implications Each subsequent level combines prior levels. Learns “factorization” of potentially exponential specification! Unsupervised training, layer by layer

Future directions & Questions Flesh out duality into formal details Duality goes both ways: can we help ML methods with our understanding of synthesis / formal verification? CEGIS: Program = Data Structure + Algorithm Learn structure (depth) and algorithm (connectivity) Approximation / Probabilistic need not mean incorrect But it may help scale inference