_Synthesis__________________ __Of_______________________ ___First-Order_____Dynamic___ _____________Programming___ _______________Algorithms___ Yewen (Evan)

Slides:



Advertisements
Similar presentations
_Synthesis__________________ __Of_______________________ ___First-Order_____Dynamic___ _____________Programming___ _______________Algorithms___ Yewen (Evan)
Advertisements

Counting the bits Analysis of Algorithms Will it run on a larger problem? When will it fail?
StreamBit: Sketching high-performance implementations of bitstream programs Armando Solar-Lezama, Rastislav Bodik UC Berkeley.
Lecture 7: Greedy Algorithms II Shang-Hua Teng. Greedy algorithms A greedy algorithm always makes the choice that looks best at the moment –My everyday.
Algorithms and Problem Solving-1 Algorithms and Problem Solving.
KNAPSACK PROBLEM A dynamic approach. Knapsack Problem  Given a sack, able to hold K kg  Given a list of objects  Each has a weight and a value  Try.
Lecture 7: Greedy Algorithms II
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
Dynamic Programming David Kauchak cs161 Summer 2009.
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Greedy Algorithms Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
Algorithm Analysis 1.
The Need for Algorithms
Advanced Algorithms Analysis and Design
The NP class. NP-completeness
Introduction to Computing Science and Programming I
Algorithms and Problem Solving
Integer Programming An integer linear program (ILP) is defined exactly as a linear program except that values of variables in a feasible solution have.
Dynamic Programming Sequence of decisions. Problem state.
GC211Data Structure Lecture2 Sara Alhajjam.
CS 3343: Analysis of Algorithms
Weighted Interval Scheduling
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Lecture 22 Complexity and Reductions
Algorithm Analysis CSE 2011 Winter September 2018.
Greedy Algorithms Basic idea Connection to dynamic programming
void Pointers Lesson xx
Problem Solving: Brute Force Approaches
Algorithms Chapter 3 With Question/Answer Animations
Example Fill in the grid so that every row, column and box contains each of the numbers 1 to 9:
Dynamic Programming.
Dynamic Programming.
Prepared by Chen & Po-Chuan 2016/03/29
One-Dimensional Array Introduction Lesson xx
Hidden Markov Models Part 2: Algorithms
Data Structures and Algorithms
Analysis and design of algorithm
Objective of This Course
Instructor: Shengyu Zhang
Unit-5 Dynamic Programming
MURI Kickoff Meeting Randolph L. Moses November, 2008
Linear Programming Duality, Reductions, and Bipartite Matching
Merge Sort 1/12/2019 5:31 PM Dynamic Programming Dynamic Programming.
Advanced Algorithms Analysis and Design
Dynamic Programming 1/15/2019 8:22 PM Dynamic Programming.
Advanced Algorithms Analysis and Design
Dynamic Programming Dynamic Programming 1/18/ :45 AM
Merge Sort 1/18/ :45 AM Dynamic Programming Dynamic Programming.
Dynamic Programming Merge Sort 1/18/ :45 AM Spring 2007
Guest Lecture by David Johnston
Quantum Computation and Information Chap 1 Intro and Overview: p 28-58
Merge Sort 2/22/ :33 AM Dynamic Programming Dynamic Programming.
Algorithms and Problem Solving
Lecture 8. Paradigm #6 Dynamic Programming
Graphs and Algorithms (2MMD30)
Templates of slides for P4 Experiments with your synthesizer
Merge Sort 4/28/ :13 AM Dynamic Programming Dynamic Programming.
Lecture 5 Dynamic Programming
This is not an advertisement for the profession
LearnZillion Notes: --This is your hook. Start with a question to draw the student in. We want that student saying, “huh, how do you do X?” Try to be specific.
Dynamic Programming Merge Sort 5/23/2019 6:18 PM Spring 2008
Dynamic Programming.
Error Correction Coding
Knapsack Problem A dynamic approach.
Pointer analysis John Rollinson & Kaiyuan Li
Lecture 22 Complexity and Reductions
Presentation transcript:

_Synthesis__________________ __Of_______________________ ___First-Order_____Dynamic___ _____________Programming___ _______________Algorithms___ Yewen (Evan) Pu Rastislav Bodik Saurabh Srivastava University of California, Berkeley

Do you feel lucky? Bob work, bob hand code, bob lucks out and gets R.E. which ken thompson did for him. Other domain possible too, but compare to total amount of algorithms, they are island on a vast ocean Let us build more islands! <natural transition> /* In the ocean of algorithms / Most are solved by hand / but once in a while / we reach solid land / A group of algorithms we understand well / And a thing that compile / From high level specification / To low level detail / */

A new way to build an island? Conventional Domain Specific Compiler: Require deep domain theories Takes a long time to implement Constraint Based Synthesizer: Write a template for desired algorithm Constraint solver fills in the template Use visual notation from the previous slide. Draw same island as before. This compare domain-theory with sketch (a constraint-based synthesizer) You write a template of the solution which is filled in by the c.b. synthesizer so that it functionally behaves like the spec. The template is a just a syntactic descripiton. Make templates, not theories. Flow to next slide: to illustrate that point, here is an example where we build a template for a single problem, using sketch

Make Templates not Theories Suppose we want to optimize x+x+x+x With domain-specific rewrite rules: With a template in SKETCH [Solar-Lezama]: spec(x): return x+x+x+x sketch(x): return x << ?? You need a lot of theory to do transform But for sketch, just completes a template Flow: but how does sketch finds the correct completion? program equivalence found using bounded model checking

A Search for a Correct Program Synthesizer finds in a space of candidate programs a correct one (it matches the specification) Sketch uses constraint solving to implement a search. Space of all possible algorithms, find one. Admit that we use bounded model checking (check equivalence between spec and sketch algorithm over finite number of inputs) This is justified because equivalence of two programs is undecidable in general. And given enough input, the user can be sure the final algorithm

Research Question in This Talk Can we write a synthesizer for an entire problem domain using a single template (sketch)? Self explaintary… transition probably not needed even.

The Challenge How do we write a template that covers all of our domain, yet the constraints it induces can be efficiently solved? Transition to next slide via next slide’s title

Our Approach Define a general template that contains the entire domain Optimize the search by reducing the search space Make sure we say the general template CONTAINS the entire domain, it is not nessesarily the entire domain but possibly larger, but it is easier to define than the domain space.

Dynamic Programming Algorithms A well-defined domain We have no “DSL compiler” for it (taught as an art) Difficulties: inventing sub-problems inventing recurrences We focus on a first-order sub-class, FORDP, which captures many O(n) DP algorithms Transition in: Let’s try our approach on this nice class of algorithms.

An Easy Problem Fibonacci Sequence: fib(n) = fib(n-1) + fib(n-2) Here we get lucky as the recurrence is given to you, and they do indeed overlap. In general through, we are not so lucky

A Harder Problem Maximal Independent Sum (MIS) Input: Array of positive integers Output: Maximum sum of a non-consecutive selections of its elements. For this problem, you do not immediately know what is the overlapping sub-problem nor the recurrence.

What does the user do? mis(A): best = 0 forall selections: if non_consec(selection): best = max(best, value(A[selection])) return best Self explaintary… transition probably not needed even.

What does the template do? Define a general template that contains the entire domain Self explaintary… transition probably not needed even.

A General Template Covers every FORDP algorithm for_dpa(a): p1 = array() p2 = array() p1[0] = init1() p2[0] = init2() for i from 1 to n: p1[i] = update1(p1[i-1],p2[i-1],a[i]) p2[i] = update2(p1[i-1],p2[i-1],a[i]) return term(p1[n],p2[n]) Explain underlined functions are general templates of what they can be. Init will cover all possible ways of initializing the base case for sub-problems Update covers all the way to update the sub-problem values Term wraps things up to yield a single output Transition out: The update template is the crux as it defines the recurrence relation for the DP Covers every FORDP algorithm

General Template for update All possible compositions of user provided operators To approach the update function, same idea, first define a general space Transition out: This space is too large to be efficiently solved. We’ll now show how to prune it

Space Reduction: Optimality All FORDP recurrences have this syntactic form Take sub-problem at i-1 iteration Extend them to candidate solutions at the ith iteration with the array input, here we extend each sub-problem in 2 possible ways. Mention what ext is, and it’s not too bad… Only a subset of all the candidates can be used by the sub-problems, Since we don’t really know what these candidates are, we let synthesizer discover a wiring that correct select for each subproblem the correct candiddates Remembering sub-problems are produced from optimal previous sub-problems, we optimize over these candidates

Space Reduction: Optimality Recurrence for MIS: For our example, MIS, the template will resolve the recurrence as follows: There are two sub-problems, pick, and omit. We can extend both as follows If we were to pick at i, we see the candidate is only… If we were to omit at i, we see the candidates are…

Space Reduction: Symmetry Many operators are commutative Pick a canonical representative syntactically

Space Reduction: Recap All possible compositions Syntactically Reduced Which is about 2%

DEMO We need to explain why a naïve sketch generates too large of a space for MIS

Benchmarks Here are some synthesized recurrences We need to explain why a naïve sketch generates too large of a space for MIS

Experiments Synthesizer solving time, in seconds out of memory Seconds in y axis The shorter the better

Conclusion It is possible to build a domain-specific synthesizer for FORDPA Synthesizer developer only find a syntactic domain structure The lessons learned in building the synthesizer may be general If so, we can build more islands with constraint-based synthesis We need to explain why a naïve sketch generates too large of a space for MIS