Simplifying Dynamic Programming via Tabling Hai-Feng Guo University of Nebraska at Omaha, USA Gopal Gupta University of Texas at Dallas, USA.

Slides:



Advertisements
Similar presentations
Problems and Their Classes
Advertisements

An Array-Based Algorithm for Simultaneous Multidimensional Aggregates By Yihong Zhao, Prasad M. Desphande and Jeffrey F. Naughton Presented by Kia Hall.
Lecture 24 MAS 714 Hartmut Klauck
Greedy Algorithms.
Traveling Salesperson Problem
Querying Workflow Provenance Susan B. Davidson University of Pennsylvania Joint work with Zhuowei Bao, Xiaocheng Huang and Tova Milo.
1 Nondeterministic Space is Closed Under Complement Presented by Jing Zhang and Yingbo Wang Theory of Computation II Professor: Geoffrey Smith.
MS 101: Algorithms Instructor Neelima Gupta
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Algorithms + L. Grewe.
Discrete Structure Li Tak Sing( 李德成 ) Lectures
1 Dynamic Programming (DP) Like divide-and-conquer, solve problem by combining the solutions to sub-problems. Differences between divide-and-conquer and.
1 Optimisation Although Constraint Logic Programming is somehow focussed in constraint satisfaction (closer to a “logical” view), constraint optimisation.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
CSC401 – Analysis of Algorithms Lecture Notes 12 Dynamic Programming
Dynamic lot sizing and tool management in automated manufacturing systems M. Selim Aktürk, Siraceddin Önen presented by Zümbül Bulut.
Branch and Bound Algorithm for Solving Integer Linear Programming
11-1 Matrix-chain Multiplication Suppose we have a sequence or chain A 1, A 2, …, A n of n matrices to be multiplied –That is, we want to compute the product.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
1/25 Pointer Logic Changki PSWLAB Pointer Logic Daniel Kroening and Ofer Strichman Decision Procedure.
Semantic Analysis Legality checks –Check that program obey all rules of the language that are not described by a context-free grammar Disambiguation –Name.
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
A Polynomial Time Approximation Scheme For Timing Constrained Minimum Cost Layer Assignment Shiyan Hu*, Zhuo Li**, Charles J. Alpert** *Dept of Electrical.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
Game Programming in Java Dr. Jeyakesavan Veerasamy CS faculty, The University of Texas at Dallas Website:
The Fast Optimal Voltage Partitioning Algorithm For Peak Power Density Minimization Jia Wang, Shiyan Hu Department of Electrical and Computer Engineering.
Algorithm Paradigms High Level Approach To solving a Class of Problems.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
Introduction to Problem Solving. Steps in Programming A Very Simplified Picture –Problem Definition & Analysis – High Level Strategy for a solution –Arriving.
Lecture 4,5 Mathematical Induction and Fibonacci Sequences.
On the Relation between SAT and BDDs for Equivalence Checking Sherief Reda Rolf Drechsler Alex Orailoglu Computer Science & Engineering Dept. University.
Mining Document Collections to Facilitate Accurate Approximate Entity Matching Presented By Harshda Vabale.
Technology Mapping. 2 Technology mapping is the phase of logic synthesis when gates are selected from a technology library to implement the circuit. Technology.
Chapter 8 Maximum Flows: Additional Topics All-Pairs Minimum Value Cut Problem  Given an undirected network G, find minimum value cut for all.
Finite State Machines (FSM) OR Finite State Automation (FSA) - are models of the behaviors of a system or a complex object, with a limited number of defined.
§5 Backtracking Algorithms A sure-fire way to find the answer to a problem is to make a list of all candidate answers, examine each, and following the.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Dynamic Programming1. 2 Outline and Reading Matrix Chain-Product (§5.3.1) The General Technique (§5.3.2) 0-1 Knapsack Problem (§5.3.3)
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
CSC5101 Advanced Algorithms Analysis
Semantic Analysis II Type Checking EECS 483 – Lecture 12 University of Michigan Wednesday, October 18, 2006.
1 Recursive algorithms Recursive solution: solve a smaller version of the problem and combine the smaller solutions. Example: to find the largest element.
Solving problems by searching A I C h a p t e r 3.
What are Factors? FACTOR Definition #1: A Whole Number that divides evenly into another number.
Reachability Testing of Concurrent Programs1 Reachability Testing of Concurrent Programs Richard Carver, GMU Yu Lei, UTA.
 FUNDAMENTALS OF ALGORITHMS.  FUNDAMENTALS OF DATA STRUCTURES.  TREES.  GRAPHS AND THEIR APPLICATIONS.  STORAGE MANAGEMENT.
TU/e Algorithms (2IL15) – Lecture 3 1 DYNAMIC PROGRAMMING
Constraint Programming for the Diameter Constrained Minimum Spanning Tree Problem Thiago F. Noronha Celso C. Ribeiro Andréa C. Santos.
Section Recursion  Recursion – defining an object (or function, algorithm, etc.) in terms of itself.  Recursion can be used to define sequences.
Introduction toData structures and Algorithms
Practical Database Design and Tuning
Data Structure Interview Question and Answers
Dynamic Programming 1 Neil Tang 4/20/2010
DATA STRUCTURES AND OBJECT ORIENTED PROGRAMMING IN C++
Data Structures Interview / VIVA Questions and Answers
Advanced Design and Analysis Techniques
Unit-5 Dynamic Programming
Practical Database Design and Tuning
Dynamic Programming.
Dynamic Programming Merge Sort 1/18/ :45 AM Spring 2007
All pairs shortest path problem
Algorithms CSCI 235, Spring 2019 Lecture 28 Dynamic Programming III
Dynamic Programming.
Dynamic Programming 動態規劃
Merge Sort 4/28/ :13 AM Dynamic Programming Dynamic Programming.
Dynamic Programming Merge Sort 5/23/2019 6:18 PM Spring 2008
Fast Min-Register Retiming Through Binary Max-Flow
Dynamic Programming.
Presentation transcript:

Simplifying Dynamic Programming via Tabling Hai-Feng Guo University of Nebraska at Omaha, USA Gopal Gupta University of Texas at Dallas, USA

Tabled Logic Programming A tabled logic programming system terminate more often by computing fixed points avoid redundant computation by memoing the computed answers Keeps the declarative and procedural semantics consistent for any definite logic programs. Tabled resolution schemes OLDT, SLG, SLS, SLDT, DRA

Reachability :- table reach/2 reach(X, Y) :- reach(X, Z), arc(Z, Y). reach(X, Y) :- arc(X, Y). arc(a, b). arc(b, a). arc(b, c). :- table reach/3 reach(X, Y, E) :- reach(X, Z, E1), arc(Z, Y, E2), append(E1, E2, E). reach(X, Y, E) :- arc(X, Y, E). arc(a, b, [(a, b)]). arc(b, a, [(b, a)]). arc(b, c, [(b, c)]). abc Table Space will be increased dramatically due to the extra argument to record the path.

How tabled answers are collected? When an answer to a tabled call is generated, variant checking is used to check whether it has been already tabled. Observation: for collecting paths for the reachability problem, we need only one simple path for each pair of nodes. A second possible path for the same pair of nodes could be thought of as a variant answer.

Indexed / Non-indexed The arguments of each tabled predicate are divided into indexed and non- indexed ones. Only indexed arguments are used for variant checking for collecting tabled answers. Non-indexed arguments are treated as no difference.

Mode Declaration for Tabled Predicates :- table_mode p(a 1, …, a n ). p/n is a predicate name, n > 0; each a i has one of the following forms: + denotes that this indexed argument is used for variant checking;  denotes that this non-indexed arguments is not be used for variant checking; * denotes that they are always bound before a call to the tabled predicate p/n is invoked.

:- table_mode reach(+, +,  )

Aggregate Declaration Associate a non-indexed argument of a tabled predicate with some optimum constraint, e.g. minimum or maximum. The argument mode also includes: 0 denotes that this argument is a minimum; 9 denotes that this argument is a maximum.

Dynamic Programming Dynamic programming is typically used for solving optimization problems. A recursive strategy: the value of an optimal solution is recursively defined in terms of optimal solutions to sub- problems.

Dynamic Programming with Mode declaration Optimization = Problem + Aggregation With mode declaration, defining a general solution suffices.

Matrix-Chain Multiplication Without Mode Declaration With Mode Declaration

Matrix-Chain Multiplication :- table scalar_cost/4. :- table_mode scalar_cost(+, 0, ,  ). scalar_cost([P1, P2], 0, P1, P2). scalar_cost([P1, P2, P3 | Pr], V, PL1, PL2) :- break([P1, P2, P3 | Pr], PL1, PL2, Pk), scalar_cost(PL1, V1, P1, Pk), scalar_cost(PL2, V2, Pk, Pn), V is V1 + V2 + P1 * Pk * Pn.

Running Time Comparison (Seconds) Benchmarkmatrixlcsobstapspknap Without mode With mode Benchmarkmatrixlcsobstapspknap Without mode With mode Without Evidence Construction With Evidence Construction The programs with mode declaration run 1.34 to 16.0 times faster than those without mode declaration.

Scalability Without EvidenceWith Evidence

Running Space Comparison (Megabytes) Benchmarkmatrixlcsobstapspknap Without mode With mode Without Evidence Construction Without evidence construction, the programs with mode declaration consumes 1.4 to 15.0 times less space than those without mode declaration.

Running Space Comparison (Megabytes) Benchmarkmatrixlcsobstapspknap Without mode With mode With Evidence Construction With evidence construction, space performance can be better or worse depending on the programs. The programs without mode explicitly generate all possible answers and then table the optimal one; The programs with mode implicitly generate all possible answers and selectively table the better answers until the optimal one is found.

Conclusion A new mode declaration for tabled predicates is introduced to aggregate information dynamically recorded in the table; A tabled predicate can be regarded as a function in which non-indexed arguments (outputs) are uniquely defined by the indexed arguments (inputs); The new mode declaration scheme, coupled with recursion, provides an elegant method for solving optimization problems; The efficiency of tabled resolution is improved since only indexed arguments are involved in variant checking.