COSC 3101NJ. Elder Announcements Midterms are marked Assignment 2: –Still analyzing.

Slides:



Advertisements
Similar presentations
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Advertisements

Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Greedy Algorithms Greed is good. (Some of the time)
Lecture 8: Dynamic Programming Shang-Hua Teng. Longest Common Subsequence Biologists need to measure how similar strands of DNA are to determine how closely.
Introduction to Bioinformatics Algorithms Divide & Conquer Algorithms.
Overview What is Dynamic Programming? A Sequence of 4 Steps
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Introduction to Algorithms Jiafen Liu Sept
David Luebke 1 5/4/2015 CS 332: Algorithms Dynamic Programming Greedy Algorithms.
CS138A Single Source Shortest Paths Peter Schröder.
Advanced Data Structures
Comp 122, Fall 2004 Dynamic Programming. dynprog - 2 Lin / Devi Comp 122, Spring 2004 Longest Common Subsequence  Problem: Given 2 sequences, X =  x.
Dynamic Programming (II)
Advanced Topics in Algorithms and Data Structures 1 Rooting a tree For doing any tree computation, we need to know the parent p ( v ) for each node v.
Chapter 7 Dynamic Programming 7.
Dynamic Programming Technique. D.P.2 The term Dynamic Programming comes from Control Theory, not computer science. Programming refers to the use of tables.
Dynamic Programming CIS 606 Spring 2010.
Lists A list is a finite, ordered sequence of data items. Two Implementations –Arrays –Linked Lists.
Chapter 9 Graph algorithms Lec 21 Dec 1, Sample Graph Problems Path problems. Connectedness problems. Spanning tree problems.
Greedy Algorithms CIS 606 Spring Greedy Algorithms Similar to dynamic programming. Used for optimization problems. Idea – When we have a choice.
UNC Chapel Hill Lin/Manocha/Foskey Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Design Patterns for Optimization Problems Dynamic Programming.
© 2004 Goodrich, Tamassia Dynamic Programming1. © 2004 Goodrich, Tamassia Dynamic Programming2 Matrix Chain-Products (not in book) Dynamic Programming.
All-Pairs Shortest Paths
Analysis of Algorithms
Lecture 7: Greedy Algorithms II
Lecture 7 Topics Dynamic Programming
1 Theory I Algorithm Design and Analysis (11 - Edit distance and approximate string matching) Prof. Dr. Th. Ottmann.
Dynamic Programming – Part 2 Introduction to Algorithms Dynamic Programming – Part 2 CSE 680 Prof. Roger Crawfis.
Dynamic Programming Dynamic programming is a technique for solving problems with a recursive structure with the following characteristics: 1.optimal substructure.
Advanced Algorithm Design and Analysis (Lecture 5) SW5 fall 2004 Simonas Šaltenis E1-215b
Dynamic Programming Credits: Many of these slides were originally authored by Jeff Edmonds, York University. Thanks Jeff!
Week 11 - Wednesday.  What did we talk about last time?  Graphs  Euler paths and tours.
Tree A connected graph that contains no simple circuits is called a tree. Because a tree cannot have a simple circuit, a tree cannot contain multiple.
MS 101: Algorithms Instructor Neelima Gupta
David Luebke 1 10/24/2015 CS 332: Algorithms Greedy Algorithms Continued.
COSC 3101A - Design and Analysis of Algorithms 7 Dynamic Programming Assembly-Line Scheduling Matrix-Chain Multiplication Elements of DP Many of these.
Greedy Methods and Backtracking Dr. Marina Gavrilova Computer Science University of Calgary Canada.
MA/CSSE 473 Day 28 Dynamic Programming Binomial Coefficients Warshall's algorithm Student questions?
Reading and Writing Mathematical Proofs Spring 2015 Lecture 4: Beyond Basic Induction.
Lectures on Greedy Algorithms and Dynamic Programming
Data Structures & Algorithms Graphs
5.5.2 M inimum spanning trees  Definition 24: A minimum spanning tree in a connected weighted graph is a spanning tree that has the smallest possible.
Optimization Problems In which a set of choices must be made in order to arrive at an optimal (min/max) solution, subject to some constraints. (There may.
Chapter 7 Dynamic Programming 7.1 Introduction 7.2 The Longest Common Subsequence Problem 7.3 Matrix Chain Multiplication 7.4 The dynamic Programming Paradigm.
COSC 3101NJ. Elder Announcements Midterm Exam: Fri Feb 27 CSE C –Two Blocks: 16:00-17:30 17:30-19:00 –The exam will be 1.5 hours in length. –You can attend.
Great Theoretical Ideas in Computer Science for Some.
Week 11 - Wednesday.  What did we talk about last time?  Graphs  Paths and circuits.
TU/e Algorithms (2IL15) – Lecture 4 1 DYNAMIC PROGRAMMING II
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
1 Chapter 15-2: Dynamic Programming II. 2 Matrix Multiplication Let A be a matrix of dimension p x q and B be a matrix of dimension q x r Then, if we.
Dr Nazir A. Zafar Advanced Algorithms Analysis and Design Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar.
Greedy Algorithms General principle of greedy algorithm
Dynamic Programming Typically applied to optimization problems
Finding a Path With Largest Smallest Edge
Advanced Algorithms Analysis and Design
Dynamic Programming Several problems Principle of dynamic programming
Advanced Algorithms Analysis and Design
Advanced Algorithms Analysis and Design
Dynamic Programming Comp 122, Fall 2004.
Lecture 8. Paradigm #6 Dynamic Programming
Ch. 15: Dynamic Programming Ming-Te Chi
Greedy Algorithms Comp 122, Spring 2004.
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Dynamic Programming II DP over Intervals
Analysis of Algorithms CS 477/677
Dynamic Programming.
Optimal Binary Search Tree. 1.Preface  OBST is one special kind of advanced tree.  It focus on how to reduce the cost of the search of the BST.  It.
Presentation transcript:

COSC 3101NJ. Elder Announcements Midterms are marked Assignment 2: –Still analyzing

COSC 3101NJ. Elder Loop Invariants (Revisited) Question 5(a): Design an iterative algorithm for Parity: Loop Invariant? LI: After i iterations are performed, p=Parity(s[1…i])

COSC 3101NJ. Elder What is a loop invariant? An assertion about the state of one or more variables used in the loop. When the exit condition is met, the LI leads naturally to the postcondition (the goal of the algorithm). Thus the LI must be a statement about the variables that store the results of the computation.

COSC 3101NJ. Elder Know What an LI Is Not ``the LI is NOT...'' – code –The steps taken by the algorithm –A statement about the range of values assumed by the loop index.

COSC 3101NJ. Elder Dynamic Programming: Recurrence

COSC 3101NJ. Elder Dynamic programming Step 1: Describe an array of values you want to compute. Step 2: Give a recurrence for computing later values from earlier (bottom-up). Step 3: Give a high-level program. Step 4: Show how to use values in the array to compute an optimal solution.

COSC 3101NJ. Elder Example 1. Rock climbing At every step our climber can reach exactly three handholds: above, above and to the right and above and to the left. There is a table of “danger ratings” provided. The “Danger” of a path is the sum of danger ratings of all handholds on the path

COSC 3101NJ. Elder For every handhold, there is only one “path” rating. Once we have reached a hold, we don’t need to know how we got there to move to the next level. This is called an “optimal substructure” property. Once we know optimal solutions to subproblems, we can compute an optimal solution to the problem itself.

COSC 3101NJ. Elder Step 2. Define a Recurrence Let C(i,j) represent the danger of hold (i,j) Let A(i,j) represent the cumulative danger of the safest path from the bottom to hold (i,j) Then A(i,j) = C(i,j)+min{A(i-1,j-1),A(i-1,j),A(i-1,j+1)} i.e., the safest path to hold (i,j) subsumes the safest path to holds at level i-1 from which hold (i,j) can be reached.

COSC 3101NJ. Elder Example 2. Activity Scheduling with Profits

COSC 3101NJ. Elder Step 2. Provide a Recurrent Solution Decide not to schedule activity i Profit from scheduling activity i Optimal profit from scheduling activities that end before activity i begins

COSC 3101NJ. Elder Example 3: Scheduling Jobs with Deadlines, Profits and Durations

COSC 3101NJ. Elder Dynamic Programming Solution

COSC 3101NJ. Elder Step 2. Provide a Recurrent Solution Decide not to schedule job i Profit from job iProfit from scheduling activities that end before job i begins

COSC 3101NJ. Elder Step 2. (cntd…) Proving the Recurrent Solution We effectively schedule job i at the latest possible time. This leaves the largest and earliest contiguous block of time for scheduling jobs with earlier deadlines.

COSC 3101NJ. Elder … event i Case 1 … event i Case 2

COSC 3101NJ. Elder Example 3. Longest Common Subsequence

COSC 3101NJ. Elder Optimal Substructure Input: 2 sequences, X = x 1,..., x m and Y = y 1,..., y n.

COSC 3101NJ. Elder Proof of Optimal Substructure Part 1

COSC 3101NJ. Elder Proof of Optimal Substructure Part 2

COSC 3101NJ. Elder Proof of Optimal Substructure Part 3

COSC 3101NJ. Elder Step 2. Provide a Recurrent Solution Input sequences are empty Last elements match: must be part of LCS Last elements don’t match: at most one of them is part of LCS

COSC 3101NJ. Elder Example 6: Longest Increasing Subsequence Input: 1 sequence, X = x 1,..., x n. Output: the longest increasing subsequence of X. Note: A subsequence doesn’t have to be consecutive, but it has to be in order.

COSC 3101NJ. Elder Step 1. Define an array of values to compute

COSC 3101NJ. Elder Step 2. Provide a Recurrent Solution

COSC 3101NJ. Elder Step 3. Provide an Algorithm Running time? O(n 2 ) function A=LIS(X) for i=1:length(X) m=0; for j=1:i-1 if X(j) m m=A(j); end A(i)=m+1; end

COSC 3101NJ. Elder Step 4. Compute Optimal Solution Running time?O(n) function lis=printLIS(X, A) [m,mi]=max(A); lis=printLISm(X,A,mi,'LIS: '); lis=[lis,sprintf('%d', X(mi))]; function lis=printLISm(X, A, mi, lis) if A(mi) > 1 i=mi-1; while ~(X(i) < X(mi) & A(i) == A(mi)-1) i=i-1; end lis=printLISm(X, A, i, lis); lis=[lis, sprintf('%d ', X(i))]; end

COSC 3101NJ. Elder LIS Example X = A = > printLIS(X,A) > LIS:

COSC 3101NJ. Elder Example 6: Optimal Binary Search Trees

COSC 3101NJ. Elder Expected Search Cost Which BST is more efficient?

COSC 3101NJ. Elder Observations

COSC 3101NJ. Elder Optimal Substructure

COSC 3101NJ. Elder Optimal Substructure (cntd…) T

COSC 3101NJ. Elder Recursive Solution

COSC 3101NJ. Elder Recursive Solution (cntd…)

COSC 3101NJ. Elder Step 2. Provide a Recurrent Solution Expected cost of search for left subtree Added cost when subtrees embedded under root Expected cost of search for right subtree

COSC 3101NJ. Elder n)n) Step 3. Provide an Algorithm Running time? O(n 3 ) work on subtrees of increasing size l

COSC 3101NJ. Elder Example

COSC 3101NJ. Elder Example (cntd…)

COSC 3101NJ. Elder Step 4. Compute Optimal Solution Running time?O(n)

COSC 3101NJ. Elder Elements of Dynamic Programming Optimal substructure: –an optimal solution to the problem contains within it optimal solutions to subproblems.

COSC 3101NJ. Elder Elements of Dynamic Programming Cut and paste: prove optimal substructure by contradiction: –assume an optimal solution to a problem with suboptimal solution to subproblem –cut out the suboptimal solution to the subproblem. –paste in the optimal solution to the subproblem. –show that this results in a better solution to the original problem. –This contradicts our assertion that our original solution is optimal.

COSC 3101NJ. Elder Dynamic programming uses optimal substructure from the bottom up: –First find optimal solutions to subproblems –Then choose which to use in optimal solution to problem. Elements of Dynamic Programming

COSC 3101NJ. Elder Section V. Graph Algorithms

COSC 3101NJ. Elder (c) The subgraph of the graph in part (a) induced by the vertex set {1,2,3,6}. (a)A directed graph G = (V, E), where V = {1,2,3,4,5,6} and E = {(1,2), (2,2), (2,4), (2,5), (4,1), (4,5), (5,4), (6,3)}. The edge (2,2) is a self-loop. (b) An undirected graph G = (V,E), where V = {1,2,3,4,5,6} and E = {(1,2), (1,5), (2,5), (3,6)}. The vertex 4 is isolated. Directed and Undirected Graphs

COSC 3101NJ. Elder Graph Isomorphism

COSC 3101NJ. Elder Trees

COSC 3101NJ. Elder Representations: Undirected Graphs Adjacency List Adjacency Matrix

COSC 3101NJ. Elder Representations: Directed Graphs Adjacency List Adjacency Matrix