CSE 326: Data Structures Lecture #24 The Algorhythmics

Slides:



Advertisements
Similar presentations
Algorithm Design Techniques
Advertisements

 Review: The Greedy Method
Greedy Algorithms.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
Types of Algorithms.
Greedy Algorithms Greed is good. (Some of the time)
CSE 326: Data Structures Lecture #19 Approaches to Graph Exploration Bart Niswonger Summer Quarter 2001.
Greedy Algorithms Be greedy! always make the choice that looks best at the moment. Local optimization. Not always yielding a globally optimal solution.
CS38 Introduction to Algorithms Lecture 5 April 15, 2014.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Dynamic Programming Lets begin by looking at the Fibonacci sequence.
Dynamic Programming Technique. D.P.2 The term Dynamic Programming comes from Control Theory, not computer science. Programming refers to the use of tables.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 (Part 3) Tuesday, 9/4/01 Greedy Algorithms.
CSE 326: Data Structures Lecture #3 Analysis of Recursive Algorithms Alon Halevy Fall Quarter 2000.
CS333/ Topic 11 CS333 - Introduction CS333 - Introduction General information Goals.
Chapter 10: Algorithm Design Techniques
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 3) Tuesday, 1/29/02 Design Patterns for Optimization.
Greedy Algorithms CSE 331 Section 2 James Daly. Reminders Exam 2 next week Thursday, April 9th Covers heaps to spanning trees Greedy algorithms (today)
1 Algorithm Design Techniques Greedy algorithms Divide and conquer Dynamic programming Randomized algorithms Backtracking.
Dynamic Programming Introduction to Algorithms Dynamic Programming CSE 680 Prof. Roger Crawfis.
Instructor: Dr. Sahar Shabanah Fall Lectures ST, 9:30 pm-11:00 pm Text book: M. T. Goodrich and R. Tamassia, “Data Structures and Algorithms in.
Teaching Teaching Discrete Mathematics and Algorithms & Data Structures Online G.MirkowskaPJIIT.
Minimum Spanning Trees
CS 5243: Algorithms Dynamic Programming Dynamic Programming is applicable when sub-problems are dependent! In the case of Divide and Conquer they are.
Fundamentals of Algorithms MCS - 2 Lecture # 7
INTRODUCTION. What is an algorithm? What is a Problem?
For Wednesday No reading No homework There will be homework for Friday, as well the program being due – plan ahead.
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
CSE 326: Data Structures Lecture #25 Class Wrap-up Steve Wolfman Winter Quarter 2000.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
Course Review Fundamental Structures of Computer Science Margaret Reid-Miller 29 April 2004.
CS4710 Algorithms. What is an Algorithm? An algorithm is a procedure to perform some task. 1.General - applicable in a variety of situations 2.Step-by-step.
Computer Sciences Department1.  Property 1: each node can have up to two successor nodes (children)  The predecessor node of a node is called its.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
COSC 3101NJ. Elder Announcements Midterm Exam: Fri Feb 27 CSE C –Two Blocks: 16:00-17:30 17:30-19:00 –The exam will be 1.5 hours in length. –You can attend.
CSE 340: Review (at last!) Measuring The Complexity Complexity is a function of the size of the input O() Ω() Θ() Complexity Analysis “same order” Order.
CS 146: Data Structures and Algorithms July 28 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
2016/3/13Page 1 Semester Review COMP3040 Dept. Computer Science and Technology United International College.
CSE 326: Data Structures Lecture #23 Dijkstra and Kruskal (sittin’ in a graph) Steve Wolfman Winter Quarter 2000.
Greedy Algorithms.
Data Structures and Algorithms
Fundamental Structures of Computer Science
Lecture on Design and Analysis of Computer Algorithm
Greedy Technique.
Summary of lectures Introduction to Algorithm Analysis and Design (Chapter 1-3). Lecture Slides Recurrence and Master Theorem (Chapter 4). Lecture Slides.
Greedy Algorithms (Chap. 16)
Data Structures and Algorithms
Types of Algorithms.
Presented by Po-Chuan & Chen-Chen 2016/03/08
Chapter 4: Divide and Conquer
Data Structures and Algorithms
Data Structures and Algorithms
Unit-5 Dynamic Programming
Types of Algorithms.
Algorithms (2IL15) – Lecture 2
Hannah Tang and Brian Tjaden Summer Quarter 2002
Chapter 15-1 : Dynamic Programming I
Ch. 15: Dynamic Programming Ming-Te Chi
Types of Algorithms.
Lecture 4 Dynamic Programming
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Lecture 15, Winter 2019 Closest Pair, Multiplication
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
This is not an advertisement for the profession
Major Design Strategies
Richard Anderson Lecture 12, Winter 2019 Recurrences
Dynamic Programming.
Major Design Strategies
Presentation transcript:

CSE 326: Data Structures Lecture #24 The Algorhythmics Steve Wolfman Winter Quarter 2000 Today we’ll look at a pair of interesting and important graph algorithms that also happen to make good use of data structures! The algorithms were developed by Dijkstra and Kruskal.

Today’s Outline Greedy Divide & Conquer Dynamic Programming Randomized Backtracking Since I’m getting filmed today, I need you to sign a release. The form just says that you don’t mind being filmed. Then, we can finish off some graph properties from Wednesday. It’s important to understand what these terms mean! Then, we’ll hit the two algorithms.

Greedy Algorithms Repeat until problem is solved: Measure options according to marginal value Commit to maximum Greedy algorithms are normally fast and simple. Sometimes appropriate as a heuristic solution or to approximate the optimal solution.

Hill-Climbing Global Maximum Local Maximum

Greed in Action Best First Search A* Search Huffman Encodings Kruskal’s Algorithm Dijkstra’s Algorithm Prim’s Algorithm Scheduling

Scheduling Problem Given: a group of tasks {T1, …, Tn} each with a duration {d1, …, dn} a single processor without interrupts Select an order for the tasks that minimizes average completion time 1 1.5 0.5 0.3 2 1.4 T1 T2 T3 T4 T5 T6 Average time to completion:

Greedy Solution

Proof of Correctness Common technique for proving correctness of greedy algorithms is by contradiction: assume there is a non-greedy best way show that making that way more like greedy solution improves the supposedly best way: contradiction! Proof of correctness for greedy scheduling:

Divide & Conquer Divide problem into multiple smaller parts Solve smaller parts Solve base cases directly Otherwise, solve subproblems recursively Merge solutions together (Conquer!) Often leads to elegant and simple recursive implementations.

Divide & Conquer in Action Mergesort Quicksort buildHeap buildTree Closest points

Closest Points Problem Given: a group of points {(x1, y1), …, (xn, yn)} Return the distance between the closest pair of points 0.75

Closest Points Algorithm Closest pair is: closest pair on left or closest pair on right or closest pair spanning the middle runtime:

Closest Points Algorithm Refined Closest pair is: closest pair on left or closest pair on right or closest pair in middle strip within one  of each other vertically runtime:

Memoizing/ Dynamic Programming Define problem in terms of smaller subproblems Solve and record solution for base cases Build solutions for subproblems up from solutions to smaller subproblems Can improve runtime of divide & conquer algorithms that have shared subproblems with optimal substructure. Usually involves a table of subproblem solutions.

Dynamic Programming in Action Sequence Alignment Fibonacci numbers All pairs shortest path Optimal Binary Search Tree

Fibonacci Numbers F(n) = F(n - 1) + F(n - 2) F(0) = 1 F(1) = 1 int fib(int n) { if (n <= 1) return 1; else return fib(n - 1) + fib(n - 2); } “Divide” & Conquer int fib(int n) { static vector<int> fibs; if (n <= 1) return 1; if (fibs[n] == 0) fibs[n] = fib(n - 1) + fib(n - 2); return fibs[n]; } Memoized

Optimal Binary Search Tree Problem Given: a set of words {w1, …, wn} probabilities of each word’s occurrence {p1, …, pn} Produce a Binary Search Tree which includes all the words and has the lowest expected cost: Expected cost = (where di is the depth of word i in the tree)

The Optimal BST Optimal Substructure Shared Subproblems Falcon Millenium Falcon And to Ever Millenium to Zaphod And to Ever First to Meet Can an optimal solution possibly have suboptimal subtrees?

Optimal BST Cost Let CLeft,Right be the cost of the optimal subtree between wLeft and wRight. Then, CLeft,Right is: Let’s maintain a 2-D table to store values of Ci,j

Optimal BST Algorithm …a …am …and …egg …if …the …two a… am… and… egg…

To Do Finish Project IV!!!!!! Read Chapter 10 (algorithmic techniques) Work on final review (remember you can and should work with others) That’s it. You should finish off project IV this weekend. Tell us if you’re having trouble! Read chapter 9 on graphs and start reading chapter 10 on algorithmic techniques. We’ll talk about chapter 10 Monday and Wednesday of next week. Finally, start working on that final review. It’s not homework, but it will help you get ready for the final. We’ll have the official final review next week.

Coming Up Course Wrap-up Project IV due (March 7th; that’s tomorrow!) Final Exam (March 13th)