CSE 326: Data Structures Lecture #19 Approaches to Graph Exploration Bart Niswonger Summer Quarter 2001.

Slides:



Advertisements
Similar presentations
Algorithm Design Techniques
Advertisements

An Introduction to Artificial Intelligence
Greedy Algorithms.
Traveling Salesperson Problem
Weighted graphs Example Consider the following graph, where nodes represent cities, and edges show if there is a direct flight between each pair of cities.
CSE 390B: Graph Algorithms Based on CSE 373 slides by Jessica Miller, Ruth Anderson 1.
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
State Space 3 Chapter 4 Heuristic Search. Three Algorithms Backtrack Depth First Breadth First All work if we have well-defined: Goal state Start state.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 4 Jim Martin.
Comp 122, Spring 2004 Greedy Algorithms. greedy - 2 Lin / Devi Comp 122, Fall 2003 Overview  Like dynamic programming, used to solve optimization problems.
Types of Algorithms.
Greedy Algorithms Greed is good. (Some of the time)
Greed is good. (Some of the time)
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture10.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
State Space Search Algorithms CSE 472 Introduction to Artificial Intelligence Autumn 2003.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 21: Graphs.
1 Greedy Algorithms. 2 2 A short list of categories Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
MAE 552 – Heuristic Optimization Lecture 27 April 3, 2002
CSE 326: Data Structures Lecture #21 Graphs II.5 Alon Halevy Spring Quarter 2001.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 2 Monday, 9/13/06 Design Patterns for Optimization Problems.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Using Search in Problem Solving
Lecture 3 Informed Search CSE 573 Artificial Intelligence I Henry Kautz Fall 2001.
Chapter 10: Algorithm Design Techniques
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
CSE 326: Data Structures Lecture #19 More Fun with Graphs Henry Kautz Winter Quarter 2002.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Informed Search Idea: be smart about what paths to try.
Instructor: Dr. Sahar Shabanah Fall Lectures ST, 9:30 pm-11:00 pm Text book: M. T. Goodrich and R. Tamassia, “Data Structures and Algorithms in.
SPANNING TREES Lecture 21 CS2110 – Spring
Fundamentals of Algorithms MCS - 2 Lecture # 7
INTRODUCTION. What is an algorithm? What is a Problem?
Honors Track: Competitive Programming & Problem Solving Optimization Problems Kevin Verbeek.
CSE 2331 / 5331 Topic 12: Shortest Path Basics Dijkstra Algorithm Relaxation Bellman-Ford Alg.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Search CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
SPANNING TREES Lecture 20 CS2110 – Fall Spanning Trees  Definitions  Minimum spanning trees  3 greedy algorithms (incl. Kruskal’s & Prim’s)
SPANNING TREES Lecture 21 CS2110 – Fall Nate Foster is out of town. NO 3-4pm office hours today!
CS 146: Data Structures and Algorithms July 28 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
Chapter 20: Graphs. Objectives In this chapter, you will: – Learn about graphs – Become familiar with the basic terminology of graph theory – Discover.
CSE 326: Data Structures Lecture #17 Heuristic Graph Search Henry Kautz Winter Quarter 2002.
CSE 326: Data Structures Lecture #23 Dijkstra and Kruskal (sittin’ in a graph) Steve Wolfman Winter Quarter 2000.
CSE 373: Data Structures and Algorithms Lecture 21: Graphs V 1.
1 A* search Outline In this topic, we will look at the A* search algorithm: –It solves the single-source shortest path problem –Restricted to physical.
Greedy Technique.
Major Design Strategies
Courtsey & Copyright: DESIGN AND ANALYSIS OF ALGORITHMS Courtsey & Copyright:
Spanning Trees Lecture 21 CS2110 – Fall 2016.
Types of Algorithms.
Randomized Algorithms CS648
Types of Algorithms.
A* Path Finding Ref: A-star tutorial.
Advanced Analysis of Algorithms
Outline This topic covers Prim’s algorithm:
Hannah Tang and Brian Tjaden Summer Quarter 2002
Artificial Intelligence
CSE 373: Data Structures and Algorithms
CSE 326: Data Structures Lecture #24 The Algorhythmics
Types of Algorithms.
CSE 373 Data Structures and Algorithms
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Major Design Strategies
Major Design Strategies
More Graphs Lecture 19 CS2110 – Fall 2009.
Autumn 2019 Lecture 11 Minimum Spanning Trees (Part II)
Presentation transcript:

CSE 326: Data Structures Lecture #19 Approaches to Graph Exploration Bart Niswonger Summer Quarter 2001

Today’s Outline Stuff Bart didn’t finish Friday Algorithm Design –Divide & Conquer –Greedy –Dynamic Programming –Randomized –Backtracking

Divide & Conquer Divide problem into multiple smaller parts Solve smaller parts –Solve base cases directly –Otherwise, solve subproblems recursively Merge solutions together (Conquer!) Often leads to elegant and simple recursive implementations.

Divide & Conquer in Action Binary Search Mergesort Quicksort buildHeap buildTree Closest points

Closest Points Problem Given: –a group of points {(x 1, y 1 ), …, (x n, y n )} Return the distance between the closest pair of points.75

Closest Points Algorithm Closest pair is: –closest pair on left or –closest pair on right or –closest pair spanning the middle runtime:

Closest Points Algorithm Closest pair is: –closest pair on left or –closest pair on right or –closest pair in middle strip within one  of each other horizontally and vertically runtime: dLdL dRdR  = min( d L, d R )  

Greedy Algorithms Repeat until problem is solved: –Measure options according to marginal value –Commit to maximum Greedy algorithms are normally fast and simple. Sometimes appropriate as a heuristic solution or to approximate the optimal solution.

Hill-Climbing Global Maximum Local Maximum

Greed in Action Kruskal’s Algorithm Dijkstra’s Algorithm Prim’s Algorithm Huffman Encodings Scheduling Best First Search A * Search

Huge Graphs Consider some really huge graphs… –All cities and towns in the World Atlas –All stars in the Galaxy –All ways 10 blocks can be stacked Huh???

Implicitly Generated Graphs A huge graph may be implicitly specified by rules for generating it on-the-fly Blocks world: –vertex = relative positions of all blocks –edge = robot arm could stack block a on block b stack(blue,red) stack(green,red) stack(green,blue)

Blocks World source: initial state of the blocks goal: desired state of the blocks path from source to goal = sequence of actions (program) for robot arm n blocks  n n states 10 blocks  10 billion states

Problem: Branching Factor Dijkstra’s algorithm is basically breadth-first search (modulo arc weights) –Visits all nodes (exhaustive search) Suppose we know that goal is only d steps away. If out-degree of each node is 10, potentially visits 10 d vertices –10 step plan => 10 billion vertices! Cannot search such huge graphs exhaustively!

An Easier Case Suppose you live in Manhattan; what do you do? 52 nd St 51 st St 50 th St 10 th Ave 9 th Ave 8 th Ave 7 th Ave6 th Ave5 th Ave4 th Ave 3 rd Ave 2 nd Ave S G

Best-First Search The Manhattan distance (  x+  y) is an estimate of the distance to the goal –a heuristic value heuristic: involving or serving as an aid to learning, discovery, or problem-solving by experimental and especially trial-and-error methods (Merriam Webster – Best-First Search –Order nodes in priority to minimize estimated distance to the goal Compare: Dijkstra –Order nodes in priority to minimize distance from the start

Best First in Action Suppose you live in Manhattan; what do you do? 52 nd St 51 st St 50 th St 10 th Ave 9 th Ave 8 th Ave 7 th Ave6 th Ave5 th Ave4 th Ave 3 rd Ave 2 nd Ave S G

Will get back on track, but not quick enough Being Mislead 52 nd St 51 st St 50 th St 10 th Ave 9 th Ave 8 th Ave 7 th Ave6 th Ave5 th Ave4 th Ave 3 rd Ave 2 nd Ave S G Best First – mistaken path Best First – correct path Dijkstra

Optimality Does Best-First Search find the shortest path –when the goal is first seen? –when the goal is removed from priority queue?

Sub-Optimal Solution Goal is by definition at distance 0: will be removed from priority queue immediately, even if a shorter path exists! 52 nd St 51 st St 9 th Ave 8 th Ave 7 th Ave6 th Ave5 th Ave4 th Ave S G (5 blocks)

Synergy? Dijkstra / Breadth First guaranteed to find optimal solution Best First often visits far fewer vertices, but may not provide optimal solution –Can we get the best of both?

A* A* - Order vertices in priority queue to minimize (distance from start) + (estimated distance to goal) f(n) = g(n) + h(n) f(n) = priority of a node g(n) = true distance from start h(n) = heuristic distance to goal

Optimality Suppose the estimated distance (h) is  the true distance to the goal –(heuristic is a lower bound) Then: when the goal is removed from the priority queue, we are guaranteed to have found a shortest path!

Optimality Revisited 52 nd St 51 st St 9 th Ave 8 th Ave 7 th Ave6 th Ave5 th Ave4 th Ave S G (5 blocks) 5+2=6 1+4=5 Dijkstra would have visited these guys! 50 th St

Revised Cloud Proof Suppose have found a path of cost c to G which is not optimal –priority(G) = f(G) = g(G) + h(G) = c + 0 = c Say N is the last vertex on an optimal path P to G which has been added to the queue but not yet dequeued. –There must be such an N, otherwise the optimal path would have been found. –priority(N) = f(N) = g(N) + h(N)  g(N) + actual cost N to G = cost of path P < c So N will be dequeued before G is dequeued Repeat argument to show entire optimal path will be expanded before G is dequeued. S N G c

A Little History A* invented by Nils Nilsson & colleagues in 1968 –or maybe some guy in Operations Research? Cornerstone of artificial intelligence –still a hot research topic! –iterative deepening A*, automatically generating heuristic functions, … Method of choice for search large (even infinite) graphs when a good heuristic function can be found

What About Those Blocks? “Distance to goal” is not always physical distance Blocks world: –distance = number of stacks to perform –heuristic lower bound = number of blocks out of place # out of place = 2, true distance to goal = 3

Other Examples Simplifying Integrals –vertex = formula –goal = closed form formula without integrals –arcs = mathematical transformations –heuristic = number of integrals remaining in formula

DNA Sequencing Problem: given chopped up DNA, reassemble Vertex = set of pieces Arc = stick two pieces together Goal = only one piece left Heuristic = number of pieces remaining - 1

Solving Simultaneous Equations Input: set of equations Vertex = assignment of values to some of the variables Edge = Assign a value to one more variable Goal = Assignment that simultaneously satisfies all the equations Heuristic = Number of equations not yet satisfied

What ISN’T A*? essentially, nothing.

Greedy Summary Greedy algorithms are not always optimal –Some greedy algorithms give provably optimal solutions (Dijkstra) –Others do not Notion of minimizing some function –Dijkstra – minimizes distance from start –Best First – minimizes distance to finish –Kruskal – minimizes edge costs –Hill Climbing – minimizes distance to the sky

Dynamic Programming (Memoizing) Define problem in terms of smaller subproblems Solve and record solution for base cases Build solutions for subproblems up from solutions to smaller subproblems Can improve runtime of divide & conquer algorithms that have shared subproblems with optimal substructure. Usually involves a table of subproblem solutions.

Dynamic Programming in Action Sequence Alignment Optimal Binary Search Tree All pairs shortest path Many, many optimization problems –Databases: finding the optimal way to answer a query –Workflow: the optimal order of operations to construct some complex object Fibonacci numbers

Fibonacci Numbers F(n) = F(n - 1) + F(n - 2) F(0) = 1 F(1) = 1 int fib(int n) { if (n <= 1) return 1; else return fib(n - 1) + fib(n - 2); } runtime:

Fibonacci Numbers Observation: every Fibonacci number depends on the previous two int fib(int n) { static vector fibs; if (n <= 1) return 1; if (fibs[n] == 0) fibs[n] = fib(n - 1) + fib(n - 2); return fibs[n]; } recurrence: runtime:

To Do Project IV –Write an algorithm over your Graph Data Structure! Finish reading Chapter 10 Come to the movie Friday

Coming Up Other Data Structures No Quiz tomorrow Movie!! (& pizza)