Optimization Problems Minimum Spanning Tree Behavioral Abstraction.

Slides:



Advertisements
Similar presentations
Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Advertisements

Lecture 15. Graph Algorithms
Problem solving with graph search
Greed is good. (Some of the time)
IKI 10100: Data Structures & Algorithms Ruli Manurung (acknowledgments to Denny & Ade Azurat) 1 Fasilkom UI Ruli Manurung (Fasilkom UI)IKI10100: Lecture10.
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
Graphs Chapter 12. Chapter Objectives  To become familiar with graph terminology and the different types of graphs  To study a Graph ADT and different.
CS 206 Introduction to Computer Science II 11 / 07 / 2008 Instructor: Michael Eckmann.
Graphs Graphs are the most general data structures we will study in this course. A graph is a more general version of connected nodes than the tree. Both.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
Graph Algorithms: Minimum Spanning Tree We are given a weighted, undirected graph G = (V, E), with weight function w:
1 Greedy Algorithms. 2 2 A short list of categories Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Graphs Chapter 12. Chapter 12: Graphs2 Chapter Objectives To become familiar with graph terminology and the different types of graphs To study a Graph.
Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
Spring 2010CS 2251 Graphs Chapter 10. Spring 2010CS 2252 Chapter Objectives To become familiar with graph terminology and the different types of graphs.
CS 206 Introduction to Computer Science II 11 / 05 / 2008 Instructor: Michael Eckmann.
Fall 2007CS 2251 Graphs Chapter 12. Fall 2007CS 2252 Chapter Objectives To become familiar with graph terminology and the different types of graphs To.
ASC Program Example Part 3 of Associative Computing Examining the MST code in ASC Primer.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Minimum Spanning Trees. Subgraph A graph G is a subgraph of graph H if –The vertices of G are a subset of the vertices of H, and –The edges of G are a.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 223 – Advanced Data Structures Graph Algorithms: Minimum.
Lecture 23. Greedy Algorithms
Introduction to Object-oriented programming and software development Lecture 1.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
SPANNING TREES Lecture 21 CS2110 – Spring
Graph Dr. Bernard Chen Ph.D. University of Central Arkansas.
Minimum Spanning Trees
Algorithm Course Dr. Aref Rashad February Algorithms Course..... Dr. Aref Rashad Part: 5 Graph Algorithms.
© 2015 JW Ryder CSCI 203 Data Structures1. © 2015 JW Ryder CSCI 203 Data Structures2.
Spanning Trees CSIT 402 Data Structures II 1. 2 Two Algorithms Prim: (build tree incrementally) – Pick lower cost edge connected to known (incomplete)
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Introduction to Classes and Objects. Real Life When a design engineer needs an electrical motor he doesn’t need to worry about –How a foundry will cast.
Minimum Spanning Trees CS 146 Prof. Sin-Min Lee Regina Wang.
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
SPANNING TREES Lecture 20 CS2110 – Fall Spanning Trees  Definitions  Minimum spanning trees  3 greedy algorithms (incl. Kruskal’s & Prim’s)
Graphs. Graphs Similar to the graphs you’ve known since the 5 th grade: line graphs, bar graphs, etc., but more general. Those mathematical graphs are.
Graphs Chapter 12. Chapter 12: Graphs2 Chapter Objectives To become familiar with graph terminology and the different types of graphs To study a Graph.
Minimum Spanning Trees CSE 373 Data Structures Lecture 21.
Graphs Upon completion you will be able to:
MA/CSSE 473 Day 34 MST details: Kruskal's Algorithm Prim's Algorithm.
Lecture 19 Minimal Spanning Trees CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Chapter 20: Graphs. Objectives In this chapter, you will: – Learn about graphs – Become familiar with the basic terminology of graph theory – Discover.
CSE 373: Data Structures and Algorithms Lecture 21: Graphs V 1.
Greedy Technique.
Design & Analysis of Algorithm Greedy Algorithm
Minimum Spanning Trees and Shortest Paths
CS 3343: Analysis of Algorithms
Short paths and spanning trees
CSCE350 Algorithms and Data Structure
CSE 373 Data Structures and Algorithms
Abstraction in Object-Oriented Programming
CSE332: Data Abstractions Lecture 18: Minimum Spanning Trees
Introduction Object-Oriented Programming
CSCI2100 Data Structures Tutorial
Introduction Object-Oriented Programming Reprise
Minimum Spanning Tree.
Minimum Spanning Trees
CSE 373: Data Structures and Algorithms
Minimum spanning trees
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Chapter 14 Graphs © 2011 Pearson Addison-Wesley. All rights reserved.
Algorithm Course Dr. Aref Rashad
Abstraction and Objects
More Graphs Lecture 19 CS2110 – Fall 2009.
Presentation transcript:

Optimization Problems Minimum Spanning Tree Behavioral Abstraction

Optimization Problems

Optimization Algorithms Many real-world problems involve maximizing or minimizing a value: How can a car manufacturer get the most parts out of a piece of sheet metal? How can a moving company fit the most furniture into a truck of a certain size? How can the phone company route calls to get the best use of its lines and connections? How can a university schedule its classes to make the best use of classrooms without conflicts?

Optimal vs. Approximate Solutions Often, we can make a choice: Do we want the guaranteed optimal solution to the problem, even though it might take a lot of work/time to find? Do we want to spend less time/work and find an approximate solution (near optimal)?

Approximate Solutions

Greedy Algorithms Approximates optimal solution May or may not find optimal solution Provides “quick and dirty” estimates A greedy algorithm makes a series of “short-sighted” decisions and does not “look ahead” Spends less time

A Greedy Algorithm Consider the weight and value of some foreign coins: foo$ grams bar$ grams baz$ grams qux $ grams If we can only fit 860 grams in our pockets... A greedy algorithm would choose: 1 foo500 grams = $ qux300 grams = $0.50 Optimal solution is: 1 bar 450 grams = $ baz410 grams = $3.00 Total of $6.50 Total of $7.00

Short-Sighted Decisions 1 12 Start End 1 2

The Shortest Path Problem Given a directed, acyclic, weighted graph… Start at some vertex A What is the shortest path from start vertex A to some end vertex B?

A Greedy Algorithm Start End

A Greedy Algorithm Start End

A Greedy Algorithm Start End

A Greedy Algorithm Start End Path = 15

A Greedy Algorithm Start End Shortest Path = 13

Dynamic Planning

Calculates all of the possible solution options, then chooses the best one. Implemented recursively. Produces an optimal solution. Spends more time.

Bellman’s Principle of Optimality Regardless of how you reach a particular state (graph node), the optimal strategy for reaching the goal state is always the same. This greatly simplifies the strategy for searching for an optimal solution.

The Shortest Path Problem Given a directed, acyclic, weighted graph What is the shortest path from the start vertex to some end vertex? Minimize the sum of the edge weights

Dynamic Planning Start End a c d g f e b

Dynamic Planning b = min(6+g, 5+e, 7+f) End g f e b Notation: ‘x’ means “shortest path to x”

Dynamic Planning g = min(6+d, 14) e = min(3+c, 7+d, 7+g) f = 2+c c d g f e 14 a

Dynamic Planning c = min(5, 11+d) d = Start a c d

b = min(6+g, 5+e, 7+f) g = min(6+d, 14) e = min(3+c, 7+d, 7+g) f = 2+c c = min(5, 11+d) d = 3 via “a to d”

b = min(6+g, 5+e, 7+f) g = min(6+d, 14) e = min(3+c, 7+d, 7+g) f = 2+c c = min(5, 11+d) d = 3 via “a to d”

b = min(6+g, 5+e, 7+f) g = min(6+3, 14) e = min(3+c, 7+3, 7+g) f = 2+c c = min(5, 11+3) d = 3 via “a to d”

b = min(6+g, 5+e, 7+f) g = min(9, 14) e = min(3+c, 10, 7+g) f = 2+c c = min(5, 14) d = 3 via “a to d”

b = min(6+g, 5+e, 7+f) g = min(9, 14) e = min(3+c, 10, 7+g) f = 2+c c = min(5, 14) d = 3 via “a to d”

b = min(6+g, 5+e, 7+f) g = 9 via “a to d to g” e = min(3+c, 10, 7+g) f = 2+c c = 5 via “a to c” d = 3 via “a to d”

b = min(6+g, 5+e, 7+f) g = 9 via “a to d to g” e = min(3+c, 10, 7+g) f = 2+c c = 5 via “a to c” d = 3 via “a to d”

b = min(6+9, 5+e, 7+f) g = 9 via “a to d to g” e = min(3+5, 10, 7+9) f = 2+5 c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = min(8, 10, 16) f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = min(8, 10, 16) f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 5+8, 7+7) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 13, 14) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = min(15, 13, 14) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

b = 13 via “a to c to e to b” g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”

Dynamic Planning Start End a c d 7 6 g f e b Shortest Path = 13

Summary Greedy algorithms –Make short-sighted, “best guess” decisions –Required less time/work –Provide approximate solutions Dynamic planning –Examines all possible solutions –Requires more time/work –Guarantees optimal solution

Questions?

Minimum Spanning Tree Prim’s Algorithm Kruskal’s Algorithm

The Scenario Construct a telephone network… We’ve got to connect many cities together –Each city must be connected –We want to minimize the total cable used

The Scenario Construct a telephone network… We’ve got to connect many cities together –Each city must be connected –We want to minimize the total cable used

Minimum Spanning Tree Required: Connect all nodes at minimum cost.

Minimum Spanning Tree Required: Connect all nodes at minimum cost

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights S.T. = 3S.T. = 5S.T. = 4

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights M.S.T. = 3S.T. = 5S.T. = 4

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node Unique solution. –Can come from different sets of edges 4 4 4

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node Unique solution. –Can come from different sets of edges Pick any 2

Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node Unique solution. –Can come from different sets of edges Two algorithms –Prim’s –Kruskal’s

Prim’s Algorithm Start with any vertex. All its edges are candidates. Stop looking when all vertices are picked Otherwise repeat… –Pick minimum edge (no cycles) and connect the adjacent vertex –Add all edges from that new vertex to our “candidates”

Prim’s Algorithm Pick any vertex and add it to “vertices” list Loop Exitif (“vertices” contains all the vertices in graph) Select shortest, unmarked edge coming from “vertices” list If (shortest edge does NOT create a cycle) then Add that edge to your “edges” Add the adjoining vertex to the “vertices” list Endif Mark that edge as having been considered Endloop

Start

NO!

LOOP

Prim’s: MST = 66

An Alternate Algorithm: Kruskal’s Algorithm

Kruskal’s Algorithm Sort edges in graph in increasing order Select the shortest edge Loop Exitif all edges examined Select next shortest edge (if no cycle created) Mark this edge as examined Endloop This guarantees an MST, but as it is built, edges do not have to be connected

Sort the Edges 2, 3, 4, 5, 6, 7, 8, 9, 9, 10, 11, 12, 13, 14, 15, 17, 18, 19, 20, 21, 24

Still NO!

Or

Kruskal’s: MST = 66

Prim’s: MST = 66

Summary Minimum spanning trees –Connect all vertices –With no cycles –And minimize the total edge cost Prim’s and Kruskal’s algorithm –Generate minimum spanning trees –Give same total cost, but may give different trees (if graph has edges with same weight)

Questions?

Introduction to the Object-Oriented Paradigm

The Scenario Recall the concept of a Queue: –Defined by a set of behaviors –Enqueue to the end –Dequeue from the front –First in, first out Items

Where is the Queue? The logical idea of the queue consisted of: –Data stored in some structure (list, array, etc.) –Modules to act on that data But where is the queue? We’d like some way of representing such a structure in our programs.

Issues As is, there is no way to “protect” against violating the specified behaviors. Items EnqueueDequeue Sneak into Middle

Issues We’d like a way to put a “wrapper” around our structure to protect against this. Items EnqueueDequeue Sneak into Middle

Motivation We need ways to manage the growing complexity and size of programs. We can better model our algorithmic solutions to real world phenomenon. Contracts of responsibility can establish clearer communication and more protected manipulation.

Procedural Abstraction Procedural Abstractions organize instructions. Function Power Give me two numbers (base & exponent) I’ll return to you base exponent ??? Implementation ???

Data Abstraction Data Abstractions organize data. Name (string) GPA (num) Letter Grade (char) Student Number (num) StudentType

Behavioral Abstraction Behavioral Abstractions combine procedural and data abstractions. Data State Enqueue Is Full Is Empty Dequeue Initialize Queue Object

The Object-Oriented Paradigm Instances of behavioral abstractions are known as objects. Objects have a clear interface by which they send and receive messages (communicate). OO is a design and approach issue. Just because a language offers object-oriented features doesn’t mean you’re doing OO programming.

Benefits of Object-Oriented Paradigm Encapsulation – information hiding, control, and design Reuse – create libraries of tested, optimized classes Adaptability – controlled interactions with minimal coupling Better model real world and larger problems Break down tasks better

Information Hiding Information Hiding means that the user has enough information to use the interface, and no more Example: stick shift We don’t care about the inner workings… We just want to get the car going! R

Encapsulation Item 1 Item 2 Item3 Encapsulation is the grouping of related things together within some structure

Encapsulation via Algorithms Instructions Procedure Function Algorithm Data Algorithms encapsulate modules, data, and instructions.

Encapsulating Instructions Instructions Module call Procedure/Function Instructions Modules encapsulate instructions.

Encapsulating Data Field 1 Field 2 Record Records allow us to do this with data

Revisiting the “Where is it?” Question Notice we still have no way of identifying the idea we’re discussing… –The Queue is still in the “ether.” We’d like to encapsulate the data (regardless of it’s actual representation) with the behavior (modules). Once we do this, we’ve got a logical entity to which we can point and say, “there it is!”

Encapsulating Data with Methods Enqueue Data Dequeue Queue Abstract data types (ADTs) allow us to do this with logically related data and modules

An idea, a concept, an abstraction of the “big picture” It encapsulates the abstract behaviors and data implied by the thing being modeled. Abstract Data Types Data State Enqueue Is Full Is Empty Dequeue Initialize Queue

Achieving Behavioral Abstraction Abstract data types (ADTs) are concepts. We require some way to implement these common abstractions so we can write them once, verify that they are correct, and reuse them. This would save us from having to re-do work. For example, every time we create a queue we did: List_Node definesa... q_front isoftype... q_tail isoftype... procedure Enqueue(...) procedure Dequeue(...) We need an algorithmic construct that will allow us to bundle these things together… the class.

Summary Behavioral abstraction combines data abstraction with procedural abstraction. The object-oriented paradigm focuses on the interaction and manipulation of objects. An Abstract Data Type (ADT) allows us to think of what we’re representing as a thing regardless of it’s actual implementation.

Questions?