Download presentation
Presentation is loading. Please wait.
Published byAndrew Fisher Modified over 9 years ago
1
Optimization Problems Minimum Spanning Tree Behavioral Abstraction
2
Optimization Problems
3
Optimization Algorithms Many real-world problems involve maximizing or minimizing a value: How can a car manufacturer get the most parts out of a piece of sheet metal? How can a moving company fit the most furniture into a truck of a certain size? How can the phone company route calls to get the best use of its lines and connections? How can a university schedule its classes to make the best use of classrooms without conflicts?
4
Optimal vs. Approximate Solutions Often, we can make a choice: Do we want the guaranteed optimal solution to the problem, even though it might take a lot of work/time to find? Do we want to spend less time/work and find an approximate solution (near optimal)?
5
Approximate Solutions
6
Greedy Algorithms Approximates optimal solution May or may not find optimal solution Provides “quick and dirty” estimates A greedy algorithm makes a series of “short-sighted” decisions and does not “look ahead” Spends less time
7
A Greedy Algorithm Consider the weight and value of some foreign coins: foo$6.00500 grams bar$4.00450 grams baz$3.00410 grams qux $0.50300 grams If we can only fit 860 grams in our pockets... A greedy algorithm would choose: 1 foo500 grams = $6.00 1 qux300 grams = $0.50 Optimal solution is: 1 bar 450 grams = $4.00 1 baz410 grams = $3.00 Total of $6.50 Total of $7.00
8
Short-Sighted Decisions 1 12 Start End 1 2
9
The Shortest Path Problem Given a directed, acyclic, weighted graph… Start at some vertex A What is the shortest path from start vertex A to some end vertex B?
10
A Greedy Algorithm 5 6 7 3 11 3 14 2 7 5 7 6 Start End
11
A Greedy Algorithm 5 6 7 3 11 3 14 2 7 5 7 6 Start End
12
A Greedy Algorithm 5 6 7 3 11 3 14 2 7 5 7 6 Start End
13
A Greedy Algorithm 5 6 7 3 11 3 14 2 7 5 7 6 Start End Path = 15
14
A Greedy Algorithm 5 6 7 3 11 3 14 2 7 5 7 6 Start End Shortest Path = 13
15
Dynamic Planning
16
Calculates all of the possible solution options, then chooses the best one. Implemented recursively. Produces an optimal solution. Spends more time.
17
Bellman’s Principle of Optimality Regardless of how you reach a particular state (graph node), the optimal strategy for reaching the goal state is always the same. This greatly simplifies the strategy for searching for an optimal solution.
18
The Shortest Path Problem Given a directed, acyclic, weighted graph What is the shortest path from the start vertex to some end vertex? Minimize the sum of the edge weights
19
Dynamic Planning 5 6 7 3 11 3 14 2 7 Start End a c d 7 5 6 g f e b
20
Dynamic Planning b = min(6+g, 5+e, 7+f) End 7 5 6 g f e b Notation: ‘x’ means “shortest path to x”
21
Dynamic Planning g = min(6+d, 14) e = min(3+c, 7+d, 7+g) f = 2+c 6 7 3 2 7 c d g f e 14 a
22
Dynamic Planning c = min(5, 11+d) d = 3 5 11 3 Start a c d
23
b = min(6+g, 5+e, 7+f) g = min(6+d, 14) e = min(3+c, 7+d, 7+g) f = 2+c c = min(5, 11+d) d = 3 via “a to d”
24
b = min(6+g, 5+e, 7+f) g = min(6+d, 14) e = min(3+c, 7+d, 7+g) f = 2+c c = min(5, 11+d) d = 3 via “a to d”
25
b = min(6+g, 5+e, 7+f) g = min(6+3, 14) e = min(3+c, 7+3, 7+g) f = 2+c c = min(5, 11+3) d = 3 via “a to d”
26
b = min(6+g, 5+e, 7+f) g = min(9, 14) e = min(3+c, 10, 7+g) f = 2+c c = min(5, 14) d = 3 via “a to d”
27
b = min(6+g, 5+e, 7+f) g = min(9, 14) e = min(3+c, 10, 7+g) f = 2+c c = min(5, 14) d = 3 via “a to d”
28
b = min(6+g, 5+e, 7+f) g = 9 via “a to d to g” e = min(3+c, 10, 7+g) f = 2+c c = 5 via “a to c” d = 3 via “a to d”
29
b = min(6+g, 5+e, 7+f) g = 9 via “a to d to g” e = min(3+c, 10, 7+g) f = 2+c c = 5 via “a to c” d = 3 via “a to d”
30
b = min(6+9, 5+e, 7+f) g = 9 via “a to d to g” e = min(3+5, 10, 7+9) f = 2+5 c = 5 via “a to c” d = 3 via “a to d”
31
b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = min(8, 10, 16) f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
32
b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = min(8, 10, 16) f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
33
b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
34
b = min(15, 5+e, 7+f) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
35
b = min(15, 5+8, 7+7) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
36
b = min(15, 13, 14) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
37
b = min(15, 13, 14) g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
38
b = 13 via “a to c to e to b” g = 9 via “a to d to g” e = 8 via “a to c to e” f = 7 via “a to c to f” c = 5 via “a to c” d = 3 via “a to d”
39
Dynamic Planning 6 7 11 3 14 2 7 Start End a c d 7 6 g f e b 5 3 5 Shortest Path = 13
40
Summary Greedy algorithms –Make short-sighted, “best guess” decisions –Required less time/work –Provide approximate solutions Dynamic planning –Examines all possible solutions –Requires more time/work –Guarantees optimal solution
41
Questions?
42
Minimum Spanning Tree Prim’s Algorithm Kruskal’s Algorithm
43
The Scenario Construct a telephone network… We’ve got to connect many cities together –Each city must be connected –We want to minimize the total cable used
44
The Scenario Construct a telephone network… We’ve got to connect many cities together –Each city must be connected –We want to minimize the total cable used
45
Minimum Spanning Tree Required: Connect all nodes at minimum cost.
46
Minimum Spanning Tree Required: Connect all nodes at minimum cost. 1 3 2
47
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights 1 2 3 2 1 3
48
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights 1 2 3 2 1 3 S.T. = 3S.T. = 5S.T. = 4
49
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights 1 3 2 1 3 2 1 3 2 M.S.T. = 3S.T. = 5S.T. = 4
50
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node
51
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node Unique solution. –Can come from different sets of edges 4 4 4
52
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node Unique solution. –Can come from different sets of edges 4 4 4 Pick any 2
53
Minimum Spanning Tree Required: Connect all nodes at minimum cost. Cost is sum of edge weights Can start at any node Unique solution. –Can come from different sets of edges Two algorithms –Prim’s –Kruskal’s
54
Prim’s Algorithm Start with any vertex. All its edges are candidates. Stop looking when all vertices are picked Otherwise repeat… –Pick minimum edge (no cycles) and connect the adjacent vertex –Add all edges from that new vertex to our “candidates”
55
Prim’s Algorithm Pick any vertex and add it to “vertices” list Loop Exitif (“vertices” contains all the vertices in graph) Select shortest, unmarked edge coming from “vertices” list If (shortest edge does NOT create a cycle) then Add that edge to your “edges” Add the adjoining vertex to the “vertices” list Endif Mark that edge as having been considered Endloop
56
4 2 7 5 18 6 9 3 15 14 19 8 10 12 20 9 17 11 13 21 24
57
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 Start 20 21 24
58
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
59
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
60
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
61
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
62
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
63
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
64
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24 NO!
65
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24 LOOP
66
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
67
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
68
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
69
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
70
4 2 7 5 6 9 3 10 9 11 Prim’s: MST = 66
71
An Alternate Algorithm: Kruskal’s Algorithm
72
Kruskal’s Algorithm Sort edges in graph in increasing order Select the shortest edge Loop Exitif all edges examined Select next shortest edge (if no cycle created) Mark this edge as examined Endloop This guarantees an MST, but as it is built, edges do not have to be connected
73
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
74
Sort the Edges 2, 3, 4, 5, 6, 7, 8, 9, 9, 10, 11, 12, 13, 14, 15, 17, 18, 19, 20, 21, 24
75
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
76
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
77
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
78
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
79
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
80
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
81
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24 Still NO!
82
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
83
Or
84
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
85
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
86
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
87
4 2 7 5 18 6 9 3 15 14 19 8 10 12 9 17 11 13 20 21 24
88
4 2 7 5 6 9 3 10 9 11 Kruskal’s: MST = 66
89
4 2 7 5 6 9 3 10 9 11 Prim’s: MST = 66
90
Summary Minimum spanning trees –Connect all vertices –With no cycles –And minimize the total edge cost Prim’s and Kruskal’s algorithm –Generate minimum spanning trees –Give same total cost, but may give different trees (if graph has edges with same weight)
91
Questions?
92
Introduction to the Object-Oriented Paradigm
93
The Scenario Recall the concept of a Queue: –Defined by a set of behaviors –Enqueue to the end –Dequeue from the front –First in, first out Items
94
Where is the Queue? The logical idea of the queue consisted of: –Data stored in some structure (list, array, etc.) –Modules to act on that data But where is the queue? We’d like some way of representing such a structure in our programs.
95
Issues As is, there is no way to “protect” against violating the specified behaviors. Items EnqueueDequeue Sneak into Middle
96
Issues We’d like a way to put a “wrapper” around our structure to protect against this. Items EnqueueDequeue Sneak into Middle
97
Motivation We need ways to manage the growing complexity and size of programs. We can better model our algorithmic solutions to real world phenomenon. Contracts of responsibility can establish clearer communication and more protected manipulation.
98
Procedural Abstraction Procedural Abstractions organize instructions. Function Power Give me two numbers (base & exponent) I’ll return to you base exponent ??? Implementation ???
99
Data Abstraction Data Abstractions organize data. Name (string) GPA (num) Letter Grade (char) Student Number (num) StudentType
100
Behavioral Abstraction Behavioral Abstractions combine procedural and data abstractions. Data State Enqueue Is Full Is Empty Dequeue Initialize Queue Object
101
The Object-Oriented Paradigm Instances of behavioral abstractions are known as objects. Objects have a clear interface by which they send and receive messages (communicate). OO is a design and approach issue. Just because a language offers object-oriented features doesn’t mean you’re doing OO programming.
102
Benefits of Object-Oriented Paradigm Encapsulation – information hiding, control, and design Reuse – create libraries of tested, optimized classes Adaptability – controlled interactions with minimal coupling Better model real world and larger problems Break down tasks better
103
Information Hiding Information Hiding means that the user has enough information to use the interface, and no more Example: stick shift We don’t care about the inner workings… We just want to get the car going! 1 2 3 4 5 R
104
Encapsulation Item 1 Item 2 Item3 Encapsulation is the grouping of related things together within some structure
105
Encapsulation via Algorithms Instructions Procedure Function Algorithm Data Algorithms encapsulate modules, data, and instructions.
106
Encapsulating Instructions Instructions Module call Procedure/Function Instructions Modules encapsulate instructions.
107
Encapsulating Data Field 1 Field 2 Record Records allow us to do this with data
108
Revisiting the “Where is it?” Question Notice we still have no way of identifying the idea we’re discussing… –The Queue is still in the “ether.” We’d like to encapsulate the data (regardless of it’s actual representation) with the behavior (modules). Once we do this, we’ve got a logical entity to which we can point and say, “there it is!”
109
Encapsulating Data with Methods Enqueue Data Dequeue Queue Abstract data types (ADTs) allow us to do this with logically related data and modules
110
An idea, a concept, an abstraction of the “big picture” It encapsulates the abstract behaviors and data implied by the thing being modeled. Abstract Data Types Data State Enqueue Is Full Is Empty Dequeue Initialize Queue
111
Achieving Behavioral Abstraction Abstract data types (ADTs) are concepts. We require some way to implement these common abstractions so we can write them once, verify that they are correct, and reuse them. This would save us from having to re-do work. For example, every time we create a queue we did: List_Node definesa... q_front isoftype... q_tail isoftype... procedure Enqueue(...) procedure Dequeue(...) We need an algorithmic construct that will allow us to bundle these things together… the class.
112
Summary Behavioral abstraction combines data abstraction with procedural abstraction. The object-oriented paradigm focuses on the interaction and manipulation of objects. An Abstract Data Type (ADT) allows us to think of what we’re representing as a thing regardless of it’s actual implementation.
113
Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.