Algorithm & Application

Slides:



Advertisements
Similar presentations
WinTool HyperMILL Interface Single Source Tool Management
Advertisements

3.6 Solving Systems of Linear Equations in 3 Variables p. 177.
Chapter 7: Deadlocks Adapted by Donghui Zhang from the original version by Silberschatz et al.
Solving Systems of Equations by Substitution Objectives: Solve Systems of Equations using substitution. Solve Real World problems involving systems of.
10/01/2014 DMI - Università di Catania 1 Combinatorial Landscapes & Evolutionary Algorithms Prof. Giuseppe Nicosia University of Catania Department of.
Slide 1 Insert your own content. Slide 2 Insert your own content.
Analysis of Algorithms
and 6.855J Cycle Canceling Algorithm. 2 A minimum cost flow problem , $4 20, $1 20, $2 25, $2 25, $5 20, $6 30, $
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of Chapter 10: Recursion Problem Solving & Program Design in C Sixth Edition.
Problems on Absolute Values
0 - 0.
Teacher Name Class / Subject Date A:B: Write an answer here #1 Write your question Here C:D: Write an answer here.
Instruction Sets: Characteristics and Functions Addressing Modes
CS4026 Formal Models of Computation Running Haskell Programs – power.
Adding/Subtracting/Multiplying Exponents when bases are the same Absent Tues/Wed 11/13,14.
Problem # Problem #
Richmond House, Liverpool (1) 26 th January 2004.
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
Multiply/Divide integers Absent 10/14
MONDAY DART I can solve problems involving percent of change. DART I can solve problems involving percent of change.
© C. Kemke Constructive Problem Solving 1 COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Machine Learning: Intro and Supervised Classification
PSAE CLICKER REVIEW On your scratch paper, work out each question. When you have an answer, enter that letter on your clicker. When time is called the.
Solving Absolute Value Equations Solving Absolute Value Equations
Factoring Grouping (Bust-the-b) Ex. 3x2 + 14x Ex. 6x2 + 7x + 2.
EOC Practice #19 SPI
Solve a simple absolute value equation
Bottoms Up Factoring. Start with the X-box 3-9 Product Sum
X-box Factoring. X- Box 3-9 Product Sum Factor the x-box way Example: Factor 3x 2 -13x (3)(-10)= x 2x 3x 2 x-5 3x +2.
Use addition to eliminate a variable
1.6 – Solving Compound and Absolute Value Inequalities
Solve an equation by multiplying by a reciprocal
Limitation of Computation Power – P, NP, and NP-complete
Foundations of Algorithms, Fourth Edition
Lecture 2: Fundamental Concepts
Text Chapters 1, 2. Sorting ä Sorting Problem: ä Input: A sequence of n numbers ä Output: A permutation (reordering) of the input sequence such that:
1 Module 2: Fundamental Concepts Problems Programs –Programming languages.
CS 101 Course Summary December 5, Big Ideas Abstraction Problem solving Fundamentals of programming.
1. The Role of the Algorithms in Computer Hsu, Lih-Hsing
Welcome to CS201!!! Introduction to Programming Using Visual Basic.
Fall 2004COMP 3351 Time Complexity We use a multitape Turing machine We count the number of steps until a string is accepted We use the O(k) notation.
4.8 Quadratic Formula and Discriminant
1 Programming Thinking and Method (0) Zhao Hai 赵海 Department of Computer Science and Engineering Shanghai Jiao Tong University
5.4 – Solving Compound Inequalities. Ex. Solve and graph the solution.
3.1 System of Equations Solve by graphing. Ex 1) x + y = 3 5x – y = -27 Which one is the solution of this system? (1,2) or (-4,7) *Check (1,2)Check (-4,7)
3.6 Solving Absolute Value Equations and Inequalities
Chapter 6 Questions Quick Quiz
Foundations of Algorithms, Fourth Edition
Solving Multi-Step Inequalities
Solve Equations with Rational Coefficients. 1.2x = 36 Check your answer = x Check 1.2x = (30) = 36 ? 36 = 36 ? 120 = -0.24y
Chapter 1. The Role of the Algorithms in Computer.
Costas Busch - LSU1 Time Complexity. Costas Busch - LSU2 Consider a deterministic Turing Machine which decides a language.
Elimination Method Day 2 Today’s Objective: I can solve a system using elimination.
1 Time Complexity We use a multitape Turing machine We count the number of steps until a string is accepted We use the O(k) notation.
Data Structures By Dr. Mehedi Masud ِAssociate Professor, Computer Science Dept. College of Computers and Information Systems Taif University 1.
Prof. Busch - LSU1 Time Complexity. Prof. Busch - LSU2 Consider a deterministic Turing Machine which decides a language.
Mathematical Analysis of Recursive Algorithm CSG3F3 Lecture 7.
Slides for CISC 2315: Discrete Structures Chapters CISC 2315 Discrete Structures Professor William G. Tanner, Jr. Fall 2007 Slides created by James.
Formal Foundations-II [Theory of Automata]
7.6 Solving Radical Equations
3.3 – Solving Systems of Inequalities by Graphing
Advanced Engineering Mathematics, Third Edition
Section 5.8 The Quadratic Formula
Section 5.8 The Quadratic Formula
Foundations of Algorithms, Fourth Edition
Design and Analysis of Algorithms
Solving Compound Inequalities
7.6 Solving Radical Equations
Presentation transcript:

Algorithm & Application Algorithm : A step-by-step procedure for solving a problem Prof. Hyunchul Shin shin@hanyang.ac.kr Hanyang University

Foundations of Algorithms Richard Neapolitan and Kumarss Naimipour 3rd Edition. Jones and Bartlett Computer Science, 2004 Time : CPU cycles Storage: memory Instance: Each specific assignment of values to parameters

Problem Is the number x in the list S of n numbers? The answer is yes if x is in S and no if it is not. (ex) S={10,7,11,5,13,8} , n=6 , and x=5 . Solution “yes” Algorithm : search ( S, n, x ) { for ( i=1; i<=n; i++ ) if S[i]==x, return ( “yes” ); return ( “no” ); } /* cf. text P5 */

Exchange Sort Problem : Sort n keys in nondecreasing order Inputs : n, S[1],…,S[n] Outputs : Sorted keys in the array S. Algorithm: Exchange Sort { for( i=1; i<=n; i++ ) for( j=i+1; j<=n; j++ ) if( S[j] < S[i]) exchange S[i] and S[j] }

Algorithm Exchange Sort Algorithm: Exchange Sort (ex) n=4 S=[ 4 3 1 5] { for( i=1; i<=n; i++ ) for( j=i+1; j<=n; j++ ) if( S[j] < S[i]) exchange S[i] and S[j] } Homework Show i , j , S , for exchange sort of S=[ 3 8 5 9 7]. Due 1 week i j S 1 2 1 5 3 4 5 4 4 3 1 5 1 4 1 3 3 4 1 3 1 5 3 4 3 5 4 5

Matrix Multiplication Cn×n=An×n . Bn×n Cij= aik . Bkj , for i<=n, j <=n. (ex) = Algorithm { /*Matrix multiplication*/ for( i=1; i<=n; i++ ) for( j=1; j<=n; j++ ) { C[i][j]=0; for(k=1;k<=n; k++) C[i][j]= C[i][j] + A[i][k] ×B[k][j]; }

Fibonacci Sequence f0=0 f1=1 fn= fn-1 + fn-2 for n>=2. (ex) f2 =f1 + f0 =1 + 0=1 f3 =f2 + f1 =1 + 1=2 f4 =f3 + f2 =2 + 1=3 f5 =f4 + f3 =3 + 2=5 …

Fibonacci (Recursive) int fib (int n) { /*divide-and-conquer : chap2 */ if(n<=1) return n; else return( fib(n-1) + fib(n-2) ); } (ex) fib(5) computation

Fibonacci (Iterative) Int fib_iter (int n){/*dynamic programing:chap3*/ Index i; int f[0..n]; f[0]=0; If(n>0){ f[1]=1; for( i=2; i<=n; i++ ) f[i]=f[i-1]+f[i-2]; } Return f[n]; Complexity (cf text p16) Fib(100) takes 13 days. Fib_iter(100) takes 101 n sec

Complexity: Exchange Sort Algorithm: Exchange Sort { for( i=1; i<=n; i++ ) for( j=i+1; j<=n; j++ ) if( S[j] < S[i]) exchange S[i] and S[j] } Basic operation: Comparison of S[j] with S[i] Input size: n, the number of items to be sorted. Complexity: the number of basic operations T(n)=(n-1)+(n-2)+(n-3)+…+1 =(n-1).n/2 ЄO(n2)

Complexity: Matrix Multiplication Algorithm { /*Matrix multiplication*/ for( i=1; i<=n; i++ ) for( j=1; j<=n; j++ ) { C[i][j]=0; for(k=1;k<=n; k++) C[i][j]= C[i][j] + A[i][k] ×B[k][j]; } Basic operation: multiplication (innermost for loop) Input size: n, #rows and #columns Complexity: T(n)=n×n×n =n3 ЄO(n3)

Memory Complexity Analysis of algorithm efficiency in terms of memory. Time complexity is usually used. Memory complexity is occasionally useful.

Order : Big O Definition For a given complexity function f(n), O(f(n)) is the set of complexity functions g(n) for which there exists some positive real constant c and some nonnegative integer N such that for all n ≤ N, 𝑔 𝑛 ≤𝑐×𝑓(𝑛). (ex) T1(n)=(n-1).n/2 Є O(n2) T2(n)=n3 Є O(n3) T3(n)=10000 𝑛 2 + 10 6 𝑛+1000 Є O(n2) (cf. p29)

Divide and Conquer Top-Down Approach (p47) Divide the problem into subproblems Conquer subproblems Obtain the solution from the solutions of subproblems Binary search Problem: Is x in the sorted array S of size n ? Inputs: Sorted array S, a key x. Outputs: Location of x in S (0 if x is not in S)

Binary Search Locationout=location(1,n); Index location (index low, index high) { index mid; if(low>high) return 0 ; else{ mid= ; if (x==S[mid]) return mid; else if (x<S[mid]) return location (low,mid-1); else return location (mid+1,high); }

Worst-Case Complexity: Binary Search Locationout=location(1,n); Index location (index low, index high){ index mid; if(low>high) return 0 ; else{ mid= (low+high)/2; if (x==S[mid]) return mid; else if (x<S[mid]) return location (low,mid-1); else return location (mid+1,high); } Basic operation: Comparison of x with S[mid] Input size: n (#items in the array S) W(n)=W(n/2) + 1 recursive top call level

Complexity: Binary Search W(n)=W(n/2)+1 , for n>1 , n a power of 2 W(1)=1 It appears that W(n) = log n + 1 (Induction base) For n=1, t1=1=log1+1 (Induction hypothesis) Assume that W(n)=log n + 1 (Induction step) L=W(2n)=log(2n)+1 R=W(2n)=W(n)+1=(logn + 1)+1 =logn + log2 + 1=log(2n) + 1

Merge Sort (O(nlogn))

Quick Sort Sort by dividing the array into two partitions Using a pivot item. (ex)(first item) Quick sort(index low , index high) {index pivot;/*index of the pivot*/ if(high>low){ partition (low, high, pivot ); quicksort (low, pivot-1); quicksort (pivot+1, high); }

Homework Given 20 15 25 22 11 20 30 27 (n=8) Mergesort as in Fig 2.2 P54 Quicksort as in Fig 2.3 P61 Partition as in Table 2.2 P62 Due 1 week

Worst-case complexity: Quick sort Worst case: When the array is already sorted Time to partition: Tp (x) = n – 1 Time to sort left subarray = T(0) Time to sort right subarray = T(n-1) Quick Sort T(n) = T(0) + T(n-1) + (n-1) for n > 0 T(n) = T(n-1) + (n-1), since T(0) = 0 T(n) = n(n-1)/2 ϵ O(n2) Average-case complexity: O (nlogn)

Dynamic Programming (Bottom-up) Establish a recursive property Solve in bottom-up fashion by solving smaller instances first (ex): Fibonacci (Iterative) Divide-and-conquer Divide a problem into smaller instances Solve these smaller instances (blindly) Examples: Fibonacci (Recursive): Instances are related Merge sort: Instances are unrelated

Binomial coefficient Frequently, n! is too large to compute directly Proof:

Binomial coefficients: Divide-and-conquer Algorithm /* Inefficient */ int bin (int n, int k) { if ( k = = 0 || n = = k) return 1; else return bin (n-1, k - 1)+bin (n - 1, k); }

Binomial coefficients Figure 3.1: The array B used to compute the binomial coefficient Complexity : O(nK)

Example 3.1: Compute Compute row 0: {This is done only to mimic the algorithm exactly.}   {The value B [0] [0] is not needed in a later computation.} B [0] [0] = 1 Compute row 1: B [1] [0] = 1 B [1] [1] = 1 Compute row 2: B [2] [0] = 1 B [2] [1] = B [1] [0] + B [1] [1] = 1+1 = 2 B [2] [2] = 1 Compute row 3: B [3] [0] = 1 B [3] [1] = B [2] [0] + B [2] [1] = 1+2 = 3 B [3] [2] = B [2] [1] + B [2] [2] = 2+1 = 3 Compute row 4: B [4] [0] = 1 B [4] [1] = B [3] [0] + B [3] [1] = 1+3 = 4 B [4] [2] = B [3] [1] + B [3] [2] = 3+3 = 6

Binomial Coefficient: Dynamic Programming Establish a recursive property Solve in bottom up fashion Algorithm: int bin2 (int n, int k) {   index i, j;   int B[0..n][0..k];   for (i = 0; i < = n; i ++)     for (j = 0; j < = minimum(i, k);  j ++)       if (j == 0 || j == i)           B[i][j] = 1;       else           B[i][j] = B[i - 1][j - 1] + B[i - 1][j];   return B[n][k]; }

HOMEWORK Use dynamic programming approach to compute B[5][3]. Draw diagram like figure 3.1 (Page 94) Due in 1 week

Binary Search Tree Definition: For a given node n, Each node contains one key Key (node in the left subtree of n) <= Key (n) Key(n) <= Key(node in the right subtree of n) Optimality depends on the probability

Binary Search Tree Depth(n): # edges in the unique path from the root to n. (Depth=level) Search time = depth(key) + 1 The root has a depth of 0.

Binary Search Algorithm struct nodetype{ Key type key; Nodetype* left; Nodetype* right; }; typeof nodetype* node_pointer; Void search (node_pointer tree, keytype keyin, node_pointer & p) { {bool found=false; p=tree; while(!found) if (p->key==keyin) found=true; elseif(keyin < p->key) p=p->left; else p=p->right; }

Example

Greedy Approach Start with an empty set and add items to the set until the set represents a solution. Each iteration consists of the following components: A selection procedure A feasibility check A solution check

Spanning Tree A connected subgraph that contains all the vertices and is a tree.

Graph G=(V , E) Where V is a finite set of vertices and E is a set of edges (pairs of vertices in V). (ex) V={v1, v2, v3, v4, v5} E={(v1, v2,), (v1, v3,), (v2, v3,), (v2, v4,), (v3, v4,), (v3, v5,), (v4, v5,), }

Weight of a Graph

Prim’s Algorithm Figure 4.4: A weighted graph (in upper-left corner) and the steps in Prim's algorithm for that graph. The vertices in Y and the edges if F are shaded at each step.

Prim’s Algorithm F = Ø; for (i = 2; i <= n; i++){ // Initialize nearest [i] = 1; // v1 is the nearest distance [i] = W[1] [i] ; // distance is the weight } repeat (n - 1 times){ // Add all n - 1 vertices min = ∞ for (i = 2; i <= n; i++) if (0 ≤ distance [i] < min) { min = distance [i]; vnear = i; e = edge connecting vnear and nearest [vnear]; F=F ∪{e} //add e to F distance [vnear] = - 1; for (i = 2; i <= n; i++) //update distance if (W[i] [vnear] < distance [i]){ distance = W[i] [vnear]; nearest [i] = vnear;

Prim’s Algorithm 2 3 4 5 distance 1 ∞ nearest

2 3 4 5 distance -1 6 ∞ nearest 1

2 3 4 5 distance -1 nearest 1

2 3 4 5 distance -1 nearest 1

2 3 4 5 distance -1 nearest 1

Prim’s Spanning Tree Complexity : O(n2) (n-1) iterations of the repeat loop (n-1) iterations in two for loops T(n)= 2(n-1) (n-1) Theorem Prim’s algorithm always produces a minimum Spanning tree.

Dijkstra’s Shortest Paths Figure 4.8: A weighted, directed graph (in upper-left corner) and the steps in Dijkstra's algorithm for that graph. The vertices in Y and the edges in F are shaded in color at each step.

Dijkstra’s Algorithm F = Ø; for (i = 2; i<= n; i++){ //Initialize touch [i] = 1; // paths from V1 length [i] = W[1] [i]; } repeat (n - 1 times){ min = ∞; for (i = 2; i < = n; i++) if ( 0 ≤ length [i] < min) { min = length [i]; vnear = i; e = edge from from[vnear] to vnear; F=F ∪{e} //add e to F if (length [vnear] + W[vnear] [i] < length [i]){ length[i] = length[vnear] + W[vnear][i]; touch[i] = vnear; length[vnear] = -1;

Complexity Prim’s and Dijkstra’s : O (n2) Heap implementation : O (mlogn) Fibonacci heap implementation : O (m + nlogn) 1.Find a minimum spanning tree for the following graph 2.Find the shortest paths from V4 to all the other vertices

Scheduling Minimizing the total time (waiting + service) (ex) Three jobs : t1=5, t2=10, t3=4. Schedule Total Time in the System [1, 2, 3] 5+(5+10)+(5+10+4) = 39 [1, 3, 2] 5+(5+4)+(5+4+10) = 33 [2, 1, 3] 10+(10+5)+(10+5+4) = 44 . . . [3, 1, 2] 4+(4+5)+(4+5+10) = 32 3! cases

Optimal Scheduling for Total Time Smallest service time first. // Sort the jobs in nondecreasing order of service time // Schedule in sorted order. Complexity (sorting) w(n)Є O ( nlogn )

Schedule with Deadlines Schedule to maximize the total profit. Each job takes one unit of time to finish. (ex) Job Deadline Profit 1 2 30 2 1 35 3 2 25 4 1 40 [1,3] : TP=30+25=55 [2,1] : TP=35+30=65 … [4,1] : TP=40+30=70.(optimal) Is highest profit first optimal?

Schedule with Deadlines Profit and deadline should be considered Problem : Maximize total profit Input : n jobs, deadline[1..n], sorted by profits in nonincreasing order Output : An optimal sequence J for the jobs. Algorithm Schedule (O(n2)) J=[1] for (i = 2; i <= n; i++){ K = J with i added according to nondecreasing values of deadline [i]; if (K is feasible) J = K; } }

Suppose we have the jobs in Example 4. 4 Suppose we have the jobs in Example 4.4. Homework : scheduling Recall that they had the following deadlines: 1. schedule to minimize the total time. Job Deadline Profit Job Service time 1 3 40 1 7 2 1 35 2 3 3 1 30 3 10 4 3 25 4 5 5 1 20 2.Schedule with deadlines for max profit 6 3 15 Job Deadline Profit 7 2 10 1 2 30 Algorithm 4.4 does the following: 2 1 35 1.J is set to [1]. 3 2 25 2.K is set to [2, 1] and is determined to be feasible. 4 1 40 J is set to [2, 1] because K is feasible. 5 3 50 3.K is set to [2, 3, 1] and is rejected because it is not feasible. 4.K is set to [2, 1, 4] and is determined to be feasible. J is set to [2, 1, 4] because K is feasible. 5.K is set to [2, 5, 1, 4] and is rejected because it is not feasible. 6.K is set to [2, 1, 6, 4] and is rejected because it is not feasible. 7.K is set to [2, 7, 1, 4] and is rejected because it is not feasible. The final value of J is [2, 1, 4].

Huffman Code Variable-length binary code for data compression Prefix code : No codeword constitutes the beginning of another codeword. (ex) 0 1 is the code for ‘a’ 0 1 1 can not be a code ( for ‘b’ ).

Prefix Code Figure 4.10: The binary character code for Code C2 in Example 4.7 appears in (a), while the one for Code C3 (Huffman) appears in (b).

Variable Length (Prefix) Code Bits(C1)=16(3)+5(3)+12(3)+17(3)+10(3)+25(3)=255 Bits(C2)=16(2)+5(5)+12(4)+17(3)+10(5)+25(1)=231 Bits(C3)=16(2)+5(4)+12(3)+17(2)+10(4)+25(2)=212

Huffman Code (Optimal) Figure 4.11: Given the file whose frequencies are shown in Table 4.1, this shows the state of the subtrees, constructed by Huffman's algorithm, after each pass through the for-i loop. The first tree is the state before the loop is entered

Huffman Algorithm Priority queue : Highest priority (lowest frequency) Element is removed first Homework for(i=1; i<=n-1; i++){ remove(PQ,p); remove(PQ,q); r=new nodetype; r->left=p; r->right=q; r->frequency=p->frequency + q ->frequency; insert(PQ, r); } remove(PQ, r) return r; Priority queue (heap) Initialization O(n) Each heap operation O(logn) Huffman algorithm complexity O(nlogn).

Knapsack Problem Let s={item 1, item 2, …, item n} wi =weight of itemi pi =profit of itemi W =max weight the knapsack can hold Determine a subset A of S such that (ex)item1 :$50, 5kg ($50/5 = 10) item2 :$60, 10kg ($60/10 = 6) item3 :$140, 20kg ($140/20 = 7) =30 kg

Example :0-1 Knapsack Figure 4.13: A greedy solution and an optimal solution to the 0-1 Knapsack problem.

Dynamic Programming : 0-1 Knapsack for i>0 and w>0, let P[i][w] be the optimal profit obtained when choosing items only from the first i items under the restriction that the total weight cannot exceed w, Max profit = P[n][ ] P[n][ ] can be computed from 2D array P with rows(0 to n) and Columns(0 to ). P[0][w]=0 P[i][0]=0

Example :Dynamic Prog.(knapsack) (ex)item1 :$50, 5kg ($50/5 = 10) item2 :$60, 10kg ($60/10 = 6) item3 :$140, 20kg ($140/20 = 7) =30 kg P[3][30] w3 =20 P[2][30] P[2][10] w2 =10 w2 =10 P[1][30] P[1][20] P[1][10] P[1][0] $50 $50 $50 $0

Example : Dynamic Prog. (ex)item1 :$50, 5kg ($50/5 = 10) item2 :$60, 10kg ($60/10 = 6) item3 :$140, 20kg ($140/20 = 7) =30 kg P[3][30] $200 P3 =$140 P[2][30] $110 P[2][10] $60 P2 =$60 P2 =$60 P[1][30] P[1][20] P[1][10] P[1][0] $50 $50 $50 $0

Complexity : Dynamic Prog.(Knapsack) (n-i)th row : 2i entries are computed Total number of entries =1+2+22+…+2n-1 =2n-1 Complexity : O(2n)

Backtracking ◆ Path finding in a maze ●If dead end, pursue another path ●If a sign were positioned near the beginning of the path, the time saving could be enormous ◆ Backtracking ●After determining that a node can lead to nothing but dead ends, we go back (backtrack) to the parent node and proceed with the search on the next child. ●Pruning the nonpromising subtree

4 Queens Problem Figure 5.5: The actual chessboard positions that are tried when backtracking is used to solve the instance of the n-Queens problem in which n = 4. Each nonpromising position is marked with a cross.

n-Queens Problem void queens (index i) { index j; if (promising (i)) if (i == n) cout << col [1] through col [n]; else for (j = 1; j <= n; j++){ // See if queen in col [i + 1] = j; // (i + 1) st row can be queens (i + 1); // positioned in each of // the n columns. } bool promising (index i) index k; bool switch; k = 1; switch = true; // Check if any queen threatens while (k < i && switch){ // queen in the ith row. if (col [i] == col [k] || abs (col [i] - col [k] == i --k) switch = false; k++; return switch;

Branch-and-Bound ◆Exponential-time complexity in the worst case ●Dynamic programming ●Backtracking ◆ Branch-and-bound algorithm ●Are improvement on the backtracking algorithm ●No limit in the way of traversing the tree (Best-first or breadth-first) ●Used only for optimization problems (Bound determines whether the node is promising)

Branch-and-Bound ◆ A node is nonpromising if (upper) bound is less than or equal to maxprofit (value of best solution found up to that point). (ex) 0-1 Knapsack weight(profit): weight profit sum of items up to the node Promising? (Bound should be computed to decide) Sorted items by(pi/wi)

Branch-and-Bound:0-1 Knapsack ◆Promising? If a node is at level i, and the node at level k is the one whose weight would bring the weight above , then totweight=weight + wj bound=(profit + Pj)+( - totweight)∙Pk/wK ◆ Nonpromising if bound < maxprofit or weight >

B&B Example 0–1 Knapsack problem( =16) Ordered according to pi/wi i Pi wi pi/wi 1 $40 2 $20 2 $30 5 $6 3 $50 10 $5 4 $10 5 $2

B&B Knapsack : Breath-First ( =16) Figure 6.2: The pruned state space tree produced using breadth-first search with branch-and-bound pruning in Example 6.1. Stored at each node from top to bottom are the total profit of the items stolen up to that node, their total weight, and the bound on the total profit that could be obtained by expanding beyond the node. The node shaded in color is the one at which an optimal solution is found.

B&B Knapsack: Best-First Figure 6.3: The pruned state space tree produced using best-first search with branch-and-bound pruning in Example 6.2. Stored at each node from top to bottom are the total profit of the items stolen up to the node, their total weight, and the bound on the total profit that could be obtained by expanding beyond the node. The node shaded in color is the one at which an optimal solution is found.

Problem Solving Approaches ◆ Behavioral approach ● Relationship between a stimulus (input) and response(output). ● Without speculating about the intervening Process. ◆ Information processing approach ● Based on the process that intervenes between input and output an leads to a desired goal from an initial state. ● Thinking to achieve a desired goal. Rubinstein & Firstenberg, Patterns of Problem Solving, Prentice Hall, 1995

Model of Memory Sensory Register Short Term Memory long Term Memory forgetting 0.1-0.5 sec forgetting ◆ Sensory register ● Important information to higher-order systems ● The rest quickly fades ◆Short-term memory or working memory ● Limited capacity (bottleneck) ● n±2 unrelated items (n digit phone number ) ◆Long-term memory ● A network of interconnecting ideas, concepts, and facts.

Short-Term Memory (STM) ◆ Limited working memory ● 3×4 is easy. ● 5+3×144 is hard, since STM can not retain all the Subcalculations. ● You can not remember a long sentence. ◆ Information in STM is replaced with competing information. ◆ Information will be transferred to LTM or lost.

Long-Term Memory(LTM) ◆ A network of interconnecting ideas ● Learning new information: Integrating that information within the structure. ● The richer the cognitive structure already set up in LTM, the easier it is to learn new information. ● Familial topic is easy. ◆ Multiple relationships among pieces of information that are stored(=>creative thinking) ● The richness and complexity leads to the easiest types of retrieval from memory. ● LTM can not “fill up”.

Forgetting ◆ Two theories of forgetting ● Changes during storage causing the information to decay. ● Failure to retrieve the information. ◆ Effective forgetting to update memories ● Need to know where we parked the car today (not yesterday). ◆ Difficult or impossible ● When a friend tells you a secret and adds “forget I ever said anything”.

Heap & Priority Queue ◆ Priority Queue: The highest priority element is always removed first. A PQ can be implemented as a linked list, but more efficiently as a heap. ◆Heap: A heap is an essentially complete binary tree such that The values come from an ordered set. Heap property is satisfied. Value(parent node) >= Value(child node)

A Heap Essentially complete binary tree (of depth d) Complete binary tree down to a depth of d-1 Nodes with depth d are as far to the left as possible Heap property: Value(parent) >= Value(child) A heap

siftdown Input tree: heap property except the root Output: a heap void siftdown (heap& H) // H starts out having the { // heap property for all node parent, largerchild; // nodes except the root. // H ends up a heap. parent = root of H; largerchild = parent's child containing larger key; while (key at parent is smaller than key at largerchild){ exchange key at parent and key at largerchild; parent = largerchild; }

sfitdown Procedure siftdown sifts 6 down until the heap property is restored.

Remove Root Remove the key at the root and restore the heap property. keytype root (heap& H) { keytype keyout; keyout = key at the root; move the key at the bottom node to the root; // Bottom node is delete the bottom node; // far-right leaf. siftdown (H); // Restore the return keyout; // heap property. } Given a heap of n keys, place the keys in sorted array S. void removekeys (int n, heap H, keytype S[]) O (nlog n) index i; for (i = n; i >= 1; i--) S[i] = root (H);

Make Heap Transform all subtrees whose roots have depth d-i into heaps for i=1,2,…,d. Complexity O(n) void makeheap (int n, heap& H) // H ends-up a heap. { index i; heap Hsub; // Hsub ends up a heap. for (i = d - 1; i >= 0; i--) // Tree has depth d. for (all subtrees Hsub whose roots have depth i) siftdown (Hsub); }

Make Heap Using siftdown to make a heap from an essentially complete binary tree. After the steps shown, the right subtree, whose root has depth d-2, must be made into a heap, and finally the entire tree must be made into a heap.

Make Heap More with depth d-2 Depth d-3

Heapsort void heapsort (int n, heap H, // H ends up a heap. keytype S[]) { makeheap (n, H); removekeys (n, H, S); } A heap The array representation of the heap.