Trees.

Slides:



Advertisements
Similar presentations
Chapter 11 Trees Graphs III (Trees, MSTs) Reading: Epp Chp 11.5, 11.6.
Advertisements

Lecture 15. Graph Algorithms
Trees Chapter 11.
Greedy Algorithms Greed is good. (Some of the time)
Today’s topics Decision Trees Reading: Sections 9.1 CompSci 102.
22C:19 Discrete Structures Trees Spring 2014 Sukumar Ghosh.
1 Copyright M.R.K. Krishna Rao 2003 Ch 9 - Trees Definition: A tree is a connected undirected graph with no simple circuits. Since a tree cannot have a.
Discussion #36 Spanning Trees
Spanning Trees.
Lists A list is a finite, ordered sequence of data items. Two Implementations –Arrays –Linked Lists.
Spanning Trees. 2 Spanning trees Suppose you have a connected undirected graph Connected: every node is reachable from every other node Undirected: edges.
Module #1 - Logic 1 Based on Rosen, Discrete Mathematics & Its Applications. Prepared by (c) , Michael P. Frank and Modified By Mingwu Chen Trees.
Spanning Trees. Spanning trees Suppose you have a connected undirected graph –Connected: every node is reachable from every other node –Undirected: edges.
KNURE, Software department, Ph , N.V. Bilous Faculty of computer sciences Software department, KNURE The trees.
Minimum Spanning Tree in Graph - Week Problem: Laying Telephone Wire Central office.
May 5, 2015Applied Discrete Mathematics Week 13: Boolean Algebra 1 Dijkstra’s Algorithm procedure Dijkstra(G: weighted connected simple graph with vertices.
BCT 2083 DISCRETE STRUCTURE AND APPLICATIONS
LOGO.  Trees:  In these slides: Introduction to trees Applications of trees Tree traversal 2.
May 1, 2002Applied Discrete Mathematics Week 13: Graphs and Trees 1News CSEMS Scholarships for CS and Math students (US citizens only) $3,125 per year.
Foundations of Discrete Mathematics
Tree A connected graph that contains no simple circuits is called a tree. Because a tree cannot have a simple circuit, a tree cannot contain multiple.
Discrete Structures Lecture 12: Trees Ji Yanyan United International College Thanks to Professor Michael Hvidsten.
Trees – Chapter 9 Slides courtesy of Dr. Michael P. Frank University of Florida Dept. of Computer & Information Science & Engineering.
17.1 CompSci 102 Today’s topics Instant InsanityInstant Insanity TreesTrees –Properties –Applications Reading: Sections Reading: Sections
Agenda Review: –Planar Graphs Lecture Content:  Concepts of Trees  Spanning Trees  Binary Trees Exercise.
5.5.2 M inimum spanning trees  Definition 24: A minimum spanning tree in a connected weighted graph is a spanning tree that has the smallest possible.
Trees Dr. Yasir Ali. A graph is called a tree if, and only if, it is circuit-free and connected. A graph is called a forest if, and only if, it is circuit-free.
Chapter 10: Trees A tree is a connected simple undirected graph with no simple circuits. Properties: There is a unique simple path between any 2 of its.
Discrete Mathematics Chapter 10 Trees.
Chapter 11. Chapter Summary  Introduction to trees (11.1)  Application of trees (11.2)  Tree traversal (11.3)  Spanning trees (11.4)
Chapter 11. Chapter Summary Introduction to Trees Applications of Trees (not currently included in overheads) Tree Traversal Spanning Trees Minimum Spanning.
Discrete Mathematics Trees.
Non Linear Data Structure
Applied Discrete Mathematics Week 15: Trees
Chapter 5 : Trees.
Discrete Mathematicsq
Trees Chapter 11.
Csc 2720 Instructor: Zhuojun Duan
12. Graphs and Trees 2 Summary
Introduction to Trees Section 11.1.
Week 11 - Friday CS221.
Minimum Spanning Trees and Shortest Paths
Lecture 12 Algorithm Analysis
i206: Lecture 13: Recursion, continued Trees
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Spanning Trees Longin Jan Latecki Temple University based on slides by
Taibah University College of Computer Science & Engineering Course Title: Discrete Mathematics Code: CS 103 Chapter 10 Trees Slides are adopted from “Discrete.
CSE 421: Introduction to Algorithms
Minimum-Cost Spanning Tree
Minimum Spanning Tree.
Graphs Chapter 13.
Spanning Trees.
Graphs.
Minimum Spanning Tree Section 7.3: Examples {1,2,3,4}
Trees CMSC 202, Version 5/02.
Chapter 11 Graphs.
CMSC 202 Trees.
Minimum Spanning Trees
Trees.
Trees 11.1 Introduction to Trees Dr. Halimah Alshehri.
And the Final Subject is…
Minimum Spanning Tree.
Minimum Spanning Trees
Spanning Trees Longin Jan Latecki Temple University based on slides by
Important Problem Types and Fundamental Data Structures
Chapter 14 Graphs © 2011 Pearson Addison-Wesley. All rights reserved.
Trees Chapter 11.
Minimum-Cost Spanning Tree
INTRODUCTION A graph G=(V,E) consists of a finite non empty set of vertices V , and a finite set of edges E which connect pairs of vertices .
Presentation transcript:

Trees

Trees Definition: A tree is a connected undirected graph with no simple circuits. Since a tree cannot have a simple circuit, a tree cannot contain multiple edges or loops. Therefore, any tree must be a simple graph. Theorem: An undirected graph is a tree if and only if there is a unique simple path between any of its vertices.

Trees Example: Are the following graphs trees? Yes. No. No. Yes.

Trees Definition: An undirected graph that does not contain simple circuits and is not necessarily connected is called a forest. In general, we use trees to represent hierarchical structures. We often designate a particular vertex of a tree as the root. Since there is a unique path from the root to each vertex of the graph, we direct each edge away from the root. Thus, a tree together with its root produces a directed graph called a rooted tree.

Tree Terminology If v is a vertex in a rooted tree other than the root, the parent of v is the unique vertex u such that there is a directed edge from u to v. When u is the parent of v, v is called the child of u. Vertices with the same parent are called siblings. The ancestors of a vertex other than the root are the vertices in the path from the root to this vertex, excluding the vertex itself and including the root.

Tree Terminology The descendants of a vertex v are those vertices that have v as an ancestor. A vertex of a tree is called a leaf if it has no children. Vertices that have children are called internal vertices. If a is a vertex in a tree, then the subtree with a as its root is the subgraph of the tree consisting of a and its descendants and all edges incident to these descendants.

Tree Terminology The level of a vertex v in a rooted tree is the length of the unique path from the root to this vertex. The level of the root is defined to be zero. The height of a rooted tree is the maximum of the levels of vertices.

Trees Example I: Family tree James Christine Bob Frank Joyce Petra

Trees Example II: Linux file system / usr bin temp bin spool ls

Trees Example III: Arithmetic expressions * + - y z x y This tree represents the expression (y + z)*(x - y).

Properties of Trees Theorem: A tree with n vertices has n-1 edges. (Can be proven using induction.) Basis Step: Let n=1. A tree with 1 vertex has no edges, so basis step is true Inductive step: Suppose every tree with k vertices has k-1 edges. Let T be a tree with k+1 vertices, and let v be a leaf of T (which must exist because the tree is finite). Let w be the parent of v.

Properties of Trees Removing the vertex v and the edge connecting w to v produces a tree T’. T’ is a tree with k vertices and is a tree because it is still connected and has no simple circuits. By the inductive hypothesis, T’ has k-1 edges. It follows that T has k edges, because it has one more edge than T’ (the edge connecting w and v). This completes the inductive step.

Properties of Trees Definition: A forest is a graph that has no cycles and that is not necessarily connected. Example: This is a forest with three connected components

Trees Definition: A rooted tree is called an m-ary tree if every internal vertex has no more than m children. The tree is called a full m-ary tree if every internal vertex has exactly m children. An m-ary tree with m = 2 is called a binary tree. Theorem: A full m-ary tree with k internal vertices contains n = mk + 1 vertices.

Trees Theorem: There are at most mh leaves in an m-ary tree of height h. (proof is by induction) Corollary: If an m-ary tree of height h has l leaves, then h≥logm. If the m-ary tree is full and balanced, then h=logm. (Recall this is the ceiling function.) (First part follows from theorem; second part uses definition of balanced tree)

Binary Search Trees If we want to perform a large number of searches in a particular list of items, it can be worthwhile to arrange these items in a binary search tree to facilitate the subsequent searches.

Binary Search Trees A binary search tree is a binary tree in which each child of a vertex is designated as a right or left child, and each vertex is labeled with a key, which is one of the items. When we construct the tree, vertices are assigned keys so that the key of a vertex is both larger than the keys of all vertices in its left subtree and smaller than the keys of all vertices in its right subtree.

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math computer

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer north

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer north zoo

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer dentist north zoo

Binary Search Trees Example: Construct a binary search tree for the strings math, computer, power, north, zoo, dentist, book. math power computer book dentist north zoo

Binary Search Trees To perform a search in such a tree for an item x, we can start at the root and compare its key to x. If x is less than the key, we proceed to the left child of the current vertex, and if x is greater than the key, we proceed to the right one. This procedure is repeated until we either found the item we were looking for, or we cannot proceed any further. In a balanced tree representing a list of n items, search can be performed with a maximum of log(n + 1) steps (compare with binary search).

Locating or adding an item to a Binary Search Tree procedure insertion(T: binary search tree, x:item) v := root of T {a vertex not present in T has the value null} while v != null and label(v) != x if x < label(v) then if left child of v != null then v := left child of v else add new vertex as a left child of v and set v := null else if right child of v != null then v := right child of v else add new vertex as a right child of v and set v := null if root of T = null then add a vertex v to the tree and label it with x else if v is null or label(v) != x then label new vertex with x and let v be this new vertex return v {v = location of x}

Decision Trees Definition: A decision tree represents a decision-making process. Each possible “decision point” or situation is represented by a node. Each possible choice that could be made at that decision point is represented by an edge to a child node. In the extended decision trees used in decision analysis, we also include nodes that represent random events and their outcomes.

Counterfeit Coin Problem Imagine you have 8 coins, one of which is a lighter counterfeit, and a free-beam balance. No scale of weight markings is required for this problem! How many weighings are needed to guarantee that the counterfeit coin will be found? ?

As a Decision-Tree Problem In each situation, we pick two disjoint and equal-size subsets of coins to put on the scale. A given sequence of weighings thus yields a decision tree with branching factor 3. The balance then “decides” whether to tip left, tip right, or stay balanced.

Applying the Tree Height Theorem The decision tree must have at least 8 leaf nodes, since there are 8 possible outcomes. In terms of which coin is the counterfeit one. Recall the tree-height theorem, h≥logm. Thus the decision tree must have height h ≥ log38 = 1.893… = 2. Let’s see if we solve the problem with only 2 weighings…

General Solution Strategy The problem is an example of searching for 1 unique particular item, from among a list of n otherwise identical items. Somewhat analogous to the adage of “searching for a needle in haystack.” Armed with our balance, we can attack the problem using a divide-and-conquer strategy, like what’s done in binary search. We want to narrow down the set of possible locations where the desired item (coin) could be found down from n to just 1, in a logarithmic fashion.

General Solution Strategy Each weighing has 3 possible outcomes. Thus, we should use it to partition the search space into 3 pieces that are as close to equal-sized as possible. This strategy will lead to the minimum possible worst-case number of weighings required.

General Balance Strategy On each step, put n/3 of the n coins to be searched on each side of the scale. If the scale tips to the left, then: The lightweight fake is in the right set of n/3 ≈ n/3 coins. If the scale tips to the right, then: The lightweight fake is in the left set of n/3 ≈ n/3 coins. If the scale stays balanced, then: The fake is in the remaining set of n − 2n/3 ≈ n/3 coins that were not weighed! Except if n mod 3 = 1 then we can do a little better by weighing n/3 of the coins on each side. You can prove that this strategy always leads to a balanced 3-ary tree.

Coin Balancing Decision Tree Here’s what the tree looks like in our case: 123 vs 456 left: 123 balanced: 78 right: 456 1 vs. 2 4 vs. 5 7 vs. 8 L:1 R:2 B:3 L:4 R:5 B:6 L:7 R:8

Tree Traversal Unlike linked lists, one-dimensional arrays and other linear data structures, which are canonically traversed in linear order, trees may be traversed in multiple ways. They may be traversed in depth-first or breadth-first order. There are three common ways to traverse them in depth-first order: pre-order, in-order and post-order.

Tree Traversal Pre-Order Check if current node is empty/null Display the data part of the node Traverse the left subtree by recursively calling the pre-order function Traverse the right subtree by recursively calling the pre-order function Pre-order: F, B, A, D, C, E, G, I, H

Tree Traversal In-Order Check if current node is empty/null Traverse the left subtree by recursively calling the in-order function Display the data part of the node Traverse the right subtree by recursively calling the in-order function In-order: A, B, C, D, E, F, G, H, I

Tree Traversal Post-Order Check if current node is empty/null Traverse the left subtree by recursively calling the post-order function Traverse the right subtree by recursively calling the post-order function Display the data part of the node Post-order: A, C, E, D, B, H, I, G, F

Tree Traversal Breadth-First Level-order Visit every node on a level before going to a lower level Level-order: F, B, G, A, d, I, C, E, H

Use of a stack It is very common to use a stack to keep track of: nodes to be visited next, or nodes that we have already visited. Typically, use of a stack leads to a depth-first visit order. Depth-first visit order is “aggressive” in the sense that it examines complete paths.

Use of a queue It is very common to use a queue to keep track of: nodes to be visited next, or nodes that we have already visited. Typically, use of a queue leads to a breadth-first visit order. Breadth-first visit order is “cautious” in the sense that it examines every path of length i before going on to paths of length i+1.

Spanning trees Suppose you have a connected undirected graph Connected: every node is reachable from every other node Undirected: edges do not have an associated direction ...then a spanning tree of the graph is a connected subgraph in which there are no cycles A connected, undirected graph Four of the spanning trees of the graph

Finding a spanning tree To find a spanning tree of a graph, pick an initial node and call it part of the spanning tree do a search from the initial node: each time you find a node that is not in the spanning tree, add to the spanning tree both the new node and the edge you followed to get to it An undirected graph One possible result of a BFS starting from top One possible result of a DFS starting from top

Minimizing costs Suppose you want to supply a set of houses (say, in a new subdivision) with: electric power water sewage lines telephone lines To keep costs down, you could connect these houses with a spanning tree (of, for example, power lines) However, the houses are not all equal distances apart To reduce costs even further, you could connect the houses with a minimum-cost spanning tree

Minimum-cost spanning trees Suppose you have a connected undirected graph with a weight (or cost) associated with each edge The cost of a spanning tree would be the sum of the costs of its edges A minimum-cost spanning tree is a spanning tree that has the lowest cost A B E D F C 16 19 21 11 33 14 18 10 6 5 A connected, undirected graph A B E D F C 16 11 18 6 5 A minimum-cost spanning tree

Finding spanning trees There are two basic algorithms for finding minimum-cost spanning trees, and both are greedy algorithms Kruskal’s algorithm: Start with no nodes or edges in the spanning tree, and repeatedly add the cheapest edge that does not create a cycle Here, we consider the spanning tree to consist of edges only Prim’s algorithm: Start with any one node in the spanning tree, and repeatedly add the cheapest edge, and the node it leads to, for which the node is not already in the spanning tree. Here, we consider the spanning tree to consist of both nodes and edges

Kruskal’s algorithm T = empty spanning tree; E = set of edges; N = number of nodes in graph; while T has fewer than N - 1 edges { remove an edge (v, w) of lowest cost from E if adding (v, w) to T would create a cycle then discard (v, w) else add (v, w) to T } Finding an edge of lowest cost can be done just by sorting the edges Efficient testing for a cycle requires a fairly complex algorithm (UNION-FIND) which we don’t cover in this course

Prim’s algorithm T = a spanning tree containing a single node s; E = set of edges adjacent to s; while T does not contain all the nodes { remove an edge (v, w) of lowest cost from E if w is already in T then discard edge (v, w) else { add edge (v, w) and node w to T add to E the edges adjacent to w } An edge of lowest cost can be found with a priority queue Testing for a cycle is automatic Hence, Prim’s algorithm is far simpler to implement than Kruskal’s algorithm

Mazes Typically, Every location in a maze is reachable from the starting location There is only one path from start to finish If the cells are “vertices” and the open doors between cells are “edges,” this describes a spanning tree Since there is exactly one path between any pair of cells, any cells can be used as “start” and “finish” This describes a spanning tree

Mazes as spanning trees While not every maze is a spanning tree, most can be represented as such The nodes are “places” within the maze There is exactly one cycle-free path from any node to any other node

Building a maze I This algorithm requires two sets of cells the set of cells already in the spanning tree, IN the set of cells adjacent to the cells in the spanning tree (but not in it themselves), called the FRONTIER Start with all walls present Pick any cell and put it into IN (red) • Put all adjacent cells, that aren’t in IN, into FRONTIER (blue)

Building a maze II Repeatedly do the following: Remove any one cell C from FRONTIER and put it in IN Erase the wall between C and some one adjacent cell in IN Add to FRONTIER all the cells adjacent to C that aren’t in IN (or in FRONTIER already) • Continue until there are no more cells in FRONTIER • When the maze is complete (or at any time), choose the start and finish cells