1 Parallel Parentheses Matching Plus Some Applications.

Slides:



Advertisements
Similar presentations
Parallel List Ranking Advanced Algorithms & Data Structures Lecture Theme 17 Prof. Dr. Th. Ottmann Summer Semester 2006.
Advertisements

Parallel Sorting Sathish Vadhiyar. Sorting  Sorting n keys over p processors  Sort and move the keys to the appropriate processor so that every key.
Lecture 3: Parallel Algorithm Design
Partitioning and Divide-and-Conquer Strategies ITCS 4/5145 Parallel Computing, UNC-Charlotte, B. Wilkinson, Jan 23, 2013.
Prune-and-Search Method
CS4413 Divide-and-Conquer
1 Potential for Parallel Computation Module 2. 2 Potential for Parallelism Much trivially parallel computing  Independent data, accounts  Nothing to.
1 Merge Sort Review of Sorting Merge Sort. 2 Sorting Algorithms Selection Sort uses a priority queue P implemented with an unsorted sequence: –Phase 1:
Stephen P. Carl - CS 2421 Recursive Sorting Algorithms Reading: Chapter 5.
Chapter 4: Divide and Conquer Master Theorem, Mergesort, Quicksort, Binary Search, Binary Trees The Design and Analysis of Algorithms.
Advanced Topics in Algorithms and Data Structures Lecture pg 1 Recursion.
CSC2100B Quick Sort and Merge Sort Xin 1. Quick Sort Efficient sorting algorithm Example of Divide and Conquer algorithm Two phases ◦ Partition phase.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 4 Some of the sides are exported from different sources.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Ver. 1.0 Session 5 Data Structures and Algorithms Objectives In this session, you will learn to: Sort data by using quick sort Sort data by using merge.
Advanced Topics in Algorithms and Data Structures Lecture 6.1 – pg 1 An overview of lecture 6 A parallel search algorithm A parallel merging algorithm.
Advanced Topics in Algorithms and Data Structures Page 1 Parallel merging through partitioning The partitioning strategy consists of: Breaking up the given.
Sorting Algorithms CS 524 – High-Performance Computing.
1 Friday, November 17, 2006 “In the confrontation between the stream and the rock, the stream always wins, not through strength but by perseverance.” -H.
Chapter 4: Divide and Conquer The Design and Analysis of Algorithms.
1 Lecture 11 Sorting Parallel Computing Fall 2008.
Parallel Merging Advanced Algorithms & Data Structures Lecture Theme 15 Prof. Dr. Th. Ottmann Summer Semester 2006.
CS 584. Sorting n One of the most common operations n Definition: –Arrange an unordered collection of elements into a monotonically increasing or decreasing.
Topic Overview One-to-All Broadcast and All-to-One Reduction
Advanced Topics in Algorithms and Data Structures 1 Two parallel list ranking algorithms An O (log n ) time and O ( n log n ) work list ranking algorithm.
Basic PRAM algorithms Problem 1. Min of n numbers Problem 2. Computing a position of the first one in the sequence of 0’s and 1’s.
Design and Analysis of Algorithms - Chapter 41 Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two.
1 Sorting Algorithms - Rearranging a list of numbers into increasing (strictly non-decreasing) order. ITCS4145/5145, Parallel Programming B. Wilkinson.
1 Parallel Sorting Algorithms. 2 Potential Speedup O(nlogn) optimal sequential sorting algorithm Best we can expect based upon a sequential sorting algorithm.
Chapter 15: Advanced Topics: Introducing Data Structures and Recursion Visual Basic.NET Programming: From Problem Analysis to Program Design.
Copyright © 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin “ Introduction to the Design & Analysis of Algorithms, ” 2 nd ed., Ch. 1 Chapter.
Outline  introduction  Sorting Networks  Bubble Sort and its Variants 2.
Nattee Niparnan. Recall  Complexity Analysis  Comparison of Two Algos  Big O  Simplification  From source code  Recursive.
Complexity of algorithms Algorithms can be classified by the amount of time they need to complete compared to their input size. There is a wide variety:
File Organization and Processing Week 13 Divide and Conquer.
1 PRAM Algorithms Sums Prefix Sums by Doubling List Ranking.
Complexity 20-1 Complexity Andrei Bulatov Parallel Arithmetic.
Sorting Chapter Sorting Consider list x 1, x 2, x 3, … x n We seek to arrange the elements of the list in order –Ascending or descending Some O(n.
1 Section 2.1 Algorithms. 2 Algorithm A finite set of precise instructions for performing a computation or for solving a problem.
CHAPTER 3 STACK CSEB324 DATA STRUCTURES & ALGORITHM.
1. 2 Sorting Algorithms - rearranging a list of numbers into increasing (strictly nondecreasing) order.
Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2.Solve smaller instances.
Parallel and Distributed Simulation Time Parallel Simulation.
Slides for Parallel Programming Techniques & Applications Using Networked Workstations & Parallel Computers 2nd ed., by B. Wilkinson & M
“Sorting networks and their applications”, AFIPS Proc. of 1968 Spring Joint Computer Conference, Vol. 32, pp
Divide and Conquer Strategy
Ananth Grama, Anshul Gupta, George Karypis, and Vipin Kumar
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
Data Structures and Algorithms in Parallel Computing Lecture 8.
A Different Solution  alternatively we can use the following algorithm: 1. if n == 0 done, otherwise I. print the string once II. print the string (n.
Unit-8 Sorting Algorithms Prepared By:-H.M.PATEL.
CSCI-455/552 Introduction to High Performance Computing Lecture 21.
Divide and Conquer Algorithms Sathish Vadhiyar. Introduction  One of the important parallel algorithm models  The idea is to decompose the problem into.
Merge Sort 1/12/2018 5:48 AM Merge Sort 7 2   7  2  2 7
Advanced Sorting 7 2  9 4   2   4   7
Sorts, CompareTo Method and Strings
Lecture 3: Parallel Algorithm Design
Subject Name: Design and Analysis of Algorithm Subject Code: 10CS43
Problem Solving Strategies
Algorithm Design Methods
Bitonic Sorting and Its Circuit Design
Chapter 4.
Merge Sort 2/23/ :15 PM Merge Sort 7 2   7  2   4  4 9
Merge Sort 4/10/ :25 AM Merge Sort 7 2   7  2   4  4 9
Divide & Conquer Sorting
Parallel Sorting Algorithms
Merge Sort 5/30/2019 7:52 AM Merge Sort 7 2   7  2  2 7
Divide and Conquer Merge sort and quick sort Binary search
Presentation transcript:

1 Parallel Parentheses Matching Plus Some Applications

2 Parentheses Matching Problem Definition: Given a well-formed sequence of parentheses stored in an array, determine the index of the mate of each parentheses stored in the array

3 Example ((())())

4 Sequential Solution Traditional solution uses a stack Push left parentheses, pop for a right parenthesis; these are a pair Can this method be implemented in parallel? Why or why not?

5 Parallel Solution: Divide & Conquer Lemma 1: The mate of a parenthesis at an odd position in a balanced string lies in an even position (and vice versa). Lemma 2: If a balanced string has no left parenthesis at an even location (or, equivalently, a right at an odd location), then the mate of each left parenthesis in the string lies immediately to its right.

6 Lemma 2 Any string that satisfies Lemma 2 is of form ( ) ( ) ( ) ( )….( ) and is referred to as form F.

7 Algorithm Overview Each left position at an odd position and each right position are marked with a 0. All others are marked with a 1. These 2 disjoint sets are copied (packed) into a new array. Repeat for the 2 sets. Stop when each new substring is of form F.

8 Algorithm Match For i = 1 to log n – 1 do if “(“ & index is odd then mark 0 else mark 1 Use segmented prefix sums to compute new index for each parenthesis Move parentheses to new location Determine if string is now in form F; if not, terminate – unbalanced string Match parentheses and store in original array

9 Example ((())()) (())()() 0110 ()()()()

10 Example – Keep Index (2 (3 (4 )5 )6 (7 )8 ) (3 (4 )8 )2 (5 )6 (7 ) (8 )3 (4 )2 (5 )6 (7 )

11 Segmented Prefix Sum Problem Definition Given an array containing elements, some marked 0 and some marked 1. Compute the prefix sum of each subset. (For this application the sums will be on values of 1, to number the items.)

12 Segmented Prefix Sum - Example ((())()) How can this be accomplished with one prefix sums operation?

13 Parentheses Matching on Hypercube Use the Divide & Conquer strategy Consider 2 processors Each PC – assign 0/1 P0 send 1’s to P1; P1 send 0’s to P0 Each solve the sub-problem Does the problem split evenly? Consider Large problem – P0 & P2 take 0 items, P1 & P3 take 1 items

14 PPM - Hypercube Overview of Algorithm 2-Cube: Special case of 2 pc hypercube 4-Cube: Used to partition large sub-problems consisting of 4 pc cubes Match: The Driving Algorithm

15 Data Distribution INPUT array is divided into P equal partitions of size n/p First n/p items are given to P0, next n/p items to P1, etc. Final MATCH information for each item is stored in the original PC

16 Data Distribution Array consists of elements INPUT which holds the parentheses & MATCH which will hold the final matching information Local Match: if the match is determined by the pc in which the match information is to be stored Non-local: otherwise

17 Algorithm 2-Cube Mark left and right parentheses with 0/1 as previously discussed P0 & P1 exchange entries – P0 contains 0 and P1 contains 1 Each PC use stack to sequentially match parentheses Send non-local match operation to appropriate processor

18 Algorithm 2-Cube – Step (()(())) Mark 0/1 P0 P1

19 Algorithm 2-Cube – Step 2 & (())()() Exchange & Match P0 P1

20 Algorithm 2-Cube – Step (()(())) Send non-local match information P0 P1

21 Algorithm 4-Cube Basis of Algorithm Match Insures near-equal data distribution Overview Phase 1: local matches determined sequentially & communicated Phase 2: Unmatched parentheses marked, redistributed; P0 & P1 have 0’s, P2 & P3 have 1’s (half each)

22 Algorithm 4- Cube Phase 1: Sequential Processing 1. Each PC use stack to match 2. Send non-local Match to other PC 3. Count unmatched parentheses; prefix sum to reindex

23 Algorithm 4-Cube Phase 2 1. Mark parentheses with 1/0 2. P0 & P2 exchange: P0=0 & P2=1 Likewise, P1=0 and P3=1 3. P0 & P1 exchange number information; likewise for P2 & P3 4. P0 & P1 exchange entries; P0 obtains 1 st half: likewise for P2 & P3

24 Algorithm 4-Cube Distribution of Data – 64 entries P0 Init.1-16 P1 Init ,33-48 (0) , 49-64(0) (0) (0) P2 Init P3 Init ,33-48 (1) , 49-64(1) (1) (1)

25 Algorithm Hypercube Match 1. Each subcube of size 4 executes 4-Cube // Logically P/2 subcubes with independent subproblems 2. Each subcube (p/2) prefix sums to determine new index 3. Each subcube from Step 1 recursively repeat 1, 2, 3 until each subcube is of size 2 4. Execute 2-Cube to complete the solution

26 Algorithm Hypercube Match Complexity Analysis O (log 2 p + n/p log p) For p=n/log n simplifies to O(log 2 n) Is the Algorithm Optimal?