CS 345: Chapter 10 Parallelism, Concurrency, and Alternative Models Or, Getting Lots of Stuff Done at Once.

Slides:



Advertisements
Similar presentations
Turing Machines January 2003 Part 2:. 2 TM Recap We have seen how an abstract TM can be built to implement any computable algorithm TM has components:
Advertisements

22C:19 Discrete Math Algorithms and Complexity
NP-Complete Problems CIT 596, Spring Problems that Cross the Line What if a problem has: o An exponential upper bound o A polynomial lower bound.
CS 345: Chapter 9 Algorithmic Universality and Its Robustness
Advanced Topics in Algorithms and Data Structures
Slide 1 Parallel Computation Models Lecture 3 Lecture 4.
1 Distributed Computing Algorithms CSCI Distributed Computing: everything not centralized many processors.
Tractable and intractable problems for parallel computers
© 2006 Pearson Addison-Wesley. All rights reserved14 A-1 Chapter 14 Graphs.
Advanced Topics in Algorithms and Data Structures An overview of the lecture 2 Models of parallel computation Characteristics of SIMD models Design issue.
1 NP-Completeness Objectives: At the end of the lesson, students should be able to: 1. Differentiate between class P, NP, and NPC 2. Reduce a known NPC.
Computational Complexity, Physical Mapping III + Perl CIS 667 March 4, 2004.
(Page 554 – 564) Ping Perez CS 147 Summer 2001 Alternative Parallel Architectures  Dataflow  Systolic arrays  Neural networks.
Complexity 19-1 Parallel Computation Complexity Andrei Bulatov.
Parallel System Performance CS 524 – High-Performance Computing.
CS10 The Beauty and Joy of Computing Lecture #23 : Limits of Computing Thanks to the success of the Kinect, researchers all over the world believe.
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
Halting Problem. Background - Halting Problem Common error: Program goes into an infinite loop. Wouldn’t it be nice to have a tool that would warn us.
1.1 Chapter 1: Introduction What is the course all about? Problems, instances and algorithms Running time v.s. computational complexity General description.
Complexity Classes Kang Yu 1. NP NP : nondeterministic polynomial time NP-complete : 1.In NP (can be verified in polynomial time) 2.Every problem in NP.
Programming & Data Structures
Chapter Complexity of Algorithms –Time Complexity –Understanding the complexity of Algorithms 1.
Computational Complexity Polynomial time O(n k ) input size n, k constant Tractable problems solvable in polynomial time(Opposite Intractable) Ex: sorting,
Chapter 7 Inefficiency and Intractability CS 345 Spring Quarter, 2014.
All that remains is to connect the edges in the variable-setters to the appropriate clause-checkers in the way that we require. This is done by the convey.
RESOURCES, TRADE-OFFS, AND LIMITATIONS Group 5 8/27/2014.
CSC 413/513: Intro to Algorithms NP Completeness.
CSCI 2670 Introduction to Theory of Computing December 1, 2004.
P, NP, and Exponential Problems Should have had all this in CS 252 – Quick review Many problems have an exponential number of possibilities and we can.
CS 345: Chapter 8 Noncomputability and Undecidability Or Sometimes You Can’t Get It Done At All.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
1 Chapter 34: NP-Completeness. 2 About this Tutorial What is NP ? How to check if a problem is in NP ? Cook-Levin Theorem Showing one of the most difficult.
Parallel computation Section 10.5 Giorgi Japaridze Theory of Computability.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
3.3 Complexity of Algorithms
Beauty and Joy of Computing Limits of Computing Ivona Bezáková CS10: UC Berkeley, April 14, 2014 (Slides inspired by Dan Garcia’s slides.)
SNU OOPSLA Lab. 1 Great Ideas of CS with Java Part 1 WWW & Computer programming in the language Java Ch 1: The World Wide Web Ch 2: Watch out: Here comes.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 7 Time complexity Contents Measuring Complexity Big-O and small-o notation.
22C:19 Discrete Math Algorithms and Integers Fall 2010 Sukumar Ghosh.
CS6045: Advanced Algorithms NP Completeness. NP-Completeness Some problems are intractable: as they grow large, we are unable to solve them in reasonable.
Design and Analysis of Algorithms NP-Completeness Haidong Xue Summer 2012, at GSU.
Donghyun (David) Kim Department of Mathematics and Computer Science North Carolina Central University 1 Chapter 7 Time Complexity Some slides are in courtesy.
CS 461 – Nov. 18 Section 7.1 Overview of complexity issues –“Can quickly decide” vs. “Can quickly verify” Measuring complexity Dividing decidable languages.
Copyright © 2012 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Computer Science: An Overview Eleventh Edition by J. Glenn Brookshear Chapter.
Chapter 15 P, NP, and Cook’s Theorem. 2 Computability Theory n Establishes whether decision problems are (only) theoretically decidable, i.e., decides.
Young CS 331 D&A of Algo. NP-Completeness1 NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and.
Optimization/Decision Problems Optimization Problems – An optimization problem is one which asks, “What is the optimal solution to problem X?” – Examples:
Copyright © 2014 Curt Hill Algorithm Analysis How Do We Determine the Complexity of Algorithms.
1 P and NP. 2 Introduction The Traveling Salesperson problem and thousands of other problems are equally hard in the sense that if we had an efficient.
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
8.1 Determine whether the following statements are correct or not
University of British Columbia
Chapter 12: Theory of Computation
1.5 Intractable problems.
Ch. 11 Theory of Computation
CS 2210 Discrete Structures Algorithms and Complexity
Other Models of Computation
Chapter 3: The Efficiency of Algorithms
Computational Complexity
Section 14.3 Complexity Classes
Chapter 3: The Efficiency of Algorithms
CLASSES P AND NP.
NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and Johnson, W.H. Freeman and Company, 1979.
4-9问题的形式化描述.
Complexity Theory in Practice
CSC 380: Design and Analysis of Algorithms
Algorithms CSCI 235, Spring 2019 Lecture 36 P vs
Invitation to Computer Science 5th Edition
Complexity Theory: Foundations
CS 2210 Discrete Structures Algorithms and Complexity
Presentation transcript:

CS 345: Chapter 10 Parallelism, Concurrency, and Alternative Models Or, Getting Lots of Stuff Done at Once

Parallelism is the use of many processors to solve a problem. When we want to ask the question, “Will parallelism result in some intractable problems becoming tractable?” we need to distinguish between fixed parallelism and expanding parallelism.

Fixed Parallelism: The number of processors remains fixed as the problem size grows. Expanding Parallelism: The number of processors available grows as the problem size grows. Since the number of processors available affects the running time of a parallel algorithm, how do we balance the two?

Product Complexity: Time x Size. See Figure 10.2, page 262. Note that since any parallel algorithm can be serialized on a single processor, the best product complexity cannot be lower than the lower bound of the problem’s sequential complexity.

Among the difficulties of parallel algorithms is “How should processors communicate?” There are two basic approaches. –Shared Memory –Fixed Networks

Shared Memory These types of machines are called Multi ‑ Processors. Is the shared memory only for reading values or can it be used to write values? If the shared memory is for writing, then there must be some method for resolving conflicts.

Fixed-Network Machines These types of computers are called Multi ‑ Computers. A processor is connected to a fixed number of neighbors, and each processing element has its own private memory. There are many types of network designs: mesh/torus, fat trees, hypercubes, and others based on graph theory.

Some networks are designed for specific problems. Others are for general purpose, but are better at some algorithms than others. Some issues with large multi-computers –What communication model is used? Store and forward Worm-hole routing Virtual cut-through –Deadlock and Livelock –Fault-tolerance.

Systolic Networks One type of network for parallel processing is a systolic network. See figure 10.5 on page 266. This is similar to an assembly line. Each processor performs a set task on some data and passes it on to its neighbor. In the example, there is an n x m matrix. A sequential algorithm takes O( n x m ), but a systolic network takes O( n + m ).

No. Since any parallel algorithm can be serialized, a parallel algorithm cannot solve an undecidable or non-computable problem. Can a parallel algorithm solve an undecidable problem?

Can a parallel algorithm turn an intractable problem into a tractable problem? Many problems in NP, including some NP- Complete problems, have been shown to have a polynomial time parallel solution. However –The number of processors required for the solution is exponential.

–NP-complete problems are not known to be intractable, so this does not necessarily mean that parallelism can remove inherent intractability. –It is not clear that a parallel algorithm that uses only a polynomial number of instructions but requires an exponential number of processors can actually run in polynomial time on a real computer. Thus, the question of whether a parallel algorithm can remove inherent intractability, is still an open question.

The Parallel Computation Thesis Part of this thesis is the claim that parallel time is the same as sequential memory. If P can be solved sequentially using S space for inputs of length N, then it can be solved in parallel time that is no worse than polynomial time in S.

Alternately, if P can be solved in parallel time T with inputs of length N, it can be solved sequentially using memory bounded by a polynomial in T.

So, any algorithm solvable by a sequential algorithm in polynomial space can be solved by a parallel algorithm in polynomial time. That is: Sequential-PSpace = Parallel-PTime

Thus the question of whether there are intractable problems that become tractable when using parallelism comes down to the question: Does PSpace contain intractable problems? This, like the question, “Does P=NP?”, is still an open question.

The Class NC (Nick’s Class) In general, a polynomial time parallel algorithm cannot claim to be tractable since it may require an exponential number of processors. A problem is in NC if –It runs in polylogrithmic time. –It requires only a polynomial number of processors.

All problems in NC are also in P, but it is not known if the converse is true. Most researchers believe that the two classes are not the same. Thus, we have: NC  P  NP  PSpace

Conjecture 1 There are problems solvable in reasonable sequential space – reasonable parallel time that cannot be solved in reasonable sequential time, even using non- determinism.

Conjecture 2 There are problems solvable in reasonable sequential time only if non-determinism is used.

Conjecture 3 There are problems solvable in reasonable sequential space, but not in very fast parallel time using reasonable hardware size.