Presentation is loading. Please wait.

Presentation is loading. Please wait.

Parallel Programming Models EECC 756 David D. McGann 18 May, 1999.

Similar presentations


Presentation on theme: "Parallel Programming Models EECC 756 David D. McGann 18 May, 1999."— Presentation transcript:

1 Parallel Programming Models EECC 756 David D. McGann 18 May, 1999

2 Topics What is a programming model? Why do we need programming models? Goals of a good parallel programming model Categories of parallel programming models –Levels of abstraction –Task creation and communication Example parallel programming models

3 What is a programming model? It is an abstract machine that provides a interface for developers to the system below.

4 Why do we need a programming model? Separates software development issues from parallel execution issues. Simplifies software development by eliminating dependencies on system architecture.

5 Goals of a good parallel programming model Easy to program –Should be easy to program (hide execution details) Software Development Methodology –Need method for non-runtime debugging. Architecture Independent –Portability Understandable –Maintenance Efficiently Implementable –Performance should not suffer due to level of abstraction and portability. Low Cost –Maintenance, portability, performance, etc.

6 Categories of programming models (in terms of abstraction) Nothing explicit –Graph Reduction, Skeletons, Comm-limited skeletons. Explicit parallelism –Dataflow, single data structure Skeletons, Data parallelism. Explicit decomposition –BSP Explicit mapping –RPC, Graphical Languages, Comm. Skeletons Explicit communication –Process Nets, Internal Object Oriented Langauges, Systolic Arrays Everything explicit –PVM, MPI Amount of developer control

7 Nothing Explicit (Graph Reduction) Functions are broken up into tree structures Sub-structures of the tree are taken and computed If sub-structures are independent, they can be computed concurrently Process keeps going till nothing left to evaluate, what is left is the solution Creates tasks dynamically

8 Graph Reduction x y z + + * ^3 ^2 -sqrt / F

9 Explicit Parallelism (Dataflow) Simple computations are represented as operations with specific inputs and outputs Order and time of execution dependent on arrival of data Each iteration of a loop can cause a new task to be created

10 Dataflow (mapped on a mesh) * + + ^3 ^2 / sqrt - Input Output

11 Explicit Decomposition (BSP) Bulk Synchronous Process Program broken up into n number of threads Computations occur within a time frame called a superstep A superstep consists of computation, global communication then barrier synchronization All threads are synchronized.

12 BSP Barrier Sync Task Computation Global Communication Synchronization

13 Explicit Mapping (Coordination Languages) Disguises communication between tasks by replacing point-to-point communication with data pool Sender and receiver are unknown to each other Data in data pool is found associatively 3 basic communication ops –in - remove from data pool –read - copy from data pool –out - place in data pool

14 Linda Data pool Task

15 Explicit Communication (Process Nets) Similar to Dataflow Differ in that the entities have their own state –Determines response to incoming messages Actor systems are a class of this model –Messages are handled sequentially –Messages can be sent and received asynchronous and out of order

16 Everything Explicit Parallel Virtual Machine –Message passing model –Developer determines Process creation Granularity Mapping Communication Synchronization –Supported via libraries for FORTRAN and C –Ideal for heterogeneous networks of workstations

17 Conclusion


Download ppt "Parallel Programming Models EECC 756 David D. McGann 18 May, 1999."

Similar presentations


Ads by Google