Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 11: Distributed Processing Parallel programming Principles of parallel programming languages Concurrent execution –Programming constructs –Guarded.

Similar presentations


Presentation on theme: "Chapter 11: Distributed Processing Parallel programming Principles of parallel programming languages Concurrent execution –Programming constructs –Guarded."— Presentation transcript:

1 Chapter 11: Distributed Processing Parallel programming Principles of parallel programming languages Concurrent execution –Programming constructs –Guarded commands –Tasks Persistent systems Client-server computing

2 Parallel processing The execution of more than one program/subprogram simultaneously. A subprogram that can execute concurrently with other subprograms is called a task or a process. Hardware supported: multiprocessor systems distributed computer systems Software simulated - : time-sharing

3 Principles of parallel programming languages Variable definitions mutable : values may be assigned to the variables and changed during program execution (as in sequential languages). definitional: variable may be assigned a value only once

4 Principles…. Parallel composition: A parallel statement, which causes additional threads of control to begin executing Execution models (Program structure) Transformational: E.G. parallel matrix multiplication Reactive

5 Principles…. Communication shared memory with common data objects accessed by each parallel program; messages Synchronization: Parallel programs must be able to coordinate actions

6 Concurrent execution Programming constructs Using parallel execution primitives of the operating system (C can invoke the fork operation of Unix ) Using parallel constructs. A programming language parallel construct indicates parallel execution

7 Example AND statement (programming language level) Syntax: statement1 and statement2 and … statementN Semantics: All statements execute in parallel. call ReadProcess and call Write process and call ExecuteUserProgram ;

8 Guarded commands Guard: a condition that can be true or false Guards are associated with statements A statement is executed when its guard becomes true

9 Example Guarded if: if B1  S1 | B2  S2 | … | Bn  Sn fi Guarded repetition statement do B1  S1 | B2  S2 | … | Bn  Sn od Bi - guards, Si - statements

10 Tasks Subprograms that run in parallel with the program that has initiated them Dependent on the initiating program The initiating program cannot terminate until all of its dependents terminate A task may have multiple simultaneous activations

11 Task interaction Tasks unaware of each other Tasks indirectly aware of each other –use shared memory Tasks directly aware of each other

12 Control Problems Mutual exclusion Deadlock P1 waits for an event to be produced by P2 P2 waits for an event to be produced by P1 Starvation P1, P2, P3 need non-shareable resource. P1 and P2 alternatively use the resource, P3 - denied access to that resource.

13 Mutual exclusion Two tasks require access to a single non-shareable resource. Critical resource - the resource in question. Critical section in the program - the portion in the program that uses the resource The rule: only one program at a time can be allowed in its critical section

14 Synchronization of Tasks Interrupts - provided by OS. Semaphores - shared data objects, with two primitive operations - signal and wait. Messages - information is sent from one task to another. The sending task may continue to execute. Guarded commands - force synchronization by insuring conditions are met before executing tasks. Rendezvous - similar to messages, the sending task waits for an answer.

15 Semaphores May be initialized to a nonnegative number Wait operation decrements the semaphore value Signal operation increments semaphore value Semaphore is a variable that has an integer value

16 Mutual exclusion with semaphores Each task performs: wait(s); /* critical section */ signal(s); B[0]B[1]B[2]B[3]B[4] … OUT IN The Producer/Consumer problem with infinite buffer

17 Solution : s - semaphore for entering the critical section delay - semaphore to ensure reading from non-empty buffer Producer:Consumer: produce();wait(delay); wait (s);wait(s); append();take(); signal(delay); signal(s); signal(s);consume();

18 Persistent systems Traditional software: Data stored outside of program - persistent data Data processed in main memory - transient data Persistent languages do not make distinction between persistent and transient data, automatically reflect changes in the database

19 Design issues A mechanism to indicate an object is persistent A mechanism to address a persistent object Simultaneous access to an individual persistent object - semaphores Check type compatibility of persistent objects - structural equivalence

20 Client-server computing Network models centralized, where a single processor does the scheduling distributed or peer-to-peer, where each machine is an equal, and the process of scheduling is spread among all of the machines

21 Client-server mediator architecture Client machine: Interacts with user Has protocol to communicate with server Server: Provides services: retrieves data and/or programs Issues: May be communicating with multiple clients simultaneously – Need to keep each such transaction separate – Multiple local address spaces in server


Download ppt "Chapter 11: Distributed Processing Parallel programming Principles of parallel programming languages Concurrent execution –Programming constructs –Guarded."

Similar presentations


Ads by Google