Presentation is loading. Please wait.

Presentation is loading. Please wait.

ProgrammingLanguages Programming Languages Parallel Programming languages.

Similar presentations


Presentation on theme: "ProgrammingLanguages Programming Languages Parallel Programming languages."— Presentation transcript:

1

2 ProgrammingLanguages Programming Languages

3 Parallel Programming languages

4 Objectives This lecture discusses the concept of parallel, or concurrent, programming.This lecture discusses the concept of parallel, or concurrent, programming. The major reason for investigating concurrent programming is that it provides a distinct way of conceptualizing the solution to a problemThe major reason for investigating concurrent programming is that it provides a distinct way of conceptualizing the solution to a problem A second reason is to take advantage of parallelism in the underlying hardware to achieve a significant speedup.A second reason is to take advantage of parallelism in the underlying hardware to achieve a significant speedup.

5 Concepts A sequential program specifies the execution of a sequence of statements that comprise the program.A sequential program specifies the execution of a sequence of statements that comprise the program. A process is a program in execution. As such, each process has its own state independent of the state of any other process or program.A process is a program in execution. As such, each process has its own state independent of the state of any other process or program. A process also has attached resources, such as files, memory, and so on. Part of the state of a process includes memory and the location of the current instruction being executed. Such an extended state is termed an execution context.A process also has attached resources, such as files, memory, and so on. Part of the state of a process includes memory and the location of the current instruction being executed. Such an extended state is termed an execution context.

6 Parallel program is a program designed to have two or more execution contexts.Parallel program is a program designed to have two or more execution contexts. Such a program is said to be multi-threaded, since it has more than one execution context.Such a program is said to be multi-threaded, since it has more than one execution context. A parallel program is a concurrent program in which more than one execution context, or thread, is active simultaneously.A parallel program is a concurrent program in which more than one execution context, or thread, is active simultaneously. In semantics, there is no difference between a concurrent program and a parallel one.In semantics, there is no difference between a concurrent program and a parallel one.

7 In a multiprocessing operating system the same program can be executed by multiple processes, each resulting in its own state or execution context, separate from the other processes.In a multiprocessing operating system the same program can be executed by multiple processes, each resulting in its own state or execution context, separate from the other processes. This is distinctly different from a multithreaded program in which some of the data resides simultaneously in each execution context.This is distinctly different from a multithreaded program in which some of the data resides simultaneously in each execution context. In a multi-threaded program, part of the program state is shared among the threads, while part of the state including the flow of control is unique to each thread.In a multi-threaded program, part of the program state is shared among the threads, while part of the state including the flow of control is unique to each thread.

8 Concurrent execution of a program can either occur using separate processors or be logically interleaved on a single processor using time slicing.Concurrent execution of a program can either occur using separate processors or be logically interleaved on a single processor using time slicing. In both Java and Ada, separate threads are applied to functions or methods, rather than being at the operation or statement level.In both Java and Ada, separate threads are applied to functions or methods, rather than being at the operation or statement level.

9 A thread can be found in any one of the following states:A thread can be found in any one of the following states: 1. Created: but is not yet ready to run.1. Created: but is not yet ready to run. 2. Runnable or ready: is ready to run, but awaits getting a processor to run on.2. Runnable or ready: is ready to run, but awaits getting a processor to run on. 3. Running: is actually executing on a processor.3. Running: is actually executing on a processor. 4. Blocked or waiting: is either waiting on gaining access to a critical section or has voluntarily given up the processor.4. Blocked or waiting: is either waiting on gaining access to a critical section or has voluntarily given up the processor. 5. Terminated: has been stopped and will not execute again.5. Terminated: has been stopped and will not execute again.

10 These states and the transitions between them are pictured in the following Figure:These states and the transitions between them are pictured in the following Figure:  Created Blocked | / \ V / \ Runnable  Running  Terminated  Created Blocked | / \ V / \ Runnable  Running  Terminated

11 Communication:Communication: All concurrent programs involve inter-thread communication or interaction.All concurrent programs involve inter-thread communication or interaction. This occurs for the following reasons:This occurs for the following reasons: 1. Threads compete for exclusive access to shared resources, such as physical devices, files, or data.1. Threads compete for exclusive access to shared resources, such as physical devices, files, or data. 2. Threads communicate to exchange data.2. Threads communicate to exchange data.

12 In both cases it is necessary for threads to synchronize their execution to avoid conflict when acquiring resources, or to make contact when exchanging data.In both cases it is necessary for threads to synchronize their execution to avoid conflict when acquiring resources, or to make contact when exchanging data. 1. Non-local shared variables: this is the primary mechanism used by Java, and it can also be used by Ada.1. Non-local shared variables: this is the primary mechanism used by Java, and it can also be used by Ada. 2. Message passing: this is the primary mechanism used by Ada.2. Message passing: this is the primary mechanism used by Ada. 3. Parameters: this is used by Ada in conjunction with message passing.3. Parameters: this is used by Ada in conjunction with message passing. A thread can communicate with other threads through: A thread can communicate with other threads through:

13 Threads normally cooperate with one another to solve a problem. Thus, even in the simplest cases, communication between threads is essential.Threads normally cooperate with one another to solve a problem. Thus, even in the simplest cases, communication between threads is essential. It is unusual for a thread not to communicate with other threads.It is unusual for a thread not to communicate with other threads. However, it is highly desirable to keep communication between threads to a minimum; this makes the code easier to understand and allows each thread to run at its own speed, without being slowed down by the coordination of communication.However, it is highly desirable to keep communication between threads to a minimum; this makes the code easier to understand and allows each thread to run at its own speed, without being slowed down by the coordination of communication.

14 The fundamental problem in sharing access to a variable is termed a race condition.The fundamental problem in sharing access to a variable is termed a race condition. This occurs when the function computed by a program depends on the order in which operations occur.This occurs when the function computed by a program depends on the order in which operations occur. In the presence of such non-determinism, faults in a concurrent program may appear as transient errors.In the presence of such non-determinism, faults in a concurrent program may appear as transient errors. The error may or may not occur, even for the same data, depending on the execution paths of the various threadsThe error may or may not occur, even for the same data, depending on the execution paths of the various threads

15 Thus, a great skill in designing a concurrent program is the ability to express it in a form that guarantees correct program behavior in the presence of non-determinism.Thus, a great skill in designing a concurrent program is the ability to express it in a form that guarantees correct program behavior in the presence of non-determinism. If a thread is unable to acquire a resource, its execution is normally suspended until the resource becomes available.If a thread is unable to acquire a resource, its execution is normally suspended until the resource becomes available. Resource acquisition should normally be administered so that no thread is unduly delayed.Resource acquisition should normally be administered so that no thread is unduly delayed.

16 Code that accesses a shared variable or other resource is termed a critical section.Code that accesses a shared variable or other resource is termed a critical section. For a thread to safely execute a critical section, there needs to be a locking mechanism such that it can test and set a lock as a single atomic instruction.For a thread to safely execute a critical section, there needs to be a locking mechanism such that it can test and set a lock as a single atomic instruction. Such a mechanism is used to ensure that only a single thread is executing a critical section at a time.Such a mechanism is used to ensure that only a single thread is executing a critical section at a time.

17 Deadlock and Unfairness:Deadlock and Unfairness: A thread is said to be in a state of deadlock if it is waiting for an event that will never happen.A thread is said to be in a state of deadlock if it is waiting for an event that will never happen. Deadlock normally involves several threads, A thread is said to be indefinitely postponed if it is delayed awaiting an event that may never occur each waiting for resources held by others.Deadlock normally involves several threads, A thread is said to be indefinitely postponed if it is delayed awaiting an event that may never occur each waiting for resources held by others. A deadlock can occur whenever two or more threads compete for resources.A deadlock can occur whenever two or more threads compete for resources.

18 Deadlock and Unfairness:Deadlock and Unfairness: A thread is said to be indefinitely postponed if it is delayed awaiting an event that may never occur.A thread is said to be indefinitely postponed if it is delayed awaiting an event that may never occur. Such a situation can occur if the algorithm that allocates resources to requesting threads makes no allowance for the waiting time of a thread.Such a situation can occur if the algorithm that allocates resources to requesting threads makes no allowance for the waiting time of a thread. Allocating resources on a first-in-first-out basis is a simple solution that eliminates this indefinite postponement.Allocating resources on a first-in-first-out basis is a simple solution that eliminates this indefinite postponement.

19 Analogous to indefinite postponement is the concept of unfairness.Analogous to indefinite postponement is the concept of unfairness. In such a case no attempt is made to ensure that threads of equal status make equal progress in acquiring resources.In such a case no attempt is made to ensure that threads of equal status make equal progress in acquiring resources. A neglect of fairness in designing a concurrent system may lead to indefinite postponement, thereby rendering the system incorrect.A neglect of fairness in designing a concurrent system may lead to indefinite postponement, thereby rendering the system incorrect. A simple fairness criterion is that when an open choice of action is to be made, any action should be equally likelyA simple fairness criterion is that when an open choice of action is to be made, any action should be equally likely

20 Semaphores:Semaphores: Basically, a semaphore is an integer variable and an associated thread queuing mechanism.Basically, a semaphore is an integer variable and an associated thread queuing mechanism. P(s)  if s > 0 then set s = s – 1 else the thread is blocked (enqueued). P(s)  if s > 0 then set s = s – 1 else the thread is blocked (enqueued). V(s)  if a thread T is blocked on the semaphore s, then wake up T, else set s = s + l. V(s)  if a thread T is blocked on the semaphore s, then wake up T, else set s = s + l. Binary Semaphore : 0 or 1.Binary Semaphore : 0 or 1. Counting Semaphore: Arbitrary nonnegative values.Counting Semaphore: Arbitrary nonnegative values.

21 Producer Consumer Operation:Producer Consumer Operation: A classic example occurs in the case of producer-consumer cooperation, where the single producer task produces information for the single consumer task to consume.A classic example occurs in the case of producer-consumer cooperation, where the single producer task produces information for the single consumer task to consume. The producer waits (via a P) for the buffer to be empty, deposits product, then signals (via a V) that the buffer is full. The producer waits (via a P) for the buffer to be empty, deposits product, then signals (via a V) that the buffer is full. The consumer waits (via a P) for the buffer to be full, then removes the product from the buffer, and signals (via a V) that the buffer is empty. The consumer waits (via a P) for the buffer to be full, then removes the product from the buffer, and signals (via a V) that the buffer is empty.

22 Using Semaphores in Concurrent Pascal:Using Semaphores in Concurrent Pascal: program SimpleProducerConsumer; var buffer : string; full : semaphore = 0; empty : semaphore = 1;program SimpleProducerConsumer; var buffer : string; full : semaphore = 0; empty : semaphore = 1; Procedure Producer: var : string begin while (true) do begin produce(tmp); P(empty) { begin critical section} Buffer := tmp; V(full); { end critical section} End; End;Procedure Producer: var : string begin while (true) do begin produce(tmp); P(empty) { begin critical section} Buffer := tmp; V(full); { end critical section} End; End;

23 Using Semaphores in Concurrent Pascal:Using Semaphores in Concurrent Pascal: procedure Consumer; var tmp : string begin while (true) do begin P(full); { begin critica1 section } tmp := buffer; V(empty); { end critica1 section } consume(tmp); end; end;procedure Consumer; var tmp : string begin while (true) do begin P(full); { begin critica1 section } tmp := buffer; V(empty); { end critica1 section } consume(tmp); end; end; begin cobegin Producer; Consumer; coend; end.begin cobegin Producer; Consumer; coend; end.

24 Monitors:Monitors: Monitors provide the basis for synchronization in Java.Monitors provide the basis for synchronization in Java. Its purpose is to encapsulate a shared variable and operations on the variable. Its purpose is to encapsulate a shared variable and operations on the variable. This capsulation is combined with an automatic locking mechanism on the operations so that at most one thread can be executing an operation at one time. This capsulation is combined with an automatic locking mechanism on the operations so that at most one thread can be executing an operation at one time.

25 Using Monitor in Consumer/Producer Operation:Using Monitor in Consumer/Producer Operation: Monitor Buffer; Const size=5; var buffer : array[1..size] of string; in : integer = 0; out : integer = 0; count : integer = 0;Monitor Buffer; Const size=5; var buffer : array[1..size] of string; in : integer = 0; out : integer = 0; count : integer = 0; nonfull : condition; nonempty : condition; nonfull : condition; nonempty : condition;

26 Using Monitor in Consumer/Producer Operation:Using Monitor in Consumer/Producer Operation: procedure put(s : string); begin if (count = size) then wait(nonfull); else in := in mod size +1; buffer[in] := tmp; count := count + 1; V(nonempty); end;procedure put(s : string); begin if (count = size) then wait(nonfull); else in := in mod size +1; buffer[in] := tmp; count := count + 1; V(nonempty); end;

27 Using Monitor in Consumer/Producer Operation:Using Monitor in Consumer/Producer Operation: function get : string; var tmp : string begin if (count = 0) then wait(nonempty); else out = out mod size + 1; tmp := buffer[out]; count := count - 1; signa1(nonfu1l); get := tmp; end; end;function get : string; var tmp : string begin if (count = 0) then wait(nonempty); else out = out mod size + 1; tmp := buffer[out]; count := count - 1; signa1(nonfu1l); get := tmp; end; end;

28 Conclusion Avoiding Deadlocks and Achieving Fairness in a concurrent system should be considered at the design level.Avoiding Deadlocks and Achieving Fairness in a concurrent system should be considered at the design level. Synchronizing the execution of parallel programs is to avoid conflict when acquiring resources, or to make contact when exchanging data.Synchronizing the execution of parallel programs is to avoid conflict when acquiring resources, or to make contact when exchanging data. Monitors and semaphores are equivalent mechanisms in power in that you can implement a monitor using semaphores and implement a semaphore using a monitor.Monitors and semaphores are equivalent mechanisms in power in that you can implement a monitor using semaphores and implement a semaphore using a monitor.


Download ppt "ProgrammingLanguages Programming Languages Parallel Programming languages."

Similar presentations


Ads by Google