13-1 Chapter 13 Concurrency Topics Introduction Introduction to Subprogram-Level Concurrency Semaphores Monitors Message Passing Java Threads C# Threads.

Slides:



Advertisements
Similar presentations
Threads, SMP, and Microkernels
Advertisements

Concurrency Important and difficult (Ada slides copied from Ed Schonberg)
Monitors Chapter 7. The semaphore is a low-level primitive because it is unstructured. If we were to build a large system using semaphores alone, the.
Concurrency CS 510: Programming Languages David Walker.
Chapter 11: Distributed Processing Parallel programming Principles of parallel programming languages Concurrent execution –Programming constructs –Guarded.
ISBN Chapter 13 Concurrency. Copyright © 2006 Addison-Wesley. All rights reserved.2 Chapter 13 Topics Introduction Introduction to Subprogram-Level.
Multithreading in Java Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
© 2004, D. J. Foreman 2-1 Concurrency, Processes and Threads.
Concurrency - 1 Tasking Concurrent Programming Declaration, creation, activation, termination Synchronization and communication Time and delays conditional.
May 6, ICE 1341 – Programming Languages (Lecture #19) In-Young Ko Programming Languages (ICE 1341) Lecture #19 Programming Languages (ICE 1341)
Comparative Programming Languages hussein suleman uct csc304s 2003.
ISBN Chapter 13 Concurrency. Copyright © 2009 Addison-Wesley. All rights reserved.1-2 Chapter 13 Topics Introduction Introduction to Subprogram-Level.
1 Chapter 13 © 2002 by Addison Wesley Longman, Inc Introduction - Concurrency can occur at four levels: 1. Machine instruction level 2. High-level.
Winter 2007SEG2101 Chapter 101 Chapter 10 Concurrency.
Concurrency (Based on:Concepts of Programming Languages, 8th edition, by Robert W. Sebesta, 2007)
Threads in Java. History  Process is a program in execution  Has stack/heap memory  Has a program counter  Multiuser operating systems since the sixties.
12/1/98 COP 4020 Programming Languages Parallel Programming in Ada and Java Gregory A. Riccardi Department of Computer Science Florida State University.
1 Processes, Threads, Race Conditions & Deadlocks Operating Systems Review.
111 © 2002, Cisco Systems, Inc. All rights reserved.
 2004 Deitel & Associates, Inc. All rights reserved. 1 Chapter 4 – Thread Concepts Outline 4.1 Introduction 4.2Definition of Thread 4.3Motivation for.
Concurrency Concurrency is the simultaneous execution of program code –instruction level – 2 or more machine instructions simultaneously –statement level.
Concurrent Programming. Concurrency  Concurrency means for a program to have multiple paths of execution running at (almost) the same time. Examples:
(12.1) COEN Concurrency and Exceptions  Coroutines  Physical and logical concurrency  Synchronization – semaphores – monitors – message passing.
Chapter 13 Concurrency. Copyright © 2012 Addison-Wesley. All rights reserved.1-2 Introduction Concurrency can occur at four levels: –Machine instruction.
CS 355 – PROGRAMMING LANGUAGES Dr. X. 1-2 Chapter 13 Topics Introduction Introduction to Subprogram-Level Concurrency Semaphores Monitors Message Passing.
CPS 506 Comparative Programming Languages
1 Copyright © 1998 by Addison Wesley Longman, Inc. Chapter 12 Concurrency can occur at four levels: 1. Machine instruction level 2. High-level language.
Lecture 13 Concepts of Programming Languages Arne Kutzner Hanyang University / Seoul Korea.
ISBN Chapter 13 Concurrency. Copyright © 2009 Addison-Wesley. All rights reserved.1-2 Chapter 13 Topics Introduction to Subprogram-Level.
ISBN Chapter 13 Concurrency. Copyright © 2004 Pearson Addison-Wesley. All rights reserved.13-2 Chapter 13 Topics Introduction Introduction.
1 Concurrency Architecture Types Tasks Synchronization –Semaphores –Monitors –Message Passing Concurrency in Ada Java Threads.
SEMAPHORE By: Wilson Lee. Concurrency Task Synchronization Example of semaphore Language Support.
ICS 313: Programming Language Theory Chapter 13: Concurrency.
Chapter 7 -1 CHAPTER 7 PROCESS SYNCHRONIZATION CGS Operating System Concepts UCF, Spring 2004.
PZ12B Programming Language design and Implementation -4th Edition Copyright©Prentice Hall, PZ12B - Synchronization and semaphores Programming Language.
Synchronization and semaphores Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Section
Threads Doing Several Things at Once. Threads n What are Threads? n Two Ways to Obtain a New Thread n The Lifecycle of a Thread n Four Kinds of Thread.
15.1 Threads and Multi- threading Understanding threads and multi-threading In general, modern computers perform one task at a time It is often.
ISBN Chapter 13 Concurrency. Copyright © 2015 Pearson. All rights reserved.1-2 Chapter 13 Topics Introduction Introduction to Subprogram-Level.
Concurrency.
ISBN Chapter 13 Concurrency. Copyright © 2009 Addison-Wesley. All rights reserved.1-2 Chapter 13 Topics Introduction Introduction to Subprogram-Level.
Chapter 13 Concurrency. Copyright © 2012 Addison-Wesley. All rights reserved.1-2 Chapter 13 Topics Introduction Introduction to Subprogram-Level Concurrency.
C H A P T E R E L E V E N Concurrent Programming Programming Languages – Principles and Paradigms by Allen Tucker, Robert Noonan.
Chapter 2 Process Management. 2 Objectives After finish this chapter, you will understand: the concept of a process. the process life cycle. process states.
Chapter 13 Concurrency. Corrected and improved by Assoc. Prof. Zeki Bayram, EMU, North Cyprus. Original Copyright © 2012 Addison-Wesley. All rights reserved1-2.
1 Chapter 13 © 2002 by Addison Wesley Longman, Inc Introduction - Concurrency can occur at four levels: 1. Machine instruction level (HW level) 2.
Chapter 13 Concurrency.
Chapter 4 – Thread Concepts
Background on the need for Synchronization
CMP 339/692 Programming Languages Day 24 Tuesday May 1, 2012
Chapter 4 – Thread Concepts
Chapter 13 Concurrency 순천향대 하 상 호.
Chapter 13 Concurrency.
Lecture 12 Concepts of Programming Languages
Multithreading.
Chapter 13 Concurrency.
Chapter 13 Concurrency.
Concurrency: Mutual Exclusion and Process Synchronization
Subject : T0152 – Programming Language Concept
Concurrency, Processes and Threads
Chapter 13 Concurrency.
Chapter 13 Concurrency.
Chapter 13 Concurrency.
Chapter 13 Concurrency.
Chapter 13 Concurrency.
Chapter 13 Concurrency.
Synchronization and semaphores
Chapter 13 Concurrency.
Presentation transcript:

13-1 Chapter 13 Concurrency Topics Introduction Introduction to Subprogram-Level Concurrency Semaphores Monitors Message Passing Java Threads C# Threads Statement-Level Concurrency

13-2 Introduction Concurrency can occur at four levels: 1. Machine instruction level 2. High-level language statement level 3. Unit level 4. Program level –Because there are no language issues in instruction- and program-level concurrency, they are not addressed here

13-3 Introduction Def: A thread of control in a program is the sequence of program points reached as control flows through the program Categories of Concurrency: 1. Physical concurrency - Multiple independent processors ( multiple threads of control) 2. Logical concurrency - The appearance of physical concurrency is presented by time-sharing one processor (software can be designed as if there were multiple threads of control) Coroutines provide only quasi-concurrency

13-4 Introduction Reasons to Study Concurrency 1. It involves a different way of designing software that can be very useful—many real- world situations involve concurrency 2. Computers capable of physical concurrency are now widely used

13-5 Introduction to Subprogram- Level Concurrency Def: A task or process is a program unit that can be in concurrent execution with other program units Tasks differ from ordinary subprograms: –1. A task may be implicitly started –2. When a program unit starts the execution of a task, it is not necessarily suspended –3. When a task’s execution is completed, control may not return to the caller Tasks usually work together

13-6 Introduction to Subprogram- Level Concurrency Two general categories of tasks –Heavyweight tasks execute in their own address space and have their own run-time stacks –Lightweight tasks all run in the same address space and use the same run-time stack

13-7 Introduction to Subprogram- Level Concurrency Def: A task is disjoint if it does not communicate with or affect the execution of any other task in the program in any way Task communication is necessary for synchronization –Task communication can be through: 1. Shared nonlocal variables 2. Parameters 3. Message passing

13-8 Introduction to Subprogram- Level Concurrency Kinds of synchronization: 1. Cooperation –Task A must wait for task B to complete some specific activity before task A can continue its execution e.g., the producer-consumer problem 2. Competition –When two or more tasks must use some resource that cannot be simultaneously used e.g., a shared counter –Competition is usually provided by mutually exclusive access.

13-9 Need for Competition Synchronization

13-10 Introduction to Subprogram- Level Concurrency Providing synchronization requires a mechanism for delaying task execution Task execution control is maintained by a program called the scheduler, which maps task execution onto available processors

13-11 Introduction to Subprogram- Level Concurrency Tasks can be in one of several different execution states: 1. New - created but not yet started 2. Runnable or ready - ready to run but not currently running (no available processor) 3. Running 4. Blocked - has been running, but cannot now continue (usually waiting for some event to occur) 5. Dead - no longer active in any sense

13-12 Introduction to Subprogram- Level Concurrency Liveness is a characteristic that a program unit may or may not have In sequential code, it means the unit will eventually complete its execution In a concurrent environment, a task can easily lose its liveness( deadlock )

13-13 Introduction to Subprogram- Level Concurrency Design Issues for Concurrency: 1. How is cooperation synchronization provided? 2. How is competition synchronization provided? 3. How and when do tasks begin and end execution? 4. Are tasks statically or dynamically created?

13-14 Introduction to Subprogram- Level Concurrency Methods of Providing Synchronization: 1. Semaphores 2. Monitors 3. Message Passing

13-15 Semaphores Dijkstra A semaphore is a data structure consisting of a counter and a queue for storing task descriptors Semaphores can be used to implement guards on the code that accesses shared data structures Semaphores have only two operations, wait and release (originally called P and V by Dijkstra) Semaphores can be used to provide both competition and cooperation synchronization

13-16 Semaphores Evaluation of Semaphores: Misuse of semaphores can cause failures in cooperation / competition synchronization. "The semaphores is an elegant synchronization tool for an ideal programmer who never makes mistakes---Brinch Hansen, 1973"

13-17 Monitors Concurrent Pascal and Modula The idea: encapsulate the shared data and its operations to restrict access A monitor is an abstract data type for shared data

13-18 Monitor Buffer Operation

13-19 Monitors Example language: Concurrent Pascal –Concurrent Pascal is Pascal + classes, processes (tasks), monitors, and the queue data type (for semaphores) –Processes are types Instances are statically created by declarations (the declaration does not start its execution) An instance is “started” by init, which allocates its local data and begins its execution

13-20 Monitors Monitors are also types Form: type some_name = monitor (formal parameters) shared variables local procedures exported procedures (have entry in definition) initialization code

13-21 Monitors Evaluation of monitors: –Support for competition synchronization is great. –Support for cooperation synchronization is very similar as with semaphores, so it has the same problems

13-22 Message Passing Message passing is a general model for concurrency –It can model both semaphores and monitors –It is not just for competition synchronization Central idea: task communication is like seeing a doctor--most of the time he waits for you or you wait for him, but when you are both ready, you get together, or rendezvous (don’t let tasks interrupt each other)

13-23 Message Passing In terms of tasks, we need: a. A mechanism to allow a task to indicate when it is willing to accept messages b. Tasks need a way to remember who is waiting to have its message accepted and some “fair” way of choosing the next message Def: When a sender task’s message is accepted by a receiver task, the actual message transmission is called a rendezvous

13-24 Rendezvous Time Lines

13-25 Java Threads Competition Synchronization with Java Threads –A method that includes the synchronized modifier disallows any other method from running on the object while it is in execution –If only a part of a method must be run without interference, it can be synchronized

13-26 Java Threads Cooperation Synchronization with Java Threads –The wait and notify methods are defined in Object, which is the root class in Java, so all objects inherit them –The wait method must be called in a loop

13-27 C# Threads Basic thread operations –Any method can run in its own thread –A thread is created by creating a Thread object –Creating a thread does not start its concurrent execution; it must be requested through the Start method –A thread can be made to wait for another thread to finish with Join –A thread can be suspended with Sleep –A thread can be terminated with Abort

13-28 C# Threads Synchronizing threads –The Interlock class –The lock statement –The Monitor class Evaluation –An advance over Java threads, e.g., any method can run its own thread –Thread termination cleaner than in Java –Synchronization is more sophisticated

13-29 Statement-Level Concurrency High-Performance FORTRAN (HPF) –Extensions that allow the programmer to provide information to the compiler to help it optimize code for multiprocessor computers –Homework: 9