1 Lecture #24 Shared Objects and Concurrent Programming This material is not available in the textbook. The online powerpoint presentations contain the.

Slides:



Advertisements
Similar presentations
Symmetric Multiprocessors: Synchronization and Sequential Consistency.
Advertisements

Chapter 5 Concurrency: Mutual Exclusion and Synchronization Operating Systems: Internals and Design Principles, 6/E William Stallings Patricia Roy Manatee.
Mutual Exclusion – SW & HW By Oded Regev. Outline: Short review on the Bakery algorithm Short review on the Bakery algorithm Black & White Algorithm Black.
Operating Systems Part III: Process Management (Process Synchronization)
CIS 540 Principles of Embedded Computation Spring Instructor: Rajeev Alur
Synchronization. How to synchronize processes? – Need to protect access to shared data to avoid problems like race conditions – Typical example: Updating.
Global Environment Model. MUTUAL EXCLUSION PROBLEM The operations used by processes to access to common resources (critical sections) must be mutually.
Ch. 7 Process Synchronization (1/2) I Background F Producer - Consumer process :  Compiler, Assembler, Loader, · · · · · · F Bounded buffer.
Process Synchronization Continued 7.2 The Critical-Section Problem.
Mutual Exclusion By Shiran Mizrahi. Critical Section class Counter { private int value = 1; //counter starts at one public Counter(int c) { //constructor.
Silberschatz, Galvin and Gagne ©2007 Operating System Concepts with Java – 7 th Edition, Nov 15, 2006 Chapter 6 (a): Synchronization.
Concurrency: Mutual Exclusion and Synchronization - Chapter 5 (Part 2)
Chapter 6: Process Synchronization
Silberschatz, Galvin and Gagne ©2013 Operating System Concepts – 9 th Edition Chapter 5: Process Synchronization.
Silberschatz, Galvin and Gagne ©2009 Operating System Concepts – 8 th Edition, Chapter 6: Process Synchronization.
Multiprocessor Synchronization Algorithms ( ) Lecturer: Danny Hendler The Mutual Exclusion problem.
Concurrent Programming James Adkison 02/28/2008. What is concurrency? “happens-before relation – A happens before B if A and B belong to the same process.
Process Synchronization. Module 6: Process Synchronization Background The Critical-Section Problem Peterson’s Solution Synchronization Hardware Semaphores.
Mutual Exclusion.
CH7 discussion-review Mahmoud Alhabbash. Q1 What is a Race Condition? How could we prevent that? – Race condition is the situation where several processes.
Introduction Companion slides for
Chair of Software Engineering Concurrent Object-Oriented Programming Prof. Dr. Bertrand Meyer Lecture 3: Introduction.
CIS 540 Principles of Embedded Computation Spring Instructor: Rajeev Alur
Review: Chapters 1 – Chapter 1: OS is a layer between user and hardware to make life easier for user and use hardware efficiently Control program.
Chapter 5 Concurrency: Mutual Exclusion and Synchronization Operating Systems: Internals and Design Principles, 6/E William Stallings Patricia Roy Manatee.
Shared Memory Coordination We will be looking at process coordination using shared memory and busy waiting. –So we don't send messages but read and write.
1 Lecture 6 Performance Measurement and Improvement.
6: Process Synchronization 1 1 PROCESS SYNCHRONIZATION I This is about getting processes to coordinate with each other. How do processes work with resources.
Chapter 2.3 : Interprocess Communication
Chapter 11: Distributed Processing Parallel programming Principles of parallel programming languages Concurrent execution –Programming constructs –Guarded.
Introduction Companion slides for The Art of Multiprocessor Programming by Maurice Herlihy & Nir Shavit Modified by Rajeev Alur for CIS 640 at Penn, Spring.
02/17/2010CSCI 315 Operating Systems Design1 Process Synchronization Notice: The slides for this lecture have been largely based on those accompanying.
Hardware solutions So far we have looked at software solutions for the critical section problem. –algorithms whose correctness does not rely on any other.
02/19/2007CSCI 315 Operating Systems Design1 Process Synchronization Notice: The slides for this lecture have been largely based on those accompanying.
Adopted from and based on Textbook: Operating System Concepts – 8th Edition, by Silberschatz, Galvin and Gagne Updated and Modified by Dr. Abdullah Basuhail,
Operating Systems CSE 411 CPU Management Oct Lecture 13 Instructor: Bhuvan Urgaonkar.
Concurrency, Mutual Exclusion and Synchronization.
Multicore Programming Nir Shavit Tel Aviv University.
© Janice Regan, CMPT 300, May CMPT 300 Introduction to Operating Systems Introduction to Concurrency.
Process Synchronization Continued 7.2 Critical-Section Problem 7.3 Synchronization Hardware 7.4 Semaphores.
Games Development 2 Concurrent Programming CO3301 Week 9.
Midterm 1 – Wednesday, June 4  Chapters 1-3: understand material as it relates to concepts covered  Chapter 4 - Processes: 4.1 Process Concept 4.2 Process.
Operating Systems CSE 411 Multi-processor Operating Systems Multi-processor Operating Systems Dec Lecture 30 Instructor: Bhuvan Urgaonkar.
Operating Systems ECE344 Ashvin Goel ECE University of Toronto Mutual Exclusion.
Concurrency: Mutual Exclusion and Synchronization Chapter 5.
Chapter 7 -1 CHAPTER 7 PROCESS SYNCHRONIZATION CGS Operating System Concepts UCF, Spring 2004.
CY2003 Computer Systems Lecture 04 Interprocess Communication.
1 Lecture #21 Shared Objects and Concurrent Programming This material is not available in the textbook. The online powerpoint presentations contain the.
CS399 New Beginnings Jonathan Walpole. 2 Concurrent Programming & Synchronization Primitives.
Fall 2008Programming Development Techniques 1 Topic 20 Concurrency Section 3.4.
Distributed Mutual Exclusion Synchronization in Distributed Systems Synchronization in distributed systems are often more difficult compared to synchronization.
1 5-High-Performance Embedded Systems using Concurrent Process (cont.)
Silberschatz, Galvin and Gagne ©2009 Operating System Concepts – 8 th Edition Chapter 6: Process Synchronization.
CIS 540 Principles of Embedded Computation Spring Instructor: Rajeev Alur
Mutual Exclusion -- Addendum. Mutual Exclusion in Critical Sections.
Synchronization Questions answered in this lecture: Why is synchronization necessary? What are race conditions, critical sections, and atomic operations?
Concurrency Idea. 2 Concurrency idea Challenge –Print primes from 1 to Given –Ten-processor multiprocessor –One thread per processor Goal –Get ten-fold.
Chapter 5 Concurrency: Mutual Exclusion and Synchronization Operating Systems: Internals and Design Principles, 6/E William Stallings Patricia Roy Manatee.
Background on the need for Synchronization
Atomic Operations in Hardware
5-High-Performance Embedded Systems using Concurrent Process (cont.)
Module 7a: Classic Synchronization
Lecture 2 Part 2 Process Synchronization
Group Mutual Exclusion & Hadzilacos Algorithm
Implementing Mutual Exclusion
Concurrency: Mutual Exclusion and Process Synchronization
CS333 Intro to Operating Systems
Atomicity, Mutex, and Locks
Process/Thread Synchronization (Part 2)
Presentation transcript:

1 Lecture #24 Shared Objects and Concurrent Programming This material is not available in the textbook. The online powerpoint presentations contain the text explanations given in class.

2 A Multiprocessor Machine P1P1 P2P2 P P1P1 memory Shared memory Uniprocessor Multiprocessor

3 Concurrent Programming object Shared Memory Challenge: coordinating access

4 Persistent vs Transient Communication Persistent Communication medium: the sending of information changes the state of the medium forever. Example: Blackboard. Transient communication medium: the change of state is only for some limited time period. Example: Talking.

5 Parallel Primality Testing Task: Print all primes from 1 to in some order Avaliable: A machine with 10 processors Solution: Speed work up 10 times, that is, new time to print all primes will be 1/10 of time for single processor

6 Parallel Primality Testing P1P1 P2P2 P x Split the work among processors! Each processor P i gets 10 9 numbers to test. … …

7 Parallel Primality Testing (define (P i) (let ((counter (+ 1 (* (- i 1) (power 10 9)))) (upto (* i (power 10 9)))) (define (iter) (if (< counter upto) (begin (if (prime? counter) (display counter) #f) (increment-counter) (iter)) 'done)) (iter))) (parallel-execute (P 1) (P 2)... (P 10))

8 Problem: work is split unevenly Some processors have less primes to test… Some composite numbers are easier to test… P1P1 P2P2 P x Need to split the work range dynamically!

9 A Shared Counter Object (define (make-shared-counter value) (define (fetch) value) (define (increment) (set! value (+ 1 value)) (define (dispatch m) (cond (((eq? m 'fetch) (fetch)) (eq? m 'increment) (increment)) (else (error “unknown request”)))) dispatch) (define shared-counter (make-shared-counter 1))

10 Using the Shared Counter (define (P i) (define (iter) (let ((index (shared-counter 'fetch))) (if (< index (power 10 10)) (begin (if (prime? index) (display index) #f) (shared-counter 'increment) (iter)) 'done)) (iter))) (parallel-execute (P 1) (P 2)... (P 10))

11 This Solution Doesn’t Work time Increment: (set! value (+ 1 value)) P 1 read value 77 P 2 increment 10 times 87 set! value 78 Error! (let ((index (shared-counter 'fetch))) 77 P 1 fetch P 2 fetch 77 Error!

12 It Will Never Work!? Real World Example: Walking in the Street Look / MoveFetch / Increment Computers: Accessing Memory Fischer Lynch & Patterson: Impossible to solve!!! Need to “glue” the Fetch / Increment pair into one indivisible operation: Fetch-and-Increment

13 The Fetch-and-Increment Operation (define (make-shared-counter value) (define (fetch-and-increment) (let ((old value)) (set! value (+ old 1)) old)) (define (dispatch m) (cond (((eq? m 'fetch-and-increment) (fetch-and-increment)) (else (error ``unknown request -- counter'' m)))) dispatch) Instanteneous Shared Counter Fetch-and-inc

14 A Correct Shared Counter (define shared-counter (make-shared-counter 1)) (define (P i) (define (iter) (let ((index (shared-counter 'fetch-and-increment))) (if (< index (power 10 10)) (begin (if (prime? index) (display index) #f) (iter)) 'done)) (iter))) (parallel-execute (P 1) (P 2)... (P 10))

15 Implementing Fetch-and-Inc To make the program work we need an “intantaneous” implementation of fetch-and-increment. How can we do this: Special Hardware. Built-in synchronization instructions. Special Software. Use regular instructions -- the solution will involve waiting. Software: Mutual Exclusion

16 Mutual Exclusion (mutex 'start) (let ((old value)) (set! value (+ old 1)) old) (mutex 'end)) Only one process at a time can execute these instructions P1P1 P2P2 P P2P2 returns 1 Mutex count

17 The Story of Alice and Bob Bob Alice Yard * As told by Leslie Lamport

18 The Mutual Exclusion Problem Requirements: Mutual Exclusion: there will never be two dogs simultaneously in the yard. No Deadlock: if only one dog wants to be in the yard it will succeed, and if both dogs want to go out, at least one of them will succeed.

19 Cell Phone Solution Bob Alice Yard

20 Coke Can Solution Bob Alice Yard

21 Flag Solution -- Alice (define (Alice) (loop ;; ``repeat forever'' (set! Alice-flag 'up) ;; Alice wants to enter (do ((= Bob-flag 'up)) (skip)) ;; loop until Bob lowers flag (Alice-dog-in-yard) ;; Dog can enter the yard (set! Alice-flag 'down) ;; Alice is leaving )) (define (Alice) (loop ;; ``repeat forever'' (set! Alice-flag 'up) ;; Alice wants to enter (do ((= Bob-flag 'up)) (skip)) ;; loop until Bob lowers flag (Alice-dog-in-yard) ;; Dog can enter the yard (set! Alice-flag 'down) ;; Alice is leaving )) Bob Alice

22 Flag Solution -- Bob (define (Bob) (loop ;; ``repeat forever'' (set! Bob-flag 'up) ;; Bob wants to enter (do ((= Alice-flag 'up)) ;; If Alice wants to enter (set! Bob-flag 'down) ;; Bob is a gentleman (do ((= Alice-flag 'up)) (skip)) ;; loop (skip) till Alice leaves (set! Bob-flag 'up) ;; raise flag ) ;; and go through the do again (Bob-dog-in-yard) ;; Dog can enter yard (set! Bob-flag 'down) ;; Bob is leaving )) (define (Bob) (loop ;; ``repeat forever'' (set! Bob-flag 'up) ;; Bob wants to enter (do ((= Alice-flag 'up)) ;; If Alice wants to enter (set! Bob-flag 'down) ;; Bob is a gentleman (do ((= Alice-flag 'up)) (skip)) ;; loop (skip) till Alice leaves (set! Bob-flag 'up) ;; raise flag ) ;; and go through the do again (Bob-dog-in-yard) ;; Dog can enter yard (set! Bob-flag 'down) ;; Bob is leaving ))

23 Flag Solution -- Both (define (Alice) (loop ;; ``repeat forever'' (set! Alice-flag 'up) ;; Alice wants to enter (do ((= Bob-flag 'up)) (skip)) ;; loop until Bob lowers flag (Alice-dog-in-yard) ;; Dog can enter the yard (set! Alice-flag 'down) ;; Alice is leaving )) (define (Alice) (loop ;; ``repeat forever'' (set! Alice-flag 'up) ;; Alice wants to enter (do ((= Bob-flag 'up)) (skip)) ;; loop until Bob lowers flag (Alice-dog-in-yard) ;; Dog can enter the yard (set! Alice-flag 'down) ;; Alice is leaving )) (define (Bob) (loop ;; ``repeat forever'' (set! Bob-flag 'up) ;; Bob wants to enter (do ((= Alice-flag 'up)) ;; If Alice wants to enter (set! Bob-flag 'down) ;; Bob is a gentleman (do ((= Alice-flag 'up)) (skip)) ;; loop (skip) till Alice leaves (set! Bob-flag 'up) ;; raise flag ) ;; and go through the do again (Bob-dog-in-yard) ;; Dog can enter yard (set! Bob-flag 'down) ;; Bob is leaving )) (define (Bob) (loop ;; ``repeat forever'' (set! Bob-flag 'up) ;; Bob wants to enter (do ((= Alice-flag 'up)) ;; If Alice wants to enter (set! Bob-flag 'down) ;; Bob is a gentleman (do ((= Alice-flag 'up)) (skip)) ;; loop (skip) till Alice leaves (set! Bob-flag 'up) ;; raise flag ) ;; and go through the do again (Bob-dog-in-yard) ;; Dog can enter yard (set! Bob-flag 'down) ;; Bob is leaving ))

24 Intuition: Why Mutual Exclusion is Preserved Each perform: First raise the flag, to signal interest. Then look to see if the other one has raised the flag. One can claim that the following flag principle holds: since Alice and Bob each raise their own flag and then look at the others flag, the last one to start looking must notice that both flags are up.

25 Why is there no Deadlock? Since Alice has priority over Bob…if neither is entering the critical section, both are repeatedly trying, and Bob will give Alice priority. Unfortunately, the algorithm is not a fair one, and Bob's dogs might eventually grow very anxious :-)

26 The Morals of our Story The Mutual Exclusion problem cannot be solved using transient communication. (I.e. Cell-phones.) The Mutual Exclusion problem cannot be solved using interrupts or interrupt bits (I.e. Cans) The Mutual Exclusion problem can be solved with one bit registers (I.e. Flags), memory locations that can be read and written (set!-ed). We cheated a little: the arbiter problem…

27 The Solution and Conclusion (define (Alice) (loop (mutex 'begin) (Alice-dog-in-yard) ;; critical section (mutex 'end) )) Question: then why not execute all the code of the parallel prime-printing algorithm in a critical section?

28 Answer: Amdahl’s Law Speedup Overall = Execution_time_Old / Execution_time_New = 1 Fraction_Enhanced Speedup_Enhanced 1- Fraction_Enhanced + If 40\% of the execution time and can be sped-up by a factor of 10 by using 10 processors. Then Speedup Overall = 1/(0.6 + (0.4/10)) = 1/0.64 = 1.56

29 Length of Critical Sections We want Linear Speedup: 100 processors = 100 times faster By Amdahl's law that means that Fraction_Enhanced = 1  No Sequential Parts!!!!! For a speedup of only 99 times: 1-Fraction_Enhacned= In other words, we must try and make the parts that are sequential (the critical sections) as short as possible!!!!!

30 Summarizing it all To summarize it all, our world is asynchronous, and yet with a bit of luck we have ways of overcoming this asynchrony to create powerful concurrent algorithms. This was best summarized by the quote which some attribute to Tommy Lasorda: “Life is the synchronicity of chance” Good Luck on Your Final Exam!