Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 7 Processes and Threads.

Similar presentations


Presentation on theme: "Lecture 7 Processes and Threads."— Presentation transcript:

1 Lecture 7 Processes and Threads

2 Process vs. Program Processes are a basic unit of execution.
An “instance of a running program”. The OS creates processes for programs . The OS manages the processes and their resources.

3 Process resources To keep running, a process at least requires:
A program blocks of memory: stack , heap, code segment resource tables: File handles, socket handles, IO handles, window handles. control attributes: Execution state, Process relationship processor state information: contents of registers, Program Counter (PC), Stack Pointer

4 Process’s context All the information the OS needs to keep track of the state of a running process: A program counter. Registers’ values. Memory associated with the process.

5 The Scheduler The OS runs multiple processes “simultaneously” on a single CPU. The scheduler: The OS component responsible for sharing the CPU. OS interleaves the execution of processes: Runs a process a little while Interrupts it and saves its context. Runs another process for a little while

6 The Scheduler We have a list of active processes - and the scheduler switches from one executing process to another. Two interaction paradigms for the scheduler with current active processes: Non-Preemptive Scheduling Preemptive Scheduling

7 Non-preemptive scheduling
Non-Preemptive - process signals to OS when process is ready to relinquish the CPU: Special system call Process waits for an external event (I/O) Scheduler select next process to execute according to a scheduling policy pick the highest priority process Dos, Windows 3.1 , older versions of Mac OS. 

8 Preemptive scheduling
Preemptive: The scheduler interrupts the executing process. Interrupt: Sends interrupt at regular intervals. Suspends the currently executing process. Starts scheduler. Scheduler selects the process to execute, according to the scheduling policy. All modern OS support preemptive scheduling.

9

10 Context Switch Context Switch - OS switches between one executing process and next one. Consumes several milliseconds of processing time (10^4 simple CPU operations). Transparent to processes (process cannot tell it was preempted).

11 Context Switch - steps Timer interrupt – suspend currently executing process, start scheduler Save process’s context, for later resume. Select next process using scheduling policy. Retrieve next process’s context. Restore the state of the new process (registers , program counter) Flush CPU cache (process has new memory map, can not use cache of old process). Resume new process (start executing code from instruction that was interrupted). The most costly operation (amortized over the lifetime of a process) is flushing the CPU cache, as it will degrade the performance of the new process. As a result we would like to lower the number of context switches our application requires. To our rescue comes the notion of the Thread.

12 The video player example
The player should take the following steps in playing the video: Read video from disk/internet Decompress video Decode video Display on screen Video is larger than actual RAM memory Video is watched before download completes

13 Interleaving (sequential) solution
Read some video data (disk /network) Decompress Decode Display on screen. Repeat until the video ends

14

15 Multi-process solution
Playing the movie is decomposed into several independent tasks = processes: read movie, decompress ,decode ,display +no need to control interleaving of tasks - processes communication: how will all of these processes communicate with each other?

16 Threads - definition Single process, multiple tasks
No communication, no context switch… Multiple threads executing concurrently. Share all the resources allocated to process Share memory space / each thread own stack. Share opened files and access rights. Can communicate with each other. (CPU cache need not be flushed)

17 Context switch between threads
Not all steps of a regular context switch need to be taken: No need to restore context (share resources). No need to flush the CPU cache (share memory) Need to switch stacks.

18 Multi-threaded solution (concurrency)
Single process, several threads design: Thread reads video stream and place chunks in chunk queue. Thread reads chunks from queue, decompresses and places in decompressed queue. Thread decodes into frames and queues in frame queue… Thread takes frame and displays on screen. (CPU cache need not be flushed)

19 Concurrency advantages
Reactive programming: programs do multiple things at a time, reactive response to input GUI applications. Liberate from invoking method and blocking. Availability: common design pattern for service provider programs: Thread as gateway for incoming requests. Thread for handling requests. FTP server: gateway thread to handle new clients connecting, thread (per client) to deal with long file transfers.

20 Concurrency advantages
Controllability: thread can be suspended, resumed or stopped by another thread Simplified design: Software objects usually model real objects... Real life objects are independent and parallel. Designing autonomous behavior is easier than sequential (interleaving between objects). Simplified implementation than processes.

21 Concurrency advantages
Parallelization: On multiprocessor machines, threads executes truly concurrently allowing one thread per CPU. On single CPU, execution paths are interleaved. Services provided by RTE operate in a concurrent manner

22 Concurrency limitations
Safety: multiple threads share resources -need for synchronization mechanisms. Guarantee consistent state vs. random looking, hard real-time debugging (inconsistency). Liveness: often we need to keep a thread alive (not running) in concurrent programming. Non determinism: executions of concurrent program are not identical - harder to predict understand and debug. Benefits of concurrency should be weighed against its cost in resource consumption, efficiency and program complexity

23 Concurrency limitations
Context switching overhead: when a job performed in a thread is too small. Synchronization overhead: consumes execution time and complexity. Process: when activity is self contained and heavy - encapsulate in a different standalone program. Can be accessed via system services. Benefits of concurrency should be weighed against its cost in resource consumption, efficiency and program complexity

24 Java Support for Threads
JVM has a first class support for threads. Runnable interface - implemented by class whose instances are intended to be executed by a thread define a method run()

25

26

27 SequentialPrinter in JVM
Loads SequentialPrinter class Calls main() function Creates two objects of MessagePrinter. MessagePrinter’s run() is invoked. Result: two words printed, one after the other. "Hello“ "Goodbye"

28 Java Support for Threads
. Next, two new objects are created, both of the MessagePrinter class and each of them receives a unique id. Next, two new special, active objects are instantiated (threads). Each thread receives as an argument a Runnable object, r. When each thread is given the message to start, it will then send a message to r. Namely, each thread impersonates r and invokes r's run() method. In our example, the last thing the JVM does as the ConcurrentPrinter class is send both of the threads the start()message.

29 ConcurrentPrinter in JVM
Loads ConcurrentPrinter class. Calls the main() function. Creates two objects of MessagePrinter. Instantiate two Thread objects, with arguments for Runnable objects r. When thread’s start() is invoked, it invokes r's run(). Two separate threads running run()of different MessagePrinter objects - we cannot know ahead of time printed order.

30 How to use Threads in Java
In order to run tasks concurrently: Define the task. Ask "someone" to run it. Defining the task = implementing the Runnable interface. a method called run(), which defines the task to be performed.

31 The Thread API You can receive the thread that is executing using the static Thread.currentThread(). A constructor which receive a runnable - this sets the run() method of this runnable as the entry point of the thread. A start() method which starts the thread’s execution – it first allocates a new stack, and then start it on its entry point.

32 Thread Joining Thread Joining - When ever a thread wants to wait to another thread to complete it can use the join method of that thread Here: main thread waits for barney to finish in line 10.

33 Thread Interruption Bad example:
How can we stop stopper? Can he tell the time? One (bad) option – Kill it. Will end up being in an unknown state.

34 Thread Interruption dfbvgsd
In this example: stopper is running without a purpose.

35 Thread Interruption dfbvgsd
In this example: stopper cannot know the time (up to 10 seconds).

36 Thread interrupt flag For the interrupt mechanism to work correctly, the interrupted thread must support its own interruption.

37 Thread interrupt flag In java, many of the method that blocks a thread execution (like sleep) throws an interrupted exception.

38 When to use threads? Example: Get an n x m matrix and create a vector of length n, which has in each cell i the sum of the elements in the the ith row of the matrix

39

40 threads complicates our code
threads will actually make our program slower

41

42 threads creation takes both time and memory many threads means many context switches
 100x100 Matrix    1,000x1,000 Matrix    10,000x10,000 Matrix Sequential Version    ~0 millis    ~4 millis    ~145 millis Parallel Version    ~48 millis    ~100 millis    ~330 millis

43 Thread pool A pool of worker threads, it contains a queue that keeps tasks waiting to get executed. No need to re-create threads for every task. Java library contains a ready-made threadpool implementation called ExecutionService

44

45 100x100 Matrix    1,000x1,000 Matrix    10,000x10,000 Matrix    1,000x200,000 Matrix Sequential Version    ~0 millis    ~4 millis    ~145 millis    ~280 millis Parallel Version    ~57 millis    ~63 millis    ~103 millis    ~150 millis


Download ppt "Lecture 7 Processes and Threads."

Similar presentations


Ads by Google