A Revolutionary Programming Pattern that Will Clean up your Code : Coroutines in C++ David Sackstein davids@codeprecise.com ACCU 2015.

Slides:



Advertisements
Similar presentations
Improving Rotor for Dynamically Typed Languages Fabio Mascarenhas and Roberto Ierusalimschy.
Advertisements

COM vs. CORBA.
Task Scheduling and Distribution System Saeed Mahameed, Hani Ayoub Electrical Engineering Department, Technion – Israel Institute of Technology
Chapter 5 Processes and Threads Copyright © 2008.
G Robert Grimm New York University Lightweight RPC.
Threads Section 2.2. Introduction to threads A thread (of execution) is a light-weight process –Threads reside within processes. –They share one address.
Distributed Systems Lecture #3: Remote Communication.
Figure 2.8 Compiler phases Compiling. Figure 2.9 Object module Linking.
Precept 3 COS 461. Concurrency is Useful Multi Processor/Core Multiple Inputs Don’t wait on slow devices.
Introduction to Operating Systems – Windows process and thread management In this lecture we will cover Threads and processes in Windows Thread priority.
3.5 Interprocess Communication Many operating systems provide mechanisms for interprocess communication (IPC) –Processes must communicate with one another.
3.5 Interprocess Communication
Threads CSCI 444/544 Operating Systems Fall 2008.
CS533 Concepts of Operating Systems Class 7 Integrated Task and Stack Management.
Concurrency, Threads, and Events Robbert van Renesse.
© Lethbridge/Laganière 2001 Chap. 3: Basing Development on Reusable Technology 1 Let’s get started. Let’s start by selecting an architecture from among.
CS533 Concepts of Operating Systems Class 3 Integrated Task and Stack Management.
Mehmet Can Vuran, Instructor University of Nebraska-Lincoln Acknowledgement: Overheads adapted from those provided by the authors of the textbook.
1 Threads Chapter 4 Reading: 4.1,4.4, Process Characteristics l Unit of resource ownership - process is allocated: n a virtual address space to.
Fundamentals of Python: From First Programs Through Data Structures
Networking Nasrullah. Input stream Most clients will use input streams that read data from the file system (FileInputStream), the network (getInputStream()/getInputStream()),
Introduction to the Enterprise Library. Sounds familiar? Writing a component to encapsulate data access Building a component that allows you to log errors.
Programming Network Servers Topic 6, Chapters 21, 22 Network Programming Kansas State University at Salina.
1 Lecture 4: Threads Operating System Fall Contents Overview: Processes & Threads Benefits of Threads Thread State and Operations User Thread.
Parallel Programming Models Jihad El-Sana These slides are based on the book: Introduction to Parallel Computing, Blaise Barney, Lawrence Livermore National.
FINAL MPX DELIVERABLE Due when you schedule your interview and presentation.
Architecture Support for OS CSCI 444/544 Operating Systems Fall 2008.
Threaded Applications Introducing additional threads in a Delphi application is easy.
Chapter 34 Java Technology for Active Web Documents methods used to provide continuous Web updates to browser – Server push – Active documents.
OPERATING SYSTEMS Goals of the course Definitions of operating systems Operating system goals What is not an operating system Computer architecture O/S.
Java Threads 11 Threading and Concurrent Programming in Java Introduction and Definitions D.W. Denbo Introduction and Definitions D.W. Denbo.
 2004 Deitel & Associates, Inc. All rights reserved. 1 Chapter 4 – Thread Concepts Outline 4.1 Introduction 4.2Definition of Thread 4.3Motivation for.
Continuations And Java Regis -
CE Operating Systems Lecture 11 Windows – Object manager and process management.
Operating Systems Lecture 7 OS Potpourri Adapted from Operating Systems Lecture Notes, Copyright 1997 Martin C. Rinard. Zhiqing Liu School of Software.
Java Threads. What is a Thread? A thread can be loosely defined as a separate stream of execution that takes place simultaneously with and independently.
1 Concurrency Architecture Types Tasks Synchronization –Semaphores –Monitors –Message Passing Concurrency in Ada Java Threads.
Threaded Programming in Python Adapted from Fundamentals of Python: From First Programs Through Data Structures CPE 401 / 601 Computer Network Systems.
Chapter 2 Processes and Threads Introduction 2.2 Processes A Process is the execution of a Program More specifically… – A process is a program.
ICS 313: Programming Language Theory Chapter 13: Concurrency.
Middleware Services. Functions of Middleware Encapsulation Protection Concurrent processing Communication Scheduling.
Processor Architecture
Operating Systems CSE 411 CPU Management Sept Lecture 10 Instructor: Bhuvan Urgaonkar.
© Janice Regan, CMPT 300, May CMPT 300 Introduction to Operating Systems Operating Systems Processes and Threads.
Lecture 4 Mechanisms & Kernel for NOSs. Mechanisms for Network Operating Systems  Network operating systems provide three basic mechanisms that support.
2 Processor(s)Main MemoryDevices Process, Thread & Resource Manager Memory Manager Device Manager File Manager.
1 Why Threads are a Bad Idea (for most purposes) based on a presentation by John Ousterhout Sun Microsystems Laboratories Threads!
Assembly Language Co-Routines
SPL/2010 Reactor Design Pattern 1. SPL/2010 Overview ● blocking sockets - impact on server scalability. ● non-blocking IO in Java - java.niopackage ●
Threads, SMP, and Microkernels Chapter 4. Processes and Threads Operating systems use processes for two purposes - Resource allocation and resource ownership.
CHAPTER 6 Threads, Handlers, and Programmatic Movement.
Multithreading vs. Event Driven in Code Development of High Performance Servers.
Introduction to Operating Systems Concepts
Chapter 4 – Thread Concepts
Threaded Programming in Python
Processes and threads.
Threads vs. Events SEDA – An Event Model 5204 – Operating Systems.
Advanced Topics in Concurrency and Reactive Programming: Asynchronous Programming Majeed Kassis.
Mechanism: Limited Direct Execution
Chapter 4 – Thread Concepts
Async or Parallel? No they aren’t the same thing!
12 Asynchronous Programming
Multithreading.
Multithreaded Programming
Lecture Topics: 11/1 General Operating System Concepts Processes
Threaded Programming in Python
Structuring of Systems using Upcalls
Implementing Processes, Threads, and Resources
NETWORK PROGRAMMING CNET 441
CS 5204 Operating Systems Lecture 5
Presentation transcript:

A Revolutionary Programming Pattern that Will Clean up your Code : Coroutines in C++ David Sackstein davids@codeprecise.com ACCU 2015

Agenda What’s in it for me? Three Problems Solutions in C# Threads, Fibers and Coroutines Boost Coroutines Simplifying Asynchronous code with Boost.Asio Coroutines Summing Up Agenda

What’s in it for me? You will be able to write asynchronous code without state machines. Therefore Your code will be efficient and scalable Your business logic will be readable and maintainable. What’s in it for me?

Two Problems Parsers and Generators Asynchronous Methods Three Problems

The Parser Problem A parser reads information from a document or stream It processes the information and produces tokens. It is convenient to pull from the source and push tokens to a consumer But this doesn’t work when: If the document is received from the network If the document is itself the result of another parsing. Solution: Rewrite so that each call produces one token Maintains state between calls. Three Problems

The Generator Problem A generator produces a sequence of elements Possible implementation: Calculate all elements and place in a collection, return the collection But this doesn’t work when The number of elements is large or unknown ahead of time Due to memory constraints and latency Solution Rewrite so that on each call it produces one element Maintain state between calls. Three Problems

The Asynchronous Method Problem How is it done? Call a void method with arguments and completion callback Callbacks are synchronized with the initiator’s context. Subsequent operations are performed in the callback. Problems that arise Business logic is broken up and dispersed among callbacks Throwing an exception unwinds all methods on the stack Three Problems

Solutions in C# A generator with yield return An asynchronous function call with Async-Await Solutions in C#

A generator with yield return See Sample 1.A Solutions in C#

How does this work? Caller: Callee: The compiler translates foreach into calls to the IEnumerable interface returned by Fibonacci() Callee: The compiler generates a class that implements IEnumerable based on the implementation of Fibonacci() The implementation of Fibonacci() instantiates an instance of the class and returns it. Solutions in C#

An asynchronous call with async-await Consider this asynchronous call with a callback Solutions in C#

An asynchronous call with async-await Which is called like so Solutions in C#

An asynchronous call with async-await Should be rewritten like so See Sample 1.B Solutions in C#

An asynchronous call with async-await Which can be called like so Solutions in C#

How does this work? The compiler creates a class which implements a state machine. The code before the await and the code after await are compiled as the work to be done in different states When the awaited task completes, the state machine’s MoveNext is invoked and it executes the code for the next state. The compiler starts the state machine and returns. Solutions in C#

Threads, Fibers and Coroutines Threads, fibers and coroutines use the stack to store state But there are important differences between them Threads, Fibers and Coroutines

How Threads Can Help Cons Each thread has its own stack which stores context Cons Synchronization is required to switch context Threads are expensive Request Data 1 Supply Data 1 Request Data 2 Supply Data 2 Threads, Fibers and Coroutines

Fibers Might Be Better Fibers are like threads with cooperative multitasking Execution continues until a fiber yields explicitly Execution is serial - no protection of shared data is required No kernel mode overhead Context switching is immediate. No idle wait. Supported on Windows and Linux Threads, Fibers and Coroutines

A Generator with Fibers The generator: See Sample 2 Threads, Fibers and Coroutines

A Generator with Fibers The caller Threads, Fibers and Coroutines

The Fiber class The Fiber class wraps the following Windows APIs: API Function Description ConvertThreadToFiber Enables the thread to create fibers CreateFiber Creates a child fiber SwitchToFiber Switches to a fiber by its handle DeleteFiber Deletes the child fiber from CreateFiber Threads, Fibers and Coroutines

Coroutines Coroutines are similar to fibers, though Fibers are described in terms of threads Coroutines are described in terms of functions (subroutines) A coroutine is a routine that can be entered more than once: Suspends execution preemptively by invoking a yield call Execution is resumed when another coroutine yields The stack is preserved between entries Threads, Fibers and Coroutines

An Important Difference Both coroutines and fibers unwind themselves when an exception is thrown, but behave differently when an exception is not caught in the coroutine/fiber: Fibers behave like threads: Uncaught exceptions terminates the process Coroutines behave like nested functions: Uncaught exceptions may be caught by the caller Threads, Fibers and Coroutines

Boost Context (Oliver Kowalke) Managing stacks is a difficult problem that has been attempted in C using setjmp() and longjmp() But these do not handle stack unwinding for objects that have non-trivial destructors. Boost.Context provides context management in a portable way. Boost.Coroutine uses Boost.Context and provides a higher level of abstraction for multitasking in one thread. Boost Coroutines

Boost Fiber (Oliver Kowalke) Boost Fiber is currently under review. It will provide a framework for micro-threads scheduled cooperatively (fibers). The API contains classes and functions to manage and synchronize fibers similarly to Boost.Thread Boost.Fiber uses Boost.Context to manage a stack per fiber. Boost Coroutines

Boost Coroutines (Oliver Kowalke) Boost.Coroutine uses Boost.Context to provide: Asymmetric coroutines An asymmetric coroutine knows its invoker, using a special operation to implicitly yield control specifically to its invoker Symmetric coroutines All symmetric coroutines are equivalent; one symmetric coroutine may pass control to any other symmetric coroutine Both types may or may not pass a result (in one direction) Boost Coroutines

Boost.Coroutine and Boost.Fiber Boost.Context Boost.Coroutine Stackful Coroutines symmetric coroutine asymmetric coroutine Boost.Fiber fiber Boost Coroutines

A Generator using Asymmetric Coroutines Use these definitions: See Sample 3.A Boost Coroutines

A Generator using Asymmetric Coroutines The generator Boost Coroutines

A Generator using Asymmetric Coroutines The consumer Boost Coroutines

A Generator using Symmetric Coroutines Use these definitions: See Sample 3.B Boost Coroutines

A Generator using Symmetric Coroutines Generator and Consumer Boost Coroutines

A Generator using Symmetric Coroutines Usage of the generator and consumer Boost Coroutines

Back to Asynchronous Methods Introduction to Boost Asio Boost Asio and Coroutines Using boost::asio::yield_context with asynchronous I/O Extending yield_context for any asynchronous call. Boost Asio and Coroutines

Boost.Asio (Christopher Kohlhoff) A library that provides tools to manage long running operations, without requiring threads and explicit locking. The central object exposed is the io_service io_service encapsulates an event loop To service the event loop call run() in one or more threads You can post any callable object to the io_service Provides synchronous and asynchronous networking services Boost Asio and Coroutines

Asynchronous functions in Boost.Asio An “invoker” initiates an asynchronous function passing in a callback handler which returns immediately. The invoker can now call io_service.run() which blocks for as long as there is work in the queue. As soon as the asynchronous function completes, the callback handler is posted to the io_service queue. The invoker encounters the callback “work” and the callback is then invoked in the thread of the invoker. If you call io_service.run() from two threads. What happens? Boost Asio and Coroutines

Applying coroutines (1) The “invoker” creates a pull_type, passing in the “work” as a lambda The work lambda receives a push_type and the pull_type by reference. It initiates an asynchronous function, passing in a special callback handler which captures a reference to the push_type object. The work lambda then yields to the pull_type yielding to the invoker. The work lambda should then yield to the invoker. The invoker now calls io_service.run() which services other clients. Boost Asio and Coroutines

Applying coroutines (2) When the asynchronous operation completes, the special callback handler is posted to the io_service queue. Eventually run() picks it up and executes the handler. The handler yields to the invoked push_type coroutine. The initiating coroutine magically wakes up after the asynchronous call completed. Boost Asio and Coroutines

Applying coroutines (3) This is more or less how Boost Asio wraps coroutines It defines boost::asio::yield_context which encapsulates pointers to a pull_type and push_type asymmetrical coroutines. The invoker calls spawn function to create the coroutines and pass control to the work delegate. The work delegate receives the yield_context as an argument. yield_context is a type that will enable us to write asynchronous functions as if they were synchronous calls. Boost Asio and Coroutines

Applying coroutines (4) yield_context is passed as a callback handler to asynchronous functions. It is invocation of the yield_context that switches context back to the work coroutine to continue work as if the call to the asynchronouls function had completed synchronously. Therefore, yield_context a type enables us to write asynchronous functions as if they were synchronous calls. Boost Asio and Coroutines

Building the solution Presenting an asynchronous function Initiating an asynchronous chain of calls Calling the function repetitively (recursively) How we would prefer to call the function Revising initiation of the chain of calls How do we get it to work using coroutines? Boost Asio and Coroutines

The asynchronous function The callback handler Boost will post this delegate to the io_service when the timer expires The handler will be called after 1 second Boost Asio and Coroutines

Initiating an asynchronous chain of calls Post some work to the io_service Post our chain of calls Service all work on this thread

Calling repetitively (recursively) Provide a callback Call recursively from the callback Boost Asio and Coroutines

How we would prefer to call the function A new type by value Pass the yield context to the asynchronous function Iteration, not recursion Boost Asio and Coroutines

Revising initiation of the chain of calls Post some work to the io_service My_spawn sets up the yield_context and passes it to the delegate Boost Asio and Coroutines

How do we get it to work using coroutines? Boost Asio and Coroutines

Boost.Asio In Action Sample 4.A Sample 4.B Sample 4.C Demonstrates the use of synchronous I/O for TCP communication Sample 4.B Demonstrates the use of asynchronous in the same application Sample 4.C Demonstrates the use of yield_context in the same application Boost Asio and Coroutines

Background: The Echo Application Server and one or more clients Each client connects to the server using TCP/IP Client sends a message the server The server sends it back The client sends back what it received ad infinitum. Each message comprises a 4 byte size followed by the data Boost Asio and Coroutines

Architecture Client Server Connection Echoer Messenger Boost.Asio { Spawns } Echoer { Uses } Application Layer: Echo Logic Messenger Message Layer: Message Format Samples Differ Here Boost.Asio Transport Layer : TCP/IP Boost Asio and Coroutines

Synchronous I/O : Message Write See Sample 4.A Boost Asio and Coroutines

Synchronous I/O : Message Read Boost Asio and Coroutines

Asynchronous I/O : Message Write (1) See Sample 4.B Boost Asio and Coroutines

Asynchronous I/O : Message Write (2) Boost Asio and Coroutines

Asynchronous I/O : Message Read (1) Boost Asio and Coroutines

Asynchronous I/O : Message Read (2) Boost Asio and Coroutines

yield_context: Message Write See Sample 4.C Boost Asio and Coroutines

yield_context: Message Read Boost Asio and Coroutines

Boost Coroutines Boost.Context Boost.Coroutine Boost.Fiber Stackful Coroutines symmetric coroutine asymmetric coroutine Boost.Fiber symmetric coroutine Boost Asio and Coroutines

Boost Asio and Coroutines Boost.Context Boost.Coroutine Boost.Asio io_service stackless coroutine Stackful Coroutines symmetric coroutine asymmetric coroutine yield_context sockets Boost.Fiber symmetric coroutine Boost Asio and Coroutines

Summing Up Coroutines allow a function to yield control preemptively to another coroutine and resume when control is returned, preserving the state of the stack. Coroutines can be used to keep parsers and generators flow driven rather than state driven. spawn and yield_context from Boost.Asio encapsulate coroutines and allow you to write scalable, asynchronous code without callbacks.

What’s in it for me? You will be able to write asynchronous code without state machines. Therefore Your code will be efficient and scalable Your business logic will be readable and maintainable. A revolutionary pattern to clean up your code: Coroutines! What’s in it for me?