Parallel Programming in.NET Kevin Luty
History of Parallelism Benefits of Parallel Programming and Designs What to Consider Defining Types of Parallelism Design Patterns and Practices Tools Supporting Libraries Agenda
Hardware ’s see parallel hardware in supercomputers Early 1980’s – super computer built with /8087 microprocessors Late 1980’s – Cluster computing power Moore’s Law Amdahl’s Law Most recently – # of cores over clock speeds History of Parallelism
Moore’s Law Moore’s Law
Amdahl’s Law
Hardware ’s sees parallel hardware in supercomputers Early 1980’s – super computer built with /8087 microprocessors Late 1980’s – Cluster computing power Moore’s Law Amdahl’s Law Most recently – # of cores over clock speeds History of Parallelism
Software One processor means sequential programs Few APIs that promote/make use of parallel programming 1990’s – No standards were created for parallel programming By 2000 – Message Passing Interface (MPI), POSIX threads (pthreads), Open Mulitprocessing (OpenMP) History of Parallelism
Task Parallel Library (TPL) for.NET Hardware capabilities Easy to write, PLINQ You can use all the cores! Timing Tools available for debugging Cost Effective Benefits of Parallel Programming
Define: What needs to be done? What data is being modified? Current state? Cost Synchronization vs. Asynchronization Output = What pattern is best What To Consider
Parallelism – Programming with multiple threads, where it is expected that threads will execute at the same time on multiple processors. Its goal is to increase throughput. Defining Types of Parallelism
Data Parallelism – When there is a lot of data, and the same operations must be performed on each piece of data Task Parallelism – There are many different operations that can run simultaneously Defining Types of Parallelism
Perform the same independent operation for each element Most common problem is not noticing dependencies How to notice dependencies Shared variables Using properties of an object Parallel Loops
Helpful Properties of TPL .Break and.Stop CancellationToken MaxDegreeOfParallelism Exception Handling ExceptionAggregate Parallel Loops
Partitioning Oversubscription and Undersubscription Parallel Loops
Making use of Parallel Loops Makes use of unshared, local variables Multiple inputs, single output Parallel Aggregation
Simple Example: Parallel Aggregation
Data Parallelism Also known as Fork/Join Pattern Uses System.Threading.Task namespace TaskFactory Invoke Wait/WaitAny/WaitAll StartNew Handling Exceptions Parallel Tasks
Handling Exceptions Exceptions are deferred until Task is done AggregateException CancellationTokenSource Can also cancel tasks outside of a Task Parallel Tasks
Work Stealing Parallel Tasks
Seen like an assembly line Pipelines
Uses BlockingCollection CompleteAdding Most problems in this design due to starvation/blocking Pipelines
Future dependencies Wait/WaitAny/WaitAll Sequential/Parallel Example Futures
Model-View-ViewModel Futures
Continuous Task adding Complete Small Tasks then Larger Tasks Binary Trees and Sorting Dynamic Task Parallelism
.NET Performance Profiler (Red-Gate) JustTrace (Telerik) GlowCode (Electric Software) Performance Profiler (Visual Studio 2010 Ultimate) Concurrency Visualizer CPU Performance Memory Management Tools
Task Parallel Library PLINQ (Parallel Language Integrated Query) Easy to learn Rx (Reactive Extensions) Supportive Libraries for.NET
Campbell, Colin, et al. Parallel Programming with Microsoft.NET: Design Patterns for Decomposition and Coordination on Multicore Architectures. Microsoft, Print. Data Parallelism (n.d.). In Wikipedia. Retrieved from Hillar, Gaston C. Professional Parallel Programming with C#: Master Parallel Extensions with.NET 4. Indiana: Wiley Print. Rx Extensions (n.d.). In Microsoft. Retrieved from T. G. Mattson, B. A. Sanders, and B. L. Massingill. Patterns for Parallel Programming. Addison-Wesley, References