Lynne Hill General Manager Parallel Computing Platform Visual Studio
Improved Productivity Immersive Experience Breakthrough Innovation
CodeOptimize Validate Design Actionable performance guidance Across multiple programming models with data and task parallelism Applications for Parallelism Correctness
Addressing the Hard Problems of Concurrency Speakers: Lynne Hill, David Callahan Date/Time: Thursday, Oct. 30 8:30AM – 10:00AM Parallel Computing Application Architectures and Opportunities Speakers: John Feo, Jerry Bautista (Intel) Date/Time: Thursday, Oct :15AM – 11:45AM Future of Parallel Computing (Panel) Speakers: Dave Detlefs, Niklas Gustafsson, Sean Nordberg, James Reinders (Intel) Moderator: Selena Wilson Date/Time: Thursday, Oct :00PM – 1:30PM
David Callahan Distinguished Engineer Parallel Computing Platform Team, Visual Studio
We need your passionate feedback – make our next steps the right ones
Broad Adoption Complex Systems Diverse Targets
Enable Experts Increase Safety & Automation Reduce Concepts
Efficient Execution System Services Constructing Parallel Applications How do we: cheaply build parallel applications that can be efficiently executed and share system resources? How do we: cheaply build parallel applications that can be efficiently executed and share system resources?
Efficient Execution System Services Constructing Parallel Applications How do we: cheaply build parallel applications that can be efficiently executed and share system resources? How do we: cheaply build parallel applications that can be efficiently executed and share system resources? Chart not to scale
E S C Integrate/Tool/Encapsulate/Raise Implicit Explicit, but safe Explicit, unsafe
Integrate/Tool/Encapsulate/Raise Parallel Programming for C++ Developers … Rick Molloy, Oct. 27 3:30PM – 4:45 PM Parallel Programming for Managed Developers … Daniel Moth, Oct :30AM– 11:45AM E S C
Emphasize recursive decomposition Preserves function interfaces “fork-join” Structured control constructs Parallel loops, co-begin Emphasize recursive decomposition Preserves function interfaces “fork-join” Structured control constructs Parallel loops, co-begin Each iteration is a task E S C All tasks finish before function returns
Every Iteration is a “task” Every Iteration is a “task” New C++ Lambda Syntax
Design and modeling tools to enable developers to start with zero parallelism debt Design Debug across multiple programming models, with data and task-focused visualizations Debug Actionable performance guidance for understanding and optimizing parallel applications Optimize Tools for developers and testers to validate correctness and cope with inherent non-deterministic execution Validate Integrate/Tool/Encapsulate/Raise
New Tools In Visual Studio 2010 Integrate/Tool/Encapsulate/Raise E S C Examples in Talks by Moth and Molly mentioned earlier Microsoft Visual Studio: Bringing out the Best in Multicore Systems Hazim Shafi Oct. 27 1:45 PM – 3:00 PM MSR: Concurrency Analysis Platform and Tools for Finding Concurrency Bugs Madan Musuvathi and Tom Ball Oct 29 at 10:30 MSR: Concurrency Analysis Platform and Tools for Finding Concurrency Bugs Madan Musuvathi and Tom Ball Oct 29 at 10:30
Best not to know Parallelism inside of libraries without interface change Ok to be warned… Frameworks with callbacks – must document/specify/enforce restriction At least get to reuse New patterns for data structure traversal E S C Integrate/Tool/Encapsulate/Raise
In domain specific ways, work without explicit sequencing Integrate/Tool/Encapsulate/Raise
SequentialCoarse if(!m.visited) { m.visited = true; recurse(m); } lock(graph) var v = m.visited; if(!v) m.visited = true; unlock(graph); if(!v) recurse(m) FineLock-free lock(m) var v = m.visited; if(!v) m.visited = true; unlock(m); if(!v) recurse(m) var v = compare_and_swap( &m.visited. false, true); if(!v) recurse(m) Integrate/Tool/Encapsulate/Raise E S C Arbitrate parallel traversal of a graph: “first to visit” Arbitrate parallel traversal of a graph: “first to visit”
Specify intent: 1.Run isolated from the effects of other tasks 2.Do nothing if there is an error Specify intent: 1.Run isolated from the effects of other tasks 2.Do nothing if there is an error Looks “coarse”, runs “fine”, composes cleanly Integrate/Tool/Encapsulate/Raise E S C
Efficient Execution System Services Constructing Parallel Applications How do we: cheaply build parallel applications that can be efficiently executed and share system resources? How do we: cheaply build parallel applications that can be efficiently executed and share system resources?
Some Efficiency Factors E S C
E S C
E S C
E S C Assuming 4 worker threads Publish opportunities to be stolen by idle workers
Parallel Pattern Library Resource Manager Task Scheduler Task Parallel Library Task Parallel Library PLINQ Managed Library Native Library Key: Threads Operating System Concurrency Runtime Programming Models Agents Library Agents Library ThreadPool Task Scheduler Resource Manager Data Structures Tools Parallel Debugger Toolwindows Parallel Debugger Toolwindows Profiler Concurrency Analysis Profiler Concurrency Analysis Exposed APIs for Partners
E S C Common Resource Management General Purpose Scheduler General Purpose (Background) Real Time Schedule Domain Specific Scheduler Domain Specific Abstractions General Purpose Abstractions: Messages + Tasks + Isolation Legacy Threads + Locks Domain Specific Frameworkss General Purpose Frameworks Rich, Connected, Scalable Applications
E S C Common Resource Management General Purpose Scheduler Online Query Optimizers Standard “SQL” Query Operators LINQ.NET Bindings Rich, Connected, Scalable Applications
Efficient Execution System Services Constructing Parallel Applications How do we: cheaply build parallel applications that can be efficiently executed and share system resources? How do we: cheaply build parallel applications that can be efficiently executed and share system resources?
Process Kernel Extended Threads Extended Threads Cooperative Physical Resource Management Common Resource Management General Purpose Scheduler General Purpose (Background) Real Time Schedule Domain Specific Scheduler Domain Specific Abstractions General Purpose Abstractions: Messages + Tasks + Isolation General Purpose Abstractions: Messages + Tasks + Isolation Legacy Threads + Locks Domain Specific Frameworkss General Purpose Frameworks Rich, Connected, Scalable Applications E S C
E S C IO N N N N N N N N N N N N D D D D D D D D SP C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C C D D D D D D D D Primary Processor Primary Processor Primary Processor On a single chip: two processors kinds, cache, network, and memory and I/O controllers
Constructing Parallel Applications Efficient Execution System Services Applications Libraries Languages, Compilers and Tools Concurrency Runtime Kernel/Hypervisor Hardware
Enable Experts Increase Safety & Automation Reduce Concepts
MSDN.com/concurrency And download Parallel Extensions to the.NET Framework!
Please fill out your evaluation for this session at: This session will be available as a recording at:
© 2008 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries. The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.