Download presentation
Presentation is loading. Please wait.
Published byGerald Cunningham Modified over 9 years ago
1
Contemporary Languages in Parallel Computing Raymond Hummel
2
Current Languages
3
Standard Languages Distributed Memory Multiprocessors MPI Shared Memory Multiprocessors OpenMP pthreads Graphics Processing Units CUDA OpenCL
4
MPI Stands for: Message Passing Interface Pros Extremely Scalable Portable Can harness a multitude of hardware setups Cons Complicated Software Complicated Hardware Complicated Setup
5
MPI
6
OpenMP Stands for: Open Multi-Processing Pros Incremental Parallelization Fairly Portable Simple Software Cons Limited Use-Case
7
OpenMP
8
POSIX Threads Stands for: Portable Operating System Interface Threads Pros Portable Fine Grained Control Cons All-or-Nothing Complicated Software Limited Use-Case
9
POSIX Threads
10
CUDA Stands for: Compute Unified Device Architecture Pros Manufacturer Support Low Level Hardware Access Cons Limited Use-Case Only Compatible with NVIDIA Hardware
11
CUDA
12
OpenCL Stands for: Open Compute Language Pros Portability Heterogeneous Platform Works with All Major Manufacturers Cons Complicated Software Special Tuning Required
14
Future Languages
15
Developing Languages D Rust Harlan
16
D Performance of Compiled Languages Memory Safety Expressiveness of Dynamic Languages Includes a Concurrency Aware Type-System Nearing Maturity
17
Rust Designed for creation of large Client-Server Programs on the Internet Safety Memory Layout Concurrency Still Major Changes Occurring
18
Harlan Experimental Language Based on Scheme Designed to take care of boilerplate for GPU Programming Could be expanded to include automatic scheduling for both CPU and GPU, depending on available resources.
19
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.