Download presentation
Presentation is loading. Please wait.
Published byWilfrid Austin Modified over 8 years ago
1
Semantics of Minimally Synchronous Parallel ML Myrto Arapini, Frédéric Loulergue, Frédéric Gava and Frédéric Dabrowski LACL, Paris, France
2
F. GavaSNPD 20032 Outline Introduction Pure functional minimally synchronous parallel programming High level semantics Distributed evaluation Conclusions and future work
3
Introduction
4
F. GavaSNPD 20034 Our previous work: Bulk Synchronous Parallelism + Functional Programming = BSML Bulk Synchronous Parallelism : Scalability Portability Simple cost model Functional Programming : High level features (higher order functions, pattern matching, concrete types, etc.) Programs proofs Safety of the environment
5
F. GavaSNPD 20035 What is MSPML ? Why MSPML ? = Minimally Synchronous Parallel ML BSML without synchronization barriers Comparison of efficiency BSML/MSPML Further extensions restricted or impossible in BSML (divide-and-conquer, nesting of parallel values) Basis for an additional level to BSML for Grid computing: BSML on each nodes (clusters) + MSPML for coordination
6
Pure Functional Minimally Synchronous Parallel Programming
7
F. GavaSNPD 20037 The MSPML Library Parallel ML library for the Objective Caml language operations on a parallel data structure Abtract type: par access to the machine parameters: p: unit int p() = number of processes
8
F. GavaSNPD 20038 Creation of Parallel Vectors mkpar: (int ) par (mkpar f ) f (p-1)…(f 1)(f 0)
9
F. GavaSNPD 20039 Pointwise Parallel Application apply: ( ) par par par apply = f p-1 …f1f1 f0f0 v p-1 …v1v1 v0v0 f p-1 v p-1 …f 1 v 1 f 0 v 0
10
F. GavaSNPD 200310 Example (1) let replicate x = mkpar(fun pid->x) replicate: ’a -> ’a par (replicate 5) 5…55
11
F. GavaSNPD 200311 Example (2) let parfun f v = apply (replicate f) v parfun: (’a->’b)->’a par->’b par parfun (fun x->x+1) parallel_vector f v p-1 …v1v1 v0v0 f…ff apply = (f v p-1 )…(f v 1 )(f v 0 )
12
F. GavaSNPD 200312 Communication Operation: get get: par int par par get v p-1 …v1v1 v0v0 i p-1 …i1i1 i 0 (=1) v i p-1 …vi1vi1 vi0vi0 =
13
F. GavaSNPD 200313 Example (3) let bcast_direct root parallel_vector = if not(within_bounds root) then raise Bcast_Error else get parallel_vector (replicate root) bcast_direct: int->’a par->’a par v p-1 …v1v1 v0v0 v root … ( bcast_direct root = )
14
F. GavaSNPD 200314 Global Conditional if parallel_vector at n then … else … if at n then E1 else E2 = E1 …truef p-1 …b1b1 b0b0 n
15
F. GavaSNPD 200315 Implementation of MSPML MSPML v 0.05 : F. Loulergue Library for Ocaml (uses threads and TCP/IP) October 2003 http://mspml.free.fr
16
High level semantics
17
F. GavaSNPD 200317 Terms Functional semantics: can be evaluated sequentially or in parallel Terms :
18
F. GavaSNPD 200318 Values and judgements Values: Judgement: e v Term « e » evaluates to value « v »
19
F. GavaSNPD 200319 Some rules
20
Distributed evaluation
21
F. GavaSNPD 200321 Informal presentation (1) MSPML programs seen as SPMD programs The parallel machine runs p copies of the same MSPML program Rules for local reduction + rule for communication For example at processor i the expression (mkpar f) is evaluated to (f i)
22
F. GavaSNPD 200322 Informal presentation (2) Proc.012 empty 0,v’ Local computation Communication environment get v 1 0,v request 0 1v’ A BIT LATER 0,v’’
23
F. GavaSNPD 200323 Informal presentation (3) Proc.012 empty 0,v’ 0,v’’ 1,w’ 2,w’’ Local computation Communication environment 0,v 0,v’ 1,w request 2 0 Not Ready !!
24
F. GavaSNPD 200324 Terms and judgments New term: request i j the processor asks the value stored at processor j during the i th step Step = each step is ended by a call to get Judgment: At process i, the term e d with communication environment E c is evaluate to e’ d with new communication environment E’ c
25
F. GavaSNPD 200325 Local computation rules
26
F. GavaSNPD 200326 Communication rule
27
F. GavaSNPD 200327 Conclusion and Future Work Conclusion Minimally Synchronous Parallel ML: Functional semantics & Deadlock free Two semantics: High level semantics (programmer’s view) Distributed evaluation (implementation’s view) Implementation Future Work MPI Implementation Comparison with BSMLlib Extensions: parallel composition, etc.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.