Download presentation
Presentation is loading. Please wait.
Published byLauren Lindsey Modified over 8 years ago
1
Problem Solving with NetSolve miller@cs.utk.edu moore@cs.utk.edu susan@cs.utk.edu www.cs.utk.edu/netsolve netsolve@cs.utk.edu Michelle Miller, Keith Moore, Susan Blackford, NetSolve group, Innovative Computing Lab, University of Tennessee
2
NetSolve Problem Statement –Software libraries could be easier to install and use Locate library Configure and install library on local machine –Need access to bigger/different machines
3
NetSolve Solution –Simple, consistent interface to numeric packages –Ease of use – locate/use already configured/installed/running solvers through simple procedure call –Resource sharing (hardware & software) Greater access to machine resources New software packages made available simply
4
NetSolve Architecture Server 1 blas, petsc Agent Client Proxy Server 2 lapack, mcell Server 3 blas, itpack Server 4 superLU Server 5 aztec, MA28 Matlab Client
5
NetSolve Architecture Server 1 blas, petsc Agent Client Proxy Server 2 lapack, mcell Server 3 blas, itpack Server 4 superLU Server 5 aztec, MA28 Matlab Client netsolve(‘problemX’, A, rhs)
6
NetSolve Architecture Server 1 blas, petsc Agent Client Proxy Server 2 lapack, mcell Server 3 blas, itpack Server 4 superLU Server 5 aztec, MA28 Matlab Client netsolve(‘problemX’, A, rhs) Servers?
7
NetSolve Architecture Server 1 blas, petsc Agent Client Proxy Server 2 lapack, mcell Server 3 blas, itpack Server 4 superLU Server 5 aztec, MA28 Matlab Client netsolve(‘problemX’, A, rhs) Servers? workload Server1, Server3
8
NetSolve Architecture Server 1 blas, petsc Agent Client Proxy Server 2 lapack, mcell Server 3 blas, itpack Server 4 superLU Server 5 aztec, MA28 Matlab Client problemX, A, rhs
9
NetSolve Architecture Server 1 blas, petsc Agent Client Proxy Server 2 lapack, mcell Server 3 blas, itpack Server 4 superLU Server 5 aztec, MA28 Matlab Client result
10
Parallelism in NetSolve Task Farming –Single request issued that specifies data partitioning –Data parallel, SPMD support
11
Task Farming Interface /*** BEFORE ***/ preamble and initializations; status1 = netslnb(‘iqsort’, size1, array1, sorted1); status2 = netslnb(‘iqsort’, size2, array2, sorted2);. status20 = netslnb(‘iqsort’, size20, array20, sorted20); program continues; /*** AFTER ***/ preamble and initializations; status_array = netsl_farm(‘iqsort’, “i=0,19”, netsl_int_array(size_array, “$i”), netsl_ptr_array(input_array, “$i”), netsl_ptr_array(sorted_array, “$i”)); program continues;
12
Request Sequencing Sequence of computations Data dependency analysis to reduce extra data transfers in sequence steps Transmit superset of all input/output parameters and make persistent near server(s) for duration of sequence execution.
13
netsl(“command1”, A, B, C); netsl(“command2”, A, C, D); netsl(“command3”, D, E, F); ClientServer command1(A, B) result C ClientServer command2(A, C) result D ClientServer command3(D, E) result F netsl_begin_sequence( ); netsl(“command1”, A, B, C); netsl(“command2”, A, C, D); netsl(“command3”, D, E, F); netsl_end_sequence(C, D); ClientServer sequence(A, B, E) Server ClientServer result F input A, intermediate output C intermediate output D, input E Data Persistence
14
NetSolve Applications MCell (Salk Institute) –Monte Carlo simulator of cellular microphysiology – synaptic transmission –Large numbers of same computation with different parameters (diffusion and chemical reaction calculations) –Task farming used for parallel runs
15
NetSolve and Metacomputing Backends NetSolve Client NetSolve Servers NetSolve Agent NetSolve Services MDS GASS GRAM Client-Proxy Interface NetSolve Proxy Client-Proxy Interface Globus Proxy Client-Proxy Interface Ninf Proxy Legion Proxy Ninf Services Legion Services Client-Proxy Interface
16
NetSolve Authentication with Kerberos Kerberos used to maintain Access Control Lists and manage access to computational resources. NetSolve properly handles authorized and non-authorized components together in the same system.
17
NetSolve Authentication with Kerberos NetSolve client NetSolve agent NetSolve servers Kerberos KDC Typical NetSolve Transaction Kerberized Interaction
18
NetSolve Authentication with Kerberos NetSolve client NetSolve agent NetSolve servers Kerberos KDC Servers register their presence with the agent and KDC Client issues problem request; Agent responds with list of servers Client sends work request to server; server replies requesting authentication credentials Client requests ticket from KDC Client sends ticket and input to server; server authenticates and returns the solution set
19
NWS Integration NetSolve Server CPU sensor Host Machine NetSolve Server CPU sensor Host Machine NetSolve Agent NWS Forecaster NWS Memory Sensors report to NWS memory.
20
NWS Integration NetSolve Server CPU sensor Host Machine NetSolve Server CPU sensor Host Machine NetSolve Agent NWS Forecaster NWS Memory
21
NWS Integration NetSolve Server CPU sensor Host Machine NetSolve Server CPU sensor Host Machine NetSolve Agent NWS Forecaster NWS Memory Agent probes NWS Forecaster
22
NWS Integration NetSolve Server CPU sensor Host Machine NetSolve Server CPU sensor Host Machine NetSolve Agent NWS Forecaster NWS Memory Forecaster probes memory.
23
NWS Integration NetSolve Server CPU sensor Host Machine NetSolve Server CPU sensor Host Machine NetSolve Agent NWS Forecaster NWS Memory Forecaster makes forecast
24
Agent chooses server NWS Integration NetSolve Server CPU sensor Host Machine NetSolve Server CPU sensor Host Machine NetSolve Agent NWS Forecaster NWS Memory
25
Newly enabled libraries
26
PETSc SuperLU SPOOLES MA28 Sparse Matrices/Solvers Support for compressed row/column sparse matrix storage -- significantly reduces network data transmission. Iterative and direct solvers: PETSc, Aztec, SuperLU, Ma28, … All available solver packages will be made available from UTK NetSolve servers and others.
27
Matlab interface Calls to PETSc, Aztec –[x, its] = netsolve(‘iterative_solve_parallel’,‘PETSC’,A,b,1.e-6,500); –[x, its] = netsolve(‘iterative_solve_parallel’, ‘AZTEC’,A,b,1.e-6,500); Similar for SuperLU, MA28 –[x] = netsolve(‘direct_solve_serial’,’SUPERLU’,A,b,0.3,1); –[x] = netsolve(‘direct_solve_serial’, ‘MA28’,A,b,0.3,1); Calls to LAPACK, ScaLAPACK –[lu, p, x, info] = netsolve(‘dgesv’,A,b); –[lu,p,x,info] = netsolve(‘pdgesv’,A,b);
28
‘LinearSolve’ interface Uncertain which library to choose? ‘LinearSolve’ interface chooses the library and appropriate routine for the user –[x] = netsolve(‘LinearSolve’,A,b);
29
Heuristics Interface analyzes the matrix A –matrix shape (square, rectangular)? If rectangular, choose linear least squares solver from LAPACK –matrix element density If square, is the matrix sparse or dense? Manually check the percentage of nonzeros and transform to sparse matrix format If dense, is it symmetric?
30
Heuristics cont’d If sparse, and dense band, use LAPACK If sparse, and if there is a block or diagonal structure (Aztec), can yield higher performance (level 3 BLAS) If sparse, direct or iterative solver? –Size of the matrix (large matrix, fill-in for a direct method can be larger than iterative)
31
Heuristics cont’d –Numerical properties (direct solvers can handle more complicated matrices than iterative methods) How to estimate fill-in and gauge numerical properties of A? Future Work: ‘Eigensolve” interface
32
Interfacing to Parallel Libraries Improved task scheduling and load balance –NWS memory sensor, latency and bandwidth sensors –provide matrix distributions for the user –heuristics for the best choice of the # of processors, amount of matrix per processor, process grid dimension...
33
Get NetSolve1.3 Now! Release date -- April 2000! UNIX client/agent/server source code. UNIX client binaries available. Win32 dlls for C/Matlab/Mathematica clients. www.cs.utk.edu/netsolve
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.