Scott Kohn with Tom Epperly and Gary Kumfert Center for Applied Scientific Computing Lawrence Livermore National Laboratory October 3, 2000 Component Technology.

Slides:



Advertisements
Similar presentations
COM vs. CORBA.
Advertisements

Introduction to Databases
1 Lawrence Livermore National Laboratory By Chunhua (Leo) Liao, Stephen Guzik, Dan Quinlan A node-level programming model framework for exascale computing*
Tammy Dahlgren, Tom Epperly, Scott Kohn, and Gary Kumfert Center for Applied Scientific Computing Common Component Architecture Working Group October 3,
Seminarium on Component-based Software Engineering Jan Willem Klinkenberg CORBA.
G O B E Y O N D C O N V E N T I O N WORF: Developing DB2 UDB based Web Services on a Websphere Application Server Kris Van Thillo, ABIS Training & Consulting.
CIM2564 Introduction to Development Frameworks 1 Overview of a Development Framework Topic 1.
ARCS Data Analysis Software An overview of the ARCS software management plan Michael Aivazis California Institute of Technology ARCS Baseline Review March.
Divorcing Language Dependencies from a Scientific Software Library Gary Kumfert, with Scott Kohn, Jeff Painter, & Cal Ribbens LLNLVaTech.
DCS Architecture Bob Krzaczek. Key Design Requirement Distilled from the DCS Mission statement and the results of the Conceptual Design Review (June 1999):
File Systems and Databases
CS 501: Software Engineering Fall 2000 Lecture 16 System Architecture III Distributed Objects.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
Software Engineering Module 1 -Components Teaching unit 3 – Advanced development Ernesto Damiani Free University of Bozen - Bolzano Lesson 2 – Components.
1 Dan Quinlan, Markus Schordan, Qing Yi Center for Applied Scientific Computing Lawrence Livermore National Laboratory Semantic-Driven Parallelization.
Automatic Data Ramon Lawrence University of Manitoba
Jun Peng Stanford University – Department of Civil and Environmental Engineering Nov 17, 2000 DISSERTATION PROPOSAL A Software Framework for Collaborative.
Support for Adaptive Computations Applied to Simulation of Fluids in Biological Systems Immersed Boundary Method Simulation in Titanium.
Center for Component Technology for Terascale Simulation Software (aka Common Component Architecture) (aka CCA) Rob Armstrong & the CCA Working Group Sandia.
Software Engineering Muhammad Fahad Khan
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 18 Slide 1 Software Reuse.
Commodity Grid (CoG) Kits Keith Jackson, Lawrence Berkeley National Laboratory Gregor von Laszewski, Argonne National Laboratory.
Beyond DHTML So far we have seen and used: CGI programs (using Perl ) and SSI on server side Java Script, VB Script, CSS and DOM on client side. For some.
Center for Component Technology for Terascale Simulation Software 122 June 2002Workshop on Performance Optimization via High Level Languages and Libraries.
Quality Assurance for Component- Based Software Development Cai Xia (Mphil Term1) Supervisor: Prof. Michael R. Lyu 5 May, 2000.
OpenAlea An OpenSource platform for plant modeling C. Pradal, S. Dufour-Kowalski, F. Boudon, C. Fournier, C. Godin.
GKK 1 CASC Recommended Viewing: Since this is a pictorial presentation, I’ve gone through the effort to type up what I normally say in the “notes” section.
Adapting Legacy Computational Software for XMSF 1 © 2003 White & Pullen, GMU03F-SIW-112 Adapting Legacy Computational Software for XMSF Elizabeth L. White.
Gary Kumfert with Bill Bosl, Tammy Dahlgren, Tom Epperly, Scott Kohn, & Steve Smith Achieving Language Interoperability with.
LLNL-PRES-XXXXXX This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
CIS 375—Web App Dev II Microsoft’s.NET. 2 Introduction to.NET Steve Ballmer (January 2000): Steve Ballmer "Delivering an Internet-based platform of Next.
A Hybrid Decomposition Scheme for Building Scientific Workflows Wei Lu Indiana University.
NeSC Grid Apps Workshop Exposing Legacy Applications as OGSI Components using pyGlobus Keith R. Jackson Distributed Systems Department Lawrence Berkeley.
©Ian Sommerville 2000 Software Engineering, 6th edition. Slide 1 Component-based development l Building software from reusable components l Objectives.
Component Architecture (CORBA – RMI) -Shalini Pradhan.
CASC This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under.
CS 390- Unix Programming Environment CS 390 Unix Programming Environment Topics to be covered: Distributed Computing Fundamentals.
Lecture 15 Introduction to Web Services Web Service Applications.
CCA Common Component Architecture CCA Forum Tutorial Working Group Contributors: Language Interoperability Using Gary.
Tammy Dahlgren with Tom Epperly, Scott Kohn, and Gary Kumfert Center for Applied Scientific Computing Common Component Architecture Working Group October.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
Ch 1. A Python Q&A Session Spring Why do people use Python? Software quality Developer productivity Program portability Support libraries Component.
Grid Computing Research Lab SUNY Binghamton 1 XCAT-C++: A High Performance Distributed CCA Framework Madhu Govindaraju.
Components for Beam Dynamics Douglas R. Dechow, Tech-X Lois Curfman McInnes, ANL Boyana Norris, ANL With thanks to the Common Component Architecture (CCA)
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
Center for Component Technology for Terascale Simulation Software CCA is about: Enhancing Programmer Productivity without sacrificing performance. Supporting.
SCIRun and SPA integration status Steven G. Parker Ayla Khan Oscar Barney.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
Microsoft.NET Norman White Stern School of Business.
Scott Kohn with Tammy Dahlgren, Tom Epperly, and Gary Kumfert Center for Applied Scientific Computing Lawrence Livermore National Laboratory October 2,
A Data Access Framework for ESMF Model Outputs Roland Schweitzer Steve Hankin Jonathan Callahan Kevin O’Brien Ansley Manke.
CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.
Kemal Baykal Rasim Ismayilov
Distributed Components for Integrating Large- Scale High Performance Computing Applications Nanbor Wang, Roopa Pundaleeka and Johan Carlsson
Connections to Other Packages The Cactus Team Albert Einstein Institute
Tammy Dahlgren, Tom Epperly, Scott Kohn, & Gary Kumfert.
1 HPJAVA I.K.UJJWAL 07M11A1217 Dept. of Information Technology B.S.I.T.
Copyright 2007, Information Builders. Slide 1 iWay Web Services and WebFOCUS Consumption Michael Florkowski Information Builders.
Toward a Distributed and Parallel High Performance Computing Environment Johan Carlsson and Nanbor Wang Tech-X Corporation Boulder,
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
1/30/2003 Los Alamos National Laboratory1 A Migration Framework for Legacy Scientific Applications  Current tendency: monolithic architectures large,
By Jeremy Burdette & Daniel Gottlieb. It is an architecture It is not a technology May not fit all businesses “Service” doesn’t mean Web Service It is.
A service Oriented Architecture & Web Service Technology.
“Port Monitor”: progress & open questions Torsten Wilde and James Kohl Oak Ridge National Laboratory CCA Forum Quarterly Meeting Santa Fe, NM ~ October.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 15 System Architecture III.
VisIt Project Overview
Designing Software for Ease of Extension and Contraction
CS590L Distributed Component Architecture References: - Objects, components, and framenworks with UML, Ch A Component Based Services Architecture.
The Challenge of Cross - Language Interoperability
Common Component Architecture (CCA)
Presentation transcript:

Scott Kohn with Tom Epperly and Gary Kumfert Center for Applied Scientific Computing Lawrence Livermore National Laboratory October 3, 2000 Component Technology for High- Performance Scientific Simulation Software

CASC 2 Presentation outline l Motivation l DOE community activities (CCA) l Language interoperability technology (Babel) l Component type and software repository (Alexandria) l Research issues in parallel component communication l Deep thoughts... Goal: Provide an overview of the approach, techniques, and tools we are exploring in adopting software component technology for scientific computing

CASC 3 Numerical simulation software is becoming increasingly complex and interdisciplinary l Scientists are asked to develop 3d, massively parallel, high-fidelity, full-physics simulations; and do it quickly l This requires the integration of software libraries developed by other teams —local resources are limited and expertise may not exist —loss of local control over software development decisions —language interoperability issues (f77, C, C++, Python, Java, f90) l Techniques for small codes do not scale to 500K lines

CASC 4 What are the barriers to software re-use, interoperability, and integration? l Technological barriers —incompatible programming languages (f90 calling C++) —incompatibilities in C and C++ header files (poor physical design) —conflicting low-level run-time support (e.g., reference counting) l Sociological barriers —trust (“how do I know you know what you’re doing?”) —“I could re-write it in less time than it would take to learn it…” l Domain understanding barriers (the interesting one!) —understand interactions of the math and physics packages —write software that reflects that understanding —this is where we gain insights and make scientific progress

CASC 5 Component technologies address issues of software complexity and interoperability l Industry created component technology to address... —interoperability problems due to languages —complexity of large applications with third-party software —incremental evolution of large legacy software Observation: The laboratory must address similar problems but in a different applications space (parallel high-performance scientific simulation, not business).

CASC 6 Current industry solutions will not work in a scientific computing environment l Three competing industry component approaches —Microsoft COM —Sun JavaBeans and Enterprise JavaBeans —OMG CORBA l Limitations for high-performance scientific computing —do not address issues of massively parallel components —industry focuses on abstractions for business (not scientific) data —typically unavailable on our parallel research platforms —lack of support for Fortran 77 and Fortran 90 l However, we can leverage techniques and software

CASC 7 Component technology extends OO with interoperability and common interfaces l Start with object-oriented technology l Add language interoperability —describe object calling interfaces independent of language —add “glue” software to support cross-language calls l Add common behavior, packaging, and descriptions —all components must support some common interfaces —common tools (e.g., repositories, builders, …) l Component technology is not… —object-oriented design, scripting, or frameworks —structured programming (e.g., modules) —the solution for all of your problems (just some of them)

CASC 8 Component technology approaches help to manage application software complexity “Monolithic” approach “Building-block” approach model equations solver1 time stepper solver2 tightly-coupled code less flexible, extensible re-use is difficult well-understood by community loosely-coupled code more flexible, extensible high re-use potential new to community timestep loop { if ( test1 ) solver1( ) else if ( test2 ) solver2( ) } share “hard-coded” data

CASC 9 SAMRAI ALPS application combines three physics packages with local time refinement Gradient Detector Berger Rigoutsos Clustering Load Balancer Time Refinement Integrator Hierarchy Description Regridding Algorithm Ion Advection Numerical Routines Light Propagation Routines Electrostatic Field Solver Conservative Hyperbolic Level Integrator Laser-Plasma Integrator Poisson Hierarchy Solver

CASC 10 CCA is investigating high-performance component technology for the DOE l Common Component Architecture (CCA) forum —regular workshops and meetings since January, 1998 —ANL, LANL, LBNL, LLNL, ORNL, SNL, Indiana, and Utah — l Goal: interoperability for high-performance software —focus on massively parallel SPMD applications —modify industry approaches for the scientific domain l Writing specifications and reference implementation —leverage technology developed by CCA participants —plan to develop a joint reference implementation by FY02

CASC 11 The CCA is researching a variety of component issues in scientific computing l Communication between components via ports è Standard component repository formats and tools l Composition GUIs è Language interoperability technology l Dynamic component loading l Distributed components è Parallel data redistribution between SPMD components l Domain interface standards (e.g., solvers, meshes, …) l Efficient low-level parallel communication libraries

CASC 12 Presentation outline l Motivation l DOE community activities (CCA) è Language interoperability technology (Babel) l Component type and software repository (Alexandria) l Research issues in parallel component communication l Conclusions

CASC 13 Motivation #1: Language interoperability l Motivated by Common Component Architecture (CCA) —cross-lab interoperability of DOE numerical software —DOE labs use many languages (f77, f90, C, C++, Java, Python) —primary focus is on tightly-coupled same-address space codes Simulation Framework (C) Solver Library (C++) Numerical Routines (f90) Scripting Driver (Python) Visualization System (Java)

CASC 14 Motivation #2: Object support for non-object languages l Want object implementations in non-object languages —object-oriented techniques useful for software architecture —but … many scientists are uncomfortable with C++ —e.g., PETSc and hypre implement object-oriented features in C l Object support is tedious and difficult if done by hand —inheritance and polymorphism require function lookup tables —support infrastructure must be built into each new class l IDL approach provides “automatic” object support —IDL compiler automates generation of object “glue” code —polymorphism, multiple inheritance, reference counting, RTTI,...

CASC 15 There are many tradeoffs when choosing a language interoperability approach l Hand generation, wrapper tools (e.g., SWIG), IDLs l We chose the IDL approach to language interoperability —goal: any language can call and use any other language —component tools need a common interface description method —sufficient information for automatic generation of distributed calls —examples: CORBA, DCOM, ILU, RPC, microkernel OSes C C++ f77 f90 Python Java IDL JNI Native SWIG C C++ f77 f90 Python Java

CASC 16 An IDL for scientific computing requires capabilities not present in industry IDLs l Industry standard IDLs: CORBA, COM, RPC, … l Desired capabilities for a scientific computing IDL —attributes for parallel semantics —dense dynamic multidimensional arrays and complex numbers —bindings for f77/f90 and “special” languages (e.g., Yorick) —small and easy-to-modify IDL for research purposes —rich inheritance model (Java-like interfaces and classes) —high performance for same address-space method invocations

CASC 17 SIDL provides language interoperability for scientific components l SIDL is a “scientific” interface definition language —we modified industry IDL technology for the scientific domain —SIDL describes calling interfaces (e.g., CCA specification) —our tools automatically generate code to “glue languages” package ESI { interface Vector { void axpy(in Vector x, in double a); double dot(in Vector x); … }; interface Matrix { … }; SIDL tools f77 C C++ Python f90 library writer develops this user runs this... … and gets this

CASC 18 SIDL incorporates ideas from Java and CORBA to describe scientific interfaces version Hypre 0.5; version ESI 1.0; import ESI; package Hypre { interface Vector extends ESI.Vector { double dot(in Vector y); void axpy(in double a, in Vector y); }; interface Matrix { void apply(out Vector Ax, in Vector x); }; class SparseMatrix implements Matrix, RowAddressible { void apply(out Vector Ax, in Vector x); }; class enumeration exception interface package

CASC 19 Users call automatically generated interface code completely unaware of SIDL tools integer b, x integer A integer smg_solver b = hypre_vector_NewVector(com, grid, stencil)... x = hypre_vector_NewVector(com, grid, stencil)... A = hypre_matrix_NewMatrix(com, grid, stencil)... smg_solver = hypre_smg_solver_new() call hypre_smg_solver_SetMaxIter(smg_solver, 10) call hypre_smg_solver_Solve(smg_solver, A, b, x) call hypre_smg_solver_Finalize(smg_solver) hypre::vector b, x; hypre::matrix A; hypre::smg_solver smg_solver; b = hypre::vector::NewVector(com, grid, stencil);... x = hypre::vector::NewVector(com, grid, stencil);... A = hypre::matrix::NewMatrix(com, grid, stencil);... smg_solver = hypre::smg_solver::New(); smg_solver.SetMaxIter(10); smg_solver.Solve(A, b, x); smg_solver.Finalize(); C++ Test CodeFortran 77 Test Code

CASC 20 SIDL version management l Simple version management scheme for SIDL types —all symbols are assigned a fixed version number —SIDL version keyword requests specified version (or latest) —supports multiple versions of specs (e.g., ESI 0.5, ESI 0.5.1) version ESI 0.5.1; // access ESI spec v0.5.1 version HYPRE 0.7; // define HYPRE spec v0.7 package HYPRE { // define v0.7 of HYPRE.Vector using v0.5.1 // of the ESI.Vector interface interface Vector extends ESI.Vector {... }

CASC 21 Language support in the Babel compiler l C, f77, C++ mostly finished using old SIDL grammar —approximately 500 test cases probe implementation —used by hypre team for exploratory development l Currently migrating system to use new grammar l Java, Python, and Yorick support next —Python and Yorick are scripting languages (Yorick from LLNL) —hope to begin development in October timeframe —“should be quick” because of C interface support in languages l f90 and MATLAB will (hopefully) begin early next year

CASC 22 We are collaborating with hypre to explore SIDL technology in a scientific library l Collaborators: Andy Cleary, Jeff Painter, Cal Ribbens l SIDL interface description file generated for hypre —approximately 30 interfaces and classes for hypre subset —use Babel tools to generate glue code for object support l Benefits of SIDL use in the hypre project —automatic support for object-oriented features in C —Fortran capabilities through SIDL in upcoming version —plan to integrate existing C, Fortran, and C++ in one library —SIDL is a useful language for discussing software design —creating better hypre design based on SIDL OO support —cost overhead in same-address space too small to measure

CASC 23 Presentation outline l Motivation l DOE community activities (CCA) l Language interoperability technology (Babel) è Component type and software repository (Alexandria) l Research issues in parallel component communication l Conclusions

CASC 24 We are developing a web-based architecture to simplify access by scientists and tools l Scientists and library developers must have easy access to our technology; otherwise, they simply will not use it l Our web-based deployment lowers the “threshold of pain” to adopting component technology scientist tool web server SIDL tools servlets JSP SQL database XML web pages

CASC 25 Alexandria is a web-based repository for component software and type descriptions l The Alexandria repository supports... —SIDL type descriptions for libraries and components —library and component implementations —an interface to the Babel language interoperability tools

CASC 26 The Babel parser converts SIDL to XML that is stored in the Alexandria repository l SIDL is used to generate XML interface information l XML type description used to generate glue code parser url://interface.sidl type repositories url://interface.xml type repository code generator XML f77 C++ C... XML machine configuration

CASC 27 Sample XML file for Hypre.Vector......

CASC 28 Presentation outline l Motivation l DOE community activities (CCA) l Language interoperability technology (Babel) l Component type and software repository (Alexandria) è Research issues in parallel component communication l Conclusions

CASC 29 Parallel redistribution of complex data structures between components l Parallel data redistribution for non-local connections —example: connect parallel application to visualization server —cannot automatically redistribute complex data structures —must support redistribution of arbitrary data structures l Approach - modify SIDL and special interface support

CASC 30 Parallel components will require special additions to SIDL interface descriptions l Special RMI semantics for parallel components —provide a local attribute for methods —will also need a copy attribute for pass-by-value, etc. —however, no data distribution directives - must be done dynamically package ESI { interface Vector { double dot(copy in Vector v); int getGlobalSize(); int getLocalSize() local; }

CASC 31 Dynamic redistribution of arbitrary data: Ask the object to do it for you! l Irregular data too complex to represent in IDL l Basic approach: —objects implement one of a set of redistribution interfaces —library queries object at run-time for supported method interface ComplexDistributed { void serialize(in Array s); void deserialize(in Array s); }... interface ArrayDistributed { // use existing array description // from PAWS or CUMULVS } two streams on M1 three streams on M2 M1 B A A A B M2

CASC 32 Will component technology be part of the future of scientific computing? l Well, maybe - or maybe not l Component technology does offer new capabilities —techniques to manage large-scale application complexity —language interoperability and easier plug-and-play —leverage technology, not re-invent the wheel —bridges to interoperate with industry software (e.g., SOAP) l However, capabilities come at a price —ties scientists to using component technology tools —steep learning curve (needs to become part of culture) —different paradigm for developing scientific applications

CASC 33 Acknowledgements l Work performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. l Document UCRL-VG