CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans.

Slides:



Advertisements
Similar presentations
Institute of Computer Science AGH Towards Multilanguage and Multiprotocol Interoperability: Experiments with Babel and RMIX Maciej Malawski, Daniel Harężlak,
Advertisements

Earth System Curator Spanning the Gap Between Models and Datasets.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech.
Priority Research Direction: Portable de facto standard software frameworks Key challenges Establish forums for multi-institutional discussions. Define.
CCA Common Component Architecture CCA Forum Tutorial Working Group A Look at More Complex.
Cracow Grid Workshop, November 5-6, 2001 Towards the CrossGrid Architecture Marian Bubak, Marek Garbacz, Maciej Malawski, and Katarzyna Zając.
Integrated Frameworks for Earth and Space Weather Simulation Timothy Killeen and Cecelia DeLuca National Center for Atmospheric Research, Boulder, Colorado.
A Case Study in Componentising a Scientific Application for the Grid  Nikos Parlavantzas, Matthieu Morel, Françoise Baude, Fabrice Huet, Denis Caromel,
Workload Management Massimo Sgaravatto INFN Padova.
Terascale Simulation Tools and Technologies Center Jim Glimm (BNL/SB), Center Director David Brown (LLNL), Co-PI Ed D’Azevedo (ORNL), Co-PI Joe Flaherty.
Center for Component Technology for Terascale Simulation Software (aka Common Component Architecture) (aka CCA) Rob Armstrong & the CCA Working Group Sandia.
CCA Forum Summer Meeting1 27 July CCA Common Component Architecture The SciDAC Center for Technology for Advanced Scientific Component Software(TASCS)
Center for Component Technology for Terascale Simulation Software 122 June 2002Workshop on Performance Optimization via High Level Languages and Libraries.
1 TOPS Solver Components Language-independent software components for the scalable solution of large linear and nonlinear algebraic systems arising from.
4.x Performance Technology drivers – Exascale systems will consist of complex configurations with a huge number of potentially heterogeneous components.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
CCA Data Object Contributors Ben Allan, Rob Armstrong, David Bernholdt, Robert Clay, Lori Freitag, Jim Kohl, Ray Loy, Manish Parashar, Craig Rasmussen,
A Hybrid Decomposition Scheme for Building Scientific Workflows Wei Lu Indiana University.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN Interfaces: The Key to Code Reuse and.
CCA Common Component Architecture Manoj Krishnan Pacific Northwest National Laboratory MCMD Programming and Implementation Issues.
Cluster Reliability Project ISIS Vanderbilt University.
Terascale Simulation Tools and Technologies Center Jim Glimm (BNL/SB), David Brown (LLNL), Lori Freitag (ANL), PIs Ed D’Azevedo (ORNL), Joe Flaherty (RPI),
Computational Design of the CCSM Next Generation Coupler Tom Bettge Tony Craig Brian Kauffman National Center for Atmospheric Research Boulder, Colorado.
CASC This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under.
The Grid Component Model: an Overview “Proposal for a Grid Component Model” DPM02 “Basic Features of the Grid Component Model (assessed)” -- DPM04 CoreGrid.
The Grid Component Model and its Implementation in ProActive CoreGrid Network of Excellence, Institute on Programming Models D.PM02 “Proposal for a Grid.
CCA Common Component Architecture CCA Forum Tutorial Working Group Welcome to the Common.
Programming Models & Runtime Systems Breakout Report MICS PI Meeting, June 27, 2002.
CCA Common Component Architecture CCA Forum Tutorial Working Group Welcome to the Common.
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
Plans and Opportunities Involving Beam Dynamics Components ComPASS SAP Project and Phase I and II Doe SBIR Boyana Norris (ANL) In collaboration with Stefan.
Components for Beam Dynamics Douglas R. Dechow, Tech-X Lois Curfman McInnes, ANL Boyana Norris, ANL With thanks to the Common Component Architecture (CCA)
Content The system development life cycle
SAP Participants: Douglas Dechow, Tech-X Corporation Lois Curfman McInnes, Boyana Norris, ANL Physics Collaborators: James Amundson, Panagiotis Spentzouris,
Center for Component Technology for Terascale Simulation Software CCA is about: Enhancing Programmer Productivity without sacrificing performance. Supporting.
SCIRun and SPA integration status Steven G. Parker Ayla Khan Oscar Barney.
CCA Common Component Architecture 1SciDAC PI Meeting22-24 March 20041SciDAC PI Meeting22-24 March 2004 Enabling Scientific Applications with the Common.
ICS-FORTH 25-Nov Infrastructure for Scalable Services Are we Ready Yet? Angelos Bilas Institute of Computer Science (ICS) Foundation.
Presented by An Overview of the Common Component Architecture (CCA) The CCA Forum and the Center for Technology for Advanced Scientific Component Software.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
Framework for MDO Studies Amitay Isaacs Center for Aerospace System Design and Engineering IIT Bombay.
Scott Kohn with Tammy Dahlgren, Tom Epperly, and Gary Kumfert Center for Applied Scientific Computing Lawrence Livermore National Laboratory October 2,
1 1 What does Performance Across the Software Stack mean?  High level view: Providing performance for physics simulations meaningful to applications 
CCA Common Component Architecture CCA Forum Tutorial Working Group A Look at More Complex.
I/O for Structured-Grid AMR Phil Colella Lawrence Berkeley National Laboratory Coordinating PI, APDEC CET.
ORNLKohl/ Mux and MxN: A Revelation… (a.k.a. “Collective” is Dead? :-o) James Arthur Kohl David E. Bernholdt Oak Ridge National Laboratory Thursday,
Distributed Components for Integrating Large- Scale High Performance Computing Applications Nanbor Wang, Roopa Pundaleeka and Johan Carlsson
Cracow Grid Workshop, November 5-6, 2001 Concepts for implementing adaptive finite element codes for grid computing Krzysztof Banaś, Joanna Płażek Cracow.
BioPSE NCRR SCIRun2 -THE PROJECT -OBJECTIVES -DEVELOPMENTS -TODAY -THE FUTURE.
Progress on Component-Based Subsurface Simulation I: Smooth Particle Hydrodynamics Bruce Palmer Pacific Northwest National Laboratory Richland, WA.
CCA Common Component Architecture CCA Forum Tutorial Working Group An Overview of Components.
Presented by Adaptive Hybrid Mesh Refinement for Multiphysics Applications Ahmed Khamayseh and Valmor de Almeida Computer Science and Mathematics Division.
117 December 2001Pacific Northwest National Laboratory Component-Based Software for High-Performance Computing: An Introduction to the Common Component.
1 ProActive GCM – CCA Interoperability Maciej Malawski, Ludovic Henrio, Matthieu Morel, Francoise Baude, Denis Caromel, Marian Bubak Institute of Computer.
CCA Common Component Architecture CCA Forum Tutorial Working Group Common Component Architecture.
1 Rocket Science using Charm++ at CSAR Orion Sky Lawlor 2003/10/21.
C OMPUTATIONAL R ESEARCH D IVISION 1 Defining Software Requirements for Scientific Computing Phillip Colella Applied Numerical Algorithms Group Lawrence.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN CCA Status, Code Walkthroughs, and Demonstrations.
Toward a Distributed and Parallel High Performance Computing Environment Johan Carlsson and Nanbor Wang Tech-X Corporation Boulder,
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
Workflow Management Concepts and Requirements For Scientific Applications.
CCA Common Component Architecture CCA Forum Tutorial Working Group Common Component Architecture.
Wael Elwasif ORNL/IU Fusion Frameworks Workshop 1 Integrating Fusion Codes Using CCA – an ORNL LDRD Project Wael R. Elwasif & Lee Berry Computer Science.
VisIt Project Overview
Kai Li, Allen D. Malony, Sameer Shende, Robert Bell
James “Jeeembo” Kohl, M.C. Oak Ridge National Laboratory
Metadata Development in the Earth System Curator
Design.
Presentation transcript:

CCA Common Component Architecture CCA Forum Tutorial Working Group CCA Status and Plans

CCA Common Component Architecture CCA Status and Plans 2 CCTTSS Research Thrust Areas and Main Working Groups Scientific Components –Scientific Data Objects Lois Curfman McInnes, ANL “MxN” Parallel Data Redistribution Jim Kohl, ORNL Frameworks –Language Interoperability / Babel / SIDL –Component Deployment / Repository Gary Kumfert, LLNL User Outreach David Bernholdt, ORNL

CCA Common Component Architecture CCA Status and Plans 3 Scientific Components Abstract Interfaces and Component Implementations –Mesh management –Linear, nonlinear, and optimization solvers –Multi-threading and load redistribution –Visualization and computational steering Quality of Service Research Fault Tolerance –Components and Frameworks

CCA Common Component Architecture CCA Status and Plans 4 Scientific Components Extended R&D Agenda Complete development of abstract interfaces and base component prototypes Advanced component development –Second-level component extensions –Application-specific components for chemistry and climate Implement fault tolerance and recovery mechanisms Develop quality of service models for numerical components –Integrate QoS system into repository Develop interfaces and implementations for multi-level nonlinear solvers and hybrid mesh management schemes –Collaboration with TOPS and TSTT centers

CCA Common Component Architecture CCA Status and Plans 5 Scientific Data Objects & Interfaces Define “Standard” Interfaces for HPC Scientific Data –Descriptive, Not (Necessarily) Generative… Basic Scientific Data Object –David Bernholdt, ORNL Structured & Unstructured Mesh –Lori Freitag, ANL –Collaboration with SciDAC TSTT Center Structured Block AMR –Phil Colella, LBNL –Collaboration with APDEC & TSTT

CCA Common Component Architecture CCA Status and Plans 6 Scientific Data Interfaces Low Level, Raw Data –Supports high performance access to memory –Based on IOVec Assumes a contiguous memory block Supports basic data types such as integer, float, double No topology information Local & Distributed Arrays –Abstract interfaces for higher-level data description 1D, 2D, 3D dense arrays Various distribution strategies –HPF-like decomposition types (Block/Cyclic…)

CCA Common Component Architecture CCA Status and Plans 7 Mesh Interfaces Unstructured Meshes –Abstract interfaces for mesh and geometry access and modification Supports geometry and topology access via iterators, arrays, worksets Separates structured and unstructured mesh access for performance Block Structured AMR –Abstract interfaces for allowing block structured AMR packages to exchange data

CCA Common Component Architecture CCA Status and Plans 8 “MxN” Parallel Data Redistribution: The Problem… “M” “N”

CCA Common Component Architecture CCA Status and Plans 9 “MxN” Parallel Data Redistribution: The Problem… Create complex scientific simulations by coupling together multiple parallel component models –Share data on “M” processors with data on “N” M != N ~ Distinct, Pronounced “M by N”… –Model coupling, e.g., climate, solver / optimizer –Collecting data for visualization (“Mx1”) Define “standard” interface –Fundamental operations for any parallel data coupler Full range of synchronization and communication options

CCA Common Component Architecture CCA Status and Plans 10 Hierarchical MxN Approach Basic MxN Parallel Data Exchange –Component implementation –Initial prototypes based on CUMULVS & PAWS Interface generalizes features of both Higher-Level Coupling Functions –Units, time & grid Interpolation, flux conservation “Automatic” MxN Service via Framework –Implicit in method invocations, “parallel RMI”

CCA Common Component Architecture CCA Status and Plans 11 CCA Frameworks Component Containers & Run-Time Environments Research Areas: –Integration of prototype frameworks SCMD/parallel with distributed Unify framework services & interactions… –Language interoperability tools Babel/SIDL, incorporate difficult languages (F90…) Production-scale requirement for application areas –Component deployment Component repository, interface lookup & semantics

CCA Common Component Architecture CCA Status and Plans 12 CCA Framework Prototypes Ccaffeine –SPMD/SCMD parallel –Direct connection CCAT / XCAT –Distributed –Network connection SCIRun –Parallel, multithreaded –Direct connection Decaf –Language interoperability via Babel

CCA Common Component Architecture CCA Status and Plans 13 Outreach and Applications Integration Not Just “Thrown Over The Fence”… Several Outreach Efforts: –General education and awareness Tutorials, like this one! Papers, conference presentations –Strong liaison with adopting groups Beyond superficial exchanges Real production requirements & feedback –Chemistry and cimate work within CCTTSS Actual application development work ($$$) SciDAC Emphasis –More vital applied advanced computing research!

CCA Common Component Architecture CCA Status and Plans 14 Current CCA / CCTTSS Status CCA Specification at Version 0.5 Several Working Prototype Frameworks Functional Multi-Component Parallel and Distributed Demonstration Applications Draft specifications for –Basic scientific data objects –MxN parallel data redistribution Demonstration Software Available for Download –4 different “direct connect” applications, 1 distributed –31 distinct components, up to 17 in any single application, 6 used in more than one application

CCA Common Component Architecture CCA Status and Plans 15 CCA Tutorial Summary Go Forth and Componentize… –And ye shall bear good scientific software Come Together for Domain Standards –Attain true interoperability & code re-use Use The Force: – –