The CrossGrid Project Marcel Kunze, FZK representing the X#-Collaboration.

Slides:



Advertisements
Similar presentations
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Advertisements

1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
AcrossGrids Conference – Santiago 13 of February 2003 First Prototype of the CrossGrid Testbed Jorge Gomes (LIP) On behalf of X# WP4.
GridLab Conference, Zakopane, Poland, September 13, 2002 CrossGrid: Interactive Applications, Tool Environment, New Grid Services, and Testbed Marian Bubak.
Cracow Grid Workshop, November 5-6, 2001 Towards the CrossGrid Architecture Marian Bubak, Marek Garbacz, Maciej Malawski, and Katarzyna Zając.
Problem-Solving Environments: The Next Level in Software Integration David W. Walker Cardiff University.
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
Cracow Grid Workshop November 5-6 Support System of Virtual Organization for Flood Forecasting L. Hluchy, J. Astalos, V.D. Tran, M. Dobrucky and G.T. Nguyen.
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
The CrossGrid project Juha Alatalo Timo Koivusalo.
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
5 th EU DataGrid Conference, Budapest, September 2002 The European CrossGrid Project Marcel Kunze Abteilung Grid-Computing und e-Science Forschungszentrum.
TAT CrossGrid Yearly Review, Brussels, March 12, 2003 CrossGrid After the First Year: A Technical Overview Marian Bubak, Maciej Malawski, and Katarzyna.
Workshop CESGA - HPC’ A Coruna, May 30, 2002 Towards the CrossGrid Architecture Marian Bubak, Maciej Malawski, and Katarzyna Zajac X# TAT Institute.
Dagstuhl Seminar 02341: Performance Analysis and Distributed Computing, August 18 – 23, 2002, Germany Monitoring of Interactive Grid Applications Marian.
“IST in the FP6”, Warszawa, November 2002 IST 5FP Success Stories of the Institute of Informatics of the Slovak Academy of Sciences, Bratislava,
CrossGrid Task 3.3 Grid Monitoring Trinity College Dublin (TCD) Brian Coghlan Paris MAR-2002.
Cracow Grid Workshop, November 5-6, 2001 Overview of the CrossGrid Project Marian Bubak Institute of Computer Science & ACC CYFRONET AGH, Kraków, Poland.
TAT Cracow Grid Workshop, October 27 – 29, 2003 Marian Bubak, Michal Turala and the CrossGrid Collaboration CrossGrid in Its Halfway:
Workload Management Massimo Sgaravatto INFN Padova.
M.Kunze, NEC2003, Varna The European CrossGrid Project Marcel Kunze Institute for Scientific Computing (IWR) Forschungszentrum Karlsruhe GmbH
5 March 2002 DG PARIS Jesus Marco CSIC IFCA(Santander) Development of GRID environment for interactive applications J.Marco (CSIC) DATAGRID WP6 MEETING.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Application Use Cases NIKHEF, Amsterdam, December 12, 13.
Peter Sloot: Computational Science, University of Amsterdam, The Netherlands. Health-grid essentials Peter Sloot University of Amsterdam.
EU 2nd Year Review – Jan – WP9 WP9 Earth Observation Applications Demonstration Pedro Goncalves :
INFSO-RI Enabling Grids for E-sciencE FloodGrid application Ladislav Hluchy, Viet D. Tran Institute of Informatics, SAS Slovakia.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
Hungarian Supercomputing GRID
From GEANT to Grid empowered Research Infrastructures ANTONELLA KARLSON DG INFSO Research Infrastructures Grids Information Day 25 March 2003 From GEANT.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
Chapter 4 Realtime Widely Distributed Instrumention System.
1 Development of GRID environment for interactive applications Jesús Marco de Lucas Instituto de Física de Cantabria,
Contact person: Prof. M. Niezgódka Prof. Piotr Bała ICM Interdisciplinary Centre for Mathematical and Computational Modelling Warsaw University,
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
TERENA 2003, May 21, Zagreb TERENA Networking Conference, 2003 MOBILE WORK ENVIRONMENT FOR GRID USERS. TESTBED Miroslaw Kupczyk Rafal.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
The Knowledge-based Workflow System for Grid Applications Ladislav Hluchý, Viet Tran, Ondrej Habala II SAS, Slovakia
Terena conference, June 2004, Rhodes, Greece Norbert Meyer The effective integration of scientific instruments in the Grid.
Kraków Kick-off meeting Migrating Desktop General concept Intuitive Grid-user’s work environment independent of a hardware.
May 2004NTUA1 National Technical University of Athens EGEE Project 3 rd Parties Kick off Meeting, Athens, May 27-28, 2004 Dr. Costis Christogiannis Telecommunications.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Distributed Data Analysis & Dissemination System (D-DADS ) Special Interest Group on Data Integration June 2000.
CERN, DataGrid PTB, April 10, 2002 CrossGrid – DataGrid Collaboration (Framework) Marian Bubak and Bob Jones.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Ariel Garcia DataGrid WP6, Heidelberg, 26 th September 2003 Ariel García CrossGrid testbed status Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft.
→ MIPRO Conference,Opatija, 31 May -3 June 2005 Grid-based Virtual Organization for Flood Prediction Miroslav Dobrucký Institute of Informatics, SAS Slovakia,
BOF at GGF5, Edinburgh, Scotland, July 21-24, 2002 CrossGrid Architecture Marian Bubak and TAT Institute of Computer Science & ACC CYFRONET AGH, Cracow,
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
CERN, April 9, 2002 Towards the CrossGrid Architecture Marian Bubak, Marek Garbacz, Maciej Malawski, and Katarzyna Zajac X# TAT Institute of Computer Science.
GRIDSTART an interface between FP5 and 6 Dr Mark Parsons Commercial Director EPCC & NeSC.
GRIDSTART Brussels 20/9/02 1www.gridstart.org GRIDSTART and European activities Dr Francis Wray EPCC The University of Edinburgh.
K-WfGrid: Grid Workflows with Knowledge Ladislav Hluchy II SAS, Slovakia.
Cyberinfrastructure Overview of Demos Townsville, AU 28 – 31 March 2006 CREON/GLEON.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
EGEE is a project funded by the European Union under contract IST Compchem VO's user support EGEE Workshop for VOs Karlsruhe (Germany) March.
DataGrid France 12 Feb – WP9 – n° 1 WP9 Earth Observation Applications.
Grid Services for Digital Archive Tao-Sheng Chen Academia Sinica Computing Centre
Bob Jones EGEE Technical Director
University of Technology
Grid Application Programming Environment
Presentation transcript:

The CrossGrid Project Marcel Kunze, FZK representing the X#-Collaboration

ACAT 2002 MoscowMarcel Kunze - FZK Main Objectives New category of Grid enabled applications Computing and data intensive Distributed Interactive, near real time response (a person in a loop) Layered New programming tools Grid more user friendly, secure and efficient Interoperability with other Grids Implementation of standards

ACAT 2002 MoscowMarcel Kunze - FZK Poland: Cyfronet & INP Cracow PSNC Poznan ICM & IPJ Warsaw Portugal: LIP Lisbon Spain: CSIC Santander Valencia & RedIris UAB Barcelona USC Santiago & CESGA Ireland: TCD Dublin Italy: DATAMAT Netherlands: UvA Amsterdam Germany: FZK Karlsruhe TUM Munich USTU Stuttgart Slovakia: II SAS Bratislava Greece: Algosystems Demo Athens AuTh Thessaloniki Cyprus: UCY Nikosia Austria: U.Linz 21 institutes 11 countries CrossGrid Collaboration

ACAT 2002 MoscowMarcel Kunze - FZK GRIDLAB GRIA EGSO DATATAG CROSSGRID DATAGRID Applications GRIP EUROGRID DAMIEN Middleware & Tools Underlying Infrastructures Science Industry / business - Links with European National efforts - Links with US projects (GriPhyN, PPDG, iVDGL,…) IST Grid Project Space

ACAT 2002 MoscowMarcel Kunze - FZK Collaboration with other # Projects Objective – Exchange of Information Software components Partners DATAGRID DATATAG GRIDLAB EUROGRID and GRIP GRIDSTART Participation in GGF

ACAT 2002 MoscowMarcel Kunze - FZK Project Phases M 1 - 3: requirements definition and merging M : first development phase: design, 1st prototypes, refinement of requirements M : second development phase: integration of components, 2nd prototypes M : third development phase: complete integration, final code versions M : final phase: demonstration and documentation

ACAT 2002 MoscowMarcel Kunze - FZK Network infrastructure, archivers, HPC/HPV systems, Labour instruments, etc.Local domain services Protocols, Authentication, Authorization, Access policy, Resource management, etc. Remote Data Access Optimization Monitoring Schedulers Roaming Access Portals Interactive simulation and visualisation of a biomedical system Flooding crisis team support Distributed Data Analysis in High Energy Physics Weather forecast and air pollution modeling FABRIC INFRASTRUCTURE GridSERVICES TOOLS APPLICATIONS GLOBUS TOOLKIT, Condor-G,... DATAGRID SET OF TOOLS Grid Visualization Kernel Benchmarks Structure Overview

ACAT 2002 MoscowMarcel Kunze - FZK CrossGrid Architecture Applications And Supporting Tools Applications Development Support Grid Common Services Grid Visualisation Kernel Data Mining on Grid Data Mining on Grid Interactive Distributed Data Access Globus Replica Manager Roaming Access Grid Resource Management Grid Resource Management Grid Monitoring MPICH-G Distributed Data Collection User Interaction Service DataGrid Replica Manager DataGrid Replica Manager Datagrid Job Manager GRAM GSI Replica Catalog GASS MDS GridFTP Globus-IO Resource Manager Resource Manager CPU Resource Manager Resource Manager Resource Manager Resource Manager Secondary Storage Resource Manager Resource Manager Scientific Instruments (Medical Scaners, Satelites, Radars) Resource Manager Resource Manager Detector Local High Level Trigger Detector Local High Level Trigger Resource Manager Resource Manager VR systems (Caves, immerse desks) VR systems (Caves, immerse desks) Resource Manager Resource Manager Visualization tools Optimization of Data Access Tertiary Storage Local Resources Biomedical Application Biomedical Application Portal Performance Analysis MPI Verification Metrics and Benchmarks HEP High LevelTrigger Flood Application Flood Application HEP Interactive Distributed Data Access Application HEP Data Mining on Grid Application HEP Data Mining on Grid Application Weather Forecast application Weather Forecast application

ACAT 2002 MoscowMarcel Kunze - FZK Layered Structure Interactive and Data Intensive Applications (WP1)  I nteractive simulation and visualization of a biomedical system  Flooding crisis team support  Distributed data analysis in HEP  Weather forecast and air pollution modeling Grid Application Programming Environment (WP2)  MPI code debugging and verification  Metrics and benchmarks  Interactive and semiautomatic performance evaluation tools Grid Visualization Kernel Data Mining New CrossGrid Services (WP3) Globus Middleware Fabric Infrastructure (Testbed WP4) DataGrid GriPhyN... Services HLA  Portals and roaming access  Grid resource management  Grid monitoring  Optimization of data access

ACAT 2002 MoscowMarcel Kunze - FZK Scope of Applications Applications in health and environment Data federation, processing and interpretation in geographically distributed locations Fast, interactive decision making Interactive access to distributed Databases Super computers and High Performance Clusters Visualisation engines Medical scanners Environmental data input devices

ACAT 2002 MoscowMarcel Kunze - FZK Application Requirements High quality presentation High frame rate Intuitive interaction Real-time response Interactive algorithms High performance computing and networking Distributed resources and data

ACAT 2002 MoscowMarcel Kunze - FZK Role of Network Latency Communication delay and rendering delay are negligible

ACAT 2002 MoscowMarcel Kunze - FZK CrossGrid Application Development (WP1) Interactive simulation and visualisation of a biomedical system Grid-based system for pre-treatment planning in vascular interventional and surgical procedures through real-time interactive simulation of vascular structure and flow. Flooding crisis team support Distributed interactive data analysis in HEP Focus on LHC experiments (ALICE, ATLAS, CMS and LHCb) Weather forecast and air pollution modelling Porting distributed/parallel codes on Grid Coupled Ocean/Atmosphere Mesoscale Prediction System STEM-II Air Pollution Code

ACAT 2002 MoscowMarcel Kunze - FZK Interactive Simulation and Visualisation of a Biomedical System Grid-based prototype system for treatment planning in vascular interventional and surgical procedures through near real-time interactive simulation of vascular structure and flow. The system will consist of a distributed near real-time simulation environment, in which a user interacts in Virtual Reality (VR) and other interactive display environments. A 3D model of the arteries, derived using medical imaging techniques, will serve as input to a simulation environment for blood flow calculations. The user will be allowed to change the structure of the arteries, thus mimicking an interventional or surgical procedure.

ACAT 2002 MoscowMarcel Kunze - FZK Current Situation Observation Diagnosis & Planning Treatment

ACAT 2002 MoscowMarcel Kunze - FZK Experimental Setup

ACAT 2002 MoscowMarcel Kunze - FZK Angio w/ Fem-Fem & Fem-Pop AFB w/ E-S Prox. Anast. Angio w/ Fem-Fem AFB w/ E-E Prox. Anast. Preop Alternatives Simulation Based Planning and Treatment

ACAT 2002 MoscowMarcel Kunze - FZK VR-Interaction

ACAT 2002 MoscowMarcel Kunze - FZK Flood Crisis Prevention Support system for establishment and operation of Virtual Organization for Flood Forecasting associating a set of individuals and institutions involved in flood prevention and protection. The system will employ a Grid technology to seamlessly connect together the experts, data and computing resources needed for quick and correct flood management decisions. The main component of the system will be a highly automated early warning system based on hydro-meteorological (snowmelt) rainfall-runoff simulations. System will integrate the advanced communication techniques allowing the crisis management teams to consult the decisions with various experts. The experts will be able to run the simulations with changed parameters and analyze the impact.

ACAT 2002 MoscowMarcel Kunze - FZK Storage systems databases surface automatic meteorological and hydrological stations systems for acquisition and processing of satellite information meteorologica l radars External sources of information  Global and regional centers GTS  EUMETSAT and NOAA  Hydrological services of other countries Data sources meteorological models hydrological models hydraulic models HPC, HTC Grid infrastructure Flood crisis teams  meteorologists  hydrologists  hydraulic engineers Users  river authorities  energy  insurance companies  navigation  media  public Virtual Organization for Flood Forecasting

ACAT 2002 MoscowMarcel Kunze - FZK Váh River Catchment Area: 19700km 2, 1/3 of Slovakia (Inflow point) Nosice Strečno (Outflow point) Pilot Site Catchment Area: 2500km 2 (above Strečno: 5500km 2 ) Váh River Pilot Site Flood Crisis Prevention Water stages/discharges in the real time operating hydrological stations Mapping of the flooded areas

ACAT 2002 MoscowMarcel Kunze - FZK Flow + water depths Flood Simulations Results

ACAT 2002 MoscowMarcel Kunze - FZK Distributed Analysis in High Energy Physics Challenging points Access to large distributed databases in the Grid Development of distributed data-mining techniques Definition of a layered application structure Integration of user-friendly interactive access (based on PROOF) Focus on LHC experiments (ALICE, ATLAS, CMS and LHCb)

ACAT 2002 MoscowMarcel Kunze - FZK PROOF Local Remote Selection Parameters Procedure Proc.C PROOF CPU TagD B RD B DB 1 DB 4 DB 5 DB 6 DB 3 DB 2

ACAT 2002 MoscowMarcel Kunze - FZK Weather Forecast and Air Pollution Modeling Integration of distributed databases into Grid Migration of data mining algorithms to Grid Porting distributed atmospheric & wave models to Grid Porting parallel codes for air quality models to Grid Integration, testing and demonstration of the application in the testbed environment

ACAT 2002 MoscowMarcel Kunze - FZK COAMPS Coupled Ocean/Atmosphere Mesoscale Prediction System: Atmospheric Components Complex Data Quality Control Analysis: Multivariate Optimum Interpolation Analysis of Winds and Heights Univariate Analyses of Temperature and Moisture Optimum Interpolation Analysis of Sea Surface Temperature Initialization: Variational Hydrostatic Constraint on Analysis Increments Digital Filter Atmospheric Model: Numerics: Nonhydrostatic, Scheme C, Nested Grids, Sigma-z Physics: Convection, Explicit Moist Physics, Radiation, Surface Layer Features: Globally Relocatable (5 Map Projections) User-Defined Grid Resolutions, Dimensions, and Number of Nested Grids 6 or 12 Hour Incremental Data Assimilation Cycle Can be Used for Idealized or Real-Time Applications Single Configuration Managed System for All Applications Operational at: 7 Areas, Twice Daily, using 81/27/9 km or 81/27 km grids Forecasts to 72 hours Operational at all Navy Regional Centers (w/GUI Interface)

ACAT 2002 MoscowMarcel Kunze - FZK Status Quo … Quo Vadis ? Current state (briefly) Simulation done on a single system or local clusters Visualisation on a single system, locally What we are going to achieve HPC, HTC, HPV in geographically distributed environment Improved interaction with the end user Near real time simulations Different visualisation equipments (adaptive according to the end-user needs), like PDA Workstations VR studio (e.g. CAVE)

ACAT 2002 MoscowMarcel Kunze - FZK Grid Application Programming Environment (WP2) MPI code debugging and verification Metrics and benchmarks Interactive and semiautomatic performance evaluation tools Specify, develop, integrate, test tools for HPC and HTC applications on the Grid

ACAT 2002 MoscowMarcel Kunze - FZK New Grid Services and Tools (WP3) Portals and roaming access Grid resource management Grid monitoring Optimisation of data access Objectives To develop interactive compute- and data-intensive applications To develop user-friendly Grid environments To offer easy access to the applications and Grid To have reasonable trade-off between resource usage efficiency and application speedup To support management issues while accessing resources

ACAT 2002 MoscowMarcel Kunze - FZK Auth Thessaloniki U v Amsterdam FZK Karlsruhe TCD Dublin U A Barcelona LIP Lisbon CSIC Valencia CSIC Madrid USC Santiago CSIC Santander DEMO AthensUCY Nikosia CYFRONET Cracow II SAS Bratislava PSNC Poznan ICM & IPJ Warsaw 15 sites International Testbed Organisation (WP4) Testbed setup & incremental evolution Integration with DataGrid Infrastructure support Verification & quality control

ACAT 2002 MoscowMarcel Kunze - FZK Summary Layered structure of all X# applications Reuse of SW from DataGrid and other # projects Globus as the bottom layer of the middleware Heterogeneous computer and storage systems Distributed development and testing of SW 12 partners in applications 14 partners in middleware 15 partners in testbeds

ACAT 2002 MoscowMarcel Kunze - FZK 1980s: Internet 1990s: Web 2000s: Grid Where do we need to get to ? Applications to support an “e-society” (“Cyber-Infrastructure”) An international Grid infrastructure which hides the complexities from the users (“Invisible Computing”) A powerful and flexible network infrastructure Where do we need to invest ? Applications targeted at realistic problems in “e-science” Prototypes of Grid infrastructures Maintain and improve the GEANT network Expression of Interest for EU FP6 program: “Enabling Grids and e-Science in Europe (EGEE)” Grid-enabled ApplicationsPrototype Grid Infrastructures Gèant: World Class Networking

ACAT 2002 MoscowMarcel Kunze - FZK GRIDLAB GRIA EGSO DATATAG CROSSGRID DATAGRID Applications GRIP EUROGRID DAMIEN Middleware & Tools Underlying Infrastructures Science Industry / business EGEE Project Space Enabling Grids and E-Science in Europe (EGEE)

ACAT 2002 MoscowMarcel Kunze - FZK First results of EGEE Brainstorming C Jones, D O Williams, et al