M.Kunze, NEC2003, Varna The European CrossGrid Project Marcel Kunze Institute for Scientific Computing (IWR) Forschungszentrum Karlsruhe GmbH www.eu-crossgrid.org.

Slides:



Advertisements
Similar presentations
ESA Data Integration Application Open Grid Services for Earth Observation Luigi Fusco, Pedro Gonçalves.
Advertisements

1 Project overview Presented at the Euforia KoM January, 2008 Marcin Płóciennik, PSNC, Poland.
EGC 2005, CrossGrid technical achievements, Amsterdam, Feb. 16th, 2005 WP2-3 New Generation Environment for Grid Interactive MPI Applications M igrating.
CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia 28/08/2002.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
AcrossGrids Conference – Santiago 13 of February 2003 First Prototype of the CrossGrid Testbed Jorge Gomes (LIP) On behalf of X# WP4.
GridLab Conference, Zakopane, Poland, September 13, 2002 CrossGrid: Interactive Applications, Tool Environment, New Grid Services, and Testbed Marian Bubak.
Cracow Grid Workshop, November 5-6, 2001 Towards the CrossGrid Architecture Marian Bubak, Marek Garbacz, Maciej Malawski, and Katarzyna Zając.
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
The CrossGrid Project Marcel Kunze, FZK representing the X#-Collaboration.
The CrossGrid project Juha Alatalo Timo Koivusalo.
27-29 September 2002CrossGrid Workshop LINZ1 USE CASES (Task 3.5 Test and Integration) Santiago González de la Hoz CrossGrid Workshop at Linz,
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
5 th EU DataGrid Conference, Budapest, September 2002 The European CrossGrid Project Marcel Kunze Abteilung Grid-Computing und e-Science Forschungszentrum.
TAT CrossGrid Yearly Review, Brussels, March 12, 2003 CrossGrid After the First Year: A Technical Overview Marian Bubak, Maciej Malawski, and Katarzyna.
Workshop CESGA - HPC’ A Coruna, May 30, 2002 Towards the CrossGrid Architecture Marian Bubak, Maciej Malawski, and Katarzyna Zajac X# TAT Institute.
Dagstuhl Seminar 02341: Performance Analysis and Distributed Computing, August 18 – 23, 2002, Germany Monitoring of Interactive Grid Applications Marian.
CrossGrid Task 3.3 Grid Monitoring Trinity College Dublin (TCD) Brian Coghlan Paris MAR-2002.
Cracow Grid Workshop, November 5-6, 2001 Overview of the CrossGrid Project Marian Bubak Institute of Computer Science & ACC CYFRONET AGH, Kraków, Poland.
TAT Cracow Grid Workshop, October 27 – 29, 2003 Marian Bubak, Michal Turala and the CrossGrid Collaboration CrossGrid in Its Halfway:
Contact person: Prof. M. Niezgódka Prof. Piotr Bała ICM Interdisciplinary Centre for Mathematical and Computational Modelling Warsaw University,
5 March 2002 DG PARIS Jesus Marco CSIC IFCA(Santander) Development of GRID environment for interactive applications J.Marco (CSIC) DATAGRID WP6 MEETING.
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space Cracow Grid Workshop’10 Kraków, October 11-13,
EU 2nd Year Review – Jan – WP9 WP9 Earth Observation Applications Demonstration Pedro Goncalves :
SUN HPC Consortium, Heidelberg 2004 Grid(Lab) Resource Management System (GRMS) and GridLab Services Krzysztof Kurowski Poznan Supercomputing and Networking.
INFSO-RI Enabling Grids for E-sciencE FloodGrid application Ladislav Hluchy, Viet D. Tran Institute of Informatics, SAS Slovakia.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
WP9 Resource Management Current status and plans for future Juliusz Pukacki Krzysztof Kurowski Poznan Supercomputing.
Crossgrid kick-off meeting, Cracow, March 2002 Santiago González de la Hoz, IFIC1 Task 3.5 Test and Integration (
Computational grids and grids projects DSS,
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
DORII Joint Research Activities DORII Joint Research Activities Status and Progress 6 th All-Hands-Meeting (AHM) Alexey Cheptsov on.
1 Development of GRID environment for interactive applications Jesús Marco de Lucas Instituto de Física de Cantabria,
Contact person: Prof. M. Niezgódka Prof. Piotr Bała ICM Interdisciplinary Centre for Mathematical and Computational Modelling Warsaw University,
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
Migrating Desktop Marcin Płóciennik Marcin Płóciennik Kick-off Meeting, Santander, Graphical.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
TERENA 2003, May 21, Zagreb TERENA Networking Conference, 2003 MOBILE WORK ENVIRONMENT FOR GRID USERS. TESTBED Miroslaw Kupczyk Rafal.
George Tsouloupas University of Cyprus Task 2.3 GridBench ● 1 st Year Targets ● Background ● Prototype ● Problems and Issues ● What's Next.
BalticGrid-II Project The Second BalticGrid-II All-Hands Meeting, Riga, May, Joint Research Activity Enhanced Application Services on Sustainable.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
The Knowledge-based Workflow System for Grid Applications Ladislav Hluchý, Viet Tran, Ondrej Habala II SAS, Slovakia
Research Infrastructures Information Day Brussels, March 25, 2003 Victor Alessandrini IDRIS - CNRS.
Kraków Kick-off meeting Migrating Desktop General concept Intuitive Grid-user’s work environment independent of a hardware.
CERN, DataGrid PTB, April 10, 2002 CrossGrid – DataGrid Collaboration (Framework) Marian Bubak and Bob Jones.
Interactive European Grid Environment for HEP Application with Real Time Requirements Lukasz Dutka 1, Krzysztof Korcyl 2, Krzysztof Zielinski 1,3, Jacek.
Ariel Garcia DataGrid WP6, Heidelberg, 26 th September 2003 Ariel García CrossGrid testbed status Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft.
EUFORIA FP7-INFRASTRUCTURES , Grant Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center.
BOF at GGF5, Edinburgh, Scotland, July 21-24, 2002 CrossGrid Architecture Marian Bubak and TAT Institute of Computer Science & ACC CYFRONET AGH, Cracow,
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
Jesus Marco DataGrid WP6, Barcelona, 12 th May 2003 WP4 Status of the CrossGrid Testbed EDG WP6 Meeting, Barcelona Jesús Marco Instituto.
PROGRESS: GEW'2003 Using Resources of Multiple Grids with the Grid Service Provider Michał Kosiedowski.
Support to MPI and interactivity on gLite infrastructures EGEE’07 Budapest, 4th Oct 2007.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
BalticGrid-II Project EGEE UF’09 Conference, , Catania Partner’s logo Framework for Grid Applications Migrating Desktop Framework for Grid.
CERN, April 9, 2002 Towards the CrossGrid Architecture Marian Bubak, Marek Garbacz, Maciej Malawski, and Katarzyna Zajac X# TAT Institute of Computer Science.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poznan, Poland EGEE’07, Budapest, Oct.
Grid Activities in CMS Asad Samar (Caltech) PPDG meeting, Argonne July 13-14, 2000.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poland EGEE’08 Conference, Istanbul, 24 Sep.
K-WfGrid: Grid Workflows with Knowledge Ladislav Hluchy II SAS, Slovakia.
EGEE is a project funded by the European Union under contract IST Compchem VO's user support EGEE Workshop for VOs Karlsruhe (Germany) March.
DataGrid France 12 Feb – WP9 – n° 1 WP9 Earth Observation Applications.
Grid Computing: Running your Jobs around the World
EO Applications Parallel Session
Report on GLUE activities 5th EU-DataGRID Conference
Wide Area Workload Management Work Package DATAGRID project
Gridifying the LHCb Monte Carlo production system
Grid Application Programming Environment
Presentation transcript:

M.Kunze, NEC2003, Varna The European CrossGrid Project Marcel Kunze Institute for Scientific Computing (IWR) Forschungszentrum Karlsruhe GmbH

M.Kunze, NEC2003, Varna Outline u Project Overview u Applications u Toolbox u Experience (Testbeds)

M.Kunze, NEC2003, Varna CrossGrid Collaboration Poland: Cyfronet & INP Cracow PSNC Poznan ICM & IPJ Warsaw Portugal: LIP Lisbon Spain: CSIC Santander Valencia & RedIris UAB Barcelona USC Santiago & CESGA Ireland: TCD Dublin Italy: DATAMAT Netherlands: UvA Amsterdam Germany: FZK Karlsruhe TUM Munich USTU Stuttgart Slovakia: II SAS Bratislava Greece: Algosystems Demo Athens AuTh Thessaloniki Cyprus: UCY Nikosia Austria: U.Linz 21 institutes 11 countries

M.Kunze, NEC2003, Varna GRIDLAB GRIA EGSO DATATAG DATAGRID GRIP EUROGRID DAMIEN Science IST Grid Project Space Industry/Business Science Applications Middleware Fabric CROSSGRID

M.Kunze, NEC2003, Varna Mission Statement Development of a Grid Environment for Interactive Applications Make the Grid more user friendly, secure and efficient Timeline:

M.Kunze, NEC2003, Varna Project Phases requirements definition and merging design, initial prototypes, refinement of requirements integration of components, additional prototypes complete integration, final versions of software components demonstration and documentation Months: first development phase: third development phase: final phase: second development phase:

M.Kunze, NEC2003, Varna Workpackages u WP1 – Application Development u WP2 – Grid Application Programming Environment u WP3 – New Grid Services and Tools u WP4 – International Testbed Organisation u WP5 – Project Management

M.Kunze, NEC2003, Varna Applications Middleware & Tools Fabric (Infrastructure) Architecture: Layered Structure

M.Kunze, NEC2003, Varna WP1 –Application Development 1.Interactive simulation and visualization of a biomedical system 2.Flooding crisis team support 3.Distributed data analysis in HEP 4.Weather forecast and air pollution modeling WP2 – Programming Environment 1.MPI code debugging and verification 2.Metrics and benchmarks 3.Interactive performance evaluation tools Visualization Kernel Data Mining WP3 – New Services Globus Middleware WP4 – Testbeds - Fabric Infrastructure Grid Services: DataGrid GriPhyN Portals and roaming access 2. Grid resource management 3. Grid monitoring 4. Optimization of data access High Level Architecture Technical Components

M.Kunze, NEC2003, Varna Detailed Architecture Supporting Tools 1.4 Meteo Pollution 1.4 Meteo Pollution 3.1 Portal & Migrating Desktop Applications Development Support 2.4 Performance Analysis 2.4 Performance Analysis 2.2 MPI Verification 2.3 Metrics and Benchmarks 2.3 Metrics and Benchmarks App. Spec Services 1.1 Grid Visualisation Kernel 1.3 Data Mining on Grid (NN) 1.3 Data Mining on Grid (NN) 1.3 Interactive Distributed Data Access 3.1 Roaming Access 3.1 Roaming Access 3.2 Scheduling Agents 3.2 Scheduling Agents 3.3 Grid Monitoring 3.3 Grid Monitoring MPICH-G Fabric 1.1, 1.2 HLA and others 3.4 Optimization of Grid Data Access 3.4 Optimization of Grid Data Access 1.2 Flooding 1.2 Flooding 1.1 BioMed 1.1 BioMed Applications Generic Services GRAM GSI Replica Catalog GIS / MDS GridFTP Globus-IO DataGrid Replica Manager DataGrid Replica Manager DataGrid Job Submission Service Resource Manager (CE) Resource Manager (CE) CPU Resource Manager Resource Manager Resource Manager (SE) Resource Manager (SE) Secondary Storage Resource Manager Resource Manager Instruments ( Satelites, Radars) Instruments ( Satelites, Radars) 3.4 Optimization of Local Data Access 3.4 Optimization of Local Data Access Tertiary Storage Replica Catalog Globus Replica Manager Globus Replica Manager 1.1 User Interaction Services 1.1 User Interaction Services CrossGrid DataGrid Globus

M.Kunze, NEC2003, Varna CrossGrid Applications

M.Kunze, NEC2003, Varna Key Features of CrossGrid Applications u Data n Data sources and data bases geographically distributed n To be selected on demand u Processing n Large processing capacity required; both HPC & HTC n Interactive u Presentation n Complex data requires versatile 3D visualisation n Support for interaction and feedback to other components

M.Kunze, NEC2003, Varna Biomedical Application CT / MRI scan Medical DB Segmentation Medical DB LB flow simulation VE WD PC PDA Visualization Interaction HDB 10 simulations/day 60 GB/simulation > 20 MB/s

M.Kunze, NEC2003, Varna Bypass Surgery: Simulated Treatment Planning

M.Kunze, NEC2003, Varna Interactive Treatment Planning The vascular geometry can be modified using a library of models Draw interactivelyComputational geometry

M.Kunze, NEC2003, Varna Sample pulsatile Flow Simulation

M.Kunze, NEC2003, Varna Flood Simulation Data sources Meteorological simulations Hydraulic simulations Hydrological simulations Users Output visualization

M.Kunze, NEC2003, Varna 3D Visualization

M.Kunze, NEC2003, Varna Flood Simulation: Flow and Water Depth

M.Kunze, NEC2003, Varna Distributed Data Analysis in HEP u Objectives n Distributed data analysis n Distributed data mining techniques with neural networks u Issues n Typical interactive requests will run on o(TB) distributed data n Transfer/replication times for the whole data about one hour n Data transfers once and in advance of the interactive session n Allocation, installation and set-up of corresponding database servers before the interactive session n Integration of user-friendly interactive access (based on PROOF)

M.Kunze, NEC2003, Varna Parallel ROOT Facility: PROOF Local Remote Selection Parameters Procedure Proc.C PROOF CPU TagD B RD B DB 1 DB 4 DB 5 DB 6 DB 3 DB 2

M.Kunze, NEC2003, Varna u Distributed/parallel codes on the Grid n Coupled Ocean/Atmosphere Mesoscale Prediction System n STEM-II Air Pollution Code u Integration of distributed databases u Data mining applied to downscaling weather forecast Weather Forecast and Air Pollution Modeling

M.Kunze, NEC2003, Varna Weather Forecast and Air Pollution Modeling

M.Kunze, NEC2003, Varna CrossGrid Toolbox

M.Kunze, NEC2003, Varna Migrating Desktop u Idea n Save and resume a user grid session n Look and feel of a windows desktop u Implementation n Roaming Access Server and Clients n Java Web Services (Portability) u Integration of Tools n Job submission wizard n Job monitoring dialog n GridExplorer dialog n GridCommander dialog

M.Kunze, NEC2003, Varna User Login Dialog

M.Kunze, NEC2003, Varna Grid Commander Tool

M.Kunze, NEC2003, Varna Grid Explorer Tool

M.Kunze, NEC2003, Varna Job Submission Dialog

M.Kunze, NEC2003, Varna Job Monitoring/ Visualisation Plugin

M.Kunze, NEC2003, Varna CrossGrid Testbeds Example sites: Valencia GoG farm, Santander (GridWall), FZK Various instances: Production, Validation, Test

M.Kunze, NEC2003, Varna Testbed Monitoring Mapcenter grid monitoring framework. Mapcenter was developed by DataGrid and adapted to CrossGrid.

M.Kunze, NEC2003, Varna Production Resource Broker Statistics Total users61 Submitted2903 Accepted2716 Matching2554 Sent by JSS2514 Jobs run2355 Jobs done2299 Graphics with RB statistics available from Mapcenter EDG 1.4

M.Kunze, NEC2003, Varna Production RB Statistics (continued) u Most of the failures are related with: n Authorization n Matchmaking n Testbed sites Submitted2903 Accepted2716 Matching2554 Sent by JSS2514 Jobs run2355 Jobs done Not accepted 162Matching failures 40Not submitted 159Didn’t run 56Didn’t reach the end 604Jobs failed EDG 1.4

M.Kunze, NEC2003, Varna The Future

M.Kunze, NEC2003, Varna 1980s: Internet 1990s: Web 2000s: Grid u Where do we need to get to ? n Applications to support an “e-society” (“Cyber-Infrastructure”) n A Grid infrastructure which hides the complexities from the users (“Invisible Computing”) n A powerful and flexible network infrastructure (GEANT 2) u Where do we need to invest ? n Applications targeted at realistic problems in “e-science” n Prototypes of Grid infrastructures n Maintain and improve the GEANT network u EU FP6 proposal: EGEE n Vision: Create European e-Infrastructure n 70 Partners, 31.5 MEuro n Start: April 2004