Grid Computing at PSNC Jarosław Nabrzyski Poznań Supercomputing and Networking Center (PSNC) and Information Sciences Institute, Poznan University of Technology.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Grid Technology Transfer through The Matrix Project IP Proposal for the Call 5 Jarek Nabrzyski et al. IP Proposal for the Call 5 Jarek.
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
1 Project overview Presented at the Euforia KoM January, 2008 Marcin Płóciennik, PSNC, Poland.
C3.ca in Atlantic Canada Virendra Bhavsar Director, Advanced Computational Research Laboratory (ACRL) Faculty of Computer Science University of New Brunswick.
1 Development of Virtual Supercomputer Service using Academic Network
1 GridLab Grid Application Toolkit and Testbed Contact: Jarek Nabrzyski, GridLab Project Coordinator Poznań Supercomputing and Networking.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
GridLab Enabling Applications on the Grid Jarek Nabrzyski et al. Poznań Supercomputing and Networking.
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
Portals Team GridSphere and the GridLab Project Jason Novotny Michael Russell Oliver Wehrens Albert.
SC 2003 Demo, NCSA booth GridLab Project Funded by the EU (5+ M€), January 2002 – December 2004 Application and Testbed oriented Cactus Code, Triana Workflow,
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
GridLab & Cactus Joni Kivi Maarit Lintunen. GridLab  A project funded by the European Commission  The project was started in January 2002  Software.
Kelly Davis GAT: Grid Application Toolkit Kelly Davis AEI-MPG.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
SUN HPC Consortium, Heidelberg 2004 Grid(Lab) Resource Management System (GRMS) and GridLab Services Krzysztof Kurowski Poznan Supercomputing and Networking.
General Intro to GridLab Jarek Nabrzyski et al. Poznań Supercomputing and Networking Center.
GridLab A Grid Application Toolkit and Testbed IST Jarek Nabrzyski GridLab Project Coordinator Poznań.
Cactus Project & Collaborative Working Gabrielle Allen Max Planck Institute for Gravitational Physics, (Albert Einstein Institute)
Distributed EU-wide Supercomputing Facility as a New Research Infrastructure for Europe Gabrielle Allen Albert-Einstein-Institut, Germany Jarek Nabrzyski.
From GEANT to Grid empowered Research Infrastructures ANTONELLA KARLSON DG INFSO Research Infrastructures Grids Information Day 25 March 2003 From GEANT.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
GridLab: A Grid Application Toolkit and Testbed Jarosław Nabrzyski GridLab Project Manager Poznań Supercomputing and Networking Center, Poland
GridLab: A Grid Application Toolkit and Testbed
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
1 4/23/2007 Introduction to Grid computing Sunil Avutu Graduate Student Dept.of Computer Science.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
General Intro to GridLab Jarek Nabrzyski et al. Poznań Supercomputing and Networking Center.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
Authors: Ronnie Julio Cole David
RI The DEISA Sustainability Model Wolfgang Gentzsch DEISA-2 and OGF rzg.mpg.de.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
CEOS Working Group on Information Systems and Services - 1 Data Services Task Team Discussions on GRID and GRIDftp Stuart Doescher, USGS WGISS-15 May 2003.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
GridLab Resource Management System (GRMS) Jarek Nabrzyski GridLab Project Coordinator Poznań Supercomputing and.
7. Grid Computing Systems and Resource Management
CERN, DataGrid PTB, April 10, 2002 CrossGrid – DataGrid Collaboration (Framework) Marian Bubak and Bob Jones.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
GridLab Resource Management System (GRMS) Jarek Nabrzyski GridLab Project Coordinator Poznań Supercomputing and.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Meeting with Sun Microsystems at PSNC: Exploitation, 13 May 2004 GridLab 2003/4 „Steady leadership in changing times!” Jarek Nabrzyski Project Coordinator.
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
VLDATA Common solution for the (very-)large data challenge EINFRA-1, focus on topics (4) & (5)
Dynamic Grid Computing: The Cactus Worm The Egrid Collaboration Represented by: Ed Seidel Albert Einstein Institute
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
GRIDSTART an interface between FP5 and 6 Dr Mark Parsons Commercial Director EPCC & NeSC.
GRIDSTART a European GRID coordination attempt Fabrizio Gagliardi CERN.
GRIDSTART Brussels 20/9/02 1www.gridstart.org GRIDSTART and European activities Dr Francis Wray EPCC The University of Edinburgh.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
COST Action and European GBIF Nodes Anne-Sophie Archambeau.
Cactus Project & Collaborative Working
Bob Jones EGEE Technical Director
Long-term Grid Sustainability
Grid Computing AEI Numerical Relativity Group has access to high-end resources in over ten centers in Europe/USA They want: Bigger simulations, more simulations.
EGI Webinar - Introduction -
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Grid Computing at PSNC Jarosław Nabrzyski Poznań Supercomputing and Networking Center (PSNC) and Information Sciences Institute, Poznan University of Technology

Talk Outline What is Grid Computing about? Background - Polish Information Infrastructure PIONIER Program Poznan Supercomputing and Networking Center PSNC’s Grid Research International collaboration IST projects The future (6FP)

The Grid Problem Flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources From “The Anatomy of the Grid: Enabling Scalable Virtual Organizations” Enable communities (“virtual organizations”) to share geographically distributed resources as they pursue common goals -- assuming the absence of… central location, central control, omniscience, trust relationships.

The Grid - Not a New Idea Late 70’s – Networked operating systems Late 80’s – Distributed operating systems Early 90’s – Heterogeneous computing Mid of 90’ - metacomputing (C.C. Catlett and L. Smarr) Transparent access to heterogeneous and geographically distributed computational resources, Giving a user access to the required computing power Metacomputing -> then the Grid: 1999, C. Kesselman and I. Foster

How are Grids Different? Autonomy Heterogeneity Focus on the user These three differences create many of the problems but also make the system much more usable than its predecessors

Elements of the Problem Resource sharing Computers, storage, sensors, networks, … Sharing always conditional: issues of trust, policy, negotiation, payment, … Coordinated problem solving Beyond client-server: distributed data analysis, computation, collaboration, … Multi-institutional virtual orgs Community overlays on classic org structures Large or small, static or dynamic

PSNC Founded in 1993 by the State Committee for Scientific Research One of the 5 supercomputing centres in Poland 4 departments: APPS, HPC, NET, SERVICES Staff: 75 people

PSNC’s Main Activities POZMAN - metropolitan area network operator POZMAN - metropolitan area network operator Operator of the Polish National Research Network POL-34/155 (now PIONIER network) Operator of the Polish National Research Network POL-34/155 (now PIONIER network) Management of high performance computing resources, network file servers, archive systems Management of high performance computing resources, network file servers, archive systems Development and deployment of new broadband network services Development and deployment of new broadband network services GRID computing GRID computing Resource management Resource management Grid Information Services Grid Information Services Grid Application Development Toolkits Grid Application Development Toolkits

PSNC’s international activities European Grid Forum (EGrid) ( 1st EGrid Workshop in Poznań (April 2000) EGrid Testbed – European Grid Forum Testbed (demo at SC’2000) EGrid Testbed Global Grid Forum ( GGF Steering Group (J. Nabrzyski, AD) 1st Global Grid Forum (Amsterdam 2001) – Program Chair 5 th GGF to be held in Edinburgh, July 2002

EGrid Testbed Cactus worm Dynamic grid computing 12 sites Presented at SC2000

Cactus warm Dynamic grid computing Remote steering and visualization Mobile user access 35 supercomputers at 25+ sites 10+ teraflops!

EC funded Grid project space GRIDLAB GRIA EGSO DATATAG CROSSGRID DATAGRID Applications GRIP EUROGRID DAMIEN Middleware & Tools Underlying Infrastructures Science Industry / business - Links with European National efforts - Links with US projects (GriPhyN, PPDG, iVDGL,…) PSNC CYFRONET PSNC ICM  35MЄ EC-funding for above initiatives  The volume of EC projects in which Poland participates is 13MЄ ENACTS PSNC ICM

GridLab – A Grid Application Toolkit and Testbed PSNC - Project Coordinator 11 institutions/vendors from Europe, 3 from the US, The total budget:5.086 Euro 3 years project, started January 2002 Contact: Jarek Nabrzyski – Project Manager,

GridLab Participants Poznań Supercomputing and Networking Center, Poland (PSNC) – Project Coordinator, Albert Einstein Institute (AEI), Germany, SZTAKI, Hungary Masaryk University, Czech Republik, University of Lecce, Italy Konrad Zusse Centrum (ZIB), Germany University of Wales, UK Vrije University, Netherlands, Sun Microsystems Gridware GmbH, Germany Compaq EMEA, France University of Athens, Greece Argonne National Laboratory (Ian Foster’s group) ISI (Carl Kesselman’s group) University of Wisconsin (Miron Livny’s group) Several subcontracting sites provide additional testbed resources Many other partners will work unfunded (GGF Apps WG connections)

Grand Challenge Applications Black hole simulations, Gravitational wave detection, Other generic applications.

GridLab Aims Get Computational Scientists using the “Grid” and Grid services for real, everyday, production work (AEI Relativists, EU Network, Grav Wave Data Analysis, Cactus User Community). Make it easier for applications to make flexible, efficient, robust, use of the resources available to their virtual organizations. Dream up, prototype, and test new application scenarios which make adaptive, dynamic, wild, and futuristic uses of resources.

Solution “Grid Application Toolkit” or GAT Provides a layer between applications and emerging grid technologies. Provides an application developer orientated API, allowing the flexible use of different tools and services, as well as providing protection from developing software. “GridLab Testbed/VO” Diverse controllable environment for developing and testing applications and tools, software maintained by people who know it. End Users GAT Tool Developers Grid Infrastructure Developers GAT-API Developers

What GridLab Isn’t … Don’t want to develop low level Grid Infrastructure Don’t want to repeat work which has already been done (want to incorporate and assimilate it … Globus APIs, OGSA, ASC Portal (GridSphere/Orbiter), GPDK, GridPort, DataGrid, GriPhyn)

Architektura GridLab

Grid Scenario We see something, but too weak. Please simulate to enhance signal! WashU Potsdam Thessaloniki OK! Resource Estimator Says need 5TB, 2TF. Where can I do this? RZG NCSA 1Tbit/sec Hong Kong Resource Broker: LANL is best match… Resource Broker: NCSA + Garching OK, but need 10Gbit/sec… LANL Now.. LANL

Workpackages WP1: Grid Application Toolkit (AEI) This is a key component of GridLab - link between Grid middleware and applications, usable by any conforming application or middleware component. Requiring input from, and connecting to, most other workpackages and components. WP2: Cactus Grid Application Toolkit (AEI) Provides an extended GAT interface for Cactus, a very general toolkit framework supporting different Grid applications, from astrophysics to chemical engineering. Cactus will be one of the primary application drivers for the GAT, and the project generally. WP3: Work-flow Application Toolkit (CARDIFF) Will develop Grid capabilities for a widely used dataflow programming environment, Triana, used in gravitational wave and other data analysis areas.

Workpackages (cont.) WP4: Grid Portals (AEI) Will be highly application driven, aimed at providing uniform, flexible and intuitive user access to Grid resources from anywhere, as well as administration tools for maintaining a Grid environment. WP5: Testbed management (MU) Will administrate and maintain an active development testbed across roughly a dozen EU sites (leveraging the work of the EGrid), deploying technologies as they are developed by the project. This workpackage will also coordinate with sites in the USA-based NCSA Alliance and others to test and develop interoperability. WP6: Security (PSNC) Will develop the required security mechanisms and will ensure the integration of all the technologies developed under other WPs, taking into account the various local security requirements and state of the art solutions.

Workpackages (cont.) WP7: Adaptive Application Components (VU) develops a set of components and APIs to be plugged into the toolkit, for example to take monitoring information and implement basic techniques for short-term forecasting and behavior adaptation/optimization. WP8: Data Handling and Visualization (ZIB) will provide Grid aware techniques for data management, analysis, and visualization, needed especially for applications that make use of multiple sites in a dynamic, time dependent manner, leaving data unpredictably scattered across the Grid. WP9: Resource Management (PSNC) will develop resource need estimators, resource brokers, and other tools, for both Grid users and the applications themselves to make intelligent decisions about which Grid resources should be used at any instant in the lifetime of a simulation.

Workpackages (cont.) WP10: Information Services (ISUFI) will extend existing Grid middleware toolkits with dynamic features needed by applications to select appropriate Grid resources and to provide simulation information to collaborative user groups. WP11: Monitoring (SZTAKI) will develop new components that will fit in the general Grid monitoring architecture to support application steering, adaptive monitoring, and automatic analysis and prediction of performance data. WP12: Access for mobile users (PSNC, ZIB) will develop and test Grid access and monitoring technologies through a variety of mobile devices,

Workpackages (cont.) WP13: Information Dissemination and Exploitation (PSNC) will ensure the active dissemination of the project results through a variety of channels, including active participation in international organizations (e.g. GGF), co-development with other Grid projects in the USA and EU, participation in international conferences, training programs, instruction of GridLab technologies into various communities, and introduction into the commercial vendor world. WP14: Project Management (PSNC) day-to-day scientific, financial and administrative management of the project, including careful orchestration and monitoring of work across groups, major project decisions, liaisons with external projects and with the international advisory board, reporting

6FP, Grids and PSNC Europe needs the Distributed EU-wide Supercomputing Facility, PSNC (and others) has already started to promote this idea, EoI in preparation, Strong co-operation with colleagues from UK, Project proposal ( MEuro) in preparation

Computational Needs not Met Great scientific and engineering talent in EU Many EU programs now beginning, with training mission, but: No EU-wide facilities No Collaborative Infrastructure No HPC Training (except for do-it-yourself) Rely on US connections What do others do?? In many cases, they do toy problems EU science and engineering fall behind Germany, UK, France are slight exceptions, but no large scale Many countries (mainly NAS, but not only) simply cut off from modern science

Our Vision for the EU Facility Science and Engineering Drivers Expertise Centers (CFD, Astrophysics, Computational Biology, Aerospace, etc.) Sheparding/building communities across Europe Present science/engineering calculations require TB/Tflop/sec, but just scratching the surface! Need much more! Enabling new science Computational Science Drivers Need Grid Software infrastructure Cycles, networks, visualization centers Partnerships with existing centers, networks, etc

Vision of the Facility Distributed Nature Pushes networking technologies through apps Works to integrate European efforts, scientific, cultural, etc. East and West at 100Gb/sec! Multi-lambda backplane (GEANT?) Unique EU facility in the HPC world Leapfrog Teragrid with Big Machine, but also Draw on unique, more diverse EU research strengths Beginning of a Worldwide Grid Effort Unique challenges to integrating diverse EU grids Already strong support from US, NSF, DOE principals Scientists/engineers know they have expanding facility over coming decades (can affect their projects)

Our Vision Major Center Satellite

More info / summary