Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Griding the Nordic Supercomputer Infrastructure The.

Slides:



Advertisements
Similar presentations
Conference xxx - August 2003 Sverker Holmgren SNIC Director SweGrid A National Grid Initiative within SNIC Swedish National Infrastructure for Computing.
Advertisements

An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Estonian Grid Mario Kadastik On behalf of Estonian Grid Tallinn, Jan '05.
Hungrid A Possible Distributed Computing Platform for Hungarian Fusion Research Szabolcs Hernáth MTA KFKI RMKI EFDA RP Workshop.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Fourth Nordic Grid Neighbourhood Workshop, St-Pettersburg, May 20, 2005 F. Ould-Saada, Oslo University ● Info, duties, goals.
SNIC 2006, - 1 Swedish National Infrastructure for Computing SNIC & Grids Anders Ynnerman.
LCSC October The EGEE project: building a grid infrastructure for Europe Bob Jones EGEE Technical Director 4 th Annual Workshop on Linux.
CoreGRID Workpackage 5 Virtual Institute on Grid Information and Monitoring Services Authorizing Grid Resource Access and Consumption Erik Elmroth, Michał.
Grid activities in Sweden Paula Eerola IT seminar, Vetenskapsrådet,
Swedish participation in DataGrid and NorduGrid Paula Eerola SWEGRID meeting,
Nordic Data Grid Facility NDGF – Paula Eerola, paula.eerola [at] hep.lu.se paula.eerola [at] hep.lu.sepaula.eerola [at] hep.lu.se 1st Iberian.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
1 CENTER FOR PARALLEL COMPUTERS Grid PDC Olle Mulmo
SNIC SweGrid and the views of SNIC Anders YnnermanUppsala Department of Science and Technology Linköping University Sweden.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
Grid and High Energy Physics Paula Eerola Lunarc, Artist’s view on Grid, by Ursula Wilby, Sydsvenskan
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
Conference xxx - August 2003 Anders Ynnerman Director, Swedish National Infrastructure for Computing Linköping University Sweden The Nordic Grid Infrastructure.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
FP I-6 Contract October, 2005EGEE04 Pisa Baltic Grid SIXTH FRAMEWORK PROGRAMME RESEARCH INFRASTRUCTURES COMMUNICATION NETWORK DEVELOPMENT.
Conference xxx - August 2003 Anders Ynnerman EGEE Nordic Federation Director Swedish National Infrastructure for Computing Linköping University Sweden.
Tord.Ekelöf, Uppsala UniversityPartikeldagarna Karlstad Tord Ekelöf Uppsala University Partikeldagarna Karlstad
CERN, BalticGrid Project Rolandas Naujikas [rolandas nawyeekas] CERN IT/GD Vilnius University Lithuania.
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Enabling Grids for E-sciencE ENEA and the EGEE project gLite and interoperability Andrea Santoro, Carlo Sciò Enea Frascati, 22 November.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Linköping University Sweden Griding the Nordic Supercomputer.
EGEE is proposed as a project funded by the European Union under contract IST EU eInfrastructure project initiatives FP6-EGEE Fabrizio Gagliardi.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks, An Overview of the GridWay Metascheduler.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
A Distributed Tier-1 An example based on the Nordic Scientific Computing Infrastructure GDB meeting – NIKHEF/SARA 13th October 2004 John Renner Hansen.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
RI The DEISA Sustainability Model Wolfgang Gentzsch DEISA-2 and OGF rzg.mpg.de.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks SA1: Grid Operations Maite Barroso (CERN)
INFSO-RI Enabling Grids for E-sciencE EGEE SA1 in EGEE-II – Overview Ian Bird IT Department CERN, Switzerland EGEE.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Performance of The NorduGrid ARC And The Dulcinea Executor in ATLAS Data Challenge 2 Oxana Smirnova (Lund University/CERN) for the NorduGrid collaboration.
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
BalticGrid-II Project EGEE’09, Barcelona1 GRID infrastructure for astrophysical applications in Lithuania Gražina Tautvaišienė and Šarūnas Mikolaitis Institute.
INFSO-RI Enabling Grids for E-sciencE An overview of EGEE operations & support procedures Jules Wolfrat SARA.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
INFSO-RI SA2 ETICS2 first Review Valerio Venturi INFN Bruxelles, 3 April 2009 Infrastructure Support.
INTRODUCTION TO GRID & CLOUD COMPUTING U. Jhashuva 1 Asst. Professor Dept. of CSE.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
INFSO-RI Enabling Grids for E-sciencE Turkish Tier-2 Site Report Emrah AKKOYUN High Performance and Grid Computing Center TUBITAK-ULAKBIM.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
NDGF and the distributed Tier-I Center Michael Grønager, PhD Technical Coordinator, NDGF dCahce Tutorial Copenhagen, March the 27th, 2007.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
Bob Jones EGEE Technical Director
Accessing the VI-SEEM infrastructure
Regional Operations Centres Core infrastructure Centres
The EDG Testbed Deployment Details
Ian Bird GDB Meeting CERN 9 September 2003
NGIs – Turkish Case : TR-Grid
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
Long-term Grid Sustainability
Presentation transcript:

Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Griding the Nordic Supercomputer Infrastructure The importance of national and regional Grids Research Infrastructures – e-Infrastructure: Grid initiatives 27 May 2005

INFO Event, May 26th Outline of presentation The Nordic HPC landscape SweGrid testbed for production Benefits of a national Grid effort NorduGrid - ARC Nordic DataGrid Facility

INFO Event, May 26th DENMARK HPC program initiated 2001 Danish Center for Scientific Computing (DCSC) Distributed program Location of facilities based on scientific evaluation of participating groups not on computer centers or infrastructure Århus Odense DTU Copenhagen University Budget 2.1 MEuros (2001) 1.6 MEuros (2002) 1.5 MEuros (2003) 1.4 MEuros (2004) Copenhagen University is a partner in NorduGrid Danish Center for Grid Computing (DCGC) Virtual center located at 5 sites Main location at the Niels Bohr Institute, Copenhagen University 0.75 MEuros/year Operations and support of a Danish Grid Research agenda with 10 senior researchers and 5 PhD students Strong ties to NorduGrid and Nordic DataGrid Faicility Participation in EGEE through NBI HPC-infrastructureGRID-initiatives

INFO Event, May 26th NORWAY NOTUR – The Norwegian High- Performance Computational Infrastructure Trondheim Oslo Bergen Tromsö Budget 168 MEuros ( ) 50% from reserach councils Statoil, DNMI In kind contributions New program similar is being initiated and placed under the NREN (UNINETT) Metacenter goal – To establish a uniform and as seamless as possible, access to HPC resources Parallab is contributing to EGEE security team Bergen and Oslo are partners in NorduGrid NORGRID project has been initiated under the umbrella of NOTUR HPC-infrastructureGRID-initiatives

INFO Event, May 26th FINLAND Center for Scientific Computing (CSC) is the (main) center in Finland The largest HPC center in Scandinavia Runs services for academic users all over Finland and FMI Budget Directly from the ministry of education Smaller university HPC-systems (clusters) are beginning to appear CSC is contributing resources to NorduGrid CSC is a partner in DEISA Helsinki Institute of Physics is running several GRID projects and is contributing to the EGEE security team A Finnish Grid dedicated to Condensed Matter Physics – example of topical Grids HPC-infrastructureGRID-initiatives

INFO Event, May 26th SWEDEN Swedish National Infrastructure for Computing (SNIC) formed during participating centers Umeå Uppsala Stockholm Linköping Göteborg Lund National resource allocation (SNAC) Budget 4.5 MEuros/year (Government) 4.0 MEuros/year private foundations Lund and Uppsala are partners in NorduGrid Large resource contributions to NorduGrid are SNIC clusters at Umeå and Linköping SweGrid production Grid (600 CPUs) over 6 sites 120 Tb storage SNIC is hosting the EGEE Nordic Regional Operations Center Stockholm (PDC) is coordinating the EGEE security activity PDC is a partner in the European Grid Support Center HPC-infrastructureGRID-initiatives

INFO Event, May 26th Grids in the context of HPC centers Throughput computing on Grids Loosely coupled workstations Clusters with Ethernet Clusters with High Speed Interconnect Large Shared Memory Systems Parallel Vector Processors Price/Performance 1/ No of Users

INFO Event, May 26th SweGrid production testbed The first step towards HPC center Gridification Initiative from All HPC-centers in Sweden IT-researchers wanting to research Grid technology Users  Life Science  Earth Sciences  Space & Astro Physics  High energy physics PC-clusters with large storage capacity Build for GRID production Participation in international collaborations LCG EGEE NorduGrid …

INFO Event, May 26th SweGrid subprojects 0.25 MEuro/year -Portals -Databases -Security Globus Alliance EGEE - security 0.25 MEuro/year 6 Technicians Forming the core team for the Northern EGEE ROC 2.5 MEuro 6 PC-clusters 600 CPUs for throughput computing SweGrid Test-bed GRID-research Technical deployment and implementation Hardware

INFO Event, May 26th SweGrid production test bed Total budget 3.6 MEuro 6 GRID nodes 600 CPUs IA-32, 1 processor/server 875P with 800 MHz FSB and dual memory busses 2.8 GHz Intel P4 2 Gbyte Gigabit Ethernet 12 TByte temporary storage FibreChannel for bandwidth 14 x 146 GByte rpm 200 TByte nearline storage 140 TByte disk 270 TByte tape 1 Gigabit direct connection to SUNET (10 Gbps)

INFO Event, May 26th SUNET connectivity GigaSunet 10 Gbit/s 2.5 Gbit/s SweGrid 1 Gbps Dedicated Univ. LAN 10 Gbit/s Typical POP at Univ.

INFO Event, May 26th Persistent storage on SweGrid? Size Administration Bandwidth Availability 1 2 3

INFO Event, May 26th SweGrid status All nodes installed during January 2004 Extensive use of the resources already Local batch queues GRID queues through the NorduGrid middlware – ARC Some nodes also available on LCG 60 national users 1/3 of SweGrid is dedicated to HEP (200 CPUs) Contributing to Atlas Data Challenge 2 As a partner in NorduGrid Also supporting LCG (gLite) Investigating compatibility between ARC and LCG Forms the core of the Northern EGEE ROC Accounting is being introduced - SGAS

INFO Event, May 26th The first users of SweGrid

INFO Event, May 26th How have you found porting your applications to SweGrid to be?

INFO Event, May 26th What is your overall impression of use of SweGrid resources?

INFO Event, May 26th Do you think all supercomputing resources should be available on a Grid?

INFO Event, May 26th SweGrid Observations Global user identity Each SweGrid users must receive a unique x509-certifikat All centers must agree on a common lowest level of security. This will affect general security policy for HPC centers. Unified support organization All helpdesk activities and other support needs to be coordinated between centers. Users can not decide where their jobs will be run (should not) and expect the same level of service at all sites. More bandwidth is needed To be able to move data between the nodes in SweGrid before and after execution of jobs continuously increasing bandwidth will be needed More storage is needed Users can despite increasing bandwidth not fetch all data back home. Storage for both temporary and permanent data will be needed in close proximity to processor capacity

INFO Event, May 26th SweGrid results Unified the Swedish Supercomputing Community Raised additional private funding for computing infrastructure Spawned national Grid R&D projects Created early user Grid awareness Created an interface to EU-projects EGEE  Regional operations center co-operated with SweGrid  Co-ordination of EGEE security DEISA participation through PDC at KTH Co-ordination of Baltic Grid Initiative

INFO Event, May 26th National projects and services Grid resource brokering, Umeå Grid accounting, SweGrid Accounting System (SGAS), Stockholm, Uppsala, Umeå Semi-automatic Grid interface-generation for numerical software libraries, Umeå Grid security research – focused on trust models, Stockholm, Umeå and Uppsala Parallel Object Query System for Expensive Computations (POQSEC) on Grids, Uppsala National Helpdesk, all SweGrid centers National storage solutions, Linköping, Stockholm, Umeå

INFO Event, May 26th North European EGEE ROC (SA1) SweGrid forms the core with three sites HPC2N  100 CPU cluster  LCG2.4, SL3 CE, Debian WN  SE: 10 TByte  Worker nodes are shared between EGEE and SweGrid (ARC) PDC  Migration of SweGrid 100 CPU system to EGEE  Eventually new 884 CPU Xeon EM64T cluster NSC  32 CPU cluster on pre-production testbed –SL3 –LCG23 (Upgrade week after 3rd EGEE Conference) SGAS (Grid bank), (Module for export of accounting information from SGAS to GOC accounting has been developed. Tests and implementation late April early May. (Specifically for accounting of Atlas VO jobs) Preparation for RC’s in Finland and Norway has started

INFO Event, May 26th EGEE security effort (JRA3) 12 FTE from the Northern Federation KTH (lead), FOM, UvA, UiB and UH-HIP Existence of a national Grid and high security competence at PDC was key in accepting the co- ordination role Responsible for overall EGEE Security Architecture partner in the Middleware design Responsible for EGEE Security software development close collaboration with Middleware development

INFO Event, May 26th BalticGrid Extending the European eInfrastructure to the Baltic region Partners: 10 partners from five countries in the Baltic region and Switzerland (CERN) Budget: 4.4M€ over 30 months Coordinator: KTH PDC, Stockholm Resources: 17 resource centres from start Major part of the work to be performed by the Estonian, Latvian and Lithuanian partners

INFO Event, May 26th SweGrid Future Long term vision is to have all HPC resources (computers & storage) in Sweden available through SweGrid Accounting will be/is available soon (SGAS) Clusters will be connected to SweGrid New loosely coupled clusters for throughput computing Existing and new clusters with high speed interconnect SweGrid should grow 10x to provide a national throughput Grid that can be a resource in international projects Provide a platform for Swedish participation in new EU- projects

INFO Event, May 26th The NorduGrid project Started in January 2001 & funded by NorduNet-2 Initial goal: to deploy DataGrid middleware to run “ATLAS Data Challenge” NorduGrid essentials Built on GT-2 Replaces some Globus core services and introduces some new services Grid-manager, Gridftp, User interface & Broker, information model, Monitoring Middleware named ARC Track record Used in the ATLAS DC tests in May 2002 Contributed 30% of the total resources to ATLAS DC II Continuation Could be included in the framework of the ”Nordic Data Grid Facility” Co-operation with EGEE/LCG

INFO Event, May 26th Resources running ARC Currently available resources: 10 countries, 40+ sites, ~4000 CPUs, ~30 TB storage 4 dedicated test clusters (3-4 CPUs) SweGrid Few university production-class facilities (20 to 60 CPUs) Three world-class clusters in Sweden and Denmark, listed in Top500 Other resources come and go Canada, Japan – test set-ups CERN, Russia – clients Australia Estonia Anybody can join or part People: the “core” team grew to 7 persons local sys admins are called up when users need an upgrade

INFO Event, May 26th

INFO Event, May 26th Reflections on NorduGrid Bottom up project driven by an application motivated group of talented people Middleware adaptation and development has followed a flexible and minimally invasive approach Nordic HPC centers have “connected” large general purpose resources since it is good PR for the centers As soon as “NorduGrid” usage of these resources increases they will be disconnected. There is no such thing as free cycles! Motivation of resource allocations is missing NorduGrid lacks an approved procedure for resource allocation to VOs and individual user groups based on scientific reviews of proposals Major obstacle for utilization of resources on Grids in general!

INFO Event, May 26th HPC-center Users Country 2 Challenges HPC-center Users Country 4 HPC-center Users Country 1 Funding agency country 4 HPC-center Users Country 3 Funding agency country 3 Funding agency country 2 Funding agency country 1 Current HPC setup

INFO Event, May 26th GRID management Users VO 1 VO 2 VO 3 MoU SLAs Proposals Authorization Middleware Other GRIDs Accounting Funding agency Country 1 HPC-center Funding agency Country 1 HPC-center Funding agency Country 1 HPC-center Funding agency Country 1 HPC-center

INFO Event, May 26th eIRG Policy framework Authentication Authorization Accounting Resource allocations SLAs MoUs Grid Economies Strong Nordic interest and representation in eIRG Collaboration between Nordic national Grids needs eIRG policy

INFO Event, May 26th Nordic Data Grid Facility - Vision To establish and operate a Nordic computing infrastructure providing seamless access to computers, storage and scientific instruments for researchers across the Nordic countries. Taken from proposal to NOS-N

INFO Event, May 26th Nordic DataGrid Facility - Mission operate a Nordic production Grid building on national production Grids operate a core facility focusing on Nordic storage resources for collaborative projects develop and enact the policy framework needed to create the Nordic research arena for computational science co-ordinate and host Nordic-level development projects in high performance and Grid computing. create a forum for high performance computing and Grid users in the Nordic Countries be the interface to international large scale projects for the Nordic high performance computing and Grid community

INFO Event, May 26th Conclusions Nordic countries are early adopters of GRID technology There are several national GRIDs These Grids have enabled participation in EU infrastructure projects Nordic representation in GGF, eIRG, etc. Several advanced partners for future Grid related EU projects available in the Nordic region A Nordic GRID is currently being planned Nordic policy framework Nordic interface to other projects

INFO Event, May 26th