SNIC 2006, - 1 Swedish National Infrastructure for Computing SNIC & Grids Anders Ynnerman.

Slides:



Advertisements
Similar presentations
Conference xxx - August 2003 Sverker Holmgren SNIC Director SweGrid A National Grid Initiative within SNIC Swedish National Infrastructure for Computing.
Advertisements

Jorge Gasós Grid Technologies Unit European Commission The EU e Infrastructures Programme Workshop, Beijing, June 2005.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
An overview of the EGEE project Bob Jones EGEE Technical Director DTI International Technology Service-GlobalWatch Mission CERN – June 2004.
Introduction to Grids and Grid applications Gergely Sipos MTA SZTAKI
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
LCSC October The EGEE project: building a grid infrastructure for Europe Bob Jones EGEE Technical Director 4 th Annual Workshop on Linux.
Grid activities in Sweden Paula Eerola IT seminar, Vetenskapsrådet,
Nordic Data Grid Facility NDGF – Paula Eerola, paula.eerola [at] hep.lu.se paula.eerola [at] hep.lu.sepaula.eerola [at] hep.lu.se 1st Iberian.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
1 CENTER FOR PARALLEL COMPUTERS Grid PDC Olle Mulmo
SNIC SweGrid and the views of SNIC Anders YnnermanUppsala Department of Science and Technology Linköping University Sweden.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
Grid and High Energy Physics Paula Eerola Lunarc, Artist’s view on Grid, by Ursula Wilby, Sydsvenskan
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
Conference xxx - August 2003 Anders Ynnerman Director, Swedish National Infrastructure for Computing Linköping University Sweden The Nordic Grid Infrastructure.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
Conference xxx - August 2003 Anders Ynnerman EGEE Nordic Federation Director Swedish National Infrastructure for Computing Linköping University Sweden.
Grid Computing - AAU 14/ Grid Computing Josva Kleist Danish Center for Grid Computing
Tord.Ekelöf, Uppsala UniversityPartikeldagarna Karlstad Tord Ekelöf Uppsala University Partikeldagarna Karlstad
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Linköping University Sweden Griding the Nordic Supercomputer.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
A Distributed Tier-1 An example based on the Nordic Scientific Computing Infrastructure GDB meeting – NIKHEF/SARA 13th October 2004 John Renner Hansen.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
RI The DEISA Sustainability Model Wolfgang Gentzsch DEISA-2 and OGF rzg.mpg.de.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks SA1: Grid Operations Maite Barroso (CERN)
Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Griding the Nordic Supercomputer Infrastructure The.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Performance of The NorduGrid ARC And The Dulcinea Executor in ATLAS Data Challenge 2 Oxana Smirnova (Lund University/CERN) for the NorduGrid collaboration.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Dr. Isabel Campos Plasencia (IFCA-CSIC) Spanish NGI Coordinator ES-GRID The Spanish National Grid Initiative.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
INFSO-RI Enabling Grids for E-sciencE GRNET strategic viewpoint: Research Networks and Grids Panayiotis Tsanakas, Chairman, GRNET.
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
CERN News on Grid and openlab François Fluckiger, Manager, CERN openlab for DataGrid Applications.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Co-ordination & Harmonisation of Advanced e-Infrastructures for Research and Education Data Sharing Grant.
NDGF and the distributed Tier-I Center Michael Grønager, PhD Technical Coordinator, NDGF dCahce Tutorial Copenhagen, March the 27th, 2007.
Bob Jones EGEE Technical Director
Regional Operations Centres Core infrastructure Centres
Clouds , Grids and Clusters
Ian Bird GDB Meeting CERN 9 September 2003
NGIs – Turkish Case : TR-Grid
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
Grid related projects CERN openlab LCG EDG F.Fluckiger
Long-term Grid Sustainability
EGEE support for HEP and other applications
Connecting the European Grid Infrastructure to Research Communities
Presentation transcript:

SNIC 2006, - 1 Swedish National Infrastructure for Computing SNIC & Grids Anders Ynnerman

SNIC 2006, - 2 GRID-Vision Hardware, networks and middleware are used to put together a virtual computer resource Users should not have to know where computation is taking place or where data is stored Users will work together over disciplinary and geographical borders and form virtual organizations

SNIC 2006, - 3 Flat GRID GRID Resource User Resource User Resource User Resource User Resource User

SNIC 2006, - 4 Hierarchical GRID GRID Regional center Management Local resource Regional center User Local resource Local resource Local resource

SNIC 2006, - 5 Collaborative GRID GRID Resources User Resources

SNIC 2006, - 6 Power plant GRID GRID HPC-center User

SNIC 2006, - 7 Some important Grid “projects” Globus –Middleware project, provides the foundation for many other projects GGF (Global Grid Forum) –World wide meetings and standardization efforts LCG (Large Hadron Collider Computing Grid) –CERNs Grid project to do data analysis for LHC NorduGrid/ARC (Advanced Resource Connector) –Middleware driving SweGrid NDGF (Nordic Data Grid Facility) –Nordic organisation for national Grids, T1 facility EGEE (Enabling Grids for Escience in Europe) –EU funded CERN driven project involving 74 partners BalticGrid –EGEE outreach project to the Baltic states, coordianted by KTH DEISA –EU funded project connecting “large” HPC centers in Europe eIRG –Advisory body to EU on eInfrastructures ESFRI expert panel on HPC –European advisory panel on HPC related issues

SNIC 2006, - 8 Computer trends Grids Loosely coupled workstations Clusters with Ethernet Clusters with High Speed Interconnect Large Shared Memory Systems Parallel Vector Processors Price/Performance No of Users Grids

SNIC 2006, - 9 SweGrid production testbed The first step towards HPC center Gridification Initiative from –All HPC-centers in Sweden –IT-researchers wanting to research Grid technology –Users Life Science Earth Sciences Space & Astro Physics High energy physics PC-clusters with large storage capacity Build for GRID production Participation in international collaborations –LCG –EGEE –NorduGrid –…

SNIC 2006, - 10 SweGrid production test bed Total budget 3.6 MEuro 6 GRID nodes 600 CPUs –IA-32, 1 processor/server –875P with 800 MHz FSB and dual memory busses –2.8 GHz Intel P4 –2 Gbyte –Gigabit Ethernet 12 TByte temporary storage –FibreChannel for bandwidth –14 x 146 GByte rpm 370 TByte nearline storage –120 TByte disk –250 TByte tape 1 Gigabit direct connection to SUNET (10 Gbps)

SNIC 2006, - 11 SUNET connectivity GigaSunet 10 Gbit/s 2.5 Gbit/s SweGrid 1 Gbps Dedicated Univ. LAN 10 Gbit/s Typical POP at Univ.

SNIC 2006, - 12 Persistent storage on SweGrid Size Administration Bandwidth Availability 1 2 3

SNIC 2006, - 13 SweGrid Observations Global user identity –AA services that scale must be implemented –All centers must agree on a common lowest level of security. This will affect general security policy for HPC centers. Unified support organization –All helpdesk activities and other support needs to be coordinated between centers. Users can not decide where their jobs will be run (should not) and expect the same level of service at all sites. More bandwidth is needed –To be able to move data between the nodes in SweGrid before and after execution of jobs continuously increasing bandwidth will be needed More storage is needed –Users can despite increasing bandwidth not fetch all data back home. Storage for both temporary and permanent data will be needed in close proximity to processor capacity

SNIC 2006, - 14 SweGrid status All nodes installed during January 2004 Extensive use of the resources already –Local batch queues –GRID queues through the NorduGrid middlware - ARC –60 users 1/3 of SweGrid is dedicated to HEP (200 CPUs) Contributed to Atlas Data Challenge 2 –As a partner in NorduGrid Consistenlty large contributor to LCG –Compatibility between ARC and gLite Forms the core of the Northern EGEE ROC Accounting is now in place

SNIC 2006, - 15 SweGrid II New Proposal Under Development 10x capacity –CPU –Storage –Technical specification being developed a Point 2 Point connections Application will be submitted in January Installation during 2007 Application specific portals Improved user support Interface to international projects –NDGF/NorduGrid –EGEE Special agreements for “large users”

SNIC 2006, - 16 The NorduGrid project Started in January 2001 & funded by NorduNet-2 –Initial goal: to deploy DataGrid middleware to run “ATLAS Data Challenge” NorduGrid essentials –Built on GT –Replaces some Globus core services and introduces some new services –Grid-manager, Gridftp, User interface & Broker, information model, Monitoring –Middleware named ARC Track record –Contributed 30% of the total resources to ATLAS DC II –Enabling Nordic participation in LCG Service Challenges Continuation –Provides middleware for the Nordic Data Grid Facility –Co-operation and interoperability with EGEE/LCG

SNIC 2006, - 17 Resources running ARC Currently available resources: –10 countries, 40+ sites, ~4000 CPUs, ~30 TB storage –4 dedicated test clusters (3-4 CPUs) –SweGrid –Few university production-class facilities (20 to 60 CPUs) –Three world-class clusters in Sweden and Denmark, listed in Top500 Other resources come and go –Canada, Japan – test set-ups –CERN, Russia – clients –Australia –Estonia –Anybody can join or part People: –the “core” team grew to 7 persons –local sys admins are called up when users need an upgrade

SNIC 2006, - 18

SNIC 2006, - 19 Grid BankUSER Resource Provider Grid Broker CPU Storage Network Published Prices Money Services Token One Economic Model - Buyya Resource Request Allocation

SNIC 2006, - 20 Nordic Data Grid Facility - Vision To establish and operate a Nordic computing infrastructure providing seamless access to computers, storage and scientific instruments for researchers across the Nordic countries. Taken from proposal to NOS-N

SNIC 2006, - 21 NDGF - Mission operate a Nordic production Grid building on national production Grids operate a core facility focusing on Nordic storage resources for collaborative projects develop and enact the policy framework needed to create the Nordic research arena for computational science co-ordinate and host Nordic-level development projects in high performance and Grid computing. create a forum for high performance computing and Grid users in the Nordic Countries be the interface to international large scale projects for the Nordic high performance computing and Grid community

SNIC 2006, - 22 NDGF - STATUS Approved by NOS-N Placed under NORDUnet A/S Steering committee appointed High Energy Physics Advisory Committee Interface to NorduGrid ARC being defined Still mostly a paper construction Some centers are already operation as a distributed T1 center

SNIC 2006, - 23 The Swedish HPC landscape Forms the basis of the SNIC strategy Describes Trends –Science –Services –Hardware Analyzes needs Service oriented landscape painted Roadmaps for landscape specified

SNIC 2006, - 24 Increased Productivity Develop –State-of-the-art Integrated development environments –High quality user support and training Compute –Fast and easy access to a multitude of heterogeneous computers in a homogenous way Store –Temporary (fast), project (available), long term (reliable) Transport –Fast and seamless access to data from several locations Analyze –Visualization locally or remotely

SNIC 2006, - 25 Network Landscape 2006 –10 Gbit/s connections: Investigate how the HPC centres canintegrate 10 Gbit/s –Lambda networks: Investigate how point-to-point connectionscan be used. –SweGrid II networks: Include costs of 10 Gbit/s and lambda network connections in SweGridII proposal –OptoSunet: Connect to OptoSunet with 10Gbit/s. Test and demonstrate the established connections. –Full OptoSunet connectivity: Connect the remaining HPC centres 2008 –Point-to-point connections: Test and demonstrate usage of point-to- point –Dynamic point-to-point connections: Establish operational procedures together with SUNET for establishing, maintaining and removing point-to-point

SNIC 2006, - 26 Visualization Landscape Paradigm shift is under way –Visualize locally or remotely –Remotely for large (untransportable) data –Locally for smaller data Processors, storage and rendering closely coupled Distribution of rendered images or graphics primitives to clients over networks Visualization services provided by SNIC centers Gradual build-up and evaluation of concepts

SNIC 2006, - 27 Remote Rendering CaptureStoreRender Display Today ~Gbit/s Tomorrow ~Tbit/s Today 100 Mbit/s Tomorrow ~Gbit/s Visualization has become a data reduction pipeline Today: Download Tomorrow: ?

SNIC 2006, - 28 NVIS remote rendering project Evaluate remote rendering solutions Joint project with IBM and SGI –IBM Deep Computing View Server in Malmö, client in Linköping –SGI Visual serving Server in Norrköping, client in Linköping –Pilot applications in medical visualization … –Project report Q4 06

SNIC 2006, - 29