Nordic Data Grid Facility NDGF – www.ndgf.org Paula Eerola, paula.eerola [at] hep.lu.se paula.eerola [at] hep.lu.sepaula.eerola [at] hep.lu.se 1st Iberian.

Slides:



Advertisements
Similar presentations
Conference xxx - August 2003 Sverker Holmgren SNIC Director SweGrid A National Grid Initiative within SNIC Swedish National Infrastructure for Computing.
Advertisements

CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Partikeldagarna Tord Ekelöf Uppsala University 1 New actions by the Swedish LHC Consortium Open Access Publishing of LHC results Nobel Symposium.
Estonian Grid Mario Kadastik On behalf of Estonian Grid Tallinn, Jan '05.
LHC and Beyond Collaboration project under the NordForsk scheme ”Joint Nordic Use of Research Infrastructures” Webpages Webpages
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
SNIC 2006, - 1 Swedish National Infrastructure for Computing SNIC & Grids Anders Ynnerman.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Grid activities in Sweden Paula Eerola IT seminar, Vetenskapsrådet,
Swedish participation in DataGrid and NorduGrid Paula Eerola SWEGRID meeting,
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Grid Computing Reinhard Bischof ECFA-Meeting March 26 th 2004 Innsbruck.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
Grid and High Energy Physics Paula Eerola Lunarc, Artist’s view on Grid, by Ursula Wilby, Sydsvenskan
NORDUnet Nordic Infrastructure for Research & Education NORDUnet Collaborations with Africa Lars Fischer CTO, NORDUnet 2010 EURO-AFRICA E-INFRASTRUCTURES.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
Technology program Helsinki Institute of Physics Technology Program Grid Activities Ari-Pekka Hameri Miika Tuisku Michael Kustaa Gindonis.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
RomeWorkshop on eInfrastructures 9 December LCG Progress on Policies & Coming Challenges Ian Bird IT Division, CERN LCG and EGEE Rome 9 December.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Conference xxx - August 2003 Anders Ynnerman EGEE Nordic Federation Director Swedish National Infrastructure for Computing Linköping University Sweden.
Results of the HPC in Europe Taskforce (HET) e-IRG Workshop Kimmo Koski CSC – The Finnish IT Center for Science April 19 th, 2007.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Linköping University Sweden Griding the Nordic Supercomputer.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
ESTELA Summer Workshop, 26 June 2013 The EU-SOLARIS project.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Data Logistics in Particle Physics Ready or Not, Here it Comes… Prof. Paul Sheldon Vanderbilt University Prof. Paul Sheldon Vanderbilt University.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
A Distributed Tier-1 An example based on the Nordic Scientific Computing Infrastructure GDB meeting – NIKHEF/SARA 13th October 2004 John Renner Hansen.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America EELA Infrastructure (WP2) Roberto Barbera.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
EGI-InSPIRE Steven Newhouse Interim EGI.eu Director EGI-InSPIRE Project Director Technical Director EGEE-III 1GDB - December 2009.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Performance of The NorduGrid ARC And The Dulcinea Executor in ATLAS Data Challenge 2 Oxana Smirnova (Lund University/CERN) for the NorduGrid collaboration.
7 September 2007 AEGIS 2007 Annual Assembly Current Status of Serbian eInfrastructure: AEGIS, SEE-GRID-2, EGEE-II Antun Balaz SCL, Institute of Physics,
LHC Computing, CERN, & Federated Identities
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Dr. Isabel Campos Plasencia (IFCA-CSIC) Spanish NGI Coordinator ES-GRID The Spanish National Grid Initiative.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
NDGF and the distributed Tier-I Center Michael Grønager, PhD Technical Coordinator, NDGF dCahce Tutorial Copenhagen, March the 27th, 2007.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
Bob Jones EGEE Technical Director
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
Long-term Grid Sustainability
Nordic ROC Organization
EGI Webinar - Introduction -
Presentation transcript:

Nordic Data Grid Facility NDGF – Paula Eerola, paula.eerola [at] hep.lu.se paula.eerola [at] hep.lu.sepaula.eerola [at] hep.lu.se 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, 2007 What – Why – How

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, Nordic Countries Common history and culture Common history and culture Similar societies Similar societies Long and productive tradition of co- operation (e.g. free mobility long before EU) Long and productive tradition of co- operation (e.g. free mobility long before EU)

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, Vision of the NDGF The vision of the Nordic Data Grid Facility (NDGF) is to establish and operate a Nordic computing infrastructure providing researchers in the Nordic countries with a seamless access to computers, storage and scientific instruments. The vision of the Nordic Data Grid Facility (NDGF) is to establish and operate a Nordic computing infrastructure providing researchers in the Nordic countries with a seamless access to computers, storage and scientific instruments. NDGF coordinates activities - NDGF does not own resources or middleware NDGF coordinates activities - NDGF does not own resources or middleware NDGF is the interface to e-Science in the Nordic countries. NDGF is the interface to e-Science in the Nordic countries. –Single Point of Entry for collaboration, middleware development, deployment, and e-Science projects –Represents the Nordic Grid community internationally

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, History of Nordic Grid Collaboration Collaboration initiated (initial partners universities of Copenhagen (DK), Lund, Uppsala (SE), Oslo (NO), Helsinki (FI)) Collaboration initiated (initial partners universities of Copenhagen (DK), Lund, Uppsala (SE), Oslo (NO), Helsinki (FI)) –"Nordic Testbed for Wide Area Computing and Data Handling"  NorduGrid –Funding by Nordunet2 R&D grant –It became soon evident that the existing middleware prototypes at that time ( ) were not suitable and/or mature enough  development of own middleware, ARC (Advanced Resource Connector) necessary for getting the Testbed operational –Testbed in operation since August 2002 –Successful participation in Particle Physics (LHC-ATLAS experiment) Data Challenges already 2004 –Forming of the NorduGrid middleware consortium (2005) NDGF Pilot Project (joint Nordic funding) NDGF Pilot Project (joint Nordic funding) –A pilot production facility based on national resources NDGF Production Phase NDGF Production Phase

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF basic facts NDGF – production phase – start NDGF – production phase – start Co-owned by: DK, FI, NO, SE Co-owned by: DK, FI, NO, SE Hosted by NORDUnet A/S (Nordic IP backbobe network, fully owned by the Nordic Research Councils) Hosted by NORDUnet A/S (Nordic IP backbobe network, fully owned by the Nordic Research Councils) Funding shared between the 4 Nordic countries Funding shared between the 4 Nordic countries Funding (2 MEUR/year) by National Research Councils of the Nordic countries Funding (2 MEUR/year) by National Research Councils of the Nordic countries Status: organization in place, integration of the national resources ongoing Status: organization in place, integration of the national resources ongoing

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Organization CEO CERNCoordinator TechnicalCoordinatorSoftwareCoordinator NationalNodeCoordinator SystemIntegrator CTO SoftwareDeveloper SoftwareDeveloper SoftwareDeveloper SoftwareDeveloper SoftwareDeveloper NationalNodeCoordinator NationalNodeCoordinator NationalNodeCoordinator NDGF Board of Directors NDGFsubcommittees:CERNCommittee,… Currently 13+2 persons Top management structure, administration and office shared with NORDUnet Key technical and project coordination functions

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Technical Platform NDGF is Nordic Production Grid giving access to national resources NDGF is Nordic Production Grid giving access to national resources National resources are purchased and owned nationally National resources are purchased and owned nationally –NDGF creates a single Grid from existing resources (supercomputers, clusters, storage) in participating countries –Provides grid project storage Critical issues: Critical issues: –Planning of available resources is more complicated than with a more complicated than with a single-site centre single-site centre –Funding of new resources (different national policies, funding sources, national policies, funding sources, time scales), coordination of funding time scales), coordination of funding and purchases –Integration of existing resources, user policies JYU

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, National resources (for academic use) SWEDEN: Major resources at 6 National Computer Centers under a virtual metacentre, SNIC. Swegrid is the Swedish national Grid. SWEDEN: Major resources at 6 National Computer Centers under a virtual metacentre, SNIC. Swegrid is the Swedish national Grid. NORWAY: Major resources coordinated by UNINETT Sigma metacentre. NORGRID is responsible for establishing and maintaining the national grid infrastructure. eVITA e- Science programme recently established. NORWAY: Major resources coordinated by UNINETT Sigma metacentre. NORGRID is responsible for establishing and maintaining the national grid infrastructure. eVITA e- Science programme recently established. FINLAND: Major resources at CSC, Center for Scientific Computing. Material Sciences National Grid Infrastructure (M-grid), joint project between CSC and universities. FINLAND: Major resources at CSC, Center for Scientific Computing. Material Sciences National Grid Infrastructure (M-grid), joint project between CSC and universities. DENMARK: Major resources coordinated by Danish Center for Grid Computing DCGC. DENMARK: Major resources coordinated by Danish Center for Grid Computing DCGC.

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF tasks – what does it mean to ”coordinate”? Facilitate, coordinate & host major e-Science projects (e.g. Nordic LHC Tier-1 for Particle Physics Research) Facilitate, coordinate & host major e-Science projects (e.g. Nordic LHC Tier-1 for Particle Physics Research) Operate Nordic storage facility for major projects Operate Nordic storage facility for major projects Create a common policy framework for the Nordic Production Grid Create a common policy framework for the Nordic Production Grid Facilitate joint Nordic planning and collaboration Facilitate joint Nordic planning and collaboration Contribute to development of ARC middleware Contribute to development of ARC middleware Other potential future activities include e.g.: Other potential future activities include e.g.: –Supporting Nordic HPC/Grid conference activities –Promotion and outreach of computational sciences in various ways

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, Supporting Nordic e-Science Provides a single point of contact for e-Science initiatives Provides a single point of contact for e-Science initiatives –A place to go for researchers initiating projects –Supports all sciences in need of compute resources Services for user groups and applications Services for user groups and applications –Portals for user groups –Gridifying major applications Hosts and coordinates major e- Science projects Hosts and coordinates major e- Science projects –Coordinate technical implementation –Coordinate collaboration NDGF represents the Nordic Grid community internationally NDGF represents the Nordic Grid community internationally –Allows the Nordic Grid community to speak with one voice –Creates visibility for Nordic technology and solutions –Creates a single Nordic representation in major e-Science projects Single Point of Entry for international partners Single Point of Entry for international partners –To reach the Nordic Grid community, talk to us –Point of contact for e-Science projects, middleware development, and grid facility deployment

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Users NDGF didn't formalize user policies yet, but the plan is to "transfer" the NorduGrid Virtual Organization (VO) to NDGF NDGF didn't formalize user policies yet, but the plan is to "transfer" the NorduGrid Virtual Organization (VO) to NDGF Currently, access to the NorduGrid resources is granted for the following users: Currently, access to the NorduGrid resources is granted for the following users: –Members of the NorduGrid Virtual Organization, i.e. people who are affiliated with one of the Nordic academic institutions and have agreed to the Acceptable Use Policy document. MembersNorduGrid Virtual OrganizationAcceptable Use Policy documentMembersNorduGrid Virtual OrganizationAcceptable Use Policy document –Members of other Virtual Organizations, who are accepted by the NorduGrid Steering Committee. Decision of acceptance is taken on case-by-case basis. –Users: physics, chemistry, computer science, bioinformatics, molecular medicine, production technology,…

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Partnerships NDGF is a Facilitator and an Integrator, not a resource owner NDGF is a Facilitator and an Integrator, not a resource owner –NDGF does not own the network –NDGF does not own computing resources –NDGF does not own the middleware platform Resources are owned and managed by partners Resources are owned and managed by partners NDGF Partnerships NDGF Partnerships –Network: NORDUnet –Grid compute resources: National Grid Projects (DCSC, SweGrid, NorGrid, M-Grid,…) –Other kind of compute resources (supercomputers, shared memory processors,...): National Computer Centers –Middleware and services: NorduGrid, KnowARC, EGEE/EGI,… –Storage: dCache/DESY

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NorduGrid Consortium Not limited to Nordic Countries any more Not limited to Nordic Countries any more Currently about 20 partners around the world, open to new partners Currently about 20 partners around the world, open to new partners The NorduGrid consortium: The NorduGrid consortium: –Controls the ARC middleware –Contributes to development of Grid standards, e.g. via the Global Grid Forum. Currently 60 Sites, 6000 Processors, 60 TB storage, 1600 Grid users Currently 60 Sites, 6000 Processors, 60 TB storage, 1600 Grid users Current NorduGrid partners

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, Middleware One of the NDGF Core Tasks: Deliver the middleware for collaborative e- Science projects One of the NDGF Core Tasks: Deliver the middleware for collaborative e- Science projects –Contribute to development of middleware(s) –Installation, testing and maintenance –Interoperability and co-existence of different middlewares –Development of tools and services not currently available ARC – the Advanced Resource Connector – developed for the Nordic distributed and heterogeneus computing and storage structure ARC – the Advanced Resource Connector – developed for the Nordic distributed and heterogeneus computing and storage structure Does not require a specific hardware nor operating system Does not require a specific hardware nor operating system – Open Source Grid Middleware platform, free download – Source code available for modification and incorporation in new projects – Open for international collaboration, contribution of code and ideas, and deployment – Interoperability and co-existence of different middlewares, e.g. ARC- LCG/gLITE A number of collaborative projects: e.g. KnowARC (EU-funded), New and Innovative Services for NorduGrid (Nordic-funded),… A number of collaborative projects: e.g. KnowARC (EU-funded), New and Innovative Services for NorduGrid (Nordic-funded),…

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, e-Science case – Particle Physics Research: NDGF Tier-1 for Large Hadron Collider (LHC) data processing and storage

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, LHC Computing Hierarchy Tier 1 FNAL Center IN2P3 Center INFN Center NDGF Tier 1 Centres = large computer centres (12). Tier 1’s provide permanent storage and management of raw, summary and other data needed during the analysis process. Tier 2 Centres = smaller computer centres (several 10’s). Tier 2 Centres provide disk storage and concentrate on simulation and end-user analysis. Tier2 Center Tier 2 Institute Workstations Physics data cache CERN Center PBs of Disk; Tape Robot ~ MBytes/s Tier 0 Experiment Tier 0= CERN. Tier 0 receives raw data from the Experiments and records them on permanent mass storage. 5-8 PetaBytes of data/year, can grow up to 100 PB/year. First- pass reconstruction of the data, producing summary data.

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Tier-1 for LHC The Nordic High Energy Physics community prefers a Tier-1 of it’s own, but… The Nordic High Energy Physics community prefers a Tier-1 of it’s own, but… None of the Nordic countries is big enough to host a Tier-1 alone None of the Nordic countries is big enough to host a Tier-1 alone NDGF will NDGF will –Host the Nordic Tier-1 for ATLAS and ALICE experiments –Co-ordinate network, storage, and computing resources –Co-ordinate communication towards CERN and LHC project partners NDGF creates one technical facility to host the Tier-1 NDGF creates one technical facility to host the Tier-1

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Tier-1 Sites Distributed Tier-1 with 7 sites Distributed Tier-1 with 7 sites 24/7 operation 24/7 operation Appears as a single site from the outside world Appears as a single site from the outside world Has one interface towards CERN (via GEANT2) Has one interface towards CERN (via GEANT2) A ”long-reach” LAN (Optical Private Network, OPN) connects the 7 sites A ”long-reach” LAN (Optical Private Network, OPN) connects the 7 sites The computing resources and storage are distributed at national computer centres The computing resources and storage are distributed at national computer centres Most resources run ARC Most resources run ARC LCG-ARC interoperability LCG-ARC interoperability To CERN-LHC OPN, 10 Gbps OPN to sites, 10 Gbps

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Tier-1 Resources NDGF LHC Tier-1 status April 2007: 70% of CPU, 34% of disk, 0% of tape in place

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, NDGF Tier-2 resources

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, Tier-1 performance ATLAS Data Challenges = processing of Simulated Data ATLAS Data Challenges = processing of Simulated Data –In 2007, NorduGrid/NDGF has processed appr. 8% of all ATLAS production with overall efficiency of 89%. In 2006, appr. 10% share with 74% efficency. Critical issues: Critical issues: –Hardware purchases –Interoperability and accounting needs more work –Incorporating ALICE –24/7 in practise NorduGrid/NDGF (NG) in red Walltime: Days equivalent on a single processor

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, SUMMARY NDGF is a collaborative Nordic Grid Production Facility NDGF is a collaborative Nordic Grid Production Facility NDGF Production Phase NDGF Production Phase Status 2007: central organization with 15 persons in place and operational, integration of resources ongoing Status 2007: central organization with 15 persons in place and operational, integration of resources ongoing NDGF LHC Tier-1 with 7 sites, hardware status April 2007: 70% of CPU, 34% of disk, 0% of tape in place NDGF LHC Tier-1 with 7 sites, hardware status April 2007: 70% of CPU, 34% of disk, 0% of tape in place Other e-Sciences to follow Other e-Sciences to follow ARC middleware development goes ahead with success ARC middleware development goes ahead with success Challenges: Challenges: –Hardware purchases (national resources) –24/7 Tier-1 operation: not yet tested

Paula Eerola 1st Iberian Grid Infrastructure Conference Santiago de Compostela, May 14-16, Thank you!