NDGF and the distributed Tier-I Center Michael Grønager, PhD Technical Coordinator, NDGF dCahce Tutorial Copenhagen, March the 27th, 2007.

Slides:



Advertisements
Similar presentations
Estonian Grid Mario Kadastik On behalf of Estonian Grid Tallinn, Jan '05.
Advertisements

CHEPREO Tier-3 Center Achievements. FIU Tier-3 Center Tier-3 Centers in the CMS computing model –Primarily employed in support of local CMS physics community.
Grid activities in Sweden Paula Eerola IT seminar, Vetenskapsrådet,
Nordic Data Grid Facility NDGF – Paula Eerola, paula.eerola [at] hep.lu.se paula.eerola [at] hep.lu.sepaula.eerola [at] hep.lu.se 1st Iberian.
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
Grid and High Energy Physics Paula Eerola Lunarc, Artist’s view on Grid, by Ursula Wilby, Sydsvenskan
NORDUnet Nordic Infrastructure for Research & Education NORDUnet Collaborations with Africa Lars Fischer CTO, NORDUnet 2010 EURO-AFRICA E-INFRASTRUCTURES.
Site Report US CMS T2 Workshop Samir Cury on behalf of T2_BR_UERJ Team.
Virtual Network Servers. What is a Server? 1. A software application that provides a specific one or more services to other computers  Example: Apache.
Notebook 2003 The ICT-Policy Committee of the Nordic Council of Ministers NORDUnet 3 - A Nordic eScience Area Recommendations for a Nordic Programme of.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Conference xxx - August 2003 Anders Ynnerman Director, Swedish National Infrastructure for Computing Linköping University Sweden The Nordic Grid Infrastructure.
E-Infrastructure hierarchy Networking and Computational facilities in Armenia ASNET AM Network Armenian National Grid Initiative Armenian ATLAS site (AM-04-YERPHI)
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Tord.Ekelöf, Uppsala UniversityPartikeldagarna Karlstad Tord Ekelöf Uppsala University Partikeldagarna Karlstad
Conference xxx - August 2003 Anders Ynnerman Director Swedish National Infrastructure for Computing Linköping University Sweden Griding the Nordic Supercomputer.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Main title HEP in Greece Group info (if required) Your name ….
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
A Distributed Tier-1 An example based on the Nordic Scientific Computing Infrastructure GDB meeting – NIKHEF/SARA 13th October 2004 John Renner Hansen.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
Site report HIP / CSC HIP : Helsinki Institute of Physics CSC: Scientific Computing Ltd. (Technology Partner) Storage Elements (dCache) for ALICE and CMS.
Status SC3 SARA/Nikhef 20 juli Status & results SC3 throughput phase SARA/Nikhef Mark van de Sanden.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
ATLAS Midwest Tier2 University of Chicago Indiana University Rob Gardner Computation and Enrico Fermi Institutes University of Chicago WLCG Collaboration.
Data Transfer Service Challenge Infrastructure Ian Bird GDB 12 th January 2005.
GridKa Cloud T1/T2 at Forschungszentrum Karlsruhe (FZK)
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Development of a Tier-1 computing cluster at National Research Centre 'Kurchatov Institute' Igor Tkachenko on behalf of the NRC-KI Tier-1 team National.
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
The HEPiX IPv6 Working Group David Kelsey (STFC-RAL) EGI OMB 19 Dec 2013.
INFSO-RI Enabling Grids for E-sciencE Turkish Tier-2 Site Report Emrah AKKOYUN High Performance and Grid Computing Center TUBITAK-ULAKBIM.
NDGF – a Joint Nordic Production Grid Lars Fischer ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science Cracow, 2 October.
Report from US ALICE Yves Schutz WLCG 24/01/2007.
May pre GDB WLCG services post EGEE Josva Kleist Michael Grønager Software Coord NDGF Director CERN, May 12 th 2009.
DCache in the NDGF Distributed Tier 1 Gerd Behrmann Second dCache Workshop Hamburg, 18 th of January 2007.
A Nordic Tier-1 for LHC Mattias Wadenstein Systems Integrator, NDGF Grid Operations Workshop Stockholm, June the 14 th, 2007.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
NDGF Site Report Mattias Wadenstein Hepix 2009 spring, Umeå , Umeå University.
Bob Jones EGEE Technical Director
Gene Oleynik, Head of Data Storage and Caching,
NDGF The Distributed Tier1 Site.
(Prague, March 2009) Andrey Y Shevel
The EDG Testbed Deployment Details
The Beijing Tier 2: status and plans
Mattias Wadenstein Hepix 2012 Fall Meeting , Beijing
U.S. ATLAS Tier 2 Computing Center
BNL Tier1 Report Worker nodes Tier 1: added 88 Dell R430 nodes
NGIs – Turkish Case : TR-Grid
Bernd Panzer-Steindel, CERN/IT
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
Clouds of JINR, University of Sofia and INRNE Join Together
The INFN TIER1 Regional Centre
Overview of IPB responsibilities in EGEE-III SA1
Southwest Tier 2.
Mattias Wadenstein Hepix 2010 Fall Meeting , Cornell
Christof Hanke, HEPIX Spring Meeting 2008, CERN
Nordic ROC Organization
The INFN Tier-1 Storage Implementation
LHCOPN / LHCONE meeting
Presentation transcript:

NDGF and the distributed Tier-I Center Michael Grønager, PhD Technical Coordinator, NDGF dCahce Tutorial Copenhagen, March the 27th, 2007

Overview NDGF Background The Distributed Nordic Tier-1 Practical info

Background Nordic Countries constitute together 25Mio People No country is bigger than 10Mio People Nordic Tier-1 needs to utilize hardware at bigger Nordic compute sites Strong Nordic grid tradition: NorduGrid / ARC – Deployed at all Nordic compute centers – Used heavily also by non-HEP users (>75%) Need for a pan-Nordic organization for Tier-1 and possibly other huge inter/Nordic e-Science projects

NORDUnet A/S The Nordic Regional Research and Educational Network (RREN) Owned by the 5 Nordic National RENs 25 Years of Nordic network collaboration Leverage National Initiatives Participates in major international efforts Represents Nordic NRENS internationally, gateway to the Nordic area

What is the NDGF? A Co-operative Nordic Data and Computing Grid facility –Nordic production grid, leveraging national grid resources –Common policy framework for Nordic production grid –Joint Nordic planning and coordination –Operate Nordic storage facility for major projects –Co-ordinate & host major e-Science projects (i.e., Nordic WLCG Tier-1) –Develop grid middleware and services NDGF –Funded (2 M.EUR/year) by National Research Councils of the Nordic countries –Builds on a history of Nordic grid collaboration –Strategic planning ongoing. NOS-N DKDK SFSF N S Nordic Data Grid Facility

Organization – CERN related

NDGF Partners NorduGrid / ARC – Responsible for the stable branch KnowARC EU Project – Development EGEE-II/EGEE-III/EGI Projects – Operation dCache/DESY – Part of the development team (1-2FTEs) GIN

NDGF Tier-1 Resource Centers The 7 biggest Nordic compute centers, dTier-1s, form the NDGF Tier-1 Resources (Storage and Computing) are scattered Services can be centralized Advantages in redundancy Especially for 24x7 data taking

NDGF Facility

Storage dCache Installation Admin and Door nodes at GEANT endpoint Pools at sites Very close collaboration with DESY to ensure dCache is suited also for distributed use

NDGF Storage

Storage Central Installation: –7 Dell xDual Core 2GHz Xeon, 4GB RAM, 2 x 73GB 15k SAS disks (mirrored) (one forspare) –2 x Dell PowerVault MD-1000 direct attached storage enclosures with 7 x 143GB 15k SAS RAID-10 each Running: –2 Postgress for PNFS running in HA mode (master-slave) DB on MD-1000 –1 PNFS Manager and Pool Manager –1 SRM, location manager, statistics, billing, etc. –1 GridFTP and xrootd door on one machine –1 Monitoring and intrusion detection on one machine

Storage Resources: –Copenhagen: Disk: IBMDS4100, 130TB, 2Gbit SAN Tape: IBM 3484, 4 LTO 3 drives, Tivoli (on p520AIX) –Norway: Disk: 75TB Tape: 60TB –Sweden and Helsinki follows later this year...

Pratical info Dinner arranged tonight at at: –“Rex Republic” opposite to NBI –Please tell me if you want to join Coffee served next door Lunch again tomorrow – before you leave