ScotGRID Report: Prototype for Tier-2 Centre for LHC Akram Khan On Behalf of the ScotGRID Team (http:/www.scotgrid.ac.uk) Akram Khan On Behalf of the ScotGRID.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
31/03/ :05:55GridPP 3 Cambridge Feb 02Slide 1 Grid site report linux desktops test rig CDF ScotGRID in collaboration with Edinburgh and IBM.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Your university or experiment logo here What is it? What is it for? The Grid.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
Alastair Dewhurst, Dimitrios Zilaskos RAL Tier1 Acknowledgements: RAL Tier1 team, especially John Kelly and James Adams Maximising job throughput using.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
ATLAS computing in Geneva Szymon Gadomski, NDGF meeting, September 2009 S. Gadomski, ”ATLAS computing in Geneva", NDGF, Sept 091 the Geneva ATLAS Tier-3.
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
London Tier 2 Status Report GridPP 12, Brunel, 1 st February 2005 Owen Maroney.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC WP2+5: Data and Storage Management.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
Jeremy Coles - RAL 17th May 2005Service Challenge Meeting GridPP Structures and Status Report Jeremy Coles
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
CRISP & SKA WP19 Status. Overview Staffing SKA Preconstruction phase Tiered Data Delivery Infrastructure Prototype deployment.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
11 March 2008 GridPP20 Collaboration meeting David Britton - University of Glasgow GridPP Status GridPP20 Collaboration Meeting, Dublin David Britton,
16 September GridPP 5 th Collaboration Meeting D0&CDF SAM and The Grid Act I: Grid, Sam and Run II Rick St. Denis – Glasgow University Act II: Sam4CDF.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
Caitriana Nicholson, CHEP 2006, Mumbai Caitriana Nicholson University of Glasgow Grid Data Management: Simulations of LCG 2008.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
UK GridPP Tier-1/A Centre at CLRC
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
Collaboration Board Meeting
LHCb(UK) Computing Status Glenn Patrick
LHCb thinking on Regional Centres and Related activities (GRIDs)
Presentation transcript:

ScotGRID Report: Prototype for Tier-2 Centre for LHC Akram Khan On Behalf of the ScotGRID Team ( Akram Khan On Behalf of the ScotGRID Team (

GridPP6 Collaboration MeetingScotGRID Report Overview of Talk Misc Bits Summary & Outlook Future Plans Hardware / Operation What are we hoping to do..?

GridPP6 Collaboration MeetingScotGRID Report Never Forget The Spirit of the Project The LHC Computing Challenge for Scotland 2000: JREI Bid The JREI funds will make possible to commission and fully exercise a prototype LHC computing centre in Scotland The Centre would provide: 1. Technical service based for the grid(GIIS, VO services…) 2. DataStore to handle samples of data towards part. Analysis 3. Significant simulation production capability 4. Excellent network connection RAL + regional sites 5. Support grid middle devel. with CERN and RAL 6. Support core software devel. within LHCb and ATLAS 7. Support user applications in other scientific areas This will enable us to answer:  Is the grid viable solution for LHC computing challenge  Can a two-site Tier-2 centre be setup and operate effectively  How can network topology between Ed,GL, RAL & CERN

GridPP6 Collaboration MeetingScotGRID Report ScotGRID: Glasgow / Edinburgh  59 x330 dual PIII 1GHz/2 Gbyte compute nodes  2 x340 dual PIII/1 GHz /2 Gbyte head nodes  3 x340 dual PIII/1 GHz/2 Gbyte storage nodes, each with 11 by 34 Gbytes in Raid 5  1 x340 dual PIII/1 GHz/0.5 Gbyte masternode  59 x330 dual PIII 1GHz/2 Gbyte compute nodes  2 x340 dual PIII/1 GHz /2 Gbyte head nodes  3 x340 dual PIII/1 GHz/2 Gbyte storage nodes, each with 11 by 34 Gbytes in Raid 5  1 x340 dual PIII/1 GHz/0.5 Gbyte masternode  xSeries quad Pentium Xeon 700 MHz/16 Gbytes, server  1 FAStT 500 controller  7 diskarrays of 10 x 73 Gb disk  xSeries quad Pentium Xeon 700 MHz/16 Gbytes, server  1 FAStT 500 controller  7 diskarrays of 10 x 73 Gb disk

GridPP6 Collaboration MeetingScotGRID Report ScotGRID - Glasgow

GridPP6 Collaboration MeetingScotGRID Report ScotGRID: Glasgow - Schematic Internet VLAN VLAN 100 Mbps 1000 Mbps MasternodeStorage NodesHead Nodes Compute Nodes Campus Backbone bottleneck

GridPP6 Collaboration MeetingScotGRID Report ScotGRID: Edinburgh - Schematic Disk Arrays(Total 4.6 Tb) FastT 500 Storage Controller Server (4*Pentium Xeon, 16Gb RAM) SRIF Network

GridPP6 Collaboration MeetingScotGRID Report Towards a Prototype Tier Q1 Q2 Q3 Q Q4 PrototypesPrototypes xCAT tutorial, attempt on masternode ScotGRID room handed over to builders Building work complete xCAT reinstall User registration, trail production Installation of Software Configuring disk array Reconfiguring kernel drivers for FAStT storage controller User registration, Upgrade storage controller ScotGRID delivery of Kit: Dec 2001 Group disk (re)organisation to match project Glasgow: MC-FARM Edinburgh: Datestore Proposal JREI: 2000

GridPP6 Collaboration MeetingScotGRID Report ScotGRID 1 st Year Review 9:45 Arrive - Coffee 10:00-10:15 Welcome (Freddie Moran) 10:15-10:35 ScotGrid Introduction (Tony Doyle) 10:35-10:50 Technical Status Overview (Akram Khan) 10:50-11:05 Cluster Operations (David Martin) 11:05-11:30 Coffee 11:30-11:50 ScotGrid Upgrade Plans (Steve Playfer) 11:50-13:00 IBM IT Briefing Discussion 13:00-14:00 Lunch 14:00-14:30 IBM IT Briefing Discussion 14:40-14:55 Grid Data Management - simulations (David Cameron) 15:10-15:30 Tea Particle Physics Applications 15:30-15:45 ATLAS (John Kennedy) 15:45-16:00 LHCb (Akram Khan) 16:00-16:15 BABAR (Steve Playfer) 16:15-16:30 CDF (Rick St Denis) ScotGrid Meeting at IBM Briefing Centre (Greenock) Friday 10th Jan ScotGrid Meeting at IBM Briefing Centre (Greenock) Friday 10th Jan Complete Success as you will see!

GridPP6 Collaboration MeetingScotGRID Report ScotGRID Statistics The amount of storage space in ScotGRID:used by each group Edinburgh (5TBytes) Glasgow (600 Gbytes)

GridPP6 Collaboration MeetingScotGRID Report ScotGRID:CPU Usage 24/6/02 – 6/1/2003 The % use by each group over the pervious weeks  startup phase  Christmas period  different applications

GridPP6 Collaboration MeetingScotGRID Report Forward Look: Introduction ScotGrid JREI project includes a mid- term hardware upgrade. As part of GridPP planning, we need to upgrade from Prototype to Production Tier 2 status by JREI funding left to be spent by June 2003: è Edinburgh £220k è Glasgow £30k £250k

GridPP6 Collaboration MeetingScotGRID Report Forward Look: Possible Upgrade Plan? Edinburgh kit  Glasgow Dual FastT TB xSeries 440 8* Xeon (1.9GHz) Scalable configuration

GridPP6 Collaboration MeetingScotGRID Report Forward Look: Front-End Grid Servers  Front end for EDG style Compute Engine/LCFG  Front end for EDG style Storage Engine  Overall ScotGrid Front end to arbitrate Grid services being requested? Would like to install Grid software on dedicated (modest-sized) servers. Decouples Grid software from Compute and Storage hardware. Will there be a standard configuration for Grid access to Tier 2 sites? (RLS/SlashGrid)

GridPP6 Collaboration MeetingScotGRID Report Towards a Production Tier-2 & beyond Q1 Q2 Q3 Q Q4 ProductionProduction Delivery of more Kit… End of JREI funding Start of ScotGRID-II Start of GridPP-II Links to other applications … Production Tier-2 Site Future Upgrades ?

GridPP6 Collaboration MeetingScotGRID Report Technical Support Group Core members of the group & invited to discuss wider issues: CORE: Akram Khan (Chair: Edinburgh) David Martin (sysadm: Glasgow) Roy de Ruiter-Koelemeiger (sysadm: Edinburgh) Gavin McCance (EDG: Glasgow) RA post (EDG: Edinburgh) INVITED: Paul Mitchell (sysadm: Edinburgh) Alan J. Flavell (Networking: Glasgow) Steve Traylen (EDG: RAL) IBM Team CORE: Akram Khan (Chair: Edinburgh) David Martin (sysadm: Glasgow) Roy de Ruiter-Koelemeiger (sysadm: Edinburgh) Gavin McCance (EDG: Glasgow) RA post (EDG: Edinburgh) INVITED: Paul Mitchell (sysadm: Edinburgh) Alan J. Flavell (Networking: Glasgow) Steve Traylen (EDG: RAL) IBM Team Webpage “technical group” of Support is a real issue we are just about ok but for a production Tier-2? Support is a real issue we are just about ok but for a production Tier-2?

GridPP6 Collaboration MeetingScotGRID Report All University traffic Packet filtering 1 Gb/s 2.5 Gb/s GlasgowEdinburgh ( ) ms ms ms ( ) ms ms ms ( ) ms ms ms 4. glasgow-bar.ja.net ( ) ms ms ms 5. po9-0.glas-scr.ja.net ( ) ms ms ms 6. po3-0.edin-scr.ja.net ( ) ms ms ms 7. po0-0.edinburgh-bar.ja.net ( ) ms ms ms ( ) ms ms ms 9. vlan686.kb5-msfc.net.ed.ac.uk ( ) ms ms ms ( ) ms ms ms Traceroute

GridPP6 Collaboration MeetingScotGRID Report EDG Middleware: Replica Optimiser Simulation Using ScotGrid for large-scale simulation runs. è uses ~15MB memory for ~60 threads. è 2-12 hours/simulati on Results to appear in IJHPCA 2003.

GridPP6 Collaboration MeetingScotGRID Report BaBar: Monte Carlo Production (SP4) ScotGrid (= edin) 8 Million Events in 3 weeks ScotGrid (= edin) 8 Million Events in 3 weeks Expect to import some streams/skims to Edinburgh in 2003 After the upgrade to ~30TB there may be interest in using ScotGrid to add to the storage available at the RAL Tier A site

GridPP6 Collaboration MeetingScotGRID Report è CERN (932 k) and Bologna (857 k) è RAL (471 k) è Imperial College and Karlsruhe (437 k) è Lyon (202 k) è ScotGrid (194 k) è Cambridge (100 k) è Bristol (92 k) è Moscow (87 k) è Liverpool (70 k) è Barcelona (56 k) è Rio (32 k) è CESGA (28 k) è Oxford (25 k) LHCb: Production Centres We can be confident for the TDR production and in 56 days with the current configuration we can produce 10 Million events (March-April 2003). Included in Draft of LHCC document: B0->J/phi K0s

GridPP6 Collaboration MeetingScotGRID Report Summary and Outlook Exciting time for ScotGRID: Exciting time for ScotGRID: There has been a lot of effort during the past year to get ScotGRID up and operational – we have learnt many ticks! Operational Prototype Centre:  We have an operational centre  Meeting the short term needs of the applications with modest resources (HEP + Middleware + non-PP)  Proof of Principle for Tier-2 Operation (pre-grid) There is a lot that needs still to the done:  having a full production system (24*7) (opt-grid)  to prototype various architectural solutions for Tier-2  look towards upgrades with a view for LHC timetable Support & Resources are a real issue for the near term future (Q1-2004)

GridPP6 Collaboration MeetingScotGRID Report RLS Architecture Local Replica Catalogues LRC on Storage Element LRC on Storage Element LRC on Storage Element RLI LRC on Storage Element Multiply indexed LRC for higher availability RLI indexing over the full namespace (all LRCs are indexed) RLI indexing over a subset of LRCs LRC indexed by only one RLI Replica Location Indices GlasgowEdinburgh CERN A Replica Location Service (RLS) is system that maintains and provides access to information about the physical location of copies of data items. A Replica Location Service (RLS) is system that maintains and provides access to information about the physical location of copies of data items. Gavin McCance Alasdair Earl Akram Khan (starting Feb)