1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.

Slides:



Advertisements
Similar presentations
D. Britton GridPP Status - ProjectMap 8/Feb/07. D. Britton08/Feb/2007GridPP Status GridPP2 ProjectMap.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Tony Doyle GridPP2 Proposal and Responses to Questions, Grid Steering Committee, Coseners, 28 July 2003.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
GridPP Meeting, 28/6/06 UB Overview à Role of the UB n Represent user community within GridPP management n Request and allocate.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Applications Area Issues RWL Jones GridPP13 – 5 th June 2005.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
Welcome to the 12 th GridPP Collaboration Meeting Introduction Steve Lloyd, Chair of the GridPP Collaboration Board Report on recent meetings of: PPARC.
Project Status David Britton,15/Dec/ Outline Programmatic Review Outcome CCRC08 LHC Schedule Changes Service Resilience CASTOR Current Status Project.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
GridPP: Executive Summary Tony Doyle. Tony Doyle - University of Glasgow Oversight Committee 11 October 2007 Exec 2 Summary Grid Status: Geographical.
S.L.LloydGridPP9 IntroductionSlide 1 Introduction Welcome to the 9 th GridPP Collaboration Meeting Dissemination Officer GridPP2 Posts Tier-2 Centres Steve.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Polish Tier-2 Ryszard Gokieli Institute for Nuclear Studies Warsaw.
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
LCG Grid Deployment Board, March 2003 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Status of GridKa for LCG-1 Forschungszentrum Karlsruhe.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Reporting of the Experiments Follow procedures set up for technical WP of EDG Spreadsheet report man month effort Pro-forma reply sheet Pro-forma sheet.
18 th Mar’03Nick Brook – University of Bristol1 GridPP2 bid Nick Brook University of Bristol Timescales Draft outline Envelope What is expected from expts.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
2 GridPP2 Budget David Britton, 4/12/03 Imperial College.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
GridPP3 Project Management GridPP20 Sarah Pearce 11 March 2008.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
A Dutch LHC Tier-1 Facility
UK GridPP Tier-1/A Centre at CLRC
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
Collaboration Board Meeting
Gridifying the LHCb Monte Carlo production system
Presentation transcript:

1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme

2Oxford eSc – 1 st July03 Hardware LHC experimental numbers based on ongoing re- assessment exercise –Computing Technical Design Reports due in 2005 –Expts will be using LCG system Hardware and chosen middleware Software tools – e.g. POOL for persistency –Numbers include Tier-2 needs Non LHC experiments also gave estimated forward look –Based on MC production & analysis Expts are expecting a single, integrated Tier-1 centre

3Oxford eSc – 1 st July03 Hardware Expts are expecting a single, integrated Tier-1 centre Short term LHC expts expect some form of centralised planning via LCG project –Projection Execution Board Grid Deployment Board –GridPP participation in LCG bodies GridPP will continue with annual h/w review –CPU vs Disk

4Oxford eSc – 1 st July03 Ongoing Activities Example: LHCb Data Challenge – >40M events – 170 yrs on a 1GHz PC ~1/3 events produced in the UK

5Oxford eSc – 1 st July03 Ongoing Activities Current usage of Tier-1/A centre dominated by BaBar usage – 60% of CPU, 90% of disk

6Oxford eSc – 1 st July03 Networking Bandwidth dominated by replication in analysis stage of data processing –Use of tools, such as OptorSim, to understand networking –LHC expts need to understand computing & analysis models –Early estimates factor of 5 increase Current problems with MC production & bulk transfer –Unrelated to SuperJANET –Often attributable to links into the MAN

7Oxford eSc – 1 st July03 CPU estimates CPU resource reqts are equivalent to 14k 2.4GHZ dual processors running continuously LHC expts: ~65% need in 2004 >80% in 2007

8Oxford eSc – 1 st July03 Disk Requirements 60% of disk reqts in 2004 for LHC expts 70% of disk storage in 2007 for LHC expts –Non-LHC expts still data taking – need disk for finishing analyses

9Oxford eSc – 1 st July03 Tape Requirments Tape usage completely dominated by LHC usage – 90% Large level uncertainty –2007: ATLAS ( 850TB) vs CMS (1150TB)

10Oxford eSc – 1 st July03 Needs in 2004 End 2003GridPP New CPUs500 kSI2k667 £k500 Total kSI2k New disks624 TB146 £k437 Total TB New tapes510 TB100 New drivers2 New servers2 £k96 Total TB ADS maintenance50 Replacements20

11Oxford eSc – 1 st July03 Application Development Building on current collaborative activity –GANGA: ATLAS & LHCb –SAM: CDF & D Ø –BaBar: adoption of EDG s/w Prototyping Production environment –Monte Carlo production activity Analysis environment Grid technologies becoming more widely accepted across HEP commuity –old experiments – UKDMC, ZEUS, … –new activities – LCFI, MICE, …

12Oxford eSc – 1 st July03 Application Development Similar pattern of needs emerge from all experiments (not too suprisingly!) –Storage & location of data Replication issues –Monte Carlo production tools Seen as an obvious area for efficiency savings –Analysis interfaces Intelligent bookkeeping of a users analysis activities –Persistency solutions Composite objects spread across several storage systems

13Oxford eSc – 1 st July03 New Experiments ExperimentProposed Activity UKDMCMC Production Analysis ZEUSMC Production phenoGridMetadata MICEMC production Data handling ANTARESData handling CALICEData handling LC –ABDMetadata

14Oxford eSc – 1 st July03 LHC experiments ExperimentActivity ATLAS MC production Metadata LCG integratiom Pesistency & data management CMS MC production Persistency & data management Workload management Monitoring LHCb MC production Metadata LCG integration Persistency & data management GANGA (ATLAS & LHCb) Grid user interface LCG integration Monitoring

15Oxford eSc – 1 st July03 Non LHC expts ExperimentActivity BaBarJob submission Data handling Persistency MC production CDFMC production SAM deployment DØDØ MC production SAM deployment UKQCDData handling Metadata

16Oxford eSc – 1 st July03 Application Call Essential continue to develop application interface through GridPP2 –Expand activity to allow current none GridPP supported expts to participate –Benefit from LCG developments Call for application posts – January04 –Response by April –Reviewed à la GridPP (Williams committee) Expt activity in UK Science o/p Track record in Grid activity