08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick 08.06.00 Central UK Computing (what.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
LHCb Bologna Workshop Glenn Patrick1 Backbone Analysis Grid A Skeleton for LHCb? LHCb Grid Meeting Bologna, 14th June 2001 Glenn Patrick (RAL)
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
LHCb(UK) Meeting Glenn Patrick1 LHCb Grid Activities in UK LHCb(UK) Meeting Cambridge, 10th January 2001 Glenn Patrick (RAL)
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
NIKHEF Testbed 1 Plans for the coming three months.
 Changes to sources of funding for computing in the UK.  Past and present computing resources.  Future plans for computing developments. UK Status &
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
The SLAC Cluster Chuck Boeheim Assistant Director, SLAC Computing Services.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Data Import Data Export Mass Storage & Disk Servers Database Servers Tapes Network from CERN Network from Tier 2 and simulation centers Physics Software.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
Summary of EU Grid meetings June, IN2P3,Lyon Slide 1 EU Grid meetings 29June,2000 (Testbeds and HEP applications) F. Harris.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
CMS Software at RAL Fortran Code Software is mirrored into RAL AFS cell every 24 hours  /afs/rl.ac.uk/cms/ Binary libraries available for: HPHP-UX
Dave Newbold, University of Bristol8/3/2001 UK Testbed 0 Sites Sites that have committed to TB0: RAL (R) Birmingham (Q) Bristol (Q) Edinburgh (Q) Imperial.
File sharing requirements of remote users G. Bagliesi INFN - Pisa EP Forum on File Sharing 18/6/2001.
Outline: Tasks and Goals The analysis (physics) Resources Needed (Tier1) A. Sidoti INFN Pisa.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
IDE disk servers at CERN Helge Meinhard / CERN-IT CERN OpenLab workshop 17 March 2003.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
Randy MelenApril 14, Stanford Linear Accelerator Center Site Report April 1999 Randy Melen SLAC Computing Services/Systems HPC Team Leader.
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
Scientific Computing in PPD and other odds and ends Chris Brew.
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
SAM at CCIN2P3 configuration issues
UK GridPP Tier-1/A Centre at CLRC
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
LHC Collisions.
Status of LHCb-INFN Computing
LHCb(UK) Computing Status Glenn Patrick
Gridifying the LHCb Monte Carlo production system
LHCb thinking on Regional Centres and Related activities (GRIDs)
First attempt at using WIRED
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what is hoped for) JIF bid - Prototype UK national computing centre (Tier 1) for all 4 LHC experiments - outcome known in ~ November. Integrated Resources Processors (PC99-450MHz) Disk (TB) Tape (TB)

What exists now? RAL-CSF: Main LHCb Platform Currently, 160*P450 equivalent processors. Hope to expand to ~300*P450 in September. Linux Redhat 6.1 being phased in on all machines (HPs being shut down) to give compatibility with CERN (eg. lxplus). PBS (Portable Batch System) not NQS. 1TB+ of robotic tape space for LHCb. 500GB+ of disk space for LHCb (need to request). Globus toolkit v1.1.1 installed on front-end (with testbed service on another machine).

RAL Particle Physics Unix Services 100 Megabit Switched Network HP LINUXSUN Disk Farm n TB Scratch NIS /home userids FDDI AFS DataStore HP BATCH LINUX BATCH DataStore

LHCb Software LHCb software stored in 4GB AFS project space /afs/rl.ac.uk/lhcb Updated just after midnight every night. CMT/CVS installed (although no remote updating to CERN repository). Crude LHCb environment at the moment, but managed to process events through SICBMC with little knowledge of LHCb software. Available for LHCb to exploit for detector, physics & Grid(?) studies.

MC Production: RAL NT Farm 18*450MHz PII + 9*200MHz Pentium Pro LHCb frontend in addition to dual-cpu frontend. Production capacity 100k-200k events/week. 500k bb events processed so far and stored in RAL DataStore. Events now transferred over network to CERN using RAL VTP protocol instead of DLTs. Thanks to Dave Salmon,Eric van H & Chris Brew. Latest production code being installed (DS).

6 BDC 8 RAL NT Farm New Front-end & extra batch nodes GB 18GB 6 7 DAT Front End Batch Node Peripherals 100Mb/s switch PDC File Server New Systems + 14 CPU LAN & WAN BDC

Grid Developments There is now a “CLRC Team” for the particle physics grid + several work groups (GNP represents LHCb with CAJB also a member). Important that this is beneficial for LHCb. EU (DataGrid) application to distribute 10 7 events & 3TB using MAP/RAL/... does not start production until Need to start now and acquire some practical experience and expertise  decide way forward.

Grid Developments II Meetings: 14th June(RAL)Small technical group to discuss short term LHCb aims, testbeds, etc. (CERN,RAL,Liverpool,Glasgow…) 21st June(RAL)Globus Toolkit User Tutorial 22nd June(RAL)Globus Toolkit Developer Tutorial Open to all, register at rd June(RAL)Globus “strategy” meeting. (invitation/nomination)

From UK Town Meeting Which Grid Topology for LHCb(UK)? Flexibility important. CERN Tier 1 INFN RAL IN2P3 Tier 2 Liverpool Glasgow Edinburgh Department    Desktop users Tier 0 etc….

Grid Issues Starting to be asked for estimates of LHCb resources (central storage, etc) and Grid requirements for applications and testbeds. Useful to have a LHCb(UK) forum for discussion & feedback  define model for all UK institutes, not just RAL. Any documentation (including this talk) on computing/software/Grid at...