12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

Andrew McNab - Manchester HEP - 10 May 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –What are other sites doing? –What are.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Physics with SAM-Grid Stefan Stonjek University of Oxford 6 th GridPP Meeting 30 th January 2003 Coseners House.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
11th December 2002Tim Adye1 BaBar UK Grid Work Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting SLAC 11 th December 2002.
13th November 2002Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting University of Bristol 13 th November.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
13 December 2000Tim Adye1 New KanGA Export Scheme Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Data Distribution Session 13 th December.
Forschungszentrum Karlsruhe Technik und Umwelt Regional Data and Computing Centre Germany (RDCCG) RDCCG – Regional Computing and Data Center Germany software.
Computing Infrastructure
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
The story of BaBar: an IT perspective Roger Barlow DESY 4 th September 2002.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
Andrew McNab - Manchester HEP - 22 April 2002 UK Rollout and Support Plan Aim of this talk is to the answer question “As a site admin, what are the steps.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
2nd April 2001Tim Adye1 Bulk Data Transfer Tools Tim Adye BaBar / Rutherford Appleton Laboratory UK HEP System Managers’ Meeting 2 nd April 2001.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
CMS Report – GridPP Collaboration Meeting V Peter Hobson, Brunel University16/9/2002 CMS Status and Plans Progress towards GridPP milestones Workload management.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
SLAC Site Report Chuck Boeheim Assistant Director, SLAC Computing Services.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Tier1A Status Andrew Sansum 30 January Overview Systems Staff Projects.
RAL Site report John Gordon ITD October 1999
26 September 2000Tim Adye1 Data Distribution Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting 26 th September 2000.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
11th November 2002Tim Adye1 Distributed Analysis in the BaBar Experiment Tim Adye Particle Physics Department Rutherford Appleton Laboratory University.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
11th April 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Liverpool 11 th April 2003.
17th December 2001Tim Adye1 Using a Cable Modem at Home Tim Adye Particle Physics Department Rutherford Appleton Laboratory PPD Christmas Lectures 17 th.
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
Tier1A Status Martin Bly 28 April CPU Farm Older hardware: –108 dual processors (450, 600 and 1GHz) –156 dual processor 1400MHz PIII Recent delivery:
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
15 December 2000Tim Adye1 Data Distribution Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting 15 th December 2000.
Scientific Computing in PPD and other odds and ends Chris Brew.
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Patrick Gartung 1 CMS 101 Mar 2007 Introduction to the User Analysis Facility (UAF) Patrick Gartung - Fermilab.
11th September 2002Tim Adye1 BaBar Experience Tim Adye Rutherford Appleton Laboratory PPNCG Meeting Brighton 11 th September 2002.
BaBar & Grid Eleonora Luppi for the BaBarGrid Group TB GRID Bologna 15 febbraio 2005.
Tim Barrass Split ( ?) between BaBar and CMS projects.
Bulk production of Monte Carlo
Update on Plan for KISTI-GSDC
UK GridPP Tier-1/A Centre at CLRC
Kanga Tim Adye Rutherford Appleton Laboratory Computing Plenary
Presentation transcript:

12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002

Tim Adye2 Hardware 104 noma-like machines allocated to BaBar 156+old farm shared with other experiments 6 BaBar Suns (4-6 CPUs each) 20 TB disk for BaBar Also using ~10 TB of pool disk for data transfers All disk servers on Gigabit ethernet Pretty good server performance … as well as existing RAL facilities 622 Mbit/s network to SLAC and elsewhere RAL connection now 2.5 Gbit/s AFS server 100 TB Tape robot ( -> 330 TB -> 1 PB) Many years experience running BaBar software

12th September 2002Tim Adye3 Problems Disk problems tracked down to a bad batch of drives All drives are now being replaced by the manufacturer our disks should be done in ~1 month By using spare servers, replacement shouldnt interrupt service Some (inevitable) scaling problems due to the major expansion in the system Now that installation and (most) BaBar-requested features are setup, support staff can concentrate on reliability

12th September 2002Tim Adye4 Support Initially suffered from lack of support staff and out-of- hours support Two new system managers now in post Two more being recruited (one just for BaBar) Additional staff have been able to help with problems at weekends Discussing more formal arrangements

12th September 2002Tim Adye5

12th September 2002Tim Adye6 RAL Batch CPU Use

12th September 2002Tim Adye7 RAL Batch Users (running at least one non-trivial job each week) A total of 113 new BaBar users registered since December

12th September 2002Tim Adye8 Data at RAL All data in Kanga format is at RAL 19 TB currently on disk Series-8 + series-10 + reskimmed series-10 AllEvents + streams data + signal+generic MC New data copied from SLAC within 1-2 days RAL is now the primary Kanga analysis site See Nicoles talk for details

12th September 2002Tim Adye9 Changes since July Two new RedHat 6 front-end machines Dedicated to BaBar use Login to babar.gridpp.rl.ac.uk Trial RedHat 7.2 service One front-end and (currently) 5 batch workers Once we are happy with the configuration, many/all of the rest of the batch workers will be rapidly upgraded ssh AFS token passing installed on front-ends So, your local (eg. SLAC) token is available when you log in Trial Grid Gatekeeper available (EDG 1.2) Allows job submission from the Grid Improved new user registration procedures

12th September 2002Tim Adye10 Plans Upgrade full farm to RedHat 7.2 Leave RedHat 6 front-end for use with older releases Upgrade Suns to Solaris 8 and integrate into PBS queues Install data dedicated import-export machines Fast (Gigabit) network connection Special firewall rules to allow scp, bbftp, bbcp, etc. AFS authentication improvements PBS token passing and renewal integrated login (AFS token on login, like SLAC)

12th September 2002Tim Adye11 Plans Objectivity support Works now for private federations, but no data import Support Grid generic accounts, so special RAL user registration is no longer necessary Procure next batch of hardware Delivery probably early 2003

12th September 2002Tim Adye12 Summary Significant hardware available, and now being fully used Disk problems now understood and being fixed Improvements planned and underway to make using RAL as SLAC-like as possible (but faster, and maybe better!) Join us! See BaBar home page -> New Accounts Contact Emmanuel Olaiya (at SLAC) or me (at RAL) for help