11th April 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Liverpool 11 th April 2003.

Slides:



Advertisements
Similar presentations
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Advertisements

The RHIC-ATLAS Computing Facility at BNL HEPIX – Edinburgh May 24-28, 2004 Tony Chan RHIC Computing Facility Brookhaven National Laboratory.
11th December 2002Tim Adye1 BaBar UK Grid Work Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting SLAC 11 th December 2002.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
13th November 2002Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting University of Bristol 13 th November.
13 December 2000Tim Adye1 New KanGA Export Scheme Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Data Distribution Session 13 th December.
Request Tracking (RT) The new ticketing system Tomasz Wlodek.
22nd January 2003Tim Adye1 Summary of Bookkeeping discussions at RAL Workshop Tim Adye Rutherford Appleton Laboratory Kanga Phone Meeting 22 nd January.
ATLAS Tier-3 in Geneva Szymon Gadomski, Uni GE at CSCS, November 2009 S. Gadomski, ”ATLAS T3 in Geneva", CSCS meeting, Nov 091 the Geneva ATLAS Tier-3.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Setting up of condor scheduler on computing cluster Raman Sehgal NPD-BARC.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
IHEP Site Status Jingyan Shi, Computing Center, IHEP 2015 Spring HEPiX Workshop.
S. Gadomski, "ATLAS computing in Geneva", journee de reflexion, 14 Sept ATLAS computing in Geneva Szymon Gadomski description of the hardware the.
A crash course in njit’s Afs
Experiences Deploying Xrootd at RAL Chris Brew (RAL)
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
CASPUR Site Report Andrei Maslennikov Sector Leader - Systems Catania, April 2001.
VIPBG LINUX CLUSTER By Helen Wang March 29th, 2013.
2nd April 2001Tim Adye1 Bulk Data Transfer Tools Tim Adye BaBar / Rutherford Appleton Laboratory UK HEP System Managers’ Meeting 2 nd April 2001.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
| nectar.org.au NECTAR TRAINING Module 5 The Research Cloud Lifecycle.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
Portal User Group Meeting June 13, Agenda I. Welcome II. Updates on the following: –Migration Status –New Templates –DB Breakup –Keywords –Streaming.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
SLAC Site Report Chuck Boeheim Assistant Director, SLAC Computing Services.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
25th October 2006Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar UK Physics Meeting Queen Mary, University of London 25 th October 2006.
London Tier 2 Status Report GridPP 11, Liverpool, 15 September 2004 Ben Waugh on behalf of Owen Maroney.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
INFSO-RI Enabling Grids for E-sciencE Enabling Grids for E-sciencE Pre-GDB Storage Classes summary of discussions Flavia Donno Pre-GDB.
Hepix LAL April 2001 An alternative to ftp : bbftp Gilles Farrache In2p3 Computing Center
26 September 2000Tim Adye1 Data Distribution Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting 26 th September 2000.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
14 th April 1999CERN Site Report, HEPiX RAL. A.Silverman CERN Site Report HEPiX April 1999 RAL Alan Silverman CERN/IT/DIS.
6. Juli 2015 Dietrich Liko Physics Computing 114. Vorstandssitzung.
Computer Security Status Update FOCUS Meeting, 28 March 2002 Denise Heagerty, CERN Computer Security Officer.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
11th November 2002Tim Adye1 Distributed Analysis in the BaBar Experiment Tim Adye Particle Physics Department Rutherford Appleton Laboratory University.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
17th December 2001Tim Adye1 Using a Cable Modem at Home Tim Adye Particle Physics Department Rutherford Appleton Laboratory PPD Christmas Lectures 17 th.
BaBar and the GRID Tim Adye CLRC PP GRID Team Meeting 3rd May 2000.
15 December 2000Tim Adye1 Data Distribution Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting 15 th December 2000.
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
Oct. 6, 1999PHENIX Comp. Mtg.1 CC-J: Progress, Prospects and PBS Shin’ya Sawada (KEK) For CCJ-WG.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
ATLAS Computing Wenjing Wu outline Local accounts Tier3 resources Tier2 resources.
11th September 2002Tim Adye1 BaBar Experience Tim Adye Rutherford Appleton Laboratory PPNCG Meeting Brighton 11 th September 2002.
CCIN2P3 Site Report - BNL, Oct 18, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center.
GridPP10 Meeting CERN June 3 rd 2004
Belle II Physics Analysis Center at TIFR
Tim Barrass Split ( ?) between BaBar and CMS projects.
SAM at CCIN2P3 configuration issues
UK GridPP Tier-1/A Centre at CLRC
Artem Trunov and EKP team EPK – Uni Karlsruhe
Kanga Tim Adye Rutherford Appleton Laboratory Computing Plenary
Presentation transcript:

11th April 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Liverpool 11 th April 2003

Tim Adye2 BaBar Batch CPU Use at RAL

11th April 2003Tim Adye3 BaBar Batch Users at RAL (running at least one non-trivial job each week)

11th April 2003Tim Adye4 Kanga Disk Saga In December we had filled up all ~20 TB at RAL Freed up some space by deleting (most) old Series-8 data and started importing the backlog A minor upgrade of our old data server on 19 Feb, csfsun02, prompted a major loss of data Recovered 1.3 TB scavenged from csfsun02 disks 1.4 TB re-imported from SLAC disk 0.3 TB restored from SLAC HPSS Half way through recovering, discovered that csfsun02 was still bad. All data migrated to borrowed servers. All Kanga data restored and up-to-date with SLAC production on 28 March.

11th April 2003Tim Adye5 Security Incident SucKIT Linux root exploit has been spreading throughout the HEP community An infected machine records all passwords typed on that machine Includes passwords used to connect to other machines ssh included; fortunately not klog It’s not unlikely that CSF passwords have been compromised by another system To protect CSF from further attack, all passwords that have been used recently were reset Tuesday Users contacted by phone and post I can give you your new password today

11th April 2003Tim Adye6 Linux Upgrade Nearly all machines at RAL now run RedHat 7.2 Exceptions are babar-old.gridpp.rl.ac.uk front-end (AKA csfc ) Will be switched off next week babarbuild batch queue RH72 batch workers can run RH6 jobs, but RH72 machines can’t build code in release analysis-13 and before, so Upgrade to analysis-13b or later Use the babarbuild queue to compile and link; run in the normal queues

11th April 2003Tim Adye7 CSF Batch System Much work behind the scenes Reliability and optimising queuing algorithms Use bbrbsub to submit, eg. bbrbsub -l cput=01:00:00 BetaApp myAnalysis.tcl bbrbsub is a wrapper for qsub, so you can use qsub options (see “ man qsub ”)

11th April 2003Tim Adye8 Recently Planned Improvements – 1 Since November Install dedicated import-export machines Fast (Gigabit) network connection Special firewall rules to allow scp, bbftp, bbcp, etc.  Two new RH72 Linux machines  csfmove01.rl.ac.uk for exports AFS authentication improvements PBS token passing and renewal integrated login (AFS token on login, like SLAC)  Not yet implemented 

11th April 2003Tim Adye9 Objectivity support Works now for private federations, but no data import First step will be to provide Objy conditions database access  Objy conditions snapshot installed by Tim Barrass…  Then we lost our Objy server, csfsun02 Upgrade Suns to Solaris 8 and integrate into PBS  4 x 4-CPU Solaris 8 systems now available in babarsol queue, eg. bbrbsub –q babarsol job.sh  Recently Planned Improvements – 2 Since November

11th April 2003Tim Adye10 Support Grid “generic accounts”, so special RAL user registration is no longer necessary  Users without an entry in the grid-mapfile will be assigned to babar001, babar002, … babar050  The pool account will forever more be bound to that certificate DN, so you will always run under the same babar0NN Recently Planned Improvements – 3 Since November

11th April 2003Tim Adye11 Support For help, post to “RAL Tier A” HyperNews forum; or contact Emmanuel Olaiya (at SLAC) or me (at RAL)