Scientific Computing in PPD and other odds and ends Chris Brew.

Slides:



Advertisements
Similar presentations
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Advertisements

The RHIC-ATLAS Computing Facility at BNL HEPIX – Edinburgh May 24-28, 2004 Tony Chan RHIC Computing Facility Brookhaven National Laboratory.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
Alastair Dewhurst, Dimitrios Zilaskos RAL Tier1 Acknowledgements: RAL Tier1 team, especially John Kelly and James Adams Maximising job throughput using.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
S.Chechelnitskiy / SFU Simon Fraser Running CE and SE in a XEN virtualized environment S.Chechelnitskiy Simon Fraser University CHEP 2007 September 6 th.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
S. Gadomski, "ATLAS computing in Geneva", journee de reflexion, 14 Sept ATLAS computing in Geneva Szymon Gadomski description of the hardware the.
ATLAS computing in Geneva Szymon Gadomski, NDGF meeting, September 2009 S. Gadomski, ”ATLAS computing in Geneva", NDGF, Sept 091 the Geneva ATLAS Tier-3.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Zhiling Chen (IPP-ETHZ) Doktorandenseminar June, 4 th, 2009.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
RAL PPD Site Update and other odds and ends Chris Brew.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
LAL Site Report Michel Jouvin LAL / IN2P3
Quarterly report ScotGrid Quarter Fraser Speirs.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
RAL PPD Computing A tier 2, a tier 3 and a load of other stuff Rob Harper, June 2011.
Developing & Managing A Large Linux Farm – The Brookhaven Experience CHEP2004 – Interlaken September 27, 2004 Tomasz Wlodek - BNL.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
Support in setting up a non-grid Atlas Tier 3 Doug Benjamin Duke University.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Southgrid Technical Meeting Pete Gronbech: 26 th August 2005 Oxford.
Architecture and ATLAS Western Tier 2 Wei Yang ATLAS Western Tier 2 User Forum meeting SLAC April
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
July 29' 2010INDIA-CMS_meeting_BARC1 LHC Computing Grid Makrand Siddhabhatti DHEP, TIFR Mumbai.
ATLAS Software Installation at UIUC “Mutability is immutable.” - Heraclitus ~400 B.C.E. D. Errede, M. Neubauer Goals: 1) ability to analyze data locally.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Derek Ross E-Science Department DCache Deployment at Tier1A UK HEP Sysman April 2005.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
IHEP(Beijing LCG2) Site Report Fazhi.Qi, Gang Chen Computing Center,IHEP.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Rutherford Appleton Lab, UK VOBox Considerations from GridPP. GridPP DTeam Meeting. Wed Sep 13 th 2005.
University user perspectives of the ideal computing environment and SLAC’s role Bill Lockman Outline: View of the ideal computing environment ATLAS Computing.
Virtualization One computer can do the job of multiple computers, by sharing the resources of a single computer across multiple environments. Turning hardware.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
RALPP Site Report HEP Sys Man, 11 th May 2012 Rob Harper.
CERN IT Department CH-1211 Genève 23 Switzerland t SL(C) 5 Migration at CERN CHEP 2009, Prague Ulrich SCHWICKERATH Ricardo SILVA CERN, IT-FIO-FS.
BNL dCache Status and Plan CHEP07: September 2-7, 2007 Zhenping (Jane) Liu for the BNL RACF Storage Group.
Tier 3 Support and the OSG US ATLAS Tier2/Tier3 Workshop at UChicago August 20, 2009 Marco Mambelli –
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
IHEP Computing Center Site Report Gang Chen Computing Center Institute of High Energy Physics 2011 Spring Meeting.
Patrick Gartung 1 CMS 101 Mar 2007 Introduction to the User Analysis Facility (UAF) Patrick Gartung - Fermilab.
Atlas Tier 3 Overview Doug Benjamin Duke University.
Title of the Poster Supervised By: Prof.*********
Southwest Tier 2 Center Status Report
Computing Board Report CHIPP Plenary Meeting
Quattor Usage at Nikhef
Summary of the dCache workshop
Presentation transcript:

Scientific Computing in PPD and other odds and ends Chris Brew

Tier 2/3 This Year Large expansion: – 960 extra Job slots – 400 TB more disk Total Now: – 1584 Job Slots – 650 TB of Disk

Usage 1.66 Million jobs 6 Million kSI2k.Hrs of CPU 23 VOs ~ 7% of UK Tier 2 CPU ~12% of UK Tier 2 Disk

Scientific Linux 5 Almost all of the batch nodes moved to SL5 – SL5 Boxes are 64bit but should have the 32bit compatibility software installed. 10 nodes (40 slots) still on SL4 – Accessed through the sl4-prod queue 1 (test) SL5 Front End/UI – heplnx109 Will replace most of the rest of the UIs over the next year

Other Changes Mostly hidden: – Replaced the batch server and compute elements with new faster hardware – Major dCache upgrade and reconfiguration Now at the “golden release” Simplified configuration Replaced PNFS namespace with chimera

Network Upgrade Installed 10Gb/s Link between R1 Lab 8 and Atlas A5Lower

TWiki Still slightly pre-production – Though some groups already using it Uses grid certificates for authentication

Local Bastion Host Will talk to Computing Forum about requirements Some open questions: – Linux/OpenBSD – Passwords/ssh keys/grid certificates – Shared/Separate accounts – Access to home file system?

Support Matrix OSDesktopLaptop WindowsFully Supported Mac OS XNot SupportedBest Effort Scientific LinuxSupportedNot Supported Other LinuxNot Supported

Computing Group Choices PersonDesktopLaptop Dave Chris Kevin Alan Rob

Conclusions No major changes this year, mainly consolidation and preparation for LHC data That will probably continue for most of next year Believe we are in a good position to help the experiments and local users do the best physics possible