Oxford PP Computing Site Report HEPSYSMAN 28 th April 2003 Pete Gronbech.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.
24th May 2004Hepix Edinburgh - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
The RHIC-ATLAS Computing Facility at BNL HEPIX – Edinburgh May 24-28, 2004 Tony Chan RHIC Computing Facility Brookhaven National Laboratory.
Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
Liverpool HEP - Site Report June 2008 Robert Fay, John Bland.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
23rd April 2002HEPSYSMAN April Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Birmingham site report Lawrie Lowe HEP System Managers Meeting, RAL,1 st July 2004.
UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
Winnie Lacesso Bristol Site Report May Scope User Support / Servers / Config Security / Network UKI-SOUTHGRID-BRIS-HEP Upcoming: major infrastructure.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
9th May 2006HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Institute for High Energy Physics ( ) NEC’2007 Varna, Bulgaria, September Activities of IHEP in LCG/EGEE.
Jefferson Lab Site Report Sandy Philpott Thomas Jefferson National Accelerator Facility Newport News, Virginia USA
Tuesday, September 08, Head Node – Magic.cse.buffalo.edu Hardware Profile Model – Dell PowerEdge 1950 CPU - two Dual Core Xeon Processors (5148LV)
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
RHUL1 Site Report Royal Holloway Sukhbir Johal Simon George Barry Green.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
20th October 2003Hepix Vancouver - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Site Report May 2006 RHUL Simon George Sukhbir Johal Royal Holloway, University of London, Egham, Surrey TW20 0EX HEP SYSMAN May 2006.
Paul Scherrer Institut 5232 Villigen PSI HEPIX_AMST / / BJ95 PAUL SCHERRER INSTITUT THE PAUL SCHERRER INSTITUTE Swiss Light Source (SLS) Particle accelerator.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
TRIUMF Site Report for HEPiX/HEPNT, Vancouver, Oct20-24/2003 – Corrie Kost TRIUMF SITE REPORT Corrie Kost Head Scientific Computing.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
TRC Mini-Grant 2002 Dell PowerEdge 2500 Server. Project Goals Provide CS students with exposure to Linux (Unix) computing environment in CS courses Provide.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
1st July 2004HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Introduction to NW-GRID R.J. Allan CCLRC Daresbury Laboratory.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
The DCS lab. Computer infrastructure Peter Chochula.
CASPUR Site Report Andrei Maslennikov Lead - Systems Amsterdam, May 2003.
26/4/2001LAL Site Report - HEPix - LAL 2001 LAL Site Report HEPix – LAL Apr Michel Jouvin
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
RAL Site Report John Gordon HEPiX/HEPNT Catania 17th April 2002.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
UTA Site Report DØrace Workshop February 11, 2002.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
January 30, 2016 RHIC/USATLAS Computing Facility Overview Dantong Yu Brookhaven National Lab.
Lisa Giacchetti AFS: What is everyone doing? LISA GIACCHETTI Operating Systems Support.
BaBar Cluster Had been unstable mainly because of failing disks Very few (
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
10/18/01Linux Reconstruction Farms at Fermilab 1 Steven C. Timm--Fermilab.
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

Oxford PP Computing Site Report HEPSYSMAN 28 th April 2003 Pete Gronbech

General Strategy Approx 200 Windows 2000 Desktop PCs with Exceed used to access central Linux systems Digital Unix and VMS phased out for general use. Red Hat Linux 7.3 is becoming the standard

Network Access Campus Backbone Router Super Janet 4 2.4Gb/s with Super Janet 4 OUCS Firewall depts Physics Firewall Physics Backbone Router 100Mb/s 1Gb/s 100Mb/s 1Gb/s Backbone Edge Router depts 100Mb/s depts 100Mb/s Backbone Edge Router 1Gb/s

Physics Backbone Upgrade to Gigabit Autumn 2002 desktop Server Gb/s switch Physics Firewall Physics Backbone Router 1Gb/s 100Mb/s Particle Physics desktop 100Mb/s 1Gb/s 100Mb/s Clarendon Lab 1Gb/s Linux Server Win 2k Server Astro 1Gb/s Theory 1Gb/s Atmos 1Gb/s

pplx1morpheuspplxfs1pplxgen pplx2 1Gb/s ppcresst1ppcresst2 ppatlas1atlassbc ppminos1ppminos2 gridpplxbatch pptb01 pptb02 Grid Development pplx3 (SNO) ppnt117 (HARP) CDF minos DAQ Atlas DAQ cresst DAQ General Purpose Systems tblcfgtbse01tbce01 RH 7.3 Fermi RH 7.3 RH 7.1 RH 7.3 RH 6.2 RH 7.1 RH 7.3 RH 6.2 PBS Batch Farm Autumn *Dual 2.4GHz systems RH 7.3 edg ui sam testing Autumn 2002

pplxfs1pplxgen pplx2 1Gb/s General Purpose Systems RH 7.3 RH 6.2 PBS Batch Farm Autumn *Dual 2.4GHz systems RH 7.3

Zero - D X- 3i SCSI -IDE RAID 12 * 160GB Maxtor Drives Supplied by Compusys This proved to be a disaster and was rejected in favour of bare scsi disks which we internally mounted in our rack mounted file server

The Linux File Server: pplxfs1 8*146GB SCSI disks

General Purpose Linux Server : pplxgen pplxgen is a Dual 2.2GHz Pentium 4 Xeon based system with 2GB ram. It is running Red Hat 7.3 It was brought on line at the end of August 2002 to share the load with pplx2 as users migrated off al1 (the Digital Unix Server)

PP batch farm running Red Hat 7.3 with Open PBS can be seen below pplxgen This service became fully operational in Feb 2003.

pplx1 (new) morpheus1Gb/s gridpplxbatch pptb01 pptb02 Grid Development CDF tblcfgtbse01tbce01 Fermi RH 7.1 RH 6.2 edg ui sam testing matrix Fermi node9 Fermi cdfsam Fermi node1 Fermi RH 6.1 RH 7.3 tbwn01tbwn02 RH 6.2 tbgen01 FEBRUARY 2003 LHCB MC RH 6.2

Grid development systems. Including EDG software testbed setup.

New Linux Systems Morpheus is an IBM x370 8 way SMP 700MHz Xeon with 4GB RAM and 1TB Fibre Channel disks Installed August 2001 Purchased as part of a JIF grant for the cdf group Runs Red Hat 7.1 Will use cdf software developed at Fermilab and here to process data from the cdf experiment.

Tape Backup is provided by a Qualstar TLS4480 tape robot with 80 slots and Dual Sony AIT3 drives. Each tape can hold 100GB of data. Installed January Netvault Software from BakBone is used, running on morpheus, for backup of both cdf and particle physics systems.

Second round of cdf JIF tender: Dell Cluster - MATRIX 10 Dual 2.4GHz P4 Xeon servers running Fermi linux and SCALI cluster software. Installed December 2002

Approx 7.5 TB for SCSI RAID 5 disks are attached to the master node. Each shelf holds GB disks. These are shared via NFS with the worker nodes. OpenPBS batch queuing software is used.

Plenty of space in the second rack for expansion of the cluster.

Lhcb Monte Carlo Setup 8 way 700MHz Xeon Server RH6.2 OpenAFS OpenPBS grid RH6.2 Globus1.1.3 OpenAFS OpenPBS Compute Node Grid Gateway The 8 way SMP has now been reloaded as a MS Windows Terminal Server and lhcb MC jobs will be run on the new pp farm.

Problems IDE Raid proved to be unreliable, caused lots of down time. Problems with NAT (using iptables caused NFS problems and hangs) Solved by dropping NAT and using real IP addresses for PP farm Trouble with ext3 journal errors. Hackers…

Problems Lack of Manpower! Number of Operating systems slowly reducing, Digital unix and vms very nearly gone. NT4 also practically eliminated. Getting closer to standardising on RH 7.3 especially as the EDG software is now heading that way. Still finding it very hard to support laptops but now have a standard clone and recommend IBM laptops. Would be good to have more time to concentrate on security…. (See later talk)