Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.

Slides:



Advertisements
Similar presentations
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Advertisements

The RHIC-ATLAS Computing Facility at BNL HEPIX – Edinburgh May 24-28, 2004 Tony Chan RHIC Computing Facility Brookhaven National Laboratory.
Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Birmingham site report Lawrie Lowe HEP System Managers Meeting, RAL,1 st July 2004.
UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
RAL Particle Physics Dept. Site Report. Gareth Smith RAL PPD About 2 staff mainly on windows and general infrastructure About 1.5 staff on departmental.
A couple of slides on RAL PPD Chris Brew CCLRC - RAL - SPBU - PPD.
Site Report: The Linux Farm at the RCF HEPIX-HEPNT October 22-25, 2002 Ofer Rind RHIC Computing Facility Brookhaven National Laboratory.
Computing Infrastructure
Martin Bly RAL CSF Tier 1/A RAL Tier 1/A Status HEPiX-HEPNT NIKHEF, May 2003.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
Rodney Neal Office 365 for Education Montgomery County Schools
Tripwire Enterprise Server – Getting Started Doreen Meyer and Vincent Fox UC Davis, Information and Education Technology June 6, 2006.
Stanford University StanfordNetDB Stanford NetDB- An Open Source Network Management Application for DNS, DHCP, IP Address Spaces, etc.
Gareth Smith RAL PPD HEP Sysman. April 2003 RAL Particle Physics Department Site Report.
RHUL1 Site Report Royal Holloway Sukhbir Johal Simon George Barry Green.
IT Update Faculty Senate September 1, 2004 University of Houston Information Technology.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
October, Scientific Linux INFN/Trieste B.Gobbo – Compass R.Gomezel - T.Macorini - L.Strizzolo INFN - Trieste.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
Site Report May 2006 RHUL Simon George Sukhbir Johal Royal Holloway, University of London, Egham, Surrey TW20 0EX HEP SYSMAN May 2006.
Paul Scherrer Institut 5232 Villigen PSI HEPIX_AMST / / BJ95 PAUL SCHERRER INSTITUT THE PAUL SCHERRER INSTITUTE Swiss Light Source (SLS) Particle accelerator.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
HEPiX/HEPNT TRIUMF,Vancouver 1 October 18, 2003 NIKHEF Site Report Paul Kuipers
TRIUMF Site Report for HEPiX/HEPNT, Vancouver, Oct20-24/2003 – Corrie Kost TRIUMF SITE REPORT Corrie Kost Head Scientific Computing.
Jefferson Lab Site Report Kelvin Edwards Thomas Jefferson National Accelerator Facility HEPiX – Fall, 2005.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
SLAC Site Report Chuck Boeheim Assistant Director, SLAC Computing Services.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
1st July 2004HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
19th September 2003Tim Adye1 RAL Tier A Status Tim Adye Rutherford Appleton Laboratory BaBar UK Collaboration Meeting Royal Holloway 19 th September 2003.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
The DCS lab. Computer infrastructure Peter Chochula.
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Brunel University, Department of Electronic and Computer Engineering, Uxbridge, UB8 3PH, UK Dr Peter R Hobson C.Phys M.Inst.P SIRE Group.
Randy MelenApril 14, Stanford Linear Accelerator Center Site Report April 1999 Randy Melen SLAC Computing Services/Systems HPC Team Leader.
Sydney Region Servers. Windows 2003 Standard Configuration Able to be supported remotely Antivirus updates managed from server.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
Lisa Giacchetti AFS: What is everyone doing? LISA GIACCHETTI Operating Systems Support.
SA1 operational policy training, Athens 20-21/01/05 Presentation of the HG Node “Isabella” and operational experience Antonis Zissimos Member of ICCS administration.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
CNAF Database Service Barbara Martelli CNAF-INFN Elisabetta Vilucchi CNAF-INFN Simone Dalla Fina INFN-Padua.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Computing in the Bacteriology Department Network Mail Storage Research (and Admin.) Computing Instructional Technology.
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
Office of Administration Enterprise Server Farm September 2008 Briefing.
Update on Plan for KISTI-GSDC
Glasgow Site Report (Group Computing)
UK GridPP Tier-1/A Centre at CLRC
Manchester HEP group Network, Servers, Desktop, Laptops, and What Sabah Has Been Doing Sabah Salih.
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Division of Engineering Computing Services
Presentation transcript:

Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 2 Computing structure Tiered approach: –Physics and Astronomy – Computing Support Team (CST) –PPE Research Group – system manager (physman) Works quite well, but don’t manage everything so have had some problems e.g. videoconferencing

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 3 Computing Support Team 5 FTEs Provide common platform based on RHEL WS 3.0 or WinXP Responsible for –Most department wide computing infrastructure: –network, DHCP, NIS, DNS, , firewall, tape backups, updates No support for other OS Strict security regulations – firewall issues

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 4 physman ~0.3 FTE per research group (RA, PhD student) First point of contact for all computing issues within group Responsible for: –purchasing –group specific software and customization – OpenAFS, CERNLIB, etc –security implication of group specific software/customization –printer queue management with CUPS –laptops –intrusion detection monitoring – Tripwire

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 5 Network

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 6 Network Physics and Astronomy: 100 Mbit/s –‘Physics’ network for supported WinXP and RHEL –‘Private’ network for anything else, laptops and visitors SRIF funded network: 1 Gbit/s fibre parallel to EdLAN used by ScotGrid Wireless in selected areas (University wide)

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 7 PPE hardware Sun Enterprise 250 Server, Solaris 8, TB RAID serving home directories, Web server 23 desktops –85 % RHEL WS 3.0 –15 % Windows, MS Office, VRVS, lab –Almost all Dell –0-6 years old 5 laptops – mainly dual boot Windows/Linux Babar compute farm: 4 * Sun Ultra 80, 1 * Ultra 5, currently offline

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 8 ScotGrid Storage bias –IBM xSeries 440, 8 * Xeon 1.9 GHz, 32 GB RAM –2 * FastT900 Storage Server, total 22 TB RAID 5 Front ends –2 * IBM xSeries 205, P4 1.8 GHz, 256 MB RAM –1 * IBM xSeries 340, 2 * PIII 1.0 GHz, 2 GB RAM LTO Ultrium Tape Library Need worker node(s) Currently installing LCG2 – Test Zone in next 1-2 weeks

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 9 RHEL experience Subscription under Red Hat’s Education Programme –Base package (Proxy server, students) £1500 p.a. –RHEL £5 p.a. per FTE, ~100 FTEs –A few AS licences with phone support –< £4000 p.a. for Physics and Astronomy –Coverage: unlimited use for staff and students on University or privately owned hardware In use since April 2004 Updates via up2date and Proxy server, nightly cron job Good value for money –Reduced OS upgrades –Updates come out quickly –Web based status monitoring an unexpected extra –Removal of some useful packages e.g. Pine –Red Hat not clear on exactly how unlimited licence works – we have a nominal upper limit of 200 seats

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 10 Hack reports No Linux based intrusions in last two years Blaster worm before installation of firewall (Jan 2004)

1 July 2004 Steve Thorn - UK HEP System Managers MeetingSlide 11 Future plans Phase out Sun/Solaris Purchase Linux server and unify PPE group storage Move ScotGrid to dedicated computing facility (out of town) There will be more and more ScotGrid hardware…