14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

Slides:



Advertisements
Similar presentations
Southgrid Status Pete Gronbech: 21 st March 2007 GridPP 18 Glasgow.
Advertisements

Southgrid Status Pete Gronbech: 30 th August 2007 GridPP 19 Ambleside.
24th May 2004Hepix Edinburgh - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Liverpool HEP – Site Report May 2007 John Bland, Robert Fay.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Oxford PP Computing Site Report HEPSYSMAN 28 th April 2003 Pete Gronbech.
9th May 2006HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Site report: CERN Helge Meinhard (at) cern ch HEPiX fall SLAC.
14th October 2014Graduate Lectures1 Oxford University Particle Physics Unix Overview Sean Brisbane Particle Physics Systems Administrator Room 661 Tel.
Tuesday, September 08, Head Node – Magic.cse.buffalo.edu Hardware Profile Model – Dell PowerEdge 1950 CPU - two Dual Core Xeon Processors (5148LV)
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
User Management in LHCb Gary Moine, CERN 29/08/
Tier 3g Infrastructure Doug Benjamin Duke University.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Quarterly report SouthernTier-2 Quarter P.D. Gronbech.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
27/04/05Sabah Salih Particle Physics Group The School of Physics and Astronomy The University of Manchester
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
20th October 2003Hepix Vancouver - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
SouthGrid Status Pete Gronbech: 2 nd April 2009 GridPP22 UCL.
Paul Scherrer Institut 5232 Villigen PSI HEPIX_AMST / / BJ95 PAUL SCHERRER INSTITUT THE PAUL SCHERRER INSTITUTE Swiss Light Source (SLS) Particle accelerator.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
Tier1 Status Report Martin Bly RAL 27,28 April 2005.
TRIUMF Site Report for HEPiX/HEPNT, Vancouver, Oct20-24/2003 – Corrie Kost TRIUMF SITE REPORT Corrie Kost Head Scientific Computing.
INDIACMS-TIFR Tier 2 Grid Status Report I IndiaCMS Meeting, April 05-06, 2007.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
Using Virtual Servers for the CERN Windows infrastructure Emmanuel Ormancey, Alberto Pace CERN, Information Technology Department.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
1st July 2004HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
Southgrid Technical Meeting Pete Gronbech: May 2005 Birmingham.
Oxford University Particle Physics Unix Overview Sean Brisbane Particle Physics Systems Administrator Room 661 Tel th.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
Oxford & SouthGrid Update HEPiX Pete Gronbech GridPP Project Manager October 2015.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
BaBar Cluster Had been unstable mainly because of failing disks Very few (
R. Krempaska, October, 2013 Wir schaffen Wissen – heute für morgen Controls Security at PSI Current Status R. Krempaska, A. Bertrand, C. Higgs, R. Kapeller,
Western Tier 2 Site at SLAC Wei Yang US ATLAS Tier 2 Workshop Harvard University August 17-18, 2006.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Oxford University Particle Physics Unix Overview
Belle II Physics Analysis Center at TIFR
Oxford University Particle Physics Unix Overview
Oxford Site Report HEPSYSMAN
Oxford University Particle Physics Unix Overview
Presentation transcript:

14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator

14th October 2010Graduate Lectures2 l Strategy l Local Cluster Overview l Connecting to it l Grid Cluster l Computer Rooms

14th October 2010Graduate Lectures3 Particle Physics Strategy The Server / Desktop Divide Win XP PC Linux Desktop Desktops Servers General Purpose Unix Server Group DAQ Systems Linux Worker nodes Web Server Linux File Servers Win XP PC Win 7 PC Approx 200 Windows XP Desktop PC’s with Exceed, putty or ssh used to access central Linux systems Virtual Machine Host NIS Server torque Server

14th October 2010Graduate Lectures4 Particle Physics Linux l Unix Team (Room 661): n Pete Gronbech - Senior Systems Manager and SouthGrid Technical Coordinator n Ewan MacMahon - Systems Administrator n Kashif Mohammad - Deputy Technical Coordinator l Aim to provide general purpose Linux based system for code development and testing and other Linux based applications. l Interactive login servers and batch queues are provided l Systems run Scientific Linux which is a free Red Hat Enterprise based distribution l Systems are currently running a mixture of SL4, and SL5 l The Systems are being migrated to SL5 currently, this is the same version as used on the Grid and at CERN. Students are encouraged to test pplxint5 to let us know of any problems. l Worker nodes form a PBS (aka torque) cluster accessed via batch queues.

14th October 2010Graduate Lectures5 Current Clusters l Particle Physics Local Batch cluster l Oxfords Tier 2 Grid cluster

14th October 2010 Alias to pplxint 2 6TB pplxfs2 PP Linux Batch Farm Scientific Linux 4 88 active slots pplxint1 Interactive login nodes 4TB 10TB pplxfs3 9TB pplxfs4 19TB lustre 19TB CDF Data ATLAS Data pplxwn11 8 * Intel 5420 cores pplxwn10 8 * Intel 5420 cores pplxwn09 8 * Intel 5420 cores pplxwn08 8 * Intel 5420 cores pplxwn07 8 * Intel 5420 cores pplxwn06 8 * Intel 5420 cores pplxwn05 8 * Intel 5420 cores pplxwn04 8 * Intel 5420 cores pplxwn03 8 * Intel 5420 cores pplxwn02 8 * Intel 5420 cores pplxwn01 8 * Intel 5420 cores pplxfs6 19TB lustre 19TB lustre 19TB pplxint2 8 * Intel 5420 cores pplxint3 6 Graduate Lectures LHCb Data NFS Servers Home areas Data Areas

Particle Physics Computing Lustre MDSLustre OSS01Lustre OSS02 18TB Lustre NFS Gateway SL4 Nodes SL5 Node SL4 Nodes 44TB Lustre OSS03 df -h /data/atlas Filesystem Size Used Avail Use% Mounted on pplxlustrenfs.physics.ox.ac.uk:/data/atlas76T 46T 27T 64%/data/atlas df -h /data/lhcb Filesystem Size Used Avail Use% Mounted on pplxlustrenfs2.physics.ox.ac.uk:/data/lhcb 18T 8.5T 8.6T 50% /data/lhcb

14th October 2010 PP Linux Batch Farm pplxwn12 Scientific Linux 5 migration plan pplxint6 pplxint5 pplxwn18 pplxwn19 8 * Intel 5420 cores 8 * Intel 5345 cores Interactive login nodes pplxwn13 8 * Intel 5420 cores pplxwn14 8 * Intel 5420 cores pplxwn15 8 * Intel 5420 cores pplxwn16 8 * Intel 5420 cores Currently acting as NFS – Lustre gateways for the SL4 nodes 8Graduate Lectures

14th October 2010Graduate Lectures9

14th October 2010Graduate Lectures10 Strong Passwords etc l Use a strong password not open to dictionary attack! n fred123 – No good n Uaspnotda!09 – Much better l Better to use ssh with a passphrased key stored on your desktop.

14th October 2010Graduate Lectures11 Connecting with PuTTY Demo 1. Plain ssh terminal connection 2. With key and Pageant 3. ssh with X windows tunnelled to passive exceed 4. ssh, X windows tunnel, passive exceed, KDE Session

14th October 2010Graduate Lectures12

Puttygen to create an ssh key on Windows 14th October 2010Graduate Lectures13 Paste this into ~/.ssh/authorized_keys on pplxint If you are likely to then hop to other nodes add : ForwardAgent yes to a file called config in the.ssh dir on pplxint Save the public and private parts of the key to a subdirectory of your h: drive

Pageant l Run Pageant once after login to load your (windows ssh key) 14th October 2010Graduate Lectures14

14th October 2010Graduate Lectures15 SouthGrid Member Institutions l Oxford l RAL PPD l Cambridge l Birmingham l Bristol l JET at Culham

14th October 2010Graduate Lectures16 Oxford Tier 2 Grid Upgrade 2008 l 13 systems, 26 servers, 52 cpus, 208 cores. Intel 5420 clovertown cpu’s provide ~540KSI2K l 3 servers each providing 20TB usable storage after RAID 6, total ~60TB l One rack, 2 PDU’s, 2 UPS’s, 3 3COM 5500G switches

2010 Upgrade Due in November l Compute Servers n Twin squared nodes –Dual 8 core AMD Opteron 6128 CPUs provide 64 cores per unit. l Storage n 24 *2TB disks per unit (~44TB after RAID6) –Increase in LHCb capacity –Allow migration off older servers n 36*2TB disks per unit (~68TB after RAID6) –Grid Cluster upgrade ~200TB 14th October 2010Graduate Lectures17

14th October 2010Graduate Lectures18 Get a Grid Certificate Must remember to use the same web browser to request and retrieve the Grid Certificate. Once you have it in your browser you can export it to the Linux Cluster to run grid jobs. Details of these steps and how to request membership of the SouthGrid VO (if you do not belong to an existing group such as ATLAS, LHCb) are here:

14th October 2010Graduate Lectures19 Two New Computer Rooms provide excellent infrastructure for the future The New Computer room built at Begbroke Science Park jointly for the Oxford Super Computer and the Physics department, provides space for 55 (11KW) computer racks. 22 of which will be for Physics. Up to a third of these can be used for the Tier 2 centre. This £1.5M project is funded by SRIF and a contribution of ~£200K from Oxford Physics. The room was ready in December Oxford Tier 2 Grid cluster was moved there during spring All new Physics High Performance Clusters will be installed here.

14th October 2010Graduate Lectures20 Oxford Grid Cluster

14th October 2010Graduate Lectures21 Local Oxford DWB Physics Infrastructure Computer Room Completely separate from the Begbroke Science park a computer room with 100KW cooling and >200KW power has been built. ~£150K Oxford Physics money. Local Physics department Infrastructure computer room. Completed September This allowed local computer rooms to be refurbished as offices again and racks that were in unsuitable locations to be re housed.

14th October 2010Graduate Lectures22 The end for now… l Ewan will give more details of use of the clusters next week l Help Pages n n l n l Questions…. l Network Topology

14th October 2010Graduate Lectures23 Network l Gigabit connection to campus operational since July l Second gigabit connection installed Sept l Dual 10 gigabit links installed August 2009 l Gigabit firewall installed for Physics. Purchased commercial unit to minimise manpower required for development and maintenance. Juniper ISG 1000 running netscreen. l Firewall also supports NAT and VPN services which is allowing us to consolidate and simplify the network services. l Moving to the firewall NAT has solved a number of problems we were having previously, including unreliability of videoconferencing connections. l Physics-wide wireless network. Installed in DWB public rooms, Martin Wood,AOPP and Theory. New firewall provides routing and security for this network.

14th October 2010Graduate Lectures24 Network Access Campus Backbone Router Super Janet 4 2* 10Gb/s with Super Janet 5 OUCS Firewall depts Physics Firewall Physics Backbone Router 1Gb/s 10Gb/s 1Gb/s 10Gb/s Backbone Edge Router depts 100Mb/s 1Gb/s depts 100Mb/s Backbone Edge Router 10Gb/s

14th October 2010Graduate Lectures25 Physics Backbone desktop Server switch Physics Firewall Physics Backbone Router 1Gb/s 100Mb/s 1Gb/s Particle Physics desktop 100Mb/s 1Gb/s 100Mb/s Clarendon Lab 1Gb/s Linux Server Win 2k Server Astro 1Gb/s Theory 1Gb/s Atmos 1Gb/s