Oxford University Particle Physics Unix Overview

Slides:



Advertisements
Similar presentations
9th May 2006HEPSYSMAN RAL - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager.
Advertisements

Chris Brew RAL PPD Site Report Chris Brew SciTech/PPD.
17th October 2013Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
S. Gadomski, "ATLAS computing in Geneva", journee de reflexion, 14 Sept ATLAS computing in Geneva Szymon Gadomski description of the hardware the.
Secure Off Site Backup at CERN Katrine Aam Svendsen.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
14th October 2014Graduate Lectures1 Oxford University Particle Physics Unix Overview Sean Brisbane Particle Physics Systems Administrator Room 661 Tel.
Do MUCH More with Less Presented by: Jon Farley 2W Technologies.
Virtual Network Servers. What is a Server? 1. A software application that provides a specific one or more services to other computers  Example: Apache.
Getting Connected to NGS while on the Road… Donna V. Shaw, NGS Convocation.
SouthGrid Status Pete Gronbech: 4 th September 2008 GridPP 21 Swansea.
Introduction to UNIX/Linux Exercises Dan Stanzione.
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
Stuart Cunningham - Computer Platforms COMPUTER PLATFORMS Network Operating Systems Week 9.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
Process & Organize Data Storage 2 Data can be stored for later recall and use. The storage facility is a very powerful feature as data can be used later.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
SLIR Computer Lab: Orientation and Training December 16, 1998.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Southgrid Technical Meeting Pete Gronbech: 16 th March 2006 Birmingham.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
17-April-2007 High Performance Computing Basics April 17, 2007 Dr. David J. Haglin.
Installation Overview Lab#2 1Hanin Abdulrahman. Installing Ubuntu Linux is the process of copying operating system files from a CD, DVD, or USB flash.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPIX 2009 Umea, Sweden 26 th May 2009.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN RAL 30 th June 2009.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
Configuration Management with Cobbler and Puppet Kashif Mohammad University of Oxford.
11th Oct 2005Hepix SLAC - Oxford Site Report1 Oxford University Particle Physics Site Report Pete Gronbech Systems Manager and South Grid Technical Co-ordinator.
Cisco Confidential © 2010 Cisco and/or its affiliates. All rights reserved. 1 MSE Virtual Appliance Presenter Name: Patrick Nicholson.
Using Virtual Servers for the CERN Windows infrastructure Emmanuel Ormancey, Alberto Pace CERN, Information Technology Department.
Katie Antypas User Services Group Lawrence Berkeley National Lab 17 February 2012 JGI Training Series.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
UKI-SouthGrid Update Hepix Pete Gronbech SouthGrid Technical Coordinator April 2012.
13th October 2011Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
Oxford University Particle Physics Unix Overview Sean Brisbane Particle Physics Systems Administrator Room 661 Tel th.
14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.
HEPSYSMAN May 2007 Oxford & SouthGrid Computing Status (Ian McArthur), Pete Gronbech May 2007 Physics IT Services PP Computing.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
Using the Weizmann Cluster Nov Overview Weizmann Cluster Connection Basics Getting a Desktop View Working on cluster machines GPU For many more.
Oxford & SouthGrid Update HEPiX Pete Gronbech GridPP Project Manager October 2015.
11th October 2012Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and GridPP Project Manager.
UKI-SouthGrid Overview and Oxford Status Report Pete Gronbech SouthGrid Technical Coordinator HEPSYSMAN – RAL 10 th June 2010.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
An Brief Introduction Charlie Taylor Associate Director, Research Computing UF Research Computing.
Advanced Computing Facility Introduction
Compute and Storage For the Farm at Jlab
Getting Connected to NGS while on the Road…
GRID COMPUTING.
Welcome to Indiana University Clusters
Oxford University Particle Physics Unix Overview
Working With Azure Batch AI
UK GridPP Tier-1/A Centre at CLRC
Oxford Site Report HEPSYSMAN
NGS Oracle Service.
Assignment Preliminaries
File Transfer Olivia Irving and Cameron Foss
Shared Research Computing Policy Advisory Committee (SRCPAC)
Information Services & Technology
Getting Connected to NGS while on the Road…
CLASP Project AAI Workshop, Nov 2000 Denise Heagerty, CERN
Lecture 16B: Instructions on how to use Hadoop on Amazon Web Services
Oxford University Particle Physics Unix Overview
bitcurator-access-webtools Quick Start Guide
Presentation transcript:

Oxford University Particle Physics Unix Overview Monday, 03 September 2018 Oxford University Particle Physics Unix Overview Sean Brisbane Particle Physics Systems Administrator 14th October 2014 Graduate Lectures

Local Cluster Overview Connecting to it Grid Cluster Computer Rooms Strategy Local Cluster Overview Connecting to it Grid Cluster Computer Rooms How to get help 14th October 2014 Graduate Lectures

Particle Physics Strategy The Server / Desktop Divide Monday, 03 September 2018 Particle Physics Strategy The Server / Desktop Divide Virtual Machine Host Servers General Purpose Unix Server Linux Worker nodes Group DAQ Systems Linux File Servers Web Server NIS Server torque Server Win 7 PC Win 7 PC Win 7 PC Ubuntu PC Linux Desktop Desktops Users should use exceed from their windows 2000 desktop pc’s to connect to unix servers. Do not store files on the desktop they are not backed up. The servers home areas are backed up. Approx 200 Desktop PC’s with Exceed, putty or ssh/X windows used to access PP Linux systems 14th October 2014 Graduate Lectures

Particle Physics Linux Unix Team (Room 661): Pete Gronbech - Senior Systems Manager and GridPP Project Manager Ewan MacMahon – Grid Systems Administrator Kashif Mohammad – Grid and Local Support Sean Brisbane – Local Server and User Support General purpose interactive Linux based systems for code development, short tests and access to Linux based office applications. These are accessed remotely. Batch queues are provided for longer and intensive jobs. Provisioned to meet peak demand and give a fast turnaround for final analysis. Systems run Scientific Linux (SL) which is a free Red Hat Enterprise based distribution. The Grid & CERN have migrated to SL6. The majority of the local cluster is also on SL6, but some legacy SL5 systems are provided for those that need them. We will be able to offer you the most help running your code on the newer SL6. Some experimental software frameworks still require SL5. 14th October 2014 Graduate Lectures

Current Clusters Particle Physics Local Batch cluster Oxfords Tier 2 Grid cluster 14th October 2014 Graduate Lectures

PP Linux Batch Farm Scientific Linux 6 jailxwn02 Users log in to the interactive nodes pplxint8 & 9, the home directories and all the data disks (/home area or /data/group ) are shared across the cluster and visible on the interactive machines and all the batch system worker nodes. Approximately 300 cores (430 incl. JAI/LWFA), each with 4GB of RAM memory. The /home area is where you should keep your important text files such as source code, papers and thesis The /data/ area is where you should put your big reproducible input and output data jailxwn02 64 * AMD cores jailxwn01 64 * AMD cores pplxwn59 16 * Intel cores pplxwn60 16 * Intel cores pplxwnnn 16 * Intel 2650 cores pplxwn41 16 * Intel 2650 cores pplxwn38 12 * Intel 5650 cores pplxwnnn 12 * Intel 5650 cores pplxwnnn 12 * Intel 5650 cores pplxwn32 12 * Intel 5650 cores pplxwn31 12 * Intel 5650 cores pplxwn16 8 * Intel 5420 cores pplxwn15 8 * Intel 5420 cores pplxint9 Interactive login nodes pplxint8 14th October 2014 Graduate Lectures

PP Linux Batch Farm Scientific Linux 5 Legacy SL5 jobs supported by smaller selection of worker nodes. Currently eight servers with 16 cores each with 4GB of RAM memory per core. All of your files area available from SL5 and 6, but the software environment will be different and therefore your code may not run if compiled for the other operating system. pplxwn30 16 * AMD 6128 cores pplxwnnn 16 * AMD 6128 cores pplxwnnn 16 * AMD 6128 cores pplxwn24 16 * AMD 6128 cores pplxwn23 16 * AMD 6128 cores pplxint6 Interactive login nodes pplxint5 14th October 2014 Graduate Lectures

PP Linux Batch Farm Data Storage NFS is used to export data to the smaller experimental groups, where the partition size is less than the total size of a server. NFS Servers 40TB The data areas are too big to be backed up. The servers have dual redundant PSUs, RAID 6 and are running on uninterruptible powers supplies. This safeguards against hardware failures, but does not help if you delete files. Data Areas pplxfsn 40TB Data Areas The home areas are backed up by two different systems nightly. The Oxford ITS HFS service and a local back up system. If you delete a file tell us a soon as you can when you deleted it and it’s full name. The latest nightly backup of any lost or deleted files from your home directory is available at the read-only location /data/homebackup/{username} The home areas are quota’d but if you require more space ask us. Store your thesis on /home NOT /data. pplxfsn 30TB Data Areas pplxfsn 19TB Home areas pplxfsn 14th October 2014 Graduate Lectures

Particle Physics Computing Lustre MDS Lustre OSS02 Lustre OSS03 18TB 44TB SL6 Node SL5 Node pplxint8 Lustre OSS01 The Lustre file system is used to group multiple file servers together to provide extremely large continuous file spaces. This is used for the Atlas and LHCb groups. 44TB Lustre OSS04 df -h /data/atlas Filesystem Size Used Avail Use% Mounted on /lustre/atlas25/atlas 366T 199T 150T 58% /data/atlas df -h /data/lhcb Filesystem Size Used Avail Use% Mounted on /lhcb25 118T 79T 34T 71% /data/lhcb25 pplxint5 14th October 2014 Graduate Lectures

14th October 2014 Graduate Lectures

Strong Passwords etc Use a strong password not open to dictionary attack! fred123 – No good Uaspnotda!09 – Much better Better to use ssh with a passphrased key stored on your desktop. 14th October 2014 Graduate Lectures

Connecting with PuTTY Demo Plain ssh terminal connection Question: How many of you are using Windows? & Linux? On the desktop Demo Plain ssh terminal connection From ‘outside of physics’ From Office (no password) ssh with X windows tunnelled to passive exceed ssh, X windows tunnel, passive exceed, KDE Session Password-less access from ‘outside physics’ http://www2.physics.ox.ac.uk/it-services/ppunix/ppunix-cluster http://www.howtoforge.com/ssh_key_based_logins_putty 14th October 2014 Graduate Lectures

14th October 2014 Graduate Lectures

Puttygen to create an ssh key on Windows (previous slide point #4) Monday, 03 September 2018 Puttygen to create an ssh key on Windows (previous slide point #4) Paste this into ~/.ssh/authorized_keys on pplxint Enter a secure passphrase then : - Enter a strong passphrase - Save the private parts of the key to a subdirectory of your local drive. If you are likely to then hop to other nodes add : ForwardAgent yes to a file called config in the .ssh dir on pplxint 14th October 2014 Graduate Lectures

Pageant Run Pageant once after login Right-click on the pageant symbol and and “Add key” for your Private (windows ssh key) 14th October 2014 Graduate Lectures

SouthGrid Member Institutions Oxford RAL PPD Cambridge Birmingham Bristol Sussex JET at Culham 14th October 2014 Graduate Lectures

Current capacity Compute Servers Storage Twin and twin squared nodes 1770 CPU cores Storage Total of ~1300TB The servers have between 12 and 36 disks, the more recent ones are 4TB capacity each. These use hardware RAID and UPS to provide resilience. 14th October 2014 Graduate Lectures

Get a Grid Certificate http://www.ngs.ac.uk/ukca You will then need to contact central Oxford IT. They will need to see you, with your university card, to approve your request: To: help@it.ox.ac.uk Dear Stuart Robeson and Jackie Hewitt, I Please let me know a good time to come over to Banbury road IT office for you to approve my grid certificate request. Thanks. Must remember to use the same PC to request and retrieve the Grid Certificate. The new UKCA page http://www.ngs.ac.uk/ukca uses a JAVA based CERT WIZARD 14th October 2014 Graduate Lectures

When you have your grid certificate… Log in to pplxint9 and run mkdir .globus cd .globus openssl pkcs12 -in mycert.p12 -clcerts -nokeys -out usercert.pem openssl pkcs12 -in mycert.p12 -nocerts -out userkey.pem chmod 400 userkey.pem chmod 444 usercert.pem Save to a filename in your home directory on the Linux systems, eg: Y:\Linuxusers\particle\home\{username}\mycert.p12 14th October 2014 Graduate Lectures

Now Join a VO This is the Virtual Organisation such as “Atlas”, so: You are allowed to submit jobs using the infrastructure of the experiment Access data for the experiment Speak to your colleagues on the experiment about this. It is a different process for every experiment! 14th October 2014 Graduate Lectures

Two Computer Rooms provide excellent infrastructure for the future The New Computer room built at Begbroke Science Park jointly for the Oxford Super Computer and the Physics department, provides space for 55 (11KW) computer racks. 22 of which will be for Physics. Up to a third of these can be used for the Tier 2 centre. This £1.5M project was funded by SRIF and a contribution of ~£200K from Oxford Physics. The room was ready in December 2007. Oxford Tier 2 Grid cluster was moved there during spring 2008. All new Physics High Performance Clusters will be installed here. 14th October 2014 Graduate Lectures

Local Oxford DWB Physics Infrastructure Computer Room Completely separate from the Begbroke Science park a computer room with 100KW cooling and >200KW power has been built. ~£150K Oxford Physics money. Local Physics department Infrastructure computer room. Completed September 2007. This allowed local computer rooms to be refurbished as offices again and racks that were in unsuitable locations to be re housed. 14th October 2014 Graduate Lectures

Cold aisle containment 14th October 2014 Graduate Lectures

The end of the overview Now more details of use of the clusters Help Pages http://www.physics.ox.ac.uk/it/unix/default.htm http://www2.physics.ox.ac.uk/research/particle-physics/particle-physics-computer-support Email pp_unix_admin@physics.ox.ac.uk 14th October 2014 Graduate Lectures