Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.

Slides:



Advertisements
Similar presentations
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
Advertisements

31/03/ :05:55GridPP 3 Cambridge Feb 02Slide 1 Grid site report linux desktops test rig CDF ScotGRID in collaboration with Edinburgh and IBM.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
UCL HEP Computing Status HEPSYSMAN, RAL,
24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
National e-Science Centre Arthur Trew Director, EPCC Deputy Director, NeSC.
Contact: Hirofumi Amano at Kyushu 40 Years of HPC Services In this memorable year, the.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Information Technology Center Introduction to High Performance Computing at KFUPM.
Dr Arthur Trew EPCC Director A Research Computing Infrastructure for Edinburgh.
LinkSCEEM Roadshow Introduction to LinkSCEEM/SESAME/IMAN1 4 May 2014, J.U.S.T Presented by Salman Matalgah Computing Group leader SESAME.
An Introduction to Princeton’s New Computing Resources: IBM Blue Gene, SGI Altix, and Dell Beowulf Cluster PICASso Mini-Course October 18, 2006 Curt Hillegas.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
CPP Staff - 30 CPP Staff - 30 FCIPT Staff - 35 IPR Staff IPR Staff ITER-India Staff ITER-India Staff Research Areas: 1.Studies.
March 27, IndiaCMS Meeting, Delhi1 T2_IN_TIFR of all-of-us, for all-of-us, by some-of-us Tier-2 Status Report.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
1 1 WLCWLCG workshop G workshop. Introduction to KISTI Introduction to NSDC Project Activities in 2009 System architecture Management plan for Alice tier-1.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Gurcharan S. Khanna Director of Research Computing RIT
1 Cardiff University Advanced Research Computing Suppliers Briefing Dr Hugh Beedie (INSRV and ARCCA CTO) Dr Chris Dickson (SRIF3 HEC Programme Coordinator)
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
HPCVL High Performance Computing Virtual Laboratory Founded 1998 as a joint HPC lab between –Carleton U. (Comp. Sci.) –Queen’s U. (Engineering) –U. of.
Future of e-Science Malcolm Atkinson Director 18 th March 2004.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
TRC Mini-Grant 2002 Dell PowerEdge 2500 Server. Project Goals Provide CS students with exposure to Linux (Unix) computing environment in CS courses Provide.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
The Birmingham Environment for Academic Research Setting the Scene Peter Watkins, School of Physics and Astronomy (on behalf of the Blue Bear team)
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Campus Network Development Network Architecture, Universal Access & Security.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Contact: Hirofumi Amano at Kyushu Mission 40 Years of HPC Services Though the R. I. I.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
Purdue Campus Grid Preston Smith Condor Week 2006 April 24, 2006.
Usability Talk, 26 th January 2006 Development of Usable Grid Services for the Biomedical Community Prof Richard Sinnott Technical Director National e-Science.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
On High Performance Computing and Grid Activities at Vilnius Gediminas Technical University (VGTU) dr. Vadimas Starikovičius VGTU, Parallel Computing Laboratory.
The National Grid Service Mike Mineter
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Facilities and How They Are Used ORNL/Probe Randy Burris Dan Million – facility administrator.
KOLKATA Grid Site Name :- IN-DAE-VECC-02Monalisa Name:- Kolkata-Cream VO :- ALICECity:- KOLKATACountry :- INDIA Shown many data transfers.
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
January 30, 2016 RHIC/USATLAS Computing Facility Overview Dantong Yu Brookhaven National Lab.
Science Support for Phase 4 Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
Computer Performance. Hard Drive - HDD Stores your files, programs, and information. If it gets full, you can’t save any more. Measured in bytes (KB,
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Creating Grid Resources for Undergraduate Coursework John N. Huffman Brown University Richard Repasky Indiana University Joseph Rinkovsky Indiana University.
New Computing Effort at CSU Fresno ATLAS Group Cui Lin, Yongsheng Gao California State University, Fresno 10/11/2011 at SMU Workshop LHC.
KOLKATA Grid Kolkata Tier-2 Status and Plan Site Name :- IN-DAE-VECC-02 Gocdb Name:- IN-DAE-VECC-02 VO :- ALICE City:- KOLKATA Country :-
What is HPC? High Performance Computing (HPC)
White Rose Grid Infrastructure Overview
Virtualization OVERVIEW
Kolkata Status and Plan
UK GridPP Tier-1/A Centre at CLRC
UK Status and Plans Scientific Computing Forum 27th Oct 2017
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
The National Grid Service
Cluster Computers.
Presentation transcript:

Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew

11 October e-science KB e-science AT The last mile: SRIF1 JANET BAR Royal Edinburgh Hospital Library Appleton Tower Kings Buildings Holyrood Robson Building EaStMan Router Old College Western General Pollock Halls New College RESNET Sick Children’s Hospital Medical School Little France BUSH 2Mbit/s 100Mbit/s 1000Mbit/s e-science KB e-science AT e-science ACF 10 Gb/s

11 October … and data too 155 TB SAN available to all e-science researchers total investment £2.2M

11 October Pulling it together: SRIF2 created the Advanced Computing Facility secure site outside Edinburgh contains SAN and HPC servers total investment £3.8M all nodes will be NGS clients

11 October …but you need people Professor Peter Clarke Chair of e-Science Dr Lorna Smith e-Science research Dr Phil Clark e-Science Lectureship Sean McGeever senior e-Science computing support

Glasgow Investment in e- Science Infrastructure Dr Richard Sinnott

11 October Glasgow e-Science Infrastructure Consolidating resources Story started with building around ScotGrid  Providing shared Grid resource for wide variety of scientists inside/outside Glasgow –HEP, CS, BRC, EEE, … »Target shares established »Non-contributing groups encouraged Hardware 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and Mbit/s ethernet 1TB disk LTO/Ultrium Tape Library Cisco ethernet switches New.. IBM X Series 370 PIII Xeon with 32 x 512 MB RAM 5TB FastT500 disk 70 x 73.4 GB IBM FC Hot-Swap HDD eDIKT 28 IBM blades dual 2.4 GHz Xeon with 1.5GB memory eDIKT 6 IBM X Series 335 dual 2.4 GHz Xeon with 1.5GB memory CDF 10 Dell PowerEdge GHz Xeon with 1.5GB memory CDF 7.5TB Raid disk ScotGrid [ Disk ~15TB CPU ~ 330 1GHz ] Over 1 million CPU hours completed (June 2004) Over 100,000 jobs completed Includes time out for major rebuilds Typically running at ~90% usage

11 October Glasgow e-Science Infrastructure Future Plans But not enough… Computer Services second HPC facility (128 processor)  (being procured) University SAN (50TB – 25TB mirrored at separate locations across campus)  (being procured) –~£850k investment Access to campus wide resources  Physics and astronomy training labs  NeSC training lab condor pool  EEE compute clusters and larger SMP machines  Computer services  others… NGS SRDG proposals for Scottish Grid Service infrastructure SBRN equipment funds …

11 October Glasgow e-Science Organisational Aspects Essential to nurture relationships across university at ALL levels Social issues almost as important for Grid success as technical issues Steering Committee Management Board Technical Board User Groups

11 October Glasgow e-Science People Plans E-Science Business Plan accepted by Glasgow Senior Management Group Identifies  Research Computing Director (new position)  E-Science applications co-ordinator  Grid Systems administrator (new position)  Underwriting of NeSC staff contracts  Underwriting existing Grid systems administrator positions  E-Science lectureship  Future funds for NeSC running costs To be funded through contributions from university wide e- Science activities and university Strategic Investment Funds University wide engagement and support of e-Science