Dr Arthur Trew EPCC Director +44 131 650 5025 A Research Computing Infrastructure for Edinburgh.

Slides:



Advertisements
Similar presentations
HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
Advertisements

24-Apr-03UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS SECURITY SOFTWARE MAINTENANCE BACKUP.
IBM Software Group ® Integrated Server and Virtual Storage Management an IT Optimization Infrastructure Solution from IBM Small and Medium Business Software.
STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
National e-Science Centre Arthur Trew Director, EPCC Deputy Director, NeSC.
Cloud Storage in Czech Republic Czech national Cloud Storage and Data Repository project.
MUNIS Platform Migration Project WELCOME. Agenda Introductions Tyler Cloud Overview Munis New Features Questions.
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
State Data Center Re-scoped Projects With emphasis on reducing load on cooling systems in OB2 April 4, 2012.
IBM Energy & Environment © 2008 IBM Corporation Energy Efficiency in the Data Centre … and beyond Peter Richardson UK Green Marketing Leader.
Virtualization Across The Enterprise Rob Lowden Director, Enterprise Infrastructure Indiana University 23 May 2007.
CPSC 2031 What is a computer? A machine that processes information.
State of Library Technology June 2003 IT Infrastructure David Leonian Library Information Systems, Tech Support.
SUMS Storage Requirement 250 TB fixed disk cache 130 TB annual increment for permanently on- line data 100 TB work area (not controlled by SUMS) 2 PB near-line.
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
Approaching a Data Center Guy Almes meetings — Tempe 5 February 2007.
Research Computing with Newton Gerald Ragghianti Nov. 12, 2010.
Results Matter. Trust NAG. Numerical Algorithms Group Mathematics and technology for optimized performance Andrew Jones IDC HPC User Forum, Imperial College.
1 Introduction To The New Mainframe Stephen S. Linkin Houston Community College ©HCCS & IBM® 2008 Stephen Linkin.
Selling Consolidation’s Value. Why Consolidate? Reduce Complexity Increase Productivity Reduce TCO Improve End User Experience Improve IT Performance.
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
ICT Strategy, Business Plan & Business Case for Community Information System Siobhan Hanna May 2009.
HPCx: Multi-Teraflops in the UK A World-Class Service for World-Class Research Dr Arthur Trew Director.
Gurcharan S. Khanna Director of Research Computing RIT
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
How Setup a Lab for VEA? Veritas Volume Manager is a Great Product from Symantec and we will see what we need setup a Home LAB. By AIK IT Support.
Meeting the Data Protection Demands of a 24x7 Economy Steve Morihiro VP, Programs & Technology Quantum Storage Solutions Group
Small File File Systems USC Jim Pepin. Level Setting  Small files are ‘normal’ for lots of people Metadata substitute (lots of image data are done this.
Wayne Hogan National Storage Manager Sun Microsystems of Canada, Inc.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
Computational Science at Edinburgh From Excellence to Enterprise Dr Arthur Trew Director.
Stephen Dart LaRDS Service Manager Monash e-Research Centre LaRDS Staging Post Enhancing Workgroup Productivity.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
Never Down? A strategy for Sakai high availability Rob Lowden Director, System Infrastructure 12 June 2007.
HPCVL High Performance Computing Virtual Laboratory Founded 1998 as a joint HPC lab between –Carleton U. (Comp. Sci.) –Queen’s U. (Engineering) –U. of.
Using Virtual Servers for the CERN Windows infrastructure Emmanuel Ormancey, Alberto Pace CERN, Information Technology Department.
Future of e-Science Malcolm Atkinson Director 18 th March 2004.
Hosted by 2004 Purchasing Intentions Survey Mark Schlack Editorial Director, Storage Media Group TechTarget.
Campus Network Development Network Architecture, Universal Access & Security.
1 eDIKT, NeSC & AstroGrid what is eDIKT? how do NeSC and eDIKT fit together? eDIKT/NeSC and AstroGrid Phase B?
E-Science: our vision of the Future Arthur Trew Deputy-Director, NeSC Director, EPCC e-Science = scientific discovery enabled by extreme IT = needs big.
September 19, 2002CSG - Seattle1 Toward Sustainable Models for Funding IT in Higher Education Common Solutions Group September 19, 2002 Jack McCredie.
2004 Purchasing Intentions Survey Mark Schlack Editorial Director, Storage Media Group TechTarget.
Virtualization: Friend or Foe Neal Puff Chief Information Officer Yuma County, AZ September 30, 2009.
National Library of the Czech Republic as End-User of the Research Networks Adolf Knoll deputy director
Brunel University, School of Engineering and Design, Uxbridge, UB8 3PH, UK Henry Nebrensky (not a systems manager) SIRE Group.
Building Cyberinfrastructure into a University Culture EDUCAUSE Live! March 30, 2010 Curt Hillegas Director, TIGRESS HPC Center Princeton University 1.
HPCx:an Overview Dr Arthur Trew Director, EPCC. 210 February 2004IBM Team Talent Meeting what is HPCx? HPCx is the latest in a series of HPC services.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
10 Feb 2004Team Talent Review1 agenda OPEN SESSION 1300 Introduction & Roadmaps Kenway On-going projects 1305 OGSA-DAIT + Web Services Resource Framework.
DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.
Luminex Virtual Tape Storage System Brian Sullivan Director of Computer Operations Broward County Public Schools 1.
Professor Arthur Trew Director, EPCC EPCC: KT in novel computing.
Science Support for Phase 4 Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
Parallel Libraries on BlueGene/L Lorna Smith EPCC
ASCC Site Report Eric Yen & Simon C. Lin Academia Sinica 20 July 2005.
Virtual Server Server Self Service Center (S3C) JI July.
Architecture of a platform for innovation and research Erik Deumens – University of Florida SC15 – Austin – Nov 17, 2015.
Making Best Use of Existing Spaces to Meet Basic Need Asset Survey Process 12/03/15.
LAM and Short Range NWP Some thoughts from Brussels
JASMIN Success Stories
The National Grid Service
Brown County General Hospital Virtualization Project
Presentation transcript:

Dr Arthur Trew EPCC Director A Research Computing Infrastructure for Edinburgh

who are we building this for? local researchers, eg. –Computational: UoE HPC service, BlueGene/L, BlueDwarf … –Data: ScotGrid, QCDGrid, eDIKT, … national research consortia –Computational: QCDOC, HPCx, DEISA –Data: NDCC visitors –Computational: HPC-Europa visitor programme –Data: eSI … new projects/bids –HECToR, Brain Imaging, systems biology … –next generation machine design …

what’s the problem? we thought it was networking and CPU … but, a survey of users added data … BIG TIME!... and this projection only included projects with secured funding TB

in summary while individual needs varied all required some mixture of –more CPU/memory –more data storage –faster access researchers were used to buying their own CPU servers so, our strategy was to complement this by providing a support infrastructure Stage 1: we used £1.68 M of SRIF1 to: –create a fast, research network in parallel with edlan –install a research data storage facility

SRIF KB SRIF AT The last mile: SRIF1 JANET BAR Royal Edinburgh Hospital Library Appleton Tower Kings Buildings Holyrood Robson Building EaStMan Router Old College Western General Pollock Halls New College RESNET Sick Children’s Hospital Medical School Little France BUSH 2Mbit/s 100Mbit/s 1000Mbit/s

… and data too 155 TB SAN + 36 TB tape backup –available to all e-science researchers total investment SRIF1 + project funds: £2.2M

short of space … but all the new facilities could not be fitted in the JCMB machine room £4.2M from SRIF2 to refurbish a new research computing facility

SRIF KB SRIF AT SRIF KB SRIF AT ACF 10 Gb/s The last mile … Phase 2 JANET BAR Royal Edinburgh Hospital Library Appleton Tower Kings Buildings Holyrood Robson Building EaStMan Router Old College Western General Pollock Halls New College RESNET Sick Children’s Hospital Medical School Little France BUSH 2Mbit/s 100Mbit/s 1000Mbit/s

the research SAN KB AT SRIF/ SJ4 Fibrechannel 10 Gb/s HPCx iSCSI NAS QCDOCBG/L SAN

what facilities are available to me? CPU –UoE HPC service (lomond) – 52-pe Sun E15000 –BlueGene/L (bluesky) – 2,000-pe IBM R&D machine –capability computing for the initiated data –research SAN – 155TB (disk), 36TB (tape) Sun 6290 networking –SRIF network secure machine room space … me with your needs and we’ll see what we can do other facilities by arrangement with project owners –BlueDwarf, ScotGrid,

a free lunch? the ACF is a strategic University facility –and hence open to all researchers although a real capital asset, the recurrent support for –facilities management –maintenance, power, space charges … –and, perhaps, user support and applications porting/tuning … has to come from project funds what is it you want/need?