University of Bristol Jon Wakelin Information Services & Dept. Physics.

Slides:



Advertisements
Similar presentations
Engineers.
Advertisements

Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
Condor use in Department of Computing, Imperial College Stephen M c Gough, David McBride London e-Science Centre.
3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)
© University of Reading David Spence 20 April 2014 e-Research: Activities and Needs.
The Reading e-Science Centre Jon Blower Reading e-Science Centre Environmental Systems Science Centre University of Reading United Kingdom.
© University of Reading David Spence 20 April 2014 E-Science Update.
© University of Reading IT Services ITS Support for e­ Research Stephen Gough Assistant Director of IT Services 18 June 2008.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH Home server AFS using openafs 3 DB servers. Web server AFS Mail Server.
Desktop U-M September 28, 2011 Ryan Henyard ITS – Desktop Infrastructure.
OeRC and the South East Regional e-Research Consortium Anne Trefethen, David Wallom.
Campus Grids and fEC (1) How to persuade Depts/others to donate CPU resources –Bartering/exchange of other resources (as OeRC) –development of ‘Tokens’
ITS Training and Awareness Session Research Support Jeremy Maris & Tom Armour ITS
Dr. David Wallom Use of Condor in our Campus Grid and the University September 2004.
RCAC Research Computing Presents: DiaGird Overview Tuesday, September 24, 2013.
NPACI Panel on Clusters David E. Culler Computer Science Division University of California, Berkeley
3 3 3 CHAPTER System Software. 3 © The McGraw-Hill Companies, Inc Objectives System software Programs, Functions, Categories Utilities Device drivers.
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
New Cluster for Heidelberg TRD(?) group. New Cluster OS : Scientific Linux 3.06 (except for alice-n5) Batch processing system : pbs (any advantage rather.
6. & 7. Teams: Technical Specification / Schedule Project Title Team Member 1 Team Member 2 Team Member 3 Team Member 4 Department of Computer Science.
6. & 7. Team Technical Specifications and Schedule Wayne Dyksen Brian Loomis Department of Computer Science and Engineering Michigan State University Spring.
What do researchers do with IT?What do they want from ITS? Heidi Fraser-Krauss University of York.
Condor and Distributed Computing David Ríos CSCI 6175 Fall 2011.
SSP Capacity Planning. One powerful machine with everything.
Virtualization Dr. John P. Abraham Professor. Grid computing Multiple independent computing clusters which act like a “grid” because they are composed.
1.5 Engage Year 5. Prior Year Challenges staffing issues – Rynge transition to ISI in Nov/Dec ’09 – new hire started 7/12/2010; new to Grid computing,
Projects. High Performance Computing Projects Design and implement an HPC cluster with one master node and two compute nodes. (Hint: use Rocks HPC Cluster.
C O L L E G E O F E N G I N E E R I N G CSU PDI 2010 Thin Clients as Desktop Computers Mark R. Ritschard Director, Engineering Network Services College.
1 Developing a Data Management Plan C&IT Resources for Data Storage and Data Security Patrick Gossman Deputy CIO for Research January 16, 2014.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
A Comparison of Linux vs. Windows Bhargav A. Sorathiya B.E. 4 th C.E. Roll no:6456.
V IRTUALIZATION Sayed Ahmed B.Sc. Engineering in Computer Science & Engineering M.Sc. In Computer Science.
The SLAC Cluster Chuck Boeheim Assistant Director, SLAC Computing Services.
John Kewley e-Science Centre CCLRC Daresbury Laboratory 28 th June nd European Condor Week Milano Heterogeneous Pools John Kewley
Campus Grids Report OSG Area Coordinator’s Meeting Dec 15, 2010 Dan Fraser (Derek Weitzel, Brian Bockelman)
Configuration Management with Cobbler and Puppet Kashif Mohammad University of Oxford.
Condor and DRBL Bruno Gonçalves & Stefan Boettcher Emory University.
Instruction Set Virtualization
Condor: High-throughput Computing From Clusters to Grid Computing P. Kacsuk – M. Livny MTA SYTAKI – Univ. of Wisconsin-Madison
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Grid and Cloud Computing Globus Provision Dr. Guy Tel-Zur.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
The GRID and the Linux Farm at the RCF HEPIX – Amsterdam HEPIX – Amsterdam May 19-23, 2003 May 19-23, 2003 A. Chan, R. Hogue, C. Hollowell, O. Rind, A.
The Roadmap to New Releases Derek Wright Computer Sciences Department University of Wisconsin-Madison
IP-Over-USB Gateway Ben Greenberg Bartosz Mach Adviser: Prof. Vincenzo Liberatore Case Western Reserve University Dept. of Electrical Engineering and Computer.
Southgrid Technical Meeting Pete Gronbech: 24 th October 2006 Cambridge.
McGraw-Hill Technology Education © 2006 by the McGraw-Hill Companies, Inc. All rights reserved. 55 CHAPTER SYSTEM SOFTWARE.
Introduction of a Web-CMS (Content Management System) to a HEP environment Carsten Germer DESY IT WebOffice.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
 Computer is an electronic tool that can accept, process, and accumulate data which can produce a result or output.  Computer System is a combination.
LBT Q Eng/SW Review CIN/NIN -Computer and Network Infrastructure.
VO-enabled spectroscopy tools Ivo Busko Science Software Branch STScI.
What is O.S Introduction to an Operating System OS Done by: Hani Al-Mohair.
Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC.
1 Sammie Carter Department of Computer Science N.C. State University November 18, 2004
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
Campus Grid Technology Derek Weitzel University of Nebraska – Lincoln Holland Computing Center (HCC) Home of the 2012 OSG AHM!
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Reading e-Science Centre Technical Director Jon Blower ESSC Director Rachel Harrison CS Director Keith Haines ESSC Associated Personnel External Collaborations.
Dag Toppe Larsen UiB/CERN CERN,
Dag Toppe Larsen UiB/CERN CERN,
5 SYSTEM SOFTWARE CHAPTER
Chapter 4.
MOSWeb: clustering server
5 SYSTEM SOFTWARE CHAPTER
کتابهای تازه خریداری شده دروس عمومی 1397
Presentation transcript:

University of Bristol Jon Wakelin Information Services & Dept. Physics

Background Originally Dept. Clusters –Linked to together via Globus –Very similar to OeRC Campus Grid (D. Wallom) Arrival of HPC –Fewer isolated clusters around University Move to Condor – Already a number of isolated around the University

Systems DeptNodesOS Biochemistry*218Linux, Solaris, Windows Computer Science + 97Linux Civil Engineering*60Windows Electronic and Electrical Engineering 280Linux Mathematics24Linux Physics*50Windows Total729

Issues 1 If the goal is to create a Campus Wide Resource –… then the main issue would be to meet the requirements of all users on all resources –The pools have grown up in isolation with each dept meeting the needs of their own local user base CompSci require NFS and Java EEE require a license server & matlab Physics used AFS (may use NFS) –Although some users could easily use everyone resources (Biochem) –Both technical and political issue Technically how easy is it to provide all of these requirements How willing are depts. to support software for other (non-local) users

Issues 2 Departmental Engagement –Related to the previous point –EEE – Paid for externally Will not flock to other depts –Maths - Political issues Use desktop machines (reluctant to allow none departmental use) –CompSci – Support Condor on best efforts basis