© University of Reading 2007 www.reading.ac.uk David Spence 20 April 2014 E-Science Update.

Slides:



Advertisements
Similar presentations
Symantec 2010 Windows 7 Migration EMEA Results. Methodology Applied Research performed survey 1,360 enterprises worldwide SMBs and enterprises Cross-industry.
Advertisements

Symantec 2010 Windows 7 Migration Global Results.
Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)
UKCoRR meeting Kingston University, November 2007 Mary Robinson European Development Officer University of Nottingham, UK
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
The National Grid Service Mike Mineter.
NGS computation services: APIs and.
Condor use in Department of Computing, Imperial College Stephen M c Gough, David McBride London e-Science Centre.
The National Grid Service and OGSA-DAI Mike Mineter
18 April 2002 e-Science Architectural Roadmap Open Meeting 1 Support for the UK e-Science Roadmap David Boyd UK Grid Support Centre CLRC e-Science Centre.
Web: The Future of OMII-UK e-Science: the Changing Landscape 17 April 2009 Neil Chue Hong.
02/07/03 Grid Support Centre 1 UK Grid Support Centre Alistair Mills CLRC e-Science Centre
University of Bristol Jon Wakelin Information Services & Dept. Physics.
3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)
Social Sciences Collections & Research: a new content-based team Gillian Ridgley, Ian Cooke, Jerry Jenkins.
© University of Reading Systems Engineering 20 April 2014 Facebook & Blackboard Comparison of learner perspectives.
© University of Reading David Spence 20 April 2014 e-Research: Activities and Needs.
The Reading e-Science Centre Jon Blower Reading e-Science Centre Environmental Systems Science Centre University of Reading United Kingdom.
© University of Reading Information Technology Services 20 April 2014 Installing / Setting Up PCs.
South-east Regional e- Research Consortium (SeReRC) Keith Haines, Jon Blower, Dan Bretherton, Alastair Gemmell Reading e-Science Centre University of Reading.
E-Science Update Steve Gough, ITS 19 Feb e-Science large scale science increasingly carried out through distributed global collaborations enabled.
© University of Reading IT Services ITS Support for e­ Research Stephen Gough Assistant Director of IT Services 18 June 2008.
© University of Reading IT Services e-Research workshop Mike Roch, Director of IT Services 17 June 2009.
WAM25 – Walk-in access to e-resources in the M25 Consortium The M25 Consortium of Academic Libraries was formed in 1993 with the aim of.
SCARF Duncan Tooke RAL HPCSG. Overview What is SCARF? Hardware & OS Management Software Users Future.
A centre of expertise in digital information management UKOLN is supported by: UK Perspectives on the Curation and Preservation of Scientific.
Introduction for University Staff
UCL LIBRARY SERVICES Experimenting with the trial of a research data audit: some preliminary findings about data types, access to data and factors for.
Introduction for University Staff CiCS welcomes you to the University of Sheffield 12/06/2014Allan Wright © The University of Sheffield 1.
Is the journey the destination? RDM at St Andrews Birgit Plietzsch Research Computing Team Leader
OeRC and the South East Regional e-Research Consortium Anne Trefethen, David Wallom.
Campus Grids and fEC (1) How to persuade Depts/others to donate CPU resources –Bartering/exchange of other resources (as OeRC) –development of ‘Tokens’
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre
Issues and developments in ICT provision in libraries David Potts Senior Network Adviser The Museums, Libraries and Archives Council.
Embedding e-Science Applications - Designing and Managing for Usability Marina Jirotka (Principle Investigator) Anne Trefethen (Co-Investigator) Ralph.
Slides Prepared from the CI-Tutor Courses at NCSA By S. Masoud Sadjadi School of Computing and Information Sciences Florida.
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
Low Carbon ICT Services Kang Tang Oxford e-Research Centre.
Shared Print in the Orbis Cascade Alliance Kathi Carlisle Fountain Collection Services Program Manager Orbis Cascade Alliance.
The UK National Grid Service Using the NGS. Outline NGS Background Getting Certificates Acceptable usage policies Joining VO’s What resources will be.
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Grid Computing: Technology and Sociology at Large Scales Douglas Thain University of Notre Dame 5 November 2004.
Cluster Computers. Introduction Cluster computing –Standard PCs or workstations connected by a fast network –Good price/performance ratio –Exploit existing.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Horst Severini Chris Franklin, Josh Alexander University of Oklahoma Implementing Linux-Enabled Condor in Windows Computer Labs.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
Connecticut State Data Center at the Map and Geographic Information Center - MAGIC Connecticut State Data Center Affiliates Annual Meeting May 11, 2012.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Next Steps: becoming users of the NGS Mike Mineter
Next Steps.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
GRID activities in Wuppertal D0RACE Workshop Fermilab 02/14/2002 Christian Schmitt Wuppertal University Taking advantage of GRID software now.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
By Chi-Chang Chen.  Cluster computing is a technique of linking two or more computers into a network (usually through a local area network) in order.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
The National Grid Service Mike Mineter.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Cluster Computers. Introduction Cluster computing –Standard PCs or workstations connected by a fast network –Good price/performance ratio –Exploit existing.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre
Creating Grid Resources for Undergraduate Coursework John N. Huffman Brown University Richard Repasky Indiana University Joseph Rinkovsky Indiana University.
SCI-BUS project Pre-kick-off meeting University of Westminster Centre for Parallel Computing Tamas Kiss, Stephen Winter, Gabor.
Grid Means Business OGF-20, Manchester, May 2007
Collaborating with the Scholarly Commons
Cluster Computers.
Presentation transcript:

© University of Reading David Spence 20 April 2014 E-Science Update

e-Science Update2 Overview Reading Campus Grid Summary Campus Grid Update e-Science Mapping Other Developments The Future User Story: Kevin Hodges Discussion

e-Science Update Reading Campus Grid Summary Access to about 500 lab/library PCs while they are not in use by students – The machines are available all night as the labs are closed – Ideal for research that involves running a program many times, where each run takes about an hour or less – It will work with any program that can be run on Linux machines Uses combination of Condor and CoLinux to run on Windows – Technology developed in SSE by Chris Chapman and Ian Bland – ITS service by Chris de la Force & others 3

e-Science Update Campus Grid Update Condor view statistics available 4

e-Science Update Campus Grid Update Reading Campus Grid Now NGS Affiliate 5

e-Science Update Campus Grid Update Condor view statistics available Reading Campus Grid Now NGS Affiliate Link to Oxford Campus Grid and NGS resources – Using the NGS Interfaces – Technically complete with policy issues to finalise There has been about 11,500 CPU hours used in last 2 months; peak 200 CPUs Kevin to follow 6

e-Science Update e-Science Mapping Provide better software provision, resources and support to those working with e-Science To discover areas of academic and e-Science expertise, to enable inter-disciplinary collaborations to be formed inside and outside of the University using e-Science To inform the development of an e-Research agenda for the University So far 33 researchers and IT Staff interviewed/surveyed 7

e-Science Update Other Developments Providing support for users and groups to get started with e-Science /Grid: – e-Science Certificates – University resources (e.g. Campus Grid) – External resource (e.g. National Grid Support) ACET now have their BladeCentre, which is available for Collaborative University Research Projects – 3040 PowerPC GHz – 60 TByte storage – Myrinet Interconnect – 51st in the November 2008 top 500 list of supercomputers 8

e-Science Update The Future New Campus Grid developments Mapping will inform e-Research Plan e-Science technology allows new forms of collaboration – We are seeking to increase inter-disciplinary collaboration around e-Science. – The University of Reading is working closely with the Oxford e- Research Centre (OeRC) to promote collaboration between the two universities. 9