Cliff Addison University of Liverpool Campus Grids Workshop October 2007 Setting the scene Cliff Addison.

Slides:



Advertisements
Similar presentations
Overview of local security issues in Campus Grid environments Bruce Beckles University of Cambridge Computing Service.
Advertisements

3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)
NW-GRID, HEP and sustainability Cliff Addison Computing Services July 2008
Rob Allan Daresbury Laboratory NW-GRID Training Event 26 th January 2007 NW-GRID Future Developments R.J. Allan CCLRC Daresbury Laboratory.
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
The UCL Condor Pool Experience John Brodholt 1, Paul Wilson 3, Wolfgang Emmerich 2 and Clovis Chapman Department of Earth Sciences, University College.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Grid Infrastructure in the UK Neil Geddes. Why this talk ? LHC to 2020 –GridPP to 2011 –SRIF3 to 2010 ? Who was successful in SRIF3? –Thereafter ? PPARC.
An overview of the EGEE project Bob Jones EGEE Technical Director DTI International Technology Service-GlobalWatch Mission CERN – June 2004.
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
NICLS: Development of Biomedical Computing and Information Technology Infrastructure Presented by Simon Sherman August 15, 2005.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
15th January, NGS for e-Social Science Stephen Pickles Technical Director, NGS Workshop on Missing e-Infrastructure Manchester, 15 th January, 2007.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Shaping Future Science Introducing the National Science and Innovation Campuses.
Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen – Louisiana State University Rob.
EGEE-II INFSO-RI Enabling Grids for E-sciencE NA2 Meeting Prague, 28 November 2007 Centrale Recherche S.A., Ecole Centrale Paris.
CSED Computational Science & Engineering Department CHEMICAL DATABASE SERVICE The Current Service is Well Regarded The CDS has a long and distinguished.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
John Womersley John Womersley Director, Science Programmes Science and Technology Facilities Council Technology Gateway Centres.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
Web: OMII-UK Collaborations Workshop 2009 CSP Meeting 2008 courtesy of Martin Turner.
Andrew McNabNorthGrid, GridPP8, 23 Sept 2003Slide 1 NorthGrid Status Andrew McNab High Energy Physics University of Manchester.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 North West Grid Overview R.J. Allan CCLRC Daresbury Laboratory A world-class Grid.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
Authors: Ronnie Julio Cole David
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Introduction to NW-GRID R.J. Allan CCLRC Daresbury Laboratory.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
SURAGrid Project Meeting Washington, DC Wednesday, February 22, 2006 Barry Wilkinson Department of Computer Science UNC-Charlotte SURAGrid and Grid Computing.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Combining the strengths of UMIST and The Victoria University of Manchester “Use cases” Stephen Pickles e-Frameworks meets e-Science workshop Edinburgh,
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
Rob Allan Daresbury Laboratory NW-GRID Training Event 26 th January 2007 Next Steps R.J. Allan CCLRC Daresbury Laboratory.
The National Grid Service Mike Mineter.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
RC ICT Conference 17 May 2004 Research Councils ICT Conference The UK e-Science Programme David Wallace, Chair, e-Science Steering Committee.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
VizNET and Visualization activities at STFC Lakshmi Sastry Applications Group E-Science Centre Science & Technology Facilities Council
SCI-BUS project Pre-kick-off meeting University of Westminster Centre for Parallel Computing Tamas Kiss, Stephen Winter, Gabor.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
Innovate UK our offer to GM Businesses
University of Cyprus UCY
MAX IV Laboratory National Synchrotron Light Source – two storage rings (1.5 & 3.0 GeV) and a Short Pulse Facility Characteristics of User Program: 15.
LINX project Total budget: EUR 11 MIO Industry led priorities
The National Grid Service Mike Mineter NeSC-TOE
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

Cliff Addison University of Liverpool Campus Grids Workshop October 2007 Setting the scene Cliff Addison

University of Liverpool Campus Grid Workshop October 2007 A Unique North-West Project: advancing Grid Technologies and Applications Top end: HPCx and CSAR Hooks to other Grid consortia: NGS, WRG Applications and industry Sensor networks and experimental facilities Technology “tuned to the needs of practicing scientists”. Pharma, meds, bio, social, env, CCPs User interfaces Desktop pools: Condor etc. Portals, client toolkits, active overlays Advanced Network technology Mid range: NW- GRID and local clusters

Cliff Addison University of Liverpool Campus Grid Workshop October 2007 The NW-Grid Project Aims and Partners ● Aims: – Establish, for the region, a world-class activity in the deployment and exploitation of Grid middleware – realise the capabilities of the Grid in leading edge academic, industrial and business computing applications – Leverage 100 posts plus £15M of additional investment ● Project Partners: – Daresbury Laboratory: CSED and e-Science Centre – Lancaster University: Management School, Physics, e- science and Computer Science – University of Liverpool: Physics and Computer Services – University of Manchester: Research Computing, Computer Science, Chemistry, Bio-informatics

Cliff Addison University of Liverpool Campus Grid Workshop October 2007 The NW-GRID Project Funding Project Funding: – North West Development Agency – £5M over 4 years commencing April 2004 – £2M capital for systems at four participating sites with initial systems in year 1 (Jan 2006) and upgrades in year 3 (Jan 2008) – £3M for staff – about 15 staff for 3 years Complemented by “Teragrid competitive” private Gbit/s link among sites.

Cliff Addison University of Liverpool Campus Grid Workshop October 2007 Condor Pools ● Often an excellent starting point for a Campus Grid. ● General issues for pooled systems: – Which PC’s to allocate to a pool? ● How decide when a Condor job can run? – Who can run jobs? – What executables can they run? – How is the input / output handled? – Energy issues?

Cliff Addison University of Liverpool Campus Grid Workshop October 2007 Other interesting topics ● Mark Calleja’s (Cambridge) ideas: – Virtualisation ● Can applications be sand-boxed in a “safe” OS? ● Do we run node images (i.e. application+OS) rather than simple applications? – Where do we go from here? ● How avoid “painting Campus Grids into a corner”? ● Where obtain future funding? ● How integrate with external grids?

Cliff Addison University of Liverpool Campus Grid Workshop October 2007 Ideas for today ● Most talks have a 45 minute slot – I hope there is lots of time for questions, but try to leave these to the end. ● Lunch is at 12:30 ● Tea / coffee at 15:00 to go along with the discussion. ● I’ll attempt to record questions that have been asked and bring these back into discussion at the end of the day. ● Ideas have also been raised about a follow-up “work- in” meeting to discuss practical issues – likely at Manchester in the December-January timescale, but nothing confirmed yet.