Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.

Slides:



Advertisements
Similar presentations
The Storage Resource Broker and.
Advertisements

NGS computation services: API's,
The National Grid Service and OGSA-DAI Mike Mineter
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
The Storage Resource Broker and.
Peter Berrisford RAL – Data Management Group SRB Services.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre
Experience of the SRB in support of collaborative grid computing Martin Dove University of Cambridge.
Data Grid: Storage Resource Broker Mike Smorul. SRB Overview Developed at San Diego Supercomputing Center. Provides the abstraction mechanisms needed.
1 Concepts of Condor and Condor-G Guy Warner. 2 Harvesting CPU time Teaching labs. + Researchers Often-idle processors!! Analyses constrained by CPU time!
INFSO-RI Enabling Grids for E-sciencE Concepts of grid computing Guy Warner NeSC Training Team
The Community Authorisation Service – CAS Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
LUNARC, Lund UniversityLSCS 2002 Transparent access to finite element applications using grid and web technology J. Lindemann P.A. Wernberg and G. Sandberg.
GlueX Computing GlueX Collaboration Meeting – Glasgow Edward Brash – University of Regina August 4 th, 2003.
Slides for Grid Computing: Techniques and Applications by Barry Wilkinson, Chapman & Hall/CRC press, © Chapter 1, pp For educational use only.
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
1-2.1 Grid computing infrastructure software Brief introduction to Globus © 2010 B. Wilkinson/Clayton Ferner. Spring 2010 Grid computing course. Modification.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
Distributed Systems Early Examples. Projects NOW – a Network Of Workstations University of California, Berkely Terminated about 1997 after demonstrating.
Oxford Interdisciplinary e-Research Centre I e R C OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom Campus Grid Manager.
Connecting OurGrid & GridSAM A Short Overview. Content Goals OurGrid: architecture overview OurGrid: short overview GridSAM: short overview GridSAM: example.
Web: OMII-UK Campus Grid Toolkit NW-GRID Campus Grids Workshop 31 st October 2007 University of Liverpool Tim Parkinson.
Grids and Portals for VLAB Marlon Pierce Community Grids Lab Indiana University.
Computational grids and grids projects DSS,
Grid tool integration within the eMinerals project Mark Calleja.
The National Grid Service Guy Warner.
Grid Computing at The Hartford Condor Week 2008 Robert Nordlund
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
Data and storage services on the NGS Mike Mineter Training Outreach and Education
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Application Development on Grids Gergely Sipos MTA SZTAKI Hungarian.
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Next Steps: becoming users of the NGS Mike Mineter
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Creating and running an application.
The National Grid Service Mike Mineter.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
Introduction to The Storage Resource.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Tool Integration with Data and Computation Grid “Grid Wizard 2”
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
The National Grid Service Mike Mineter.
The Storage Resource Broker and.
1 GridFTP and SRB Guy Warner Training, Outreach and Education Team, Edinburgh e-Science.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
INFSO-RI Enabling Grids for E-sciencE Middleware Concepts and Services in Campus, National and International Grids Mike Mineter.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Overview of gLite, the EGEE middleware Mike Mineter Training Outreach Education National.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
1 eScience Grid Environments th May 2004 NESC - Edinburgh Deployment of Storage Resource Broker at CCLRC for E-science Projects Ananta Manandhar.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
CernVM and Volunteer Computing Ivan D Reid Brunel University London Laurence Field CERN.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Accessing the VI-SEEM infrastructure
Data services on the NGS
The National Grid Service Mike Mineter NeSC-TOE
Presentation transcript:

Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006

2 Thanks to: Mark Calleja, Cambridge David McBride, Imperial College David Wallom, Oxford for descriptions of their campus initiatives.

3 Overview Harvesting CPU time –Computers within an “administrative domain” Campus Grids –An example: OxGrid –Middleware for campus grids Some opportunities and implications

4 Harvesting CPU time Teaching labs. + Researchers Often-idle processors!! Analyses constrained by CPU time!

5 Harvesting CPU time Teaching lab machines lie idle for most of the time Harvest spare compute cycles to create a low-cost “high throughput computing” (HTC) platform –Goal: run many tasks in a week, month, … –Typically: many similar tasks invoked from workflow or a script Monte-Carlo Simulation – parameter sweeps Pool processors as a batch processing resource Submit jobs that run when a machine is free Condor most common approach –

6 Example: viewshed analyses Derive viewsheds for all points in “digital elevation model” (DEM) Build a database to allow –Derivation of indices to characterise viewsheds –Applications to access pre-calculated viewsheds Viewsheds: what can be seen from point at “+”

7 Example: viewshed analyses Condor Viewshed creation Coordinator Starts,monitors Condor jobs. Builds database intermediate files Condor DEM database Worker processors running Windows Coordinating processor Typical run: 39 PCs (Windows) 330 CPU days in 9.94 days elapsed

8 Campus grids Single sign-on to virtualised resources in multiple administrative domains Need AA mechanisms –X 509 certificates… –For users that only wish to access internal (university) resources, a Kerberos CA (e.g. Oxford, Imperial College) Need brokering – where should a job be run? Needs information systems Scalability requires each VO or institute contributes its average requirement

Oxford Interdisciplinary e-Research Centre I e R C Example: OxGrid, a University Campus Grid Single entry point for Oxford users to shared and dedicated resources Seamless access to National Grid Service and OSC for registered users Single sign-on using PKI technology integrated with current methods NGSOSC OxGrid Central Management Resource Broker MDS/ VOM Storage College Resources Departmental Resources Oxford Users David Wallom

10 Middleware for campus grids Globus toolkit –Tools built on Grid Security Infrastructure and include: Job submission (GRAM) : run a job on a remote computer Information services: So I know which computer to use Storage Resource Broker –Virtual filesystem: for files held in multiple locations –NIEES offers a testbed to give SRB experience SRB and Globus Toolkit 2 are part of the NGS stack

11 Globus A software toolkit: a modular “bag of technologies” –Made available under liberal open source license Not turnkey solutions, but building blocks and tools for application developers and system integrators International production grids are (currently) based on the Globus Toolkit release 2 Globus Alliance:

12 Globus is a Toolkit To submit a job to run on a remote computer globus-job-submit grid-data.rl.ac.uk/jobmanager-pbs /bin/hostname -f globus-job-status DONE globus-job-get-output grid-data12.rl.ac.uk NEED –Brokers to allow jobs to be submitted to “a grid” –Portals… to empower those interested in their research rather than UNIX and scripting languages!

13 Storage Resource Broker SRB server Resource Driver SRB server Resource Driver SRB server Resource Driver Filesystems in different admin. domains User sees a virtual filesytem: –Command line (S-Commands) –MS Windows (InQ) –Web based (MySRB). –Java (JARGON) –Web Services (MATRIX)

14 How SRB Works MCAT Database MCAT Server SRB A Server SRB B Server SRB Client a b cd e f g 4 major components: –The Metadata Catalogue (MCAT) –The MCAT-Enabled SRB Server –The SRB Storage Server –The SRB Client

15 Overview Harvesting CPU time –Computers within an “administrative domain” Grid enabling –Computers –Data held by collaborating researchers Files, in this talk, databases deferred to the next talk! Some implications

16 Before you start! Need specific initial user communities >> vague sense this is a good idea! Engage with systems managers from first thoughts –Operations effort must be factored in and justifiable! Researcher enthusiasm is not enough! Be alert to National Grid Service deployment of middleware, and potential for interoperability

17 Motivations Most campus grids are motivated by enhanced use of resources – e.g. to save purchasing new clusters Campus grid gives an infrastructure spanning multiple institutes Potential for empowering collaborations –Enabling easier access / reuse of research data –Sharing services – data, computation,… Platform for –Interdisciplinary research –Higher level services + ontologies/semantics + workflow

18 Summary By using established middleware …. –Condor: high-throughput computing –Globus Toolkit: basis for resource sharing across admin domains… i.e. a grid –Storage Resource Broker: virtual filesystems ….can build admin and researcher experience in distributed, collaborative computing …gaining the advantages of using the same stack as the National Grid Service –Expertise and support available –Smaller leaps to partnership in and use of the NGS ….and once the infrastructure exists –Enhanced potential for interdisciplinary research