Web: OMII-UK Campus Grid Toolkit NW-GRID Campus Grids Workshop 31 st October 2007 University of Liverpool Tim Parkinson.

Slides:



Advertisements
Similar presentations
The National Grid Service and OGSA-DAI Mike Mineter
Advertisements

Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
VO Support and directions in OMII-UK Steven Newhouse, Director.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
Web: OMII-UK LiveCD Demonstrations – Providing Access to Computational Resources for Researchers AHM 2009 Steve Crouch,
3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
High Performance Computing Course Notes Grid Computing.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
The Community Authorisation Service – CAS Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
CoreGRID Workpackage 5 Virtual Institute on Grid Information and Monitoring Services Authorizing Grid Resource Access and Consumption Erik Elmroth, Michał.
Network Management Overview IACT 918 July 2004 Gene Awyzio SITACS University of Wollongong.
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
Slides for Grid Computing: Techniques and Applications by Barry Wilkinson, Chapman & Hall/CRC press, © Chapter 1, pp For educational use only.
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Workload Management Massimo Sgaravatto INFN Padova.
Globus Computing Infrustructure Software Globus Toolkit 11-2.
15th January, NGS for e-Social Science Stephen Pickles Technical Director, NGS Workshop on Missing e-Infrastructure Manchester, 15 th January, 2007.
Enabling Grids for E-sciencE Medical image processing web portal : Requirements analysis. An almost end user point of view … H. Benoit-Cattin,
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over the Internet. Cloud is the metaphor for.
What is OMII-Europe? Qin Li Beihang University. EU project: RIO31844-OMII-EUROPE 1 What is OMII-Europe? Open Middleware Infrastructure Institute for Europe.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Software from Science for Science Steven Newhouse, Director.
DISTRIBUTED COMPUTING
Web: Towards Grid Interoperability Richard Boardman, Stephen Crouch, Hugo Mills, Steven Newhouse, Juri Papay and.
Grids and Portals for VLAB Marlon Pierce Community Grids Lab Indiana University.
VOX Project Status T. Levshina. Talk Overview VOX Status –Registration –Globus callouts/Plug-ins –LRAS –SAZ Collaboration with VOMS EDG team Preparation.
Through the development of advanced middleware, Grid computing has evolved to a mature technology in which scientists and researchers can leverage to gain.
COMP3019 Coursework: Introduction to GridSAM Steve Crouch School of Electronics and Computer Science.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Scalable Systems Software Center Resource Management and Accounting Working Group Face-to-Face Meeting October 10-11, 2002.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
Neil Witheridge APAN29 Sydney February 2010 ARCS Authorisation Services Neil Witheridge Manager, ARCS Authorisation Services APAN29, Sydney, February 2010.
London e-Science Centre GridSAM Job Submission and Monitoring Web Service William Lee, Stephen McGough.
Grid Workload Management Massimo Sgaravatto INFN Padova.
2009 Federal IT Summit Cloud Computing Breakout October 28, 2009.
WNoDeS – Worker Nodes on Demand Service on EMI2 WNoDeS – Worker Nodes on Demand Service on EMI2 Local batch jobs can be run on both real and virtual execution.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks David Kelsey RAL/STFC,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
OGF22 25 th February 2008 OGF22 Demo Slides Prof. Richard O. Sinnott Technical Director, National e-Science Centre University of Glasgow, Scotland
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Cooperative experiments in VL-e: from scientific workflows to knowledge sharing Z.Zhao (1) V. Guevara( 1) A. Wibisono(1) A. Belloum(1) M. Bubak(1,2) B.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Standards driven AAA for Job Management within the OMII-UK distribution Steven Newhouse Director, OMII-UK
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
3/12/2013Computer Engg, IIT(BHU)1 CLOUD COMPUTING-1.
The VL-e Proof of Concept Environment & The VL-e PoC Installer Jan Just Keijser System Integrator P4 Team NIKHEF.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
Grid Workload Management (WP 1) Massimo Sgaravatto INFN Padova.
1 Further information and next steps Further information and next steps Gergely Sipos MTA SZTAKI
Shibboleth Use at the National e-Science Centre Hub Glasgow at collaborating institutions in the Shibboleth federation depending.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Accessing the VI-SEEM infrastructure
Workload Management Workpackage
THE STEPS TO MANAGE THE GRID
Presentation transcript:

Web: OMII-UK Campus Grid Toolkit NW-GRID Campus Grids Workshop 31 st October 2007 University of Liverpool Tim Parkinson OMII-UK Southampton Operations Manager

Web: OMII-UK What is OMII-UK? o Collaboration between Open Source Software developers at Southampton, Manchester, Edinburgh. Mission: o OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its international collaborators. Software Solutions for e-Research Led by user / community requirements

Web: Characteristics of a Campus Grid Generally Speaking.... One or more o HPC Clusters o Condor Pools o Specialised Computation Devices or Data Resource o Owned and managed by a single institution o Subject to single AuthN regime and internal AuthZ policies? o Shared by users belonging to the institution o Applications / Services useful to users o Maintenance & support of that environment

Web: Uses of a Campus Grid Typically o To run large scale computations or simulations o That can be partitioned into scenarios or parameter sweeps. o That therefore benefit by running many scenarios in parallel or on larger machines. Reduce Time to Publication Obtain results that would be unobtainable in available time if done sequentially or on a smaller desktop.

Web: Requirements of Campus Grid Users Determined from a survey of the UK Campus Grid SIG membership o Chaired by David Wallom, OeRC Backed up by the SUPER report o Study of User Priorities for e-Infrastructure for e-Research (NeSC 2007) o pdf

Web: SIG Requirements Five areas: Applications How to transparently execute applications across heterogeneous resources to maximise throughput and minimise execution time. Identify common applications for shrink-wrapping. Security Users: Just want to login once and use what they are allowed to use transparently or to be refused gracefully. Admins: Want to be able to apply suitable authentication method and suitable access control policy. Want to support attribute based virtual organisations (VOMS / Shibboleth) to support collaborations.

Web: SIG Requirements Five areas (ctd) Accounting: o Admins: Driven by FEC. Associate usage with user. Ability to log, price, and ultimately to bill. o Users: Any such billing should be fair and trustworthy. Monitoring o Admins: Track resource utilisation and availability o Users: See what is happening to their jobs.

Web: SIG Requirements Five areas (ctd) Storage o Users: ability to seamlessly transfer data to and from a variety of distributed storage systems (such as SRB) or databases. o Admins: ability to configure and such storage mechanisms into job services and to monitor them.

Web: SUPER Requirements Similar to CG SIG Spectrum of User Interaction Styles o Web PortalsWrapped Applications o Desktop GUI o CLI o Scripting Languages o Programmatic APICustom Applications

Web: Quality Requirements Installability o Ease of installation and configuration. Reliability o Should stay up and running or at least fail gracefully Portability / Availability o Should work on a range of different architectures and operating systems and back end job managers. Scalability o Should work for personal installations up to Campus Wide and beyond.

Web: What is the Campus Grid Toolkit? An enhanced packaging of existing and future OMII-UK components, principally GridSAM, that provides o Consistent job submission across heterogeneous resources for the scientist (via OGF standard JSDL) over a Web Services interface. o Ease of installation and configuration for the administrator.

Web: What is the Campus Grid Toolkit Provides (ctd) o a range of interaction styles for the scientist (Desktop, CLI, Portal / Portlets) o a range of configurable security policies for the administrator (OMII-AuthZ, SPAM-GP, VOMS / Shibboleth integration). o a way to wrap legacy, unmodified applications (AHE, OGRSH) o a way to create new applications that access grid resources directly (SAGA and its scripting bindings)

Web: What is the Campus Grid Toolkit? A vehicle to deliver an integrated set of OMII-UK components that work together to enhance the scientist’s ability to submit large numbers of computational jobs to the resources available on campus and beyond in a seamless fashion. Should become the installation of choice for Campus Grid providers.

Web: First Release (Nov 2007) Address the Installability of GridSAM onto a Condor pool. GridSAM on Condor in a Box. Autoconfiguration Better example programs Address reliablity issues Prototype demonstrated at OGF21

Web: The Old Way – Component Based Follow an installation and configuration process (1-2 hours): 1. Install the OMII-Server bundle and select GridSAM as an option 2. Use temporary server certificate 3. Configure the Condor DRM yourself 4. Download and install the client 5. Use temporary client certificate 6. Test with uname application (too trivial) 7. Make client available to users 8. Replace temporary CA and cert with real ones

Web: The New Way Move towards one-step installation and configuration (10 mins): 1. Install CGT which offers the option to trust UK CA and configures server accordingly 2. Attempts to install trust for UK CA and your own ‘real’ certificate 3. Attempts to detect and auto-configure Condor DRM 4. Automatically installs and configures client to match the server that was just installed 5. Test with Mandelbrot set – a real application requiring significant processing time 6. Make pre-configured client available

Web: Future Plans (no particular order) Add one-step installation and auto-configure for AHE (Demo SC2007) Extend one-step installation to other DRMs (Globus, PBS/Torque, Platform LSF etc) Extend GridSAM to propagate user identity and to collect resource usage for jobs. Investigate ways to add grid monitoring, perhaps with OGM Investigate job service brokering perhaps with Knoogle or Grid-BS.

Web: Future Development Enhance all major OMII-UK components to interact with the main storage management solutions.

Web: Questions?