Peter Coveney Paris, 31 March 2003 Best Practice Project Rapid Prototyping of Usable Middleware Peter Coveney Centre for Computational.

Slides:



Advertisements
Similar presentations
The e-Framework Bill Olivier Director Development, Systems and Technology JISC.
Advertisements

The National Grid Service Mike Mineter.
Peter Coveney Paris, 31 March 2003 What makes Grid computing difficult? Peter Coveney Centre for Computational Science University.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
University of Southampton Electronics and Computer Science M-grid: Using Ubiquitous Web Technologies to create a Computational Grid Robert John Walters.
This product includes material developed by the Globus Project ( Introduction to Grid Services and GT3.
GEODE Workshop 16 th January 2007 Issues in e-Science Richard Sinnott University of Glasgow Ken Turner University of Stirling.
Intelligent Grid Solutions 1 / 18 Convergence of Grid and Web technologies Alexander Wöhrer und Peter Brezany Institute for Software.
Grid Execution Management for Legacy Code Applications Exposing Application as Grid Services Porto, Portugal, 23 January 2007.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
The OMII Position At the University of Southampton.
4b.1 Grid Computing Software Components of Globus 4.0 ITCS 4010 Grid Computing, 2005, UNC-Charlotte, B. Wilkinson, slides 4b.
15th January, NGS for e-Social Science Stephen Pickles Technical Director, NGS Workshop on Missing e-Infrastructure Manchester, 15 th January, 2007.
The OMII Perspective on Grid and Web Services At the University of Southampton.
AHM /09/05 AHM 2005 Automatic Deployment and Interoperability of Grid Services G.Kecskemeti, Yonatan Zetuny, G.Terstyanszky,
Connecting OurGrid & GridSAM A Short Overview. Content Goals OurGrid: architecture overview OurGrid: short overview GridSAM: short overview GridSAM: example.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Data Management Kelly Clynes Caitlin Minteer. Agenda Globus Toolkit Basic Data Management Systems Overview of Data Management Data Movement Grid FTP Reliable.
Chapter 6 Operating System Support. This chapter describes how middleware is supported by the operating system facilities at the nodes of a distributed.
OPEN GRID SERVICES ARCHITECTURE AND GLOBUS TOOLKIT 4
1 AHE Server Deployment and Hosting Applications Stefan Zasada University College London.
DISTRIBUTED COMPUTING
GT Components. Globus Toolkit A “toolkit” of services and packages for creating the basic grid computing infrastructure Higher level tools added to this.
COMP3019 Coursework: Introduction to GridSAM Steve Crouch School of Electronics and Computer Science.
Crystal-25 April The Rising Power of the Web Browser: Douglas du Boulay, Clinton Chee, Romain Quilici, Peter Turner, Mathew Wyatt. Part of a.
1 Overview of the Application Hosting Environment Stefan Zasada University College London.
London e-Science Centre GridSAM A Standards Based Approach to Job Submission A. Stephen M C Gough Imperial College London A Standards Based Approach to.
London e-Science Centre GridSAM Job Submission and Monitoring Web Service William Lee, Stephen McGough.
The National Grid Service Guy Warner.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
Stephen Pickles Technical Director, Grid Operations Support Centre University of Manchester Neil Geddes CCLRC Head of e-Science Director of the UK Grid.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Supporting education and research Security and Authentication for the Grid Alan Robiette, JISC Development Group.
Wrapping Scientific Applications As Web Services Using The Opal Toolkit Wrapping Scientific Applications As Web Services Using The Opal Toolkit Sriram.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
1 Joining the NGS Stephen Pickles Technical Director, GOSC GOSC face-to-face meeting, NeSC, 28/10/2004.
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
LEGS: A WSRF Service to Estimate Latency between Arbitrary Hosts on the Internet R.Vijayprasanth 1, R. Kavithaa 2,3 and Raj Kettimuthu 2,3 1 Coimbatore.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Combining the strengths of UMIST and The Victoria University of Manchester “Use cases” Stephen Pickles e-Frameworks meets e-Science workshop Edinburgh,
The National Grid Service Mike Mineter.
Middleware for Campus Grids Steven Newhouse, ETF Chair (& Deputy Director, OMII)
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
Enabling e-Research in Combustion Research Community T.V Pham 1, P.M. Dew 1, L.M.S. Lau 1 and M.J. Pilling 2 1 School of Computing 2 School of Chemistry.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
Easy Access to Grid infrastructures Dr. Harald Kornmayer (NEC Laboratories Europe) Dr. Mathias Stuempert (KIT-SCC, Karlsruhe) EGEE User Forum 2008 Clermont-Ferrand,
Paul Graham Software Architect, EPCC PCP – The P robes C oordination P rotocol A secure, robust framework.
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
The National Grid Service Mike Mineter.
INFSO-RI JRA2 Test Management Tools Eva Takacs (4D SOFT) ETICS 2 Final Review Brussels - 11 May 2010.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
UK Grid Operations Support Centre All slides stolen by P.Clarke from a talk given by: Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Lightweight grid computing workshop, 3rd May 2006
Introduction to the Application Hosting Environment
Stephen Pickles Technical Director, GOSC
UK e-Science OGSA-DAI November 2002 Malcolm Atkinson
Grid Portal Services IeSE (the Integrated e-Science Environment)
The Anatomy and The Physiology of the Grid
Grid Systems: What do we need from web service standards?
Presentation transcript:

Peter Coveney Paris, 31 March 2003 Best Practice Project Rapid Prototyping of Usable Middleware Peter Coveney Centre for Computational Science University College London EPSRC Annual e-Science Meeting, 21 April, 2005

Peter Coveney Scientists developing middleware! Rapid prototyping of usable grid middleware (EPSRC funded, starts April 2005) Partners include Manchester, Southampton (Comb-e-Chem), Oxford (IB), … Robust application hosting under WSRF::Lite (OMII funded) Combined value £500K cash + £100K in kind support OMII = Open Middleware Infrastructure Institute (UK)

Peter Coveney Talk contents Grid computing--what is it? Problems with existing grid middleware The case for lightweight middleware Robust application hosting Enabling grid-based computational science Materials science example

Peter Coveney Grid Computing My preferred definition: Grid computing is distributed computing performed transparently across multiple administrative domains Notes: Computing means any activity involving digital information -- no distinction between numeric/symbolic, or numeric/data/viz Transparency implies minimal complexity for users of the technology See: Phil Trans R Soc London A (2005)

Peter Coveney Grid Computing Problem: No so-called “Grid” we use today fulfils this definition

Peter Coveney TeraGyroid Grid Visualization Computation Starlight (Chicago) Netherlight (Amsterdam) BT provision PSC ANL NCSA Phoenix Caltech SDSC UCL Daresbury Manchester SJ4 MB-NG Network PoP Access Grid node Service Registry production network Dual-homed system 10 Gbps 2 x 1 Gbps

Peter Coveney Computation Starlight (Chicago) Netherlight (Amsterdam) Leeds PSC SDSC UCL Network PoP Service Registry NCSA Manchester UKLight Oxford RAL US TeraGrid UK NGS Steering clients AHM 2004 Local laptops,PDAs, and Manchester vncserver All sites connected by production network (not all shown) Grid infrastructure Both the US TeraGrid and UK NGS use GT2 middleware STIMD Grid

Peter Coveney Problems for users lack of a common API for usable core functionality (e.g. file-transfer) across distinct grid applications and domains heterogeneous software stacks make grid-application portability a nightmare for users security -- high barrier for getting certificates accepted beyond the issuing domain (more tomorrow) non-uniform scheduling and job-launching resources and often incompatible policies in different admin domains complex grid middleware detrimental to scientific research, and contrary to the stipulated goals of grid computing

Peter Coveney Grid computing headaches Deployment It takes a long time and much effort by many people to get applications properly deployed Lots of things can go wrong Most people give up -- ROI too low Lack of persistence of grid infrastructure & capabilities Security issues (more in tomorrow’s talk) Clunky, not very usable Existing model not taken seriously by people who care about it

Peter Coveney How we build services on GT2 grids Globus Toolkit 2 has limited usable functionality, so we: Track specs & standards Provide functionality as easily as possible Put this on top of GT2 grid middleware We do NOT wait for heavyweight/generic solutions provided by others: GT3 (obsolescent) GT4 (yes, but when?) It’s a recipe for being sidelined indefinitely… Lightweight middleware: makes provision of a service oriented architecture a pleasant experience for all

Peter Coveney Lightweight middleware What do we mean by lightweight? Minimal dependencies on third-party software Small learning-curve for new users – obviate the need to learn new programming methods Interoperable with other WSRG implementations Easy to write, and so to adapt to new specs, etc. Original use of OGSI compliant services Now have WSRF::Lite (interoperable with other WSRG implementations) Tracks the evolving WSRF standards, implementing stable areas of the specifications

Peter Coveney Lightweight middleware OGSI::Lite/WSRF::Lite by Mark McKeown of Manchester University Lightweight OGSI/WSRF implementation, written in Perl uses existing software (eg for SSL) where possible; simple installation Necessary for all RealityGrid grid work Using OGSI::Lite (2003): Grid-based job submission and steering retrofitted onto the LB2D workstation class simulation code within a week Standards compliance: we were able to steer simulations from a web browser, with no custom client software needed Now developing extended capabilities using WSRF::Lite on US TeraGrid & UK NGS We have developed WEDS--a web service hosting environment for distributed simulation

Peter Coveney About WEDS Developed to make life easy for application scientists for once Easy to deploy – sits inside a WSRF::Lite container, has no additional software requirements Provides all the tools and glue required to: expose an unaltered binary application as a service create and interact with service instances Broker service manages creation of services, to load balance across a pool of machines For grid deployment, needs: security solution (WS-Security, TLS) and grid job submission tools (from OMII_1, others from GridSAM project) See Coveney et al., 2004, NeSC Tech Rpt

Peter Coveney WEDS Architecture Broker Machine Service Service Factory Client Wrapper Service Invoked Application Managed resource Each resource runs a WSRF::Lite container containing a WEDS machine service and factory services for each hosted application. Each machine that a user wishes to use is registered with a broker service The user contacts the broker with the details of the job to run The broker match-makes the job details with the capabilities advertised by each machine service and decides where to invoke the service The broker passes back the contact details of the service instance to the client

Peter Coveney About WEDS Can interact flexibly with OMII middleware OMII interface to WEDS resources WEDS broker will soon interact with GT2, OMII resources Delegation of file-transfer to existing transports (HTTP(S), FTP, GridFTP, etc) Provides C and Fortran API to allow an application programmer to expose a richer service interface to the application. Already hosted: LB2D, DL_MESO, NAMD, LAMMPS, CPMD RealityGrid steering will be incorporated as those tools move towards WSRF compliance

Peter Coveney OGSA/WSRF compliance In the main, the hosting environment is WSRF- and OGSA- compliant BUT we have to go outside these specifications (with DataProxy) because they require binary data to be moved within XML files! W3C has spotted the problem and is now proposing recommendations

Peter Coveney Transferring binary data World Wide Web Consortium Issues Three Web Services Recommendations January The World Wide Web Consortium (W3C) has published three new Web Services Recommendations: XML-binary Optimized Packaging (XOP), SOAP Message Transmission Optimization Mechanism (MTOM), and Resource Representation SOAP Header Block (RRSHB). These recommendations provides ways to efficiently package and transmit binary data included or referenced in a SOAP 1.2 message. Web Services Applications Need Effective, Standard Methods for Handling Binary Data

Peter Coveney Transferring binary data ”One of the biggest technical and performance issues for Web services occurs when a user or application is handling large binary files. Encoding binary data as XML produces huge files, which absorbs bandwidth and measurably slows down applications. For some devices, it slows down so much that the performance is considered unacceptable.”

Peter Coveney Robust application hosting Developing our lightweight hosting tools to meet the needs of applications scientists No preconceptions about the 'right way' to do things or pre-determined adherence to particular specifications or “work flows” Gain experience by working with real-world problems, refactoring design as required Projects/people we are collaborating with as “end-users” --Daniel Mason (Imperial) -- polystyrene-surface interactions (see demo) --CCP5’s DL-MESO Project (Rongshan Qin, DL) -- mesoscale modelling/simulation --Jonathan Essex (Southampton) -- NAMD for protein modelling --Integrative Biology EPSRC e-Science Project project --IBiS (Integrative Biological Simulation) BBSRC Bioinformatics & e-Science Project Close collaboration with OMII and its middleware

Peter Coveney The Polysteer Application Developed by Daniel Mason (Imperial; ReG partner) New Monte Carlo polymer simulation code –Create as many chain conformations as possible Task farming of configuration generation Equilibration is difficult from arbitrary start point –Need to watch chains relax Attach visualisation client Monte Carlo moves are complex –Modify parameters on-the-fly to optimise efficiency Attach steering client

Peter Coveney Lightweight hosting of Polysteer application

Peter Coveney Visualisation Showing each atom is unreadable Potentials treat CH x, Bz as single entities We visualise ellipsoids rather than spheres Visualisation client attaches to a running simulation Data transfer via files using ReG steering library -Fortran main code to Java visualiser

Peter Coveney Summary Lightweight middleware greatly facilitates deployment of applications on grids We’re now working with several “computational user communities” from physics through to biology All our middleware will be in the public domain

Peter Coveney Acknowledgements Matt Harvey, Laurent Pedesseau, Mark Mc Keown, Stephen Pickles, Daniel Mason, Jonathan Chin EPSRC OMII