Using ICENI to run parameter sweep applications across multiple Grid resources Murtaza Gulamali Stephen McGough, Steven Newhouse, John Darlington London.

Slides:



Advertisements
Similar presentations
L ondon e-S cience C entre Application Scheduling in a Grid Environment Nine month progress talk Laurie Young.
Advertisements

Condor use in Department of Computing, Imperial College Stephen M c Gough, David McBride London e-Science Centre.
WS-JDML: A Web Service Interface for Job Submission and Monitoring Stephen M C Gough William Lee London e-Science Centre Department of Computing, Imperial.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
ICENI: An Open Grid Services Architecture Implemented with Jini William Lee, Nathalie Furmento, Anthony Mayer, Steven Newhouse and John Darlington London.
The Community Authorisation Service – CAS Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Grids and Grid Technologies for Wide-Area Distributed Computing Mark Baker, Rajkumar Buyya and Domenico Laforenza.
A quick introduction to CamGrid University Computing Service Mark Calleja.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Grid for Coupled Ensemble Prediction (GCEP) Keith Haines, William Connolley, Rowan Sutton, Alan Iwi University of Reading, British Antarctic Survey, CCLRC.
Future UK e-Science Grid Middleware Dr Steven Newhouse London e-Science Centre Department of Computing, Imperial College London.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
ICENI Overview & Grid Scheduling Laurie Young London e-Science Centre Department of Computing, Imperial College.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
DISTRIBUTED COMPUTING
Service-enabling Legacy Applications for the GENIE Project Sofia Panagiotidi, Jeremy Cohen, John Darlington, Marko Krznarić and Eleftheria Katsiri.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
1 EPCC Sun Data and Compute Grids Project Using Sun Grid Engine and Globus to Schedule Jobs Across a Combination of Local.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
NeSC, Nov 2003 Bristol Regional e-Science Centre: progress and plans Mark Birkinshaw University of Bristol.
Tuning GENIE Earth System Model Components using a Grid Enabled Data Management System Andrew Price University of Southampton UK.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
Performance Architecture within ICENI Dr Andrew Stephen M c Gough Laurie Young, Ali Afzal, Steven Newhouse and John Darlington London e-Science Centre.
Grid Computing at The Hartford Condor Week 2008 Robert Nordlund
Issues in (Financial) High Performance Computing John Darlington Director Imperial College Internet Centre Fast Financial Algorithms and Computing 4th.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Composing workflows in the environmental sciences using Web Services and Inferno Jon Blower, Adit Santokhee, Keith Haines Reading e-Science Centre Roger.
Grids - the near future Mark Hayes NIEeS Summer School 2003.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
Holding slide prior to starting show. A Portlet Interface for Computational Electromagnetics on the Grid Maria Lin and David Walker Cardiff University.
Authors: Ronnie Julio Cole David
Applications & a Reality Check Mark Hayes. Applications on the UK Grid Ion diffusion through radiation damaged crystal structures (Mark Calleja, Mark.
GO-ESSP Workshop, LLNL, Livermore, CA, Jun 19-21, 2006, Center for ATmosphere sciences and Earthquake Researches Construction of e-science Environment.
A Portable Regional Weather and Climate Downscaling System Using GEOS-5, LIS-6, WRF, and the NASA Workflow Tool Eric M. Kemp 1,2 and W. M. Putman 1, J.
Service-oriented Resource Broker for QoS-Guaranteed in Grid Computing System Yichao Yang, Jin Wu, Lei Lang, Yanbo Zhou and Zhili Sun Centre for communication.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Scheduling Architecture and Algorithms within ICENI Laurie Young, Stephen McGough, Steven Newhouse, John Darlington London e-Science Centre Department.
Predictable Workflow Deployment Service Stephen M C Gough Ali Afzal, Anthony Mayer, Steven Newhouse, Laurie Young London e-Science Centre Department of.
Testing Grid Software on the Grid Steven Newhouse Deputy Director.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
International Symposium on Grid Computing (ISGC-07), Taipei - March 26-29, 2007 Of 16 1 A Novel Grid Resource Broker Cum Meta Scheduler - Asvija B System.
Light Weight Grid Platform: Design Methodology Vladimir Getov University of Westminster.
Grid Appliance The World of Virtual Resource Sharing Group # 14 Dhairya Gala Priyank Shah.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
Perspectives in Computational Earth System Science an oceanographer’s view Aike Beckmann Division of Geophysics, Department of Physical Sciences
Performance guided scheduling in GENIE through ICENI
Page 1© Crown copyright 2005 Met Office plans for sea ice model development within a flexible modelling framework Helene Banks Martin Best, Ann Keen and.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
DS-Grid: Large Scale Distributed Simulation on the Grid Georgios Theodoropoulos Midlands e-Science Centre University of Birmingham, UK Stephen John Turner,
Mapping of Scientific Workflow within the e-Protein project to Distributed Resources London e-Science Centre Department of Computing, Imperial College.
RealityGrid: An Integrated Approach to Middleware through ICENI Prof John Darlington London e-Science Centre, Imperial College London, UK.
On Advanced Scientific Understanding, Model Componentisation and Coupling in GENIE Sofia Panagiotidi, Eleftheria Katsiri and John Darlington.
Workflow Enactment in ICENI Dr Andrew Stephen M C Gough Laurie Young, Ali Afzal, Steven Newhouse and John Darlington London e-Science Centre 2 nd September.
Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC.
Virtual Organisations for Trials and Epidemiological Studies (VOTES) Overview VOTES is a pioneering project investigating the application of Grid technology.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
Shibboleth Use at the National e-Science Centre Hub Glasgow at collaborating institutions in the Shibboleth federation depending.
Group # 14 Dhairya Gala Priyank Shah. Introduction to Grid Appliance The Grid appliance is a plug-and-play virtual machine appliance intended for Grid.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
Clouds , Grids and Clusters
Multiple Views of Workflow through ICENI
Grid Computing.
Presentation transcript:

Using ICENI to run parameter sweep applications across multiple Grid resources Murtaza Gulamali Stephen McGough, Steven Newhouse, John Darlington London e-Science Centre Department of Computing, Imperial College London Case Studies on Grid Applications – GGF10

2 Contents 1.The GENIE project 2.The ICENI middleware 3.GENIE as an ICENI application 4.Summary and conclusions 5.Acknowledgements

3 The GENIE project Background  Grid ENabled Integrated Earth system model.  Investigate long term changes to the Earth’s climate (i.e. global warming) by integrating numerical models of various components of the Earth system. 3D atmosphere 3D ocean 2D sea ice Atmospheric CO 2 2D land surface Land biogeochemistry Ocean biogeochemisty Ocean sediments 3D ice sheets Schematic diagram of model framework for GENIE. Courtesy of T. Lenton, CEH Edinburgh, UK.

4 The GENIE project Background  Grid ENabled Integrated Earth system model.  Investigate long term changes to the Earth’s climate (i.e. global warming) by integrating numerical models of various components of the Earth system.  Require a Grid infrastructure to: –flexibly couple together components to form a unified Earth System Model (ESM). –execute the resultant ESM efficiently and accurately. –archive and share the resultant data produced by the model. –provide a high-level open access system to allow a virtual organisation of Earth System modellers to collaborate.

5 The GENIE project Previous scientific work  Investigate the vulnerability of the thermohaline circulation of the world ocean using a prototype model consisting of just 3 coupled components.  Run simulation across two different parameter ranges.  perform 31  31 = 961 individual simulations.  parameter sweep application!

6 The GENIE project Previous e-scientific work  Provided Grid infrastructure to support this activity… –flocked Condor pool between three institutions. –web-portal to allow experiment management. –database management system (based on Geodise) to allow data archiving and retrieval.  Disadvantages of this infrastructure… –firewalls!… between institutions hosting Condor pools. –web-portal not very flexible… model and parameter choices hard-coded. –true resource brokering not taking place… all compute and database resources belonging to virtual organisation not utilised.  Solution: use ICENI middleware

7 The ICENI middleware Background  IC e-Science Networked Infrastructure.  Developed by LeSC Grid Middleware Group.  Service oriented Grid middleware.  Represents compute, storage and software resources as services.  Services can communicate using standard protocols (eg. Jini, SOAP, JXTA).  ICENI provides an end-to-end middleware consisting of: –Grid service infrastructure –dynamic service management framework –application toolkit

8 The ICENI middleware Application development in ICENI  ICENI uses a component programming model to describe Grid applications.  application development  application composition  Example: linear equation solver matrix source vector source linear equation solver vector sink linear equation solver Cholsky decomposition LU decomposition linear equation solver

9 The ICENI middleware Application development in ICENI service listcomposition paneparameters

10 GENIE as an ICENI application Parameter sweep as component app. setup component GENIE binary component archive component GENIE binary component GENIE binary component splitter component collator component

11 GENIE as an ICENI application Executing over multiple resources Beowulf cluster setup component GENIE binary component archive component GENIE binary component resource launcher Condor pool splitter component collator component

12 GENIE as an ICENI application Results  Using ICENI, ran 4 GENIE parameter sweep experiments on Beowulf Cluster (using Sun Grid Engine) and Linux PC based Condor pool. –Sun Grid Engine: 481 jobs –Condor pool: 480 jobs –Total: (31  31 =)961 jobs  Find that ICENI takes ~2 minutes to schedule and submit jobs to both high throughput job managers.  Each experiment took ~5 days to run.

13 Summary and conclusions  Are able to execute GENIE parameter sweep experiments across multiple resources administered by members of virtual organisation.  Execution time same as before but: –Can leverage all the flexibility of a service oriented Grid middleware. –Can create ICENI Grid based on resources owned and federated by collaborators in the virtual organisation. –Don’t have to contend with firewalls… (sort of)

14 Acknowledgements  My co-authors: –Dr. Stephen McGough, Dr. Steven Newhouse, Prof. John Darlington.  The ICENI development team: –  The GENIE team: –