Performance guided scheduling in GENIE through ICENI

Slides:



Advertisements
Similar presentations
Supporting the UK e-Science community and their international collaborators Steven Newhouse.
Advertisements

Collaborative Orthopaedic Research Environment Lester Gilbert, Gary Wills, Yee-Wai Sim, Chu Wang, Matt Stenning School of Electronics and Computer Science.
ICENI: An Open Grid Services Architecture Implemented with Jini William Lee, Nathalie Furmento, Anthony Mayer, Steven Newhouse and John Darlington London.
E-Science Collaboration between the UK and China Paul Townend ( University of Leeds.
SLA-Oriented Resource Provisioning for Cloud Computing
NSF Site Visit HYDRA Using Windows Desktop Systems in Distributed Parallel Computing.
Dr. David Wallom Use of Condor in our Campus Grid and the University September 2004.
Ischia, Italy 9th - 21st July Context & Linking Tuesday 11 th July Malcolm Atkinson & David Fergusson.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Grid Programming Environment (GPE) Grid Summer School, July 28, 2004 Ralf Ratering Intel - Parallel and Distributed Solutions Division (PDSD)
Cloud based Dynamic workflow with QOS for Mass Spectrometry Data Analysis Thesis Defense: Ashish Nagavaram Graduate student Computer Science and Engineering.
EXTENDING SCIENTIFIC WORKFLOW SYSTEMS TO SUPPORT MAPREDUCE BASED APPLICATIONS IN THE CLOUD Shashank Gugnani Tamas Kiss.
The Pursuit for Efficient S/C Design The Stanford Small Sat Challenge: –Learn system engineering processes –Design, build, test, and fly a CubeSat project.
Connecting OurGrid & GridSAM A Short Overview. Content Goals OurGrid: architecture overview OurGrid: short overview GridSAM: short overview GridSAM: example.
ICENI Overview & Grid Scheduling Laurie Young London e-Science Centre Department of Computing, Imperial College.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Service-enabling Legacy Applications for the GENIE Project Sofia Panagiotidi, Jeremy Cohen, John Darlington, Marko Krznarić and Eleftheria Katsiri.
Building the Infrastructure Grid: Architecture, Design & Deployment Logan McLeod – Database Technology Strategist.
Young Suk Moon Chair: Dr. Hans-Peter Bischof Reader: Dr. Gregor von Laszewski Observer: Dr. Minseok Kwon 1.
NeSC, Nov 2003 Bristol Regional e-Science Centre: progress and plans Mark Birkinshaw University of Bristol.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
SUMA: A Scientific Metacomputer Cardinale, Yudith Figueira, Carlos Hernández, Emilio Baquero, Eduardo Berbín, Luis Bouza, Roberto Gamess, Eric García,
Using ICENI to run parameter sweep applications across multiple Grid resources Murtaza Gulamali Stephen McGough, Steven Newhouse, John Darlington London.
Tuning GENIE Earth System Model Components using a Grid Enabled Data Management System Andrew Price University of Southampton UK.
MediaGrid Processing Framework 2009 February 19 Jason Danielson.
Wenjing Wu Andrej Filipčič David Cameron Eric Lancon Claire Adam Bourdarios & others.
Loosely Coupled Parallelism: Clusters. Context We have studied older archictures for loosely coupled parallelism, such as mesh’s, hypercubes etc, which.
CMAQ Runtime Performance as Affected by Number of Processors and NFS Writes Patricia A. Bresnahan, a * Ahmed Ibrahim b, Jesse Bash a and David Miller a.
London e-Science Centre GridSAM Job Submission and Monitoring Web Service William Lee, Stephen McGough.
Evaluation of Agent Teamwork High Performance Distributed Computing Middleware. Solomon Lane Agent Teamwork Research Assistant October 2006 – March 2007.
Performance Architecture within ICENI Dr Andrew Stephen M c Gough Laurie Young, Ali Afzal, Steven Newhouse and John Darlington London e-Science Centre.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
GridSAM - A Standards Based Approach to Job Submission Through Web Services William Lee and Stephen McGough London e-Science Centre Department of Computing,
Composing workflows in the environmental sciences using Web Services and Inferno Jon Blower, Adit Santokhee, Keith Haines Reading e-Science Centre Roger.
Supporting Molecular Simulation-based Bio/Nano Research on Computational GRIDs Karpjoo Jeong Konkuk Suntae.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
The eMinerals minigrid and the national grid service: A user’s perspective NGS169 (A. Marmier)
A Dynamic Service Deployment Infrastructure for Grid Computing or Why it’s good to be Jobless Paul Watson School of Computing Science.
Applications & a Reality Check Mark Hayes. Applications on the UK Grid Ion diffusion through radiation damaged crystal structures (Mark Calleja, Mark.
© Geodise Project, University of Southampton, Geodise Middleware & Optimisation Graeme Pound, Hakki Eres, Gang Xue & Matthew Fairman Summer 2003.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
Integrating Computing Resources on Multiple Grid-enabled Job Scheduling Systems Through a Grid RPC System Yoshihiro Nakajima, Mitsuhisa Sato, Yoshiaki.
Virtual Lab for e-Science Towards a new Science Paradigm.
Scheduling Architecture and Algorithms within ICENI Laurie Young, Stephen McGough, Steven Newhouse, John Darlington London e-Science Centre Department.
MROrder: Flexible Job Ordering Optimization for Online MapReduce Workloads School of Computer Engineering Nanyang Technological University 30 th Aug 2013.
The impacts of climate change on global hydrology and water resources Simon Gosling and Nigel Arnell, Walker Institute for Climate System Research, University.
Investigating the Performance of Audio/Video Service Architecture I: Single Broker Ahmet Uyar & Geoffrey Fox Tuesday, May 17th, 2005 The 2005 International.
Testing Grid Software on the Grid Steven Newhouse Deputy Director.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
1 Grid Activity Summary » Grid Testbed » CFD Application » Virtualization » Information Grid » Grid CA.
Economic and On Demand Brain Activity Analysis on Global Grids A case study.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Grid Appliance The World of Virtual Resource Sharing Group # 14 Dhairya Gala Priyank Shah.
Distributed Handler Architecture (DHArch) Beytullah Yildiz Advisor: Prof. Geoffrey C. Fox.
Welcome to the PRECIS training workshop
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
The National Grid Service Mike Mineter.
Mapping of Scientific Workflow within the e-Protein project to Distributed Resources London e-Science Centre Department of Computing, Imperial College.
RealityGrid: An Integrated Approach to Middleware through ICENI Prof John Darlington London e-Science Centre, Imperial College London, UK.
On Advanced Scientific Understanding, Model Componentisation and Coupling in GENIE Sofia Panagiotidi, Eleftheria Katsiri and John Darlington.
Workflow Enactment in ICENI Dr Andrew Stephen M C Gough Laurie Young, Ali Afzal, Steven Newhouse and John Darlington London e-Science Centre 2 nd September.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
Group # 14 Dhairya Gala Priyank Shah. Introduction to Grid Appliance The Grid appliance is a plug-and-play virtual machine appliance intended for Grid.
OMII-BPEL Grid Services Orchestration using the Business Process Execution Language (BPEL) Liang Chen Bruno Wassermann Project Inspector: Wolfgang Emmerich.
UK Grid: Moving from Research to Production
Grid Computing.
The National Grid Service
Development of Information Grid
The National Grid Service Mike Mineter NeSC-TOE
Presentation transcript:

Performance guided scheduling in GENIE through ICENI

2 Contents 1.What is GENIE? 2.Previous work – Grid infrastructure 3.Limitations of present infrastructure 4.Introduction to ICENI 5.Performance experiments 6.Summary and conclusions 7.Future work

3 What is GENIE?  Grid ENabled Integrated Earth system model.  Investigate long term changes to the Earth’s climate (i.e. global warming) by integrating numerical models of the Earth system.  e-Science aims: Flexibly couple together state-of-the-art components to form unified Earth System Model (ESM). Execute resultant ESM on a Grid infrastructure. Share resultant data produced by simulation runs. Provide high-level open access to the system, creating and supporting virtual organisation of Earth System modellers.

4 GENIE model components

5 Previously in GENIE… Investigating the thermohaline circulation  Investigated influence of freshwater transport upon global ocean circulation  Performed several parameter sweep experiments, each consisting of ~1000 simulations of a GENIE prototype.  Used a Grid infrastructure: i.Portal – create, submit and manage experiments. ii.Condor pool – execute simulations in parallel. iii.Database management system – archive and process resultant data.

6 Previously in GENIE… Scientific achievements Intensity of the thermohaline circulation as a function of freshwater flux between Atlantic and Pacific oceans and mid-Atlantic and North Atlantic. Surface air temperature difference between extreme states (off - on) of the thermohaline circulation. North Atlantic 2  C colder when the circulation is off. New scientific findings  papers published!

7 Limitations of Grid infrastructure  GENIE model hard-coded – cannot use alternative models without recoding.  Format of input/output data not flexible.  Parameter space being investigated is fixed.  True resource brokering not taking place. Solution:

8 Advantages of ICENI  ICENI Netbeans client allows experiment to be built in a systematic and repeatable way.  Component based programming model provides flexibility and extensibility.  Service oriented architecture allows for true resource brokering and Grid enablement of application.

9 ICENI Binary Component  Allows you to wrap a binary executable as an ICENI component. binary executable JDML stdin files stdout stderr

10 A GENIE experiment as an ICENI application setup component GENIE binary component archive component GENIE binary component GENIE binary component splitter component collator component

11 A GENIE experiment as an ICENI application  Introduce high-throughput resource launcher… setup component archive component GENIE binary component resource launcher Sun Grid Engine Condor pool Bash LAUNCHES TO

12 Performance experiments  Ran 8 different types of experiments to evaluate performance of ICENI:  Each experiment run several times in order to obtain an average sojourn time. #ResourceJob TypeMiddleware 1 Solaris Bash Non-ICENI 2ICENI 3 Condor Non-ICENI 4ICENI 5 Linux Bash Non-ICENI 6ICENI 7 Condor Non-ICENI 8ICENI Solaris Shared memory server 8  900MHz UltraSparc II CPUs 16Gb memory Linux: Beowulf cluster 16  2GHz Intel Dual Xeon CPUs 2Gb memory

13 Performance results

14 Performance results Under Solaris Condor is ~5 times faster than Bash  sequential vs parallel execution.

15 Performance results Under Linux Condor and Bash perform similarly  only 2 nodes in Linux Condor pool.

16 Performance results ICENI introduces scheduling and deployment overhead to the sojourn time.

17 Summary  Shown how ICENI middleware can be used to launch GENIE experiments.  Performance overhead is insignificant when compared to advantages of using ICENI to deploy jobs.  Can create and schedule ensemble experiments across multiple computational resources using true resource brokering.

18 Future work  Need to repeat experiments with Sun Grid Engine launcher.  Need to incorporate component based GENIE model in experiments.  ICENI is evolving… Adopt new web services. Decouple into separate functionalities. Use GridSAM to launch experiments. (please visit the LeSC booth for a demo)

19 Acknowledgments  GENIE investigators: Prof. Paul Valdes (Bristol), Prof. John Shepherd (SOC, Southampton), Prof. Andrew Watson (UEA), Prof. Melvyn Cannell (CEH Edinburgh), Dr. Anthony Payne (Bristol), Prof. Richard Harding (CEH Wallingford), Prof. Simon Cox (SReSC), Dr. Steven Newhouse (OMII) and Prof. John Darlington (LeSC).  Recognised researchers: Dr. Stephen McGough (LeSC), Andrew Yool (SOC), Dr. Robert Marsh (SOC), Dr. Timothy Lenton (UEA) and Dr. Neil Edwards (Bern).