Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)

Slides:



Advertisements
Similar presentations
Grey Literature, Institutional Repositories and the Organisational Context Simon Lambert, Brian Matthews & Catherine Jones Business & Information Technology.
Advertisements

Australian Virtual Observatory A distributed volume rendering grid service Gridbus 2003 June 7 Melbourne University David Barnes School of Physics, The.
Shoaib Sufi CCLRC e-Science Centre CCLRC Scientific Metadata (CSMD) Model April 2004 NESC.
Peter Berrisford RAL – Data Management Group SRB Services.
UK e-Science All Hands Meeting 2005 Paul Groth, Simon Miles, Luc Moreau.
The UCL Condor Pool Experience John Brodholt 1, Paul Wilson 3, Wolfgang Emmerich 2 and Clovis Chapman Department of Earth Sciences, University College.
Experience of the SRB in support of collaborative grid computing Martin Dove University of Cambridge.
Daresbury Laboratory Enabling Science with Grid Technology Jamie Rintelman, Kerstin Kleese-Van Dam, Rik Tyer STFC-Daresbury Laboratory; Daresbury, Cheshire,
Solar and STP Physics with AstroGrid 1. Mullard Space Science Laboratory, University College London. 2. School of Physics and Astronomy, University of.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Brian Matthews, CRIS 2002, 31/08/02 1 Accessing the Outputs of Scientific Projects Brian Matthews, Michael Wilson, Business & Information Technology Dept,
The NERC Cluster Grid Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre Environmental Systems Science Centre.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Report : Zhen Ming Wu 2008 IEEE 9th Grid Computing Conference.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Web-based Virtual Research Environments (VRE): Supporting Collaboration in e-Science Xiaobo Yang, Rob Allan CCLRC e-Science Centre Daresbury Laboratory,
Environment from the Molecular Level: An e-science project for modelling the atomistic processes involved in environmental issues (funded by NERC)
Holding slide prior to starting show. A Grid-based Problem Solving Environment for GECEM Maria Lin and David Walker Cardiff University Yu Chen and Jason.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
Running Climate Models On The NERC Cluster Grid Using G-Rex Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre Environmental.
UK e-Science All Hands Meeting 2005 Paul Groth, Simon Miles, Luc Moreau.
Integrated e-Infrastructure for Scientific Facilities Kerstin Kleese van Dam STFC- e-Science Centre Daresbury Laboratory
DISTRIBUTED COMPUTING
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
John Kewley e-Science Centre CCLRC Daresbury Laboratory 28 th June nd European Condor Week Milano Heterogeneous Pools John Kewley
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Condor Birdbath Web Service interface to Condor
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
Grid tool integration within the eMinerals project Mark Calleja.
1 All-Hands Meeting 2-4 th Sept 2003 e-Science Centre The Data Portal Glen Drinkwater.
GGF-16 Athens Production Grid Computing in the UK Neil Geddes CCLRC Director, e-Science.
Web Services BOF This is a proposed new working group coming out of the Grid Computing Environments Research Group, as an outgrowth of their investigations.
Experiences with the Globus Toolkit on AIX and deploying the Large Scale Air Pollution Model as a grid service Ashish Thandavan Advanced Computing and.
PROGRESS: ICCS'2003 GRID SERVICE PROVIDER: How to improve flexibility of grid user interfaces? Michał Kosiedowski.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 North West Grid Overview R.J. Allan CCLRC Daresbury Laboratory A world-class Grid.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Composing workflows in the environmental sciences using Web Services and Inferno Jon Blower, Adit Santokhee, Keith Haines Reading e-Science Centre Roger.
Rob Allan Daresbury Laboratory A Web Portal for the National Grid Service Xiaobo Yang, Dharmesh Chohan, Xiao Dong Wang and Rob Allan CCLRC e-Science Centre,
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
Holding slide prior to starting show. A Portlet Interface for Computational Electromagnetics on the Grid Maria Lin and David Walker Cardiff University.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The eMinerals minigrid and the national grid service: A user’s perspective NGS169 (A. Marmier)
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
The Global Land Cover Facility is sponsored by NASA and the University of Maryland.The GLCF is a founding member of the Federation of Earth Science Information.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Michael Doherty RAL UK e-Science AHM 2-4 September 2003 SRB in Action.
Middleware for Campus Grids Steven Newhouse, ETF Chair (& Deputy Director, OMII)
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
Biomedical Informatics Research Network The Storage Resource Broker & Integration with NMI Middleware Arcot Rajasekar, BIRN-CC SDSC October 9th 2002 BIRN.
1 AHM, 2–4 Sept 2003 e-Science Centre GRID Authorization Framework for CCLRC Data Portal Ananta Manandhar.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
Grid Remote Execution of Large Climate Models (NERC Cluster Grid) Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre
Douglas Thain, John Bent Andrea Arpaci-Dusseau, Remzi Arpaci-Dusseau, Miron Livny Computer Sciences Department, UW-Madison Gathering at the Well: Creating.
PROGRESS: GEW'2003 Using Resources of Multiple Grids with the Grid Service Provider Michał Kosiedowski.
The National Grid Service Mike Mineter.
Holding slide prior to starting show. Lessons Learned from the GECEM Portal David Walker Cardiff University
Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
1 eScience Grid Environments th May 2004 NESC - Edinburgh Deployment of Storage Resource Broker at CCLRC for E-science Projects Ananta Manandhar.
Grid Portal Services IeSE (the Integrated e-Science Environment)
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)

AHM, Nottingham 2003 Royal Institution University of Reading Project Members

AHM, Nottingham 2003 Who we are One of Europe’s largest Research Support Organisations Provides large scale experimental, data and computing facilities Serves the UK research community both in academia and industry Annually support scientists from all major scientific domains 1800 members of staff over three sites: Rutherford Appleton Laboratory in Oxfordshire Daresbury Laboratory in Cheshire Chilbolton Observatory in Hampshire Large quantities of data associated with the various facilities Major e-Science centre in the UK Council for the Central Laboratory of the Research Councils

AHM, Nottingham 2003 Environmental Issues Radioactive waste disposal Crystal growth and scale inhibition Pollution: molecules and atoms on mineral surfaces Crystal dissolution and weathering

AHM, Nottingham 2003 Examples of Codes DL_POLY3: parallel molecular dynamics code. Modifications aimed at running efficient simulations with millions of atoms for simulations of radiation damage (Daresbury) SIESTA: Order-N quantum mechanics code. Objective to run with large samples and realistic fluids (Cambridge) SURFACE SIMULATIONS: New developments aimed at efficient scanning of many configurations of complex fluid-mineral interfaces, for studies of crystal growth and dissolution (Bath)

AHM, Nottingham 2003 Data Management Requirements Many output files produced from each simulation run Each set of input and output files is a dataset Need to keep information about each simulation – metadata Other scientists need access to this information and datasets Need to search different metadata repositories at once Access could be from anywhere in the world Need to categorise data so it can be found by someone else These requirements are same for all scientists

AHM, Nottingham 2003 Integrated Portals using web services External Applications DataPortal HPCPortal High Performance Computers on the GRID Metadata databases Web Services

AHM, Nottingham 2003 DataPortal Metadata Object Topic Study Description Access Conditions Data Location Data Description Related Material Discipline e.g. Earth Sciences/Soil Contamination/Heavy Metals/Arsenic Provenance about what the study is, who did it and when. Conditions of use providing information on who and how the data can be accessed. Detailed description of the organisation of the data into datasets and files. Locations providing a navigational to where the data on the study can be found. References into the literature and community providing context about the study. Scientific Metadata Model

AHM, Nottingham 2003 DataPortal – Use Cases DATA PORTAL Request Metadata Store Associated Data Files Transfer Data Files Multiple Metadata Repositories Scientist External Application Remote Machines

AHM, Nottingham 2003 Plan of Work High-requirement science High-performance codes Collaborative environment

AHM, Nottingham 2003 e-Minerals Minigrid and Portal Minigrid makes the shared computing resources available to the project through the UK e-Science grid (built using globus, incorporating storage resource broker) The e-Minerals minigrid is accessed through the e-Minerals portal, based on the HPCPortal and Dataportal developed at the Daresbury Laboratory The e-Minerals minigrid links to a storage resource broker to store the outputs of simulation runs

AHM, Nottingham 2003 e-Minerals Portal

AHM, Nottingham 2003 DataPortal Results

AHM, Nottingham 2003 Condor Technologies Condor: Mature-ish technology to build small or large distributed computing systems from standard desktop computers The important point is that condor can allow you to use idle time on desktops, and hence harness the potential of powerful processors

AHM, Nottingham 2003 The UCL Windows Condor Pool Runs WTS (Windows Terminal Server) Approximately 750 cpu’s in 30 clusters. Most are 1GHz Pentium 4, with 256/512Mb ram and 40Gb hard disks. All 90%+ underutilised and running 24/7… We are using condor to use this pool as a massive distributed computing system

AHM, Nottingham 2003 Technology Used Operating system: SuSE Linux 8.1 (kernel GB) Sun Microsystems J2SDK version 1.4 All Data Portal web services built and deployed under Apache Tomcat version using Apache Ant version Apache Axis as the SOAP engine For authentication MyProxy server used Systinet UDDI Server version 4.5 for Lookup Web Service PostgreSQL (version ) databases for Lookup, Session Manager, Access & Control, Shopping Cart Web Services HPCPortal services built using Globus 2 toolkit with GSoap2 libraries. Deployed under standard Apache http server

AHM, Nottingham 2003 Further Information Environment from the Molecular Level E-minerals Mini Grid (need a X.509 certificate) Integrated e-Science Environment Portal HPC Grid Services Portal DataPortal demonstration UK CCLRC e-Science Centre