Holding slide prior to starting show. GECEM: Grid-Enabled Computational Electromagnetics David W. Walker School of Computer Science Cardiff University.

Slides:



Advertisements
Similar presentations
Holding slide prior to starting show. Education,Training & Awareness Raising at Welsh e-Science Centre John Oliver – Commercial Coordinator.
Advertisements

3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)
C3.ca in Atlantic Canada Virendra Bhavsar Director, Advanced Computational Research Laboratory (ACRL) Faculty of Computer Science University of New Brunswick.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
Visualisation in Grid Enabled Optimisation and Design Search for Engineering Marc Molinari, Sam Gould University of Southampton
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Network Management Overview IACT 918 July 2004 Gene Awyzio SITACS University of Wollongong.
Problem-Solving Environments: The Next Level in Software Integration David W. Walker Cardiff University.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
Workshop on Cyber Infrastructure in Combustion Science April 19-20, 2006 Subrata Bhattacharjee and Christopher Paolini Mechanical.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
UNICORE UNiform Interface to COmputing REsources Olga Alexandrova, TITE 3 Daniela Grudinschi, TITE 3.
.NET Mobile Application Development Introduction to Mobile and Distributed Applications.
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
Centre for Earth Systems Engineering Research Infrastructure Transitions Research Consortium (ITRC) David Alderson & Stuart Barr What is the aim of ITRC?
Visualisation & Grid Applications of Electromagnetic Scattering from Aircraft Mark Spivack (PI), Andrew Usher, Xiaobo Yang, Mark Hayes CeSC, DAMTP & BAE.
INFSO-SSA International Collaboration to Extend and Advance Grid Education ICEAGE Forum Meeting at EGEE Conference, Geneva Malcolm Atkinson & David.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Holding slide prior to starting show. A Grid-based Problem Solving Environment for GECEM Maria Lin and David Walker Cardiff University Yu Chen and Jason.
Holding slide prior to starting show. e-Research at Cardiff - Welsh e-Science Centre John Oliver Commercial Coordinator.
Holding slide prior to starting show. Grid Projects at WeSC: Synergies and Opportunities David W. Walker School of Computer Science Cardiff University.
Fisheries Oceanography Collaboration Software Donald Denbo NOAA/PMEL-UW/JISAO Presented by Nancy Soreide NOAA/PMEL AMS 2002/IIPS 10.3.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Enabling Grids for E-sciencE EGEE-III INFSO-RI Using DIANE for astrophysics applications Ladislav Hluchy, Viet Tran Institute of Informatics Slovak.
Service - Oriented Middleware for Distributed Data Mining on the Grid ,劉妘鑏 Antonio C., Domenico T., and Paolo T. Journal of Parallel and Distributed.
Composing workflows in the environmental sciences using Web Services and Inferno Jon Blower, Adit Santokhee, Keith Haines Reading e-Science Centre Roger.
Large Scale Nuclear Physics Calculations in a Workflow Environment and Data Provenance Capturing Fang Liu and Masha Sosonkina Scalable Computing Lab, USDOE.
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
Holding slide prior to starting show. A Portlet Interface for Computational Electromagnetics on the Grid Maria Lin and David Walker Cardiff University.
Policy Based Data Management Data-Intensive Computing Distributed Collections Grid-Enabled Storage iRODS Reagan W. Moore 1.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
© Geodise Project, University of Southampton, Geodise Middleware & Optimisation Graeme Pound, Hakki Eres, Gang Xue & Matthew Fairman Summer 2003.
CORBA1 Distributed Software Systems Any software system can be physically distributed By distributed coupling we get the following:  Improved performance.
Grid-Powered Scientific & Engineering Applications Ho Quoc Thuan INSTITUTE OF HIGH PERFORMANCE COMPUTING.
GEON2 and OpenEarth Framework (OEF) Bradley Wallet School of Geology and Geophysics, University of Oklahoma
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
System Development & Operations NSF DataNet site visit to MIT February 8, /8/20101NSF Site Visit to MIT DataSpace DataSpace.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
Partnerships in Innovation: Serving a Networked Nation Grid Technologies: Foundations for Preservation Environments Portals for managing user interactions.
Grid Remote Execution of Large Climate Models (NERC Cluster Grid) Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre
DS-Grid: Large Scale Distributed Simulation on the Grid Georgios Theodoropoulos Midlands e-Science Centre University of Birmingham, UK Stephen John Turner,
Holding slide prior to starting show. Lessons Learned from the GECEM Portal David Walker Cardiff University
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poznan, Poland EGEE’07, Budapest, Oct.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poland EGEE’08 Conference, Istanbul, 24 Sep.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
March, The C3 GRID An investment in the future of Canadian R&D Infrastructure.
2003 NTHU IEEM 1 Enterprise Integration Collaborative Product Design – Using Access Grid Project as an Example Group No.11 : 林彥伯 (Gilbert)
Visualization Tools for Nuclear Engineering Data Tom Fogal May 3 rd, 2011.
2nd GEO Data Providers workshop (20-21 April 2017, Florence, Italy)
Computing Clusters, Grids and Clouds Globus data service
An Approach to Software Preservation
What are they? The Package Repository Client is a set of Tcl scripts that are capable of locating, downloading, and installing packages for both Tcl and.
MATLAB Distributed, and Other Toolboxes
e-Health Platform End 2 End encryption
VirtualGL.
Pipeline Execution Environment
Joseph JaJa, Mike Smorul, and Sangchul Song
UK e-Science OGSA-DAI November 2002 Malcolm Atkinson
Grid Portal Services IeSE (the Integrated e-Science Environment)
University of Technology
An Introduction to Software Architecture
Large Scale Distributed Computing
Status of Grids for HEP and HENP
Data Management Components for a Research Data Archive
Grid Computing Software Interface
Presentation transcript:

Holding slide prior to starting show

GECEM: Grid-Enabled Computational Electromagnetics David W. Walker School of Computer Science Cardiff University

Contributing Project Partners At WeSC: –David Walker –Jon Giddy At University of Wales, Swansea: –Nigel Weatherill –Jason Jones At BAE SYSTEMS, Advanced Technology Centre, Filton: –David Rowse –Alan Gould –Michael Turner

Why CEM? CEM is important in civil and defence sectors. Complex electronic systems are key to platforms such as the More Electric Aircraft and the All Electric Ship. Response of systems to lightning strikes and EM pulses. CEM simulations of these systems are usually computationally intensive.

Why Use the Grid? Industrial and academic partners form an “extended enterprise” in which resources are intrinsically distributed, and only partially shared. Partners may be prepared to share data, but not the hardware and proprietary software that produces the data.

GECEM Objectives To use and develop Grid technology to enable large scale and globally distributed science and engineering. Key areas of interest include: –Performance and fault tolerance. –Secure remote execution. –Collaborative analysis and visualisation. –Easy and uniform access to resources via a problem-solving environment.

Typical GECEM Mode of Use Geometry created at location A (e.g., IGES, STEP, FLITE format). Geometry data transferred to location B where corresponding mesh is generated. Mesh is transferred to location C where CEM simulation determines solution on mesh. Simulation results are visualised collaboratively at multiple locations.

GECEM Prototype Grid Geometry data UWS WeSC Other locations BAE SYSTEMS Create geometry Generate mesh CEM simulation MeshOutput

GECEM Production Grid Other locations Geometry data UWS Singapore BAE SYSTEMS Create geometry Generate mesh CEM simulation MeshOutput

Issues in Remote Execution Performance and resilience. Mainly network issues. Security. Would like to migrate code to remote resource, execute it, and return results with minimum risk of unauthorised interference.

Issues in Collaborative Analysis and Visualisation Two modes of use: –“Follow the leader” –Independent exploration. Asynchronous analysis and visualisation – need to be able to annotate data with metadata (in addition to provenance data). Need to create and maintain archive of annotated output data sets.

Issues in PSEs PSE provides user access to GECEM data, software, and hardware resources. –Data includes geometries, meshes, and output. –Software includes mesh generator, CEM solvers, analysis and visualisation tools. Provides visual environment for composing applications. Provides tools for discovery and monitoring of GECEM Grid.

GECEM Demo Based on Globus Toolkit 2 (GT2). Uses UK e-Science Certificate Authority. Initiates GECEM session from client (a laptop) via script. Generates mesh and PML layer at UWS. Sends mesh to WeSC. Migrates CEM radar cross-section solver (RDS) to WeSC. Executes CEM solver at WeSC Send output back to client where it in visualised.

Use of GT2 The following GT2 commands are used: –grid-proxy-init –grid-copy-url for copying files –globus-job-run for controlling remote execution We aim to move to a service-based architecture using GT3 in the near future.

The Output Status results are returned to the client throughout execution. Results of CEM simulation are returned to client and visualised.

Visualisation

Future Work The demo maps out the basic features of the GECEM Grid but there is a lot of work to do in all areas. Migration from GT2 to service-based GT3. Incorporation of Grid resources at Singapore Institute of HPC into GECEM Grid. PSE needs to be developed. Collaborative visualisation software need to be evaluated and developed if necessary. Secure migration and execution.

Concluding Remarks GECEM aims to demonstrate a globally distributed extended enterprise based on Grid technologies. A simple script-based demonstration of the main features of the GECEM Grid has been developed. Need messaging system through which services can interact. GECEM PSE will control all aspects of the users access to GECEM resources.