Creating and running an application.

Slides:



Advertisements
Similar presentations
The National Grid Service Mike Mineter.
Advertisements

NGS computation services: API's,
NGS computation services: APIs and.
The National Grid Service and OGSA-DAI Mike Mineter
Enabling, facilitating and delivering quality training in the UK and Internationally Mike Mineter Training Outreach and Education, NeSC, Edinburgh
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
MyProxy Guy Warner NeSC Training.
John Kewley CCLRC Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Accessing the NW-GRID (from Linux) John Kewley Grid Technology Group E-Science.
John Kewley CCLRC Daresbury Laboratory NW-GRID Training Event 26 th January 2007 GROWL Scripts and Web Services John Kewley Grid Technology Group E-Science.
Grid Resource Allocation Management (GRAM) GRAM provides the user to access the grid in order to run, terminate and monitor jobs remotely. The job request.
GlueX Computing GlueX Collaboration Meeting – Glasgow Edward Brash – University of Regina August 4 th, 2003.
INFSO-RI Enabling Grids for E-sciencE Security, Authorisation and Authentication Mike Mineter Training, Outreach and Education National.
Globus Toolkit 4 hands-on Gergely Sipos, Gábor Kecskeméti MTA SZTAKI
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
Slides for Grid Computing: Techniques and Applications by Barry Wilkinson, Chapman & Hall/CRC press, © Chapter 1, pp For educational use only.
Grids and Globus at BNL Presented by John Scott Leita.
OSG End User Tools Overview OSG Grid school – March 19, 2009 Marco Mambelli - University of Chicago A brief summary about the system.
Computational grids and grids projects DSS,
GRAM5 - A sustainable, scalable, reliable GRAM service Stuart Martin - UC/ANL.
INFSO-RI Enabling Grids for E-sciencE Workload Management System Mike Mineter
National Computational Science National Center for Supercomputing Applications National Computational Science NCSA-IPG Collaboration Projects Overview.
The National Grid Service Guy Warner.
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
August 13, 2003Eric Hjort Getting Started with Grid Computing in STAR Eric Hjort, LBNL STAR Collaboration Meeting August 13, 2003.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Security, Authorisation and Authentication.
Communicating Security Assertions over the GridFTP Control Channel Rajkumar Kettimuthu 1,2, Liu Wantao 3,4, Frank Siebenlist 1,2 and Ian Foster 1,2,3 1.
First attempt for validating/testing Testbed 1 Globus and middleware services WP6 Meeting, December 2001 Flavia Donno, Marco Serra for IT and WPs.
Open Science Grid OSG CE Quick Install Guide Siddhartha E.S University of Florida.
Data and storage services on the NGS Mike Mineter Training Outreach and Education
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
1 DIRAC Interfaces  APIs  Shells  Command lines  Web interfaces  Portals  DIRAC on a laptop  DIRAC on Windows.
July 11-15, 2005Lecture3: Grid Job Management1 Grid Compute Resources and Job Management.
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
Next Steps: becoming users of the NGS Mike Mineter
12th September 2007UK e-Science All Hands Meeting1 John Kewley Grid Technology Group e-Science Centre STFC Daresbury Laboratory GROWL.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Next Steps.
The National Grid Service Mike Mineter.
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
Introduction to The Storage Resource.
Ad Hoc VO Akylbek Zhumabayev Images. Node Discovery vs. Registration VO Node Resource User discover register Resource.
EGEE-0 / LCG-2 middleware Practical.
VO Box Issues Summary of concerns expressed following publication of Jeff’s slides Ian Bird GDB, Bologna, 12 Oct 2005 (not necessarily the opinion of)
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
Unix Servers Used in This Class  Two Unix servers set up in CS department will be used for some programming projects  Machine name: eustis.eecs.ucf.edu.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
Security, Authorisation and Authentication Mike Mineter, Guy Warner Training, Outreach and Education National e-Science Centre
INFSO-RI Enabling Grids for E-sciencE Web Services Mike Mineter National e-Science Centre, Edinburgh.
Open Science Grid Build a Grid Session Siddhartha E.S University of Florida.
Client installation DIRAC Project. DIRAC Client Software  Many operations can be performed through the Web interface  Even more to come  However, certain.
The National Grid Service Mike Mineter.
NGS computation services: APIs and.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Using Certificate & Simple Job Submission Jinny Chien ASGC.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
Client installation Beijing, 13-15/11/2013. DIRAC Client Software Beijing, /11/2013 DIRAC Tutorial2  Many operations can be performed through the.
Remote Api Tutorial How to call WS-PGRADE workflows from remote clients through the http protocol?
SSH. 2 SSH – Secure Shell SSH is a cryptographic protocol – Implemented in software originally for remote login applications – One most popular software.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
RI EGI-TF 2010, Tutorial Managing an EGEE/EGI Virtual Organisation (VO) with EDGES bridged Desktop Resources Tutorial Robert Lovas, MTA SZTAKI.
OGSA-DAI.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Authentication, Authorisation and Security Mike Mineter, National e-Science Centre.
Antonio Fuentes RedIRIS Barcelona, 15 Abril 2008 The GENIUS Grid portal.
INFSO-RI Enabling Grids for E-sciencE Sofia, 17 March 2009 Security, Authentication and Authorisation Mike Mineter Training, Outreach.
Security, Authorisation and Authentication Mike Mineter,
Creating and running applications on the NGS
Grid2Win: Porting of gLite middleware to Windows XP platform
NGS computation services: APIs and Parallel Jobs
a middleware implementation
Presentation transcript:

Creating and running an application on the NGS Mike Mineter

2 Policy for re-use This presentation can be re-used for academic purposes. However if you do so then please let training- know. We need to gather statistics of re-use: no. of events, number of people trained. Thank you!!training-

3 Acknowledgements This presentation re-uses material –on globus commands from Stephen Pickering (University of Leeds)

4 Outline A “User interface” machine and our set-up today Commands to be used Practical: –Port code and data from desktop/UI to the NGS compute nodes –Compile and run code –Invoke your application from the UI machine Summary

5 Goals You’ve got your proxy on a UI machine and want to know how to port and run an application Er.. What’s a UI machine?

6 The “UI” machine The users interface to the grid –Where you upload your certificate for your session –Where you create proxy certificates –Where you can run the various commands, including…

7 Globus Tools Globus Toolkit version 2 –GT from VDT 1.2 –(VDT comprises several middleware packages used in EU, US and UK grids) Job submission (GRAM) File transfer (GridFTP) Shell (GSI-SSH) Information Services (MDS/GIIS/GRIS) –Information providers from GLUE schema

8 Our setup Core NGS nodes ssh UI Internet Tutorial room machines grid-data.rl.ac.uk

9 Secure file copy UINGS compute node gsiscp: copies file using proxy certificate to allow AA Code and data

10 Open shell on NGS CN UINGS compute node gsissh Can be an X- windows client Code and data Compile, edit, recompile, build SHORT interactive runs are ok (sequential) Totalview debugger.

11 Establishing contact Connecting to the NGS compute node gsissh via port 2222 Modify environment on the NGS account (note which node you use and keep using it, or repeat commands on other nodes) –.bashrc Used when you connect with gsissh Used by globus

12 gsissh -X -p 2222 grid-compute.leeds.ac.uk -X: enables forwarding of X11 connections -p: sets port through which gsissh works [ can set –X –p 2222 as defaults in config files]

13 Run jobs from the UI UINGS compute node globus_job_run Or globus_job_submit / globus_get_output Code and data Executables Can pass files with these commands: e,g, parameters for a job.

14 PRACTICAL 2

15 The “UI” machine The users interface to the grid –Where you upload your certificate for your session –Where you create proxy certificates –Where you can run the various commands That was the good news. The bad news: install your own (currently!) –May be your desktop with additional software (GT2) – if you run UNIX derivative

16 A multi-VO Grid User Interface Grid services User Interface

17 globus_job_run Forks process to run command. Globus gatekeeper Job request Info system gridmapfile I.S. Job runs on head-node the gatekeeper reads the gridmapfile to map the user’s id from their proxy certificate to a local account head-node

18 globus_job_submit Job queue: PBSPro Globus gatekeeper Job request Info system gridmapfile I.S. Job runs on a cluster node the gatekeeper reads the gridmapfile to map the user’s id from their proxy certificate to a local account head-node

19 Questions -1 “How do I know which compute node to use?” –Use the Information Service –You should find that the Core nodes of the NGS all run the same software –In general you are likely to use the same machine Is my NGS Compute Node account shared across all machines?? –NO – You must synchronise your accounts on different machines yourself. Your account names may be different on each machine. Use GridFTP (from portal) or gsi-scp –You can hold files in the SRB,(Storage Resource Broker –see tomorrow) and read/write these from any compute node

20 Questions -2 “Should I stage an executable?” (stage = Send it to a compute node from my desktop/UI) –Only if the UI produces a binary-compatible file! –Safer to Check it compiles locally Copy to a compute node Compile it there

21 Further information VDT documentation NGS user pages