R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP.

Slides:



Advertisements
Similar presentations
Large-Scale, Adaptive Fabric Configuration for Grid Computing Peter Toft HP Labs, Bristol June 2003 (v1.03) Localised for UK English.
Advertisements

Grid Tech Team Certificates, Monitoring, & Firewall September 15, 2003 Chiang Mai, Thailand Allan Doyle, NASA With the help of the entire Grid Tech Team.
Project Overview Daniel Mallmann, Research Centre Juelich Alistair Dunlop, University of Southampton.
Designing Services for Grid-based Knowledge Discovery A. Congiusta, A. Pugliese, Domenico Talia, P. Trunfio DEIS University of Calabria ITALY
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
5-Dec-02D.P.Kelsey, GridPP Security1 GridPP Security UK Security Workshop 5-6 Dec 2002, NeSC David Kelsey CLRC/RAL, UK
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
Abstraction Layers Why do we need them? –Protection against change Where in the hourglass do we put them? –Computer Scientist perspective Expose low-level.
The National Grid Service Mike Mineter.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre 1.The vision Thing, recent history and Organisation 2.Roles and Relationships.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
The National Grid Service and OGSA-DAI Mike Mineter
Support: Certificates and Helpdesks Andrew Richards (GSC/NGS) – CCLRC, RAL.
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
18 April 2002 e-Science Architectural Roadmap Open Meeting 1 Support for the UK e-Science Roadmap David Boyd UK Grid Support Centre CLRC e-Science Centre.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
02/07/03 Grid Support Centre 1 UK Grid Support Centre Alistair Mills CLRC e-Science Centre
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
© University of Reading IT Services ITS Support for e­ Research Stephen Gough Assistant Director of IT Services 18 June 2008.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
1 Towards Building Generic Grid Services Platform A component oriented approach Jeyarajan Thiyagalingam Stavros Isaiadis, Vladimir Getov Distributed and.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
The OMII Position At the University of Southampton.
UK e-Science Dave Berry, Research Manager National e-Science Centre e-Science: Computational Grid Infrastructure and Scientific & Engineering Applications.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Stephen Pickles Technical Director, Grid Operations Support Centre University of Manchester Neil Geddes CCLRC Head of e-Science Director of the UK Grid.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
Rob Allan Daresbury Laboratory A Web Portal for the National Grid Service Xiaobo Yang, Dharmesh Chohan, Xiao Dong Wang and Rob Allan CCLRC e-Science Centre,
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Applications & a Reality Check Mark Hayes. Applications on the UK Grid Ion diffusion through radiation damaged crystal structures (Mark Calleja, Mark.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Middleware for Campus Grids Steven Newhouse, ETF Chair (& Deputy Director, OMII)
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
Remarks on OGSA and OGSI e-Science All Hands Meeting September Geoffrey Fox, Indiana University.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
The National Grid Service Mike Mineter.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
UK Grid Operations Support Centre All slides stolen by P.Clarke from a talk given by: Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations.
The UK National Grid Service Andrew Richards – CCLRC, RAL.
ETF and Level 2 Grid UK e-Science Grid Now up and running! Rob Allan David Boyd.
Operations Support for the UK National Grid Service
Bob Jones EGEE Technical Director
Regional Operations Centres Core infrastructure Centres
UK Grid: Moving from Research to Production
Stephen Pickles Technical Director, GOSC
Grid Portal Services IeSE (the Integrated e-Science Environment)
The National Grid Service
Leigh Grundhoefer Indiana University
Presentation transcript:

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 2 BoF Agenda Applications & Requirements Technical Exchanges & Collaboration Common Strategies / Roadmap Discussion Applications & Requirements Technical Exchanges & Collaboration Common Strategies / Roadmap Discussion

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 3 Applications…

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 4 What does the eScience Grid currently look like? Globus v2 installed at all regional eScience centres. Heterogenous resources (linux clusters, SGI O2/3000, SMP Sun machines) eScience certificate authority Globus v2 installed at all regional eScience centres. Heterogenous resources (linux clusters, SGI O2/3000, SMP Sun machines) eScience certificate authority Mark Hayes

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 5 GridPP & EDG Dedicated linux clusters running EDG middleware (globus++) Very homogenous resources Resource broker (based on Condor) LDAP based VO management Dedicated linux clusters running EDG middleware (globus++) Very homogenous resources Resource broker (based on Condor) LDAP based VO management Mark Hayes

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 6 Applications Applications on the eScience Grid E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts) Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab) GENIE - ocean-atmosphere modelling (flocked Condor pools) Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG Applications on the eScience Grid E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts) Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab) GENIE - ocean-atmosphere modelling (flocked Condor pools) Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG GridPP & EDG (e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction Large FORTAN/C++ codes, optimised for Linux (and packaged as RPMs) Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA) GridPP & EDG (e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction Large FORTAN/C++ codes, optimised for Linux (and packaged as RPMs) Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA) Mark Hayes

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 7 Technical…

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 8 Comparison GT2 EDG 2.0 Value added Components EDG Info VOMS RB Data Man. Only Linux 7.3 in places PP Applications GT2 Simple User management ??? UK MDS L2 Grid Value added components Application Network monitoring eScience CA Peter Clarke

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 9 UK Grid : Deployment Phases Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk; Level 1: Resources capable of running the Grid Integration Test Script; Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS; Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. 7 Centres of Excellence Globus GT3 testbed (later talks and mini workshop) JISC JCSR resources Level 4: TBD, probably OGSA based. Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk; Level 1: Resources capable of running the Grid Integration Test Script; Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS; Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. 7 Centres of Excellence Globus GT3 testbed (later talks and mini workshop) JISC JCSR resources Level 4: TBD, probably OGSA based. Rob Allan

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 10 UK Grid : Key Components ETF Coordination: activities are coordinated through regular Access Grid meetings, and the Web site; Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks; Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software; Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres; Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC; Access Grid: on-line meeting facilities with dedicated rooms and multicast network access. ETF Coordination: activities are coordinated through regular Access Grid meetings, and the Web site; Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks; Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software; Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres; Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC; Access Grid: on-line meeting facilities with dedicated rooms and multicast network access. Rob Allan

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 11 UK Grid : GT3 Testbeds 2 testbeds funded: Edinburgh/ Newcastle/ UCL/ Imperial –OGSA-DAI –E-Materials e-Science pilot demonstrator –AstroGrid application demonstrator Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading –Tackle issues of inter-working between OGSI implementations –Report on deployment and ease of use –HPCPortal services demonstrator –CCP application demonstrator –E-HTPX e-Science pilot demonstrator –InfoPortal demonstrator using OGSA-DAI 2 testbeds funded: Edinburgh/ Newcastle/ UCL/ Imperial –OGSA-DAI –E-Materials e-Science pilot demonstrator –AstroGrid application demonstrator Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading –Tackle issues of inter-working between OGSI implementations –Report on deployment and ease of use –HPCPortal services demonstrator –CCP application demonstrator –E-HTPX e-Science pilot demonstrator –InfoPortal demonstrator using OGSA-DAI Rob Allan

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 12 UK Grid : Issues to be Tackled A number of areas significant for a production Grid environment have hardly been tackled using GT2. Issues include: Grid information systems, service registration, discovery and definition of facilities. Schema important; Security, in particular role-based authorisation and security of middleware; Portable parallel job specifications; Meta-scheduling, resource reservation and on demand; Linking and interacting with remote data sources; Wide-area visualisation and computational steering; Workflow composition and optimisation; Distributed user, s/w and application management; Data management and replication services; Grid programming environments, PSEs and user interfaces; Auditing, advertising and billing in a Grid-based resource market; Semantic and autonomic tools; etc. etc. A number of areas significant for a production Grid environment have hardly been tackled using GT2. Issues include: Grid information systems, service registration, discovery and definition of facilities. Schema important; Security, in particular role-based authorisation and security of middleware; Portable parallel job specifications; Meta-scheduling, resource reservation and on demand; Linking and interacting with remote data sources; Wide-area visualisation and computational steering; Workflow composition and optimisation; Distributed user, s/w and application management; Data management and replication services; Grid programming environments, PSEs and user interfaces; Auditing, advertising and billing in a Grid-based resource market; Semantic and autonomic tools; etc. etc. Rob Allan

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 13 Coordination & Management

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 14 SR2004 e-Science soft-landing Key e-Science Infrastructure components: –Persistent National e-Science Research Grid –Grid Operations Centre –UK e-Science Middleware Infrastructure Repository –National e-Science Institute (cf Newton Institute) –National Digital Curation Centre –AccessGrid Support Service –e-Science/Grid collaboratories Legal Service –International Standards Activity Key e-Science Infrastructure components: –Persistent National e-Science Research Grid –Grid Operations Centre –UK e-Science Middleware Infrastructure Repository –National e-Science Institute (cf Newton Institute) –National Digital Curation Centre –AccessGrid Support Service –e-Science/Grid collaboratories Legal Service –International Standards Activity Paul Jeffreys

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 15 Post 2006 e-Science soft-landing Components foreseen:- –Baseline OST Core Programme and collaborate with JISC/JCSR about long-term support issues –OST should support Persistent Research Grid and e- Science Institute –JISC should support Grid Operations Centre, AccessGrid Support Service –OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity Components foreseen:- –Baseline OST Core Programme and collaborate with JISC/JCSR about long-term support issues –OST should support Persistent Research Grid and e- Science Institute –JISC should support Grid Operations Centre, AccessGrid Support Service –OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity Paul Jeffreys

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 16 Intersecting UK Grid and EGEE/LCG/GridPP Coordination and Management GridPP –Very substantial resources (computing and infrastructure) –Built-in international connections.. EGEE, LCG –Need to create production service Possible points of intersection of UK programme with GridPP/EGEE/LCG:- –Resources – shared sites, shared personnel –Grid Operations and Support –Security (operational level – firewalls etc) –Change Management of software suite –OMII –CA –Interface to Europe –Training, dissemination –Collection of requirements GridPP –Very substantial resources (computing and infrastructure) –Built-in international connections.. EGEE, LCG –Need to create production service Possible points of intersection of UK programme with GridPP/EGEE/LCG:- –Resources – shared sites, shared personnel –Grid Operations and Support –Security (operational level – firewalls etc) –Change Management of software suite –OMII –CA –Interface to Europe –Training, dissemination –Collection of requirements Paul Jeffreys

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 17 Discussion

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 18 Discussion EDG not yet heterogeneous, OGSA, timelines, packaging issues Get people working together as investment for future –Working parties : Security, Resource Brokering, Information Services, Workflow, Ops & Support Avoid being too ambitious or complex Need over-arching strategic / roadmap view Outcomes : –Pursue grass-roots collaboration & strategic view in parallel –Set up working party to recommend common elements of roadmap –Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation. EDG not yet heterogeneous, OGSA, timelines, packaging issues Get people working together as investment for future –Working parties : Security, Resource Brokering, Information Services, Workflow, Ops & Support Avoid being too ambitious or complex Need over-arching strategic / roadmap view Outcomes : –Pursue grass-roots collaboration & strategic view in parallel –Set up working party to recommend common elements of roadmap –Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation.

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 19 Status GT2 L2 Grid Application GT2 PP Grid PP Application Common Core Layer Shared Engineering ??? LCG-1/EGEE-0 ??? (GT2) ??? OGSA ???

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 20 Conclusions Possible route for technical collaboration –Shared engineering – working together –Common objective in OGSA (similar timescales) No overall co-ordination - yet Much still to do to make it happen !! Possible route for technical collaboration –Shared engineering – working together –Common objective in OGSA (similar timescales) No overall co-ordination - yet Much still to do to make it happen !!