3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)

Slides:



Advertisements
Similar presentations
March 6 th, 2009 OGF 25 Unicore 6 and IPv6 readiness and IPv6 readiness
Advertisements

Physics with SAM-Grid Stefan Stonjek University of Oxford 6 th GridPP Meeting 30 th January 2003 Coseners House.
Overview of local security issues in Campus Grid environments Bruce Beckles University of Cambridge Computing Service.
The National Grid Service and OGSA-DAI Mike Mineter
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
VO Support and directions in OMII-UK Steven Newhouse, Director.
© University of Reading David Spence 20 April 2014 E-Science Update.
E-Science Update Steve Gough, ITS 19 Feb e-Science large scale science increasingly carried out through distributed global collaborations enabled.
An Open Standards-based Scalable Heavy Lifting Data Transfer Service for e-Research David Meredith, Peter Turner, Alex Arana, Gerson Galang, David Wallom,
UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
The NERC Cluster Grid Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre Environmental Systems Science Centre.
Jens G Jensen CCLRC e-Science Single Sign-on to the Grid Federated Access and Integrated Identity Management.
Integrating HPC and the Grid – the STFC experience Matthew Viljoen, STFC RAL EGEE 08 Istanbul.
Copyright 2007, Information Builders. Slide 1 WebFOCUS Authentication Mark Nesson, Vashti Ragoonath Information Builders Summit 2008 User Conference June.
- 1 - Grid Programming Environment (GPE) Ralf Ratering Intel Parallel and Distributed Solutions Division (PDSD)
Web: OMII-UK Campus Grid Toolkit NW-GRID Campus Grids Workshop 31 st October 2007 University of Liverpool Tim Parkinson.
Running Climate Models On The NERC Cluster Grid Using G-Rex Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre Environmental.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Grid Interoperability Shootout GridPP and NGS UK e-Science All Hands Meeting, Nottingham 2007 J Jensen, G Stewart, M Viljoen, D Wallom, S Young (contact.
COMP3019 Coursework: Introduction to GridSAM Steve Crouch School of Electronics and Computer Science.
London e-Science Centre GridSAM Job Submission and Monitoring Web Service William Lee, Stephen McGough.
The National Grid Service Guy Warner.
MEDIU Learning for HE Ahmad Nimer | Project Manager.
ShibGrid: Shibboleth access to the UK National Grid Service University of Oxford and STFC.
Cliff Addison University of Liverpool Campus Grids Workshop October 2007 Setting the scene Cliff Addison.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
Athens – integrated AMS services Ed Zedlewski JISC/CNI Conference Edinburgh, June 2002.
HEPiX FNAL ‘02 25 th Oct 2002 Alan Silverman HEPiX Large Cluster SIG Report Alan Silverman 25 th October 2002 HEPiX 2002, FNAL.
PwC New Technologies New Risks. PricewaterhouseCoopers Technology and Security Evolution Mainframe Technology –Single host –Limited Trusted users Security.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Next Steps.
Middleware for Campus Grids Steven Newhouse, ETF Chair (& Deputy Director, OMII)
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Testing Grid Software on the Grid Steven Newhouse Deputy Director.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
Standards driven AAA for Job Management within the OMII-UK distribution Steven Newhouse Director, OMII-UK
Office of Science U.S. Department of Energy Grid Security at NERSC/LBL Presented by Steve Chan Network, Security and Servers
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
LSF Universus By Robert Stober Systems Engineer Platform Computing, Inc.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
Rob Allan Daresbury Laboratory NW-GRID Training Event 26 th January 2007 Next Steps R.J. Allan CCLRC Daresbury Laboratory.
Grid Remote Execution of Large Climate Models (NERC Cluster Grid) Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre
The National Grid Service Mike Mineter.
MGRID Architecture Andy Adamson Center for Information Technology Integration University of Michigan, USA.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Active Directory Domain Services (AD DS). Identity and Access (IDA) – An IDA infrastructure should: Store information about users, groups, computers and.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
Portlet Development Konrad Rokicki (SAIC) Manav Kher (SemanticBits) Joshua Phillips (SemanticBits) Arch/VCDE F2F November 28, 2008.
Holding slide prior to starting show. GECEM: Grid-Enabled Computational Electromagnetics David W. Walker School of Computer Science Cardiff University.
Accessing the VI-SEEM infrastructure
Next Steps.
Belle II Physics Analysis Center at TIFR
GWE Core Grid Wizard Enterprise (
Grid Portal Services IeSE (the Integrated e-Science Environment)
WLCG Collaboration Workshop;
Patrick Dreher Research Scientist & Associate Director
Grid Systems: What do we need from web service standards?
Preventing Privilege Escalation
Presentation transcript:

3rd Campus Grid SIG Meeting

Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)

What is the SIG? Group for the promotion of Campus Grid facilities throughout the UK academic community Solidify best practice throughout the UK community Groundwork for larger collaboration including regional grids

The story so far 1st meeting in Oxford in April –Presentations from each campus grid represented –Discussion of top four issues facing campus grid providers 2nd meeting over Access Grid –Presentation on the OMII roadmap

OMII Requirements Document Give OMII a concrete list of developments needed by our community Separated into five areas –Applications –Security –Accounting –Monitoring –Storage

Storage, Jon Blower Support reading/writing from/to SRB systems as these appear to be fairly widely used Some groups appear to be moving towards WebDAV-based systems so it is recommend that OMII consider supporting this. SFTP/SCP is also commonly used - I think this might already be supported but if it isn't, this would be very useful. This will be available to lots of users on Campus (since they can access files on any server they can SSH to) so might be a good fallback even where other options are available. Filesystem-based solutions (e.g. AFS, Parrot) are independent of middleware and so are probably not important for OMII.

Accounting, David Wallom Driven by –fEC –Tracing of resources donated to show fare share usage, especially as we move towards inter-institutional sharing and regional grids Important that this be auditable Standards based OMII should develop a lightweight RUS server and client set –Independent of platform and requires no external software e.g. Tomcat etc. –Includes more than just processing usage but storage and services

Security, Ian Bland Users do not like using grid certificates, –prefer the same authentication method as they used for more traditional computing/IT system. –mainly SSO systems such as Kerberos but increasingly Shibboleth. OMII should actively support the inclusion of Shibboleth support for middleware system. Support for authentication via SSO systems needs to be matched with support for authorisation via directory services, notably LDAP. Integrity issues for the users are that data sets are consistent and not corrupt and that code will execute in a precise manner the need for a trusted platform extends to one that is robust and cannot be compromised. This is something OMII should consider, possibly in the context of a trusted computing platform.

Monitoring, Mark Calleja GridSAM to provide a means of monitoring running jobs through connectivity to the underlying scheduler/system –through providing a standard view of the contents of the scratch space on a remote resource –Through being able to pipe on request stdout/stderr

Applications

Grid Data Storage UK

HTC Workshop Last week in November Including –Condor –United Devices –Datasynapse –Digipede Attendance from users, vendors and IT managers Practical demonstrations and exercises as well as talks on research that has been made possible using HTC

Group Collaboration Campus Grid SIG and UK e-Science Engineering Task Force similar aims and objectives can we work more formally together? Research Computing SIG –Many campus grid providers also run university HPC facilities, push for more synergies between them Launchpad for regional grids? Build on the example of NWGrid

AOB Visualization –Talk by RAVE team Networks –Firewall settings –Network performance Access Methods –Portals NGS Application Repository Application Hosting Environment –Standard command line applications –API

Next Meeting?