QCDgrid UKQCD Achievements and Future Priorities Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan.

Slides:



Advertisements
Similar presentations
Remote Visualisation System (RVS) By: Anil Chandra.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
QCDgrid User Interfaces James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
The Quantum Chromodynamics Grid James Perry, Andrew Jackson, Matthew Egbert, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
UKQCD GridPP NeSCAC Irving, 4/2/041 9 th GridPP Collaboration Meeting QCDgrid: Status and Future Alan Irving University of Liverpool.
Data services on the NGS.
Peter Berrisford RAL – Data Management Group SRB Services.
Data Management Expert Panel. RLS Globus-EDG Replica Location Service u Joint Design in the form of the Giggle architecture u Reference Implementation.
Magda – Manager for grid-based data Wensheng Deng Physics Applications Software group Brookhaven National Laboratory.
The science of simulation falsification algorithms phenomenology machines better theories computer architectures non-perturbative QFT experimental tests.
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
SAMGrid – A fully functional computing grid based on standard technologies Igor Terekhov for the JIM team FNAL/CD/CCF.
ILDG5QCDgrid1 QCDgrid status report UKQCD data grid Chris Maynard.
Lattice 2004Chris Maynard1 QCDml Tutorial How to mark up your configurations.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Simple Database.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Talk structure who are we ? what is a VO ? what are the challenges ? what is an e-project ? Andy Lawrence Garching June 2002.
CHEP 2000, Giuseppe Andronico Grid portal based data management for Lattice QCD data ACAT03, Tsukuba, work in collaboration with A.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Setting up a Pan-European Datagrid using QCDgrid technology Chris Johnson, James Perry, Lorna Smith and Jean-Christophe Desplat EPCC, The University Of.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
- Distributed Analysis (07may02 - USA Grid SW BNL) Distributed Processing Craig E. Tull HCG/NERSC/LBNL (US) ATLAS Grid Software.
ILDG Middleware Status Chip Watson ILDG-6 Workshop May 12, 2005.
Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield.
Lattice QCD Data Grid Middleware: status report M. Sato, CCS, University of Tsukuba ILDG6, May, 12, 2005.
Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson National Accelerator Facility Andy Kowalski.
Security monitoring boxes Andrew McNab University of Manchester.
Dr Chris Maynard Application Consultant, EPCC Tools for ILDG.
1 ILDG Status in Japan  Lattice QCD Archive(LQA) a gateway to ILDG Japan Grid  HEPNet-J/sc an infrastructure for Japan Lattice QCD Grid A. Ukawa Center.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
The CERA2 Data Base Data input – Data output Hans Luthardt Model & Data/MPI-M, Hamburg Services and Facilities of DKRZ and Model & Data Hamburg,
…building the next IT revolution From Web to Grid…
Who are we ? what is a VO ? what is a Grid ? how do we get there ? Andy Lawrence S.P.I.E. Hawaii Aug 2002 AstroGrid
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UKQCD Grid Status Report GridPP 13 th Collaboration Meeting Durham, 4th—6th July 2005 Dr George Beckett Project Manager, EPCC +44.
Lattice QCD Data Grid Middleware: Meta Data Catalog (MDC) -- CCS ( tsukuba) proposal -- M. Sato, for ILDG Middleware WG ILDG Workshop, May 2004.
DataTAG Work Package 4 Meeting Bologna Simone Ludwig Brunel University 23rd and 24th of May 2002.
MySQL and GRID status Gabriele Carcassi 9 September 2002.
A QCD Grid: 5 Easy Pieces? Richard Kenway University of Edinburgh.
Data and storage services on the NGS.
May 2005 PPARC e-Science PG School1 QCDgrid Chris Maynard A Grid for UKQCD National collaboration for lattice QCD.
Author - Title- Date - n° 1 Partner Logo WP5 Status John Gordon Budapest September 2002.
The Storage Resource Broker and.
STAR Scheduler Gabriele Carcassi STAR Collaboration.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
The RAL PPD Tier 2/3 Current Status and Future Plans or “Are we ready for next year?” Chris Brew PPD Christmas Lectures th December 2007.
WP2: Data Management Gavin McCance University of Glasgow.
Bob Jones EGEE Technical Director
U.S. ATLAS Grid Production Experience
Database System Concepts and Architecture
Data services on the NGS
Moving the LHCb Monte Carlo production system to the GRID
Data services on the NGS
UK GridPP Tier-1/A Centre at CLRC
Grid Portal Services IeSE (the Integrated e-Science Environment)
HYCOM CONSORTIUM Data and Product Servers
iSERVOGrid Architecture Working Group Brisbane Australia June
Presentation transcript:

QCDgrid UKQCD Achievements and Future Priorities Who and what Achievements QCDgrid middleware Future priorities Demo of meta-data catalogue browser Alan Irving University of Liverpool UKQCD

QCDgrid January 2003 UKQCD and the Grid: QCDgrid architecture Phase 1: data grid  Phase 2: distributed processing Roll out to all UKQCD ILDG: International Lattice Data Grid (ILDG meeting in Edinburgh: Dec 19/20)

QCDgrid January 2003 QCDgrid people The main players…… James Perry – EPCC – Globus, EDG and QCDgrid middleware, browser (OGSA) Chris Maynard – Edinburgh – XML schema, QCDgrid administrator Craig McNeile – Liverpool – UKQCD Software Manager, QCDgrid user interface

QCDgrid January 2003 QCDgrid Achievements We have a working data grid (J Perry, EPCC) We have used it for routine work (C McNeile, Liverpool) We can build our own 0.6 Tbyte RAID disk arrays for < £2000 (S Downing, Liverpool) We have explicitly tested RAID systems by reformatting one drive (C Allton, Swansea) We have a draft XML schema for lattice QCD data (C Maynard, Edinburgh) We have started an International Lattice Data Grid project (R Kenway, Edinburgh)

QCDgrid January 2003 QCDgrid middleware (Globus 2.0, Linux 7.x) User commands qcdgrid-list put-file-on-qcdgrid get-file-from-qcdgrid i-like-this-file qcdgrid-delete Configuration qcdgrid.conf nodes.conf nodeprefs.conf Grid administration control-thread.sh add-qcdgrid-node disable-qcdgrid-node enable-qcdgrid-node remove-qcdgrid-node retire-qcdgrid-node unretire-qcdgrid-node get-disk-space qcdgrid-checksum qcdgrid-filetime rebuild-qcdgrid-rc create-qcdgrid-rc delete-qcdgrid-rc verify-qcdgrid-rc

QCDgrid January 2003 QCDgrid: the future

QCDgrid January 2003 QCDOC and the grid … UKQCD is a distributed physics organisation Distributed computing  Grid Local disk farms and clusters QCDOC : source of configs Primary data from QCDOC QCDOC not directly on the grid Output -> binary + XML Front end feeds data to QCDgrid Data storage on RAID RAIDs as QCDgrid nodes Analyses on clusters Phase I: analysis on specified nodes pulling/pushing data from QCDgrid Phase II: cluster processing via grid middleware? Node QCDOC FE Grid

QCDgrid January 2003 QCDgrid: future plans Short term Decide on database: native XML (Xindice) or relational + XML interface tools Complete meta-data catalogue browser Expand no of sites in Grid to 4 UKQCD + RAL Decide on data-binding or other strategy for QCDOC I/O Construct more RAID arrays. Longer term Agree ILDG standard XML schema Agree ILDG middleware functionality at web- services level Monitor EDG M/W Investigate EDG M/W for remote job processing Test interoperability with other ILDG sites Develop M/W to implement any new IDLG strategy

QCDgrid January 2003 QCDgrid and HPC resources requested Hardware ‘Tier 1’: QCDOC + FE [JIF]  ‘Tier 2’: [PPARC, ‘son-of-SRIF’] –6x100 node PC clusters –6x20 Tbyte disk farms (could be integrated with LHC Tier 2 facilities?) Staff Grid specific : [PPARC/GridPP, Esci] –2 FTEx3years for QCDgrid and ILDG development –1 FTEx3 years for maintenance, dissemination (shared with other theory groups) Core SW development:[PPARC/HPC] –4 phys. progs. across all UKQCD Subdetector specific: –N.A.

QCDgrid January 2003 QCDgrid logical name browser

QCDgrid January 2003 QCDgrid metadata catalogue browser

QCDgrid January 2003 Output from Metadata Catalogue search trumpton.ph.ed.ac.uk:aci|aci> Query returned 427 results Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195 K3450U tar Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195 K3450U tar Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195 K3450U tar Grid filename: NF2/BETA526/CLOVER195/V16X32/KAPPA3450/GAUGE/D526C195 K3450U tar Grid filename: NF2/BETA526/CLOVER195...

QCDgrid January 2003 QCDgrid command line use (get-file-from-qcdgrid) ulgbcm.liv.ac.uk:aci|gridwork> getmoreconfigs creating an initial todo list 7 configs in initial list todo list shows 7 configs still to do but 5 have.tar already done, so delete any duplicates updating todolist now 2 configs still to be done getting NF2/BETA52/CLOVER202/V16X32/KAPPA3500/GAUGE/D52C202K3500U tar from the grid getting NF2/BETA52/CLOVER202/V16X32/KAPPA3500/GAUGE/D52C202K3500U tar from the grid now have:- -rw-r--r-- 1 aci aci Dec 11 21:41 /users/aci/configs/D52C202K3500U tar -rw-r--r-- 1 aci aci Jan 27 21:07 /users/aci/configs/D52C202K3500U tar -rw-r--r-- 1 aci aci Jan 27 21:52 /users/aci/configs/D52C202K3500U tar