UK GridPP Tier-1/A Centre at CLRC

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
GridPP News NeSC opening “Media” dissemination Tier 1/A hardware Web pages Collaboration meetings Nick Brook University of Bristol.
 Changes to sources of funding for computing in the UK.  Past and present computing resources.  Future plans for computing developments. UK Status &
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Edinburgh University Experimental Particle Physics Alasdair Earl PPARC eScience Summer School September 2002.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
UKQCD QCDgrid Richard Kenway. UKQCD Nov 2001QCDgrid2 why build a QCD grid? the computational problem is too big for current computers –configuration generation.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
CLRC and the European DataGrid Middleware Information and Monitoring Services The current information service is built on the hierarchical database OpenLDAP.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
2-Sep-02Steve Traylen, RAL WP6 Test Bed Report1 RAL and UK WP6 Test Bed Report Steve Traylen, WP6
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
CNAF Database Service Barbara Martelli CNAF-INFN Elisabetta Vilucchi CNAF-INFN Simone Dalla Fina INFN-Padua.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
(Prague, March 2009) Andrey Y Shevel
Grid related projects CERN openlab LCG EDG F.Fluckiger
SAM at CCIN2P3 configuration issues
The INFN TIER1 Regional Centre
Fabric and Storage Management
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
CC and LQCD dimanche 13 janvier 2019dimanche 13 janvier 2019
Gridifying the LHCb Monte Carlo production system
Grid activities in NIKHEF
Presentation transcript:

UK GridPP Tier-1/A Centre at CLRC Prototype Tier 1 centre for CERN LHC and FNAL experiments Tier A centre for SLAC BaBar experiment Testbed for EU DataGrid project Computing Farm The newly procured hardware consists of 4 racks holding 156 dual cpu PCs; a total of 312 1.4GHz Pentium III Tualatin cpus. Each box has 1GB of memory, a 40GB internal disk and 100Mb Ethernet connection. GridPP The Grid for UK Particle Physics Slogan: “From Web to Grid… exploiting the next IT revolution” First idea for a poster Background of data stream? Standard background for all GridPP posters. Standard logo and slogan.(or partsof slogon?) Poster emaulates web page? This one should illustrate central role of UK in worldwide collaboration. Emphasise middleware in handout/description? Other posters are: BaBar LHCb Query Optimisation Tier 1 - RAL Tier 2 – ScotGRID …others? 50TByte disk-based Mass Storage Unit The new 50TB mass storage unit is equipped with RAID5 disk management. Data is accessed through 26 Linux disk servers, each with a Gigabit Ethernet connection to the Tier-1/A Centre network Inside The Tape Robot The tape robot was recently upgraded and now uses 60GB STK 9940 tapes. It currently holds 45TB but will hold 330TB when full.

European DataGrid MiddleWare Middleware provides the interface between the applications and the physical computing, storage and network fabrics. HEP Apps EO Apps Bio Apps Applications Middleware Workload Management Fabric Management Data Management Information and Monitoring Services Mass Storage Management Network Services Globus Middleware Physical Fabric Networking Fabric Storage Element Computing Fabric Major UK Involvement Mass Storage Management The Storage Element provides a grid interface to diverse mass storage systems allowing remote access to data on both tape and disk. Network Services Network Services make use of existing monitoring tools to enable the publication of network metrics, from a Grid applications perspective, to the DataGrid middleware. Security CLRC is leading the Security Co-ordination group of the EU DataGrid project. An important part of Grid security is authentication via X.509 certificates. CLRC administers the UK eScience GRID Certificate Authority which issues certificates for GridPP users, machines and services. Information and Monitoring A new information and monitoring infrastructure is being developed: the Relational Grid Monitoring Architecture. It consists of consumers, producers and a directory service (the registry). Producers register with the registry and describe the type and structure of information they want to make available to the Grid. Consumers query the registry then contact the producer directly to obtain the relevant data. HEP Applications HEP application software is used to analyse very large volumes of data from particle physics experiments. It will use the grid middleware to manage the storage of data, and to control the submission of jobs. For more information http://www.gridpp.ac.uk/eu-datagrid/