Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
31/03/ :05:55GridPP 3 Cambridge Feb 02Slide 1 Grid site report linux desktops test rig CDF ScotGRID in collaboration with Edinburgh and IBM.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
OptorSim: A Replica Optimisation Simulator for the EU DataGrid W. H. Bell, D. G. Cameron, R. Carvajal, A. P. Millar, C.Nicholson, K. Stockinger, F. Zini.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Data Management Expert Panel - WP2. WP2 Overview.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
DGC Paris Community Authorization Service (CAS) and EDG Presentation by the Globus CAS team & Peter Kunszt, WP2.
Parallel Programming on the SGI Origin2000 With thanks to Moshe Goldberg, TCC and Igor Zacharov SGI Taub Computer Center Technion Mar 2005 Anne Weill-Zrahia.
GGF Toronto Spitfire A Relational DB Service for the Grid Peter Z. Kunszt European DataGrid Data Management CERN Database Group.
POLITEHNICA University of Bucharest California Institute of Technology National Center for Information Technology Ciprian Mihai Dobre Corina Stratan MONARC.
Security Mechanisms The European DataGrid Project Team
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Tony Doyle “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Welcome e-Science in the UK Building Collaborative eResearch Environments Prof. Malcolm Atkinson Director 23 rd February 2004.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC WP2+5: Data and Storage Management.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
The Grid Prof Steve Lloyd Queen Mary, University of London.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Status of LHCb-INFN Computing CSN1, Catania, September 18, 2002 Domenico Galli, Bologna.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK,
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Nick Brook Current status Future Collaboration Plans Future UK plans.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Tier1 Andrew Sansum GRIDPP 10 June GRIDPP10 June 2004Tier1A2 Production Service for HEP (PPARC) GRIDPP ( ). –“ GridPP will enable testing.
Eine Einführung ins Grid Andreas Gellrich IT Training DESY Hamburg
National HEP Data Grid Project in Korea Kihyeon Cho Center for High Energy Physics (CHEP) Kyungpook National University CDF CAF & Grid Meeting July 12,
Data Management GridPP and EDG Gavin McCance University of Glasgow May 9, 2002
DGC Paris WP2 Summary of Discussions and Plans Peter Z. Kunszt And the WP2 team.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Maria Girone CERN - IT Tier0 plans and security and backup policy proposals Maria Girone, CERN IT-PSS.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
DGC Paris Spitfire A Relational DB Service for the Grid Leanne Guy Peter Z. Kunszt Gavin McCance William Bell European DataGrid Data Management.
Storage Management on the Grid Alasdair Earl University of Edinburgh.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
ALICE Computing Data Challenge VI
UK GridPP Tier-1/A Centre at CLRC
The INFN TIER1 Regional Centre
D. Galli, U. Marconi, V. Vagnoni INFN Bologna N. Brook Bristol
CS258 Spring 2002 Mark Whitney and Yitao Duan
Presentation transcript:

Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID EDG TestBed 1 Status Middleware –Overview of SAM –Spitfire - Security Mechanism –Optor – replica optimiser simulation –Monitoring Prototype Hardware +Software +People Summary

Grid Team Hardware Software System Middleware Applications Hardware See David.. # working

LHC Computing at a Glance The investment in LHC computing will be massive –LHC Review estimated 240MCHF –80MCHF/y afterwards These facilities will be distributed –Political as well as sociological and practical reasons Europe: 267 institutes, 4603 users Elsewhere: 208 institutes, 1632 users

Rare Phenomena – Huge Background 9 orders of magnitude! The HIGGS All interactions

CPU Requirements Complex events –Large number of signals –“good” signals are covered with background Many events –10 9 events/experiment/year –1- 25 MB/event raw data –several passes required  Need world-wide: 7*10 6 SPECint95 (3*10 8 MIPS)

LHC Computing Challenge Tier2 Centre ~1 TIPS Online System Offline Farm ~20 TIPS CERN Computer Centre >20 TIPS RAL Regional Centre US Regional Centre French Regional Centre Italian Regional Centre Institute Institute ~0.25TIPS Workstations ~100 MBytes/sec Mbits/sec One bunch crossing per 25 ns 100 triggers per second Each event is ~1 Mbyte Physicists work on analysis “channels” Glasgow has ~10 physicists working on one or more channels Data for these channels is cached by the Glasgow server Physics data cache ~PBytes/sec ~ Gbits/sec or Air Freight Tier2 Centre ~1 TIPS ~Gbits/sec Tier 0 Tier 1 Tier 3 Tier 4 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 ScotGRID++ ~1 TIPS Tier 2

Starting Point

CPU Intensive Applications Numerically intensive simulations: –Minimal input and output data ATLAS Monte Carlo (gg H bb) 182 sec/3.5 Mb event on 1000 MHz linux box Standalone physics applications: 1. Simulation of neutron/photon/electron interactions for 3D detector design 2. NLO QCD physics simulation CompilerSpeed (MFlops) Fortran (g77) 27 C (gcc)43 Java (jdk)41 Compiler Tests:

Timeline Q1 Q2 Q3 Q4 Prototype of Hybrid Event Store (Persistency Framework) Hybrid Event Store available for general users Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) available LHC Global Grid TDR applications grid ScotGRID ~ 300 CPUs + ~ 50 TBytes

ScotGRID ScotGRID Processing nodes at Glasgow 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and Mbit/s ethernet 1TB disk LTO/Ultrium Tape Library Cisco ethernet switches ScotGRID Storage at Edinburgh IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM 70 x 73.4 GB IBM FC Hot-Swap HDD CDF equipment at Glasgow 8 x 700 MHz Xeon IBM xSeries GB memory 1 TB disk Griddev testrig at Glasgow 4 x 233 MHz Pentium II

EDG TestBed 1 Status Web interface showing status of (~400) servers at testbed 1 sites GRID extend to all expts

Glasgow within the Grid

GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA LCFG MDS deployment GridSite SlashGrid Spitfire Optor GridPP Monitor Page =Glasgow element Applications (start-up phase) BaBar CDF+D0 (SAM) ATLAS/LHCb CMS (ALICE) UKQCD £17m 3-year project funded by PPARC CERN - LCG (start-up phase) funding for staff and hardware... £3.78m £5.67m £3.66m £1.99m £1.88m CERN DataGrid Tier - 1/A Applications Operations

Overview of SAM

Spitfire - Security Mechanism Servlet Container SSLServletSocketFactory TrustManager Security Servlet Does user specify role? Map role to connection id Authorization Module HTTP + SSL Request + client certificate Yes Role Trusted CAs Is certificate signed by a trusted CA? No Has certificate been revoked? Revoked Certs repository Find default No Role repository Role ok? Connection mappings Translator Servlet RDBMS Request a connection ID Connection Pool

Optor – replica optimiser simulation Simulate prototype Grid Input site policies and experiment data files. Introduce replication algorithm: –Files are always replicated to the local storage. –If necessary oldest files are deleted. –Even a basic replication algorithm significantly reduces network traffic and program running times. New economics-based algorithms under investigation

Prototypes Tools: Java Analysis Studio over TCP/IP Instantaneous CPU Usage Scalable Architecture Individual Node Info. real world... simulated World… simulated World…

Glasgow Investment in Computing Infrastructure Long tradition Significant Dept. Investment £100,000 refurbishment (just completed) Long term commitment (LHC era ~ 15 years) Strong System Management Team – underpinning role New Grid Data Management Group – fundamental to Grid Development ATLAS/CDF/LHCb software Alliances with Glasgow Computing Science, Edinburgh, IBM.

Summary (to be updated..) Grids are (already) becoming a reality Mutual Interest ScotGRID Example Glasgow emphasis on –DataGrid Core Development –Grid Data Management –CERN+UK lead –Multidisciplinary Approach –University + Regional Basis –Applications ATLAS, CDF, LHCb –Large distributed databases – a common problem=challenge – CDF LHC – – Genes Proteins Detector for ALICE experiment Detector for LHCb experiment ScotGRID