EScience and Particle Physics Roger Barlow eScience showcase May 1 st 2007.

Slides:



Advertisements
Similar presentations
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Advertisements

5-Dec-02D.P.Kelsey, GridPP Security1 GridPP Security UK Security Workshop 5-6 Dec 2002, NeSC David Kelsey CLRC/RAL, UK
Your university or experiment logo here What is it? What is it for? The Grid.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
29 June 2006 GridSite Andrew McNabwww.gridsite.org VOMS and VOs Andrew McNab University of Manchester.
Overview of local security issues in Campus Grid environments Bruce Beckles University of Cambridge Computing Service.
12th September 2002Tim Adye1 RAL Tier A Tim Adye Rutherford Appleton Laboratory BaBar Collaboration Meeting Imperial College, London 12 th September 2002.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Cloud Storage in Czech Republic Czech national Cloud Storage and Data Repository project.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Security Issues in Physics Grid Computing Ian Stokes-Rees OeSC Security Working Group 14 June 2005.
Technology on the NGS Pete Oliver NGS Operations Manager.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
The story of BaBar: an IT perspective Roger Barlow DESY 4 th September 2002.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
Joining the Grid Andrew McNab. 28 March 2006Andrew McNab – Joining the Grid Outline ● LCG – the grid you're joining ● Related projects ● Getting a certificate.
The B A B AR G RID demonstrator Tim Adye, Roger Barlow, Alessandra Forti, Andrew McNab, David Smith What is BaBar? The BaBar detector is a High Energy.
Birmingham Particle Physics Masterclass 23 th April 2008 Birmingham Particle Physics Masterclass 23 th April 2008 The Grid What & Why? Presentation by:
14 July 2004GridPP Collaboration BoardSlide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
25 February 2000Tim Adye1 Using an Object Oriented Database to Store BaBar's Terabytes Tim Adye Particle Physics Department Rutherford Appleton Laboratory.
RomeWorkshop on eInfrastructures 9 December LCG Progress on Policies & Coming Challenges Ian Bird IT Division, CERN LCG and EGEE Rome 9 December.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
A long tradition. e-science, Data Centres, and the Virtual Observatory why is e-science important ? what is the structure of the VO ? what then must we.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Royal Latin School. Spec Coverage: a) Explain the advantages of networking stand-alone computers into a local area network e) Describe the differences.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Your university or experiment logo here Caitriana Nicholson University of Glasgow Dynamic Data Replication in LCG 2008.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Next Steps: becoming users of the NGS Mike Mineter
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Next Steps.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
Your university or experiment logo here What is it? What is it for? The Grid.
Derek Ross E-Science Department DCache Deployment at Tier1A UK HEP Sysman April 2005.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Andrew McNabGrid in 2002, Manchester HEP, 7 Jan 2003Slide 1 Grid Work in 2002 Andrew McNab High Energy Physics University of Manchester.
Grid Security work in 2004 Andrew McNab Grid Security Research Fellow University of Manchester.
The GridPP DIRAC project DIRAC for non-LHC communities.
LHC Computing, CERN, & Federated Identities
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
The National Grid Service Mike Mineter.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
The ATLAS Computing & Analysis Model Roger Jones Lancaster University ATLAS UK 06 IPPP, 20/9/2006.
M.C. Vetterli; SFU/TRIUMF Simon Fraser ATLASATLAS SFU & Canada’s Role in ATLAS M.C. Vetterli Simon Fraser University and TRIUMF SFU Open House, May 31.
The GridPP DIRAC project DIRAC for non-LHC communities.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Bob Jones EGEE Technical Director
Ian Bird GDB Meeting CERN 9 September 2003
UK GridPP Tier-1/A Centre at CLRC
VOMS deployment for small national VOs and local groups
Collaboration Board Meeting
The Worldwide LHC Computing Grid
Using an Object Oriented Database to Store BaBar's Terabytes
Presentation transcript:

eScience and Particle Physics Roger Barlow eScience showcase May 1 st 2007

Particle physics yesterday Particle physics has pushed the computing envelope ever since the early days - though in the early days the envelope was smaller

Particle physics today Today we have millions of extremely complicated events to interpret

Particle physics tomorrow Tomorrow the LHC will give us trillions of much more complicated events to interpret

What is our computing? Basic tasks fairly simple – find tracks from set of points, find particle momenta from track curvature, etc. Data is pretty well constrained and homogenous. But we need to do this on a big scale, for hundreds of millions of events.

Big is beautiful If all the data from the LHC in a year were written to CDs, and the CDs put in a pile, that pile would be 20 km high Commodity computing saved us in the 1990s: PCs got cheap. But that’s not enough

The Grid Data analysis can only be achieved as a world-wide exercise using all available computing resources This needs a Grid: The UK GridPP project and EGEE across Europe

What is a Grid? Answer 1 (for journalists) The Grid is a virtual supercomputer Answer 2 (for politicians) Computing coming out of a wall socket, just like electric power Answer 3 (for the rest of us) The Web is computers talking to computers, saying ‘Give me a certain document’. The Grid is computers talking to computers. They can say anything.

Issues Many computers available in computer centres Many people who want to use them How do the people know the computers are there? What facilities they provide How do the computer centres know which users are allowed to do what? That users are who they say they are? Traditional solution; every user needs an account and password at every centre Does not scale to large numbers of computers and users

Solution – Grid certificates User details encoded by the RSA algorithm from a trusted Certificate Authority Proves that whoever presents the certificate are who they say they are. (Modern analogue of the royal seal) User needs only one Grid certificate – access everywhere Led (in academia) by UK/European particle physicists /C=UK /O=eScience /OU=Manchester /L=HEP /CN=roger john barlow UK CA

Solution - VOMS Users join ‘Virtual Organisations’ Centres negotiate with VOs for usage rights UK VOMS system run from Manchester (Physics for GridPP, Computing for NGS). 500 users, 15 VOs, and growing CENTRE User VO VOMS syste m

Solution: pool accounts Developed and implemented by Manchester physics Neither user nor centre wants individual accounts Single account (‘griduser’) will lead to jobs deleting each others’ files Create generic accounts (ATLAS001, ATLAS002…) User assigned a generic account – can run jobs Account is linked to certificate name so there is an audit trail for antisocial behaviour

Solution - GridSite GridSite developed at Manchester to manage websites using grid credentials Now used to ‘gridify’ Web Services used on the grid GridPP website is maintained at Manchester using GridSite

Facility – Tier dual 2.8 GHz Xeon nodes ½ Petabyte of storage 10 Gbit/s network Bought as faculty investment in eScience Reynolds House machine room Management through particle physics

Results Working and delivering CPU cycles High uptime and reliability

Who’s using it? Manchester LHC experiments Non-Manchester LHC experiments Manchester non LHC experiments Non-Manchester non-LHC experiments Non-Particle physics users

Biomed Drug design for combating bird flu & other diseases

ATLAS ATLAS trigger tests Large scale software tests cannot be done at CERN as only ~10 computers available Tests run at Manchester (400 CPUs) instead ATLAS monitoring and calibration Detectors need frequent calibrating to convert raw signals into useful co-ordinates correctly Computers at CERN dedicated to actual DAQ Solution: ship raw data to Manchester. Process. Ship calibration data back Successful large-scale tests show this is possible Trigger Detector DAQ Data storage CAL DATA Manchester 2000 CPUs Monitor

B A B AR Experiment running at Stanford Linear Accelerator studying the difference between matter and antimatter Data selection: Copy files from SLAC to Manchester (~5TB) Select ~200 different streams using different criteria Ship files of separate streams back to SLAC SLAC computers overloaded with other processing tasks Anticipate direct financial return if successful (common fund rebate to STFC)

Financial benefits (past) BaBarGridR011454PPARC£138K TestbedR011857PPARC £ 10K Grid Security R011409PPARC£461K EGEER013652EU£112K Tier2 operations R011411PPARC£311K

Financial benefits (future) GridPP2+PPARC~£200K GridPP3PPARC £ 1.8M EGEE3EU£112K Grants applied for Expect approval though not at full amount requested

Summary Particle Physics uses eScience eScience benefits from Particle Physics Manchester is leading in ideas and computer power thanks to bright people and strong support Benefits in international recognition and research income Long may it continue!