LHC Computing at RAL PPD Dave Newbold RAL PPD / University of Bristol The LHC computing challenge PPD and the Grid Computing for physics PPD added value.

Slides:



Advertisements
Similar presentations
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
Pete Clarke– GridPP 6 – 31 Jan n° 1 EGEE EGEE - The Network Sector.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
The e-Framework Bill Olivier Director Development, Systems and Technology JISC.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
R e D R e S S Resource Discovery for Researchers in e-Social Science ReDReSS A Joint Application from Lancaster and Daresbury (7 social scientists, 6 computer/computational.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Assessment of Core Services provided to USLHC by OSG.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
Open Source Grid Computing in the Finance Industry Alex Efimov STFC Kite Club Knowledge Exchange Advisor UK CERN Technology Transfer Officer
The Preparatory Phase Proposal a first draft to be discussed.
S.L.LloydGridPP CB 19 February 2003Slide 1 Agenda 1.Minutes of Previous Meeting (29 Oct 2002) 2.Matters Arising 3.GridPP2 Planning 4.EGEE 5.Any Other Business.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
E-Science Experiences: Software Engineering Practice and the EU DataGrid Lee Momtahan and Andrew Martin Oxford University Software Engineering Centre.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Steven Newhouse EGEE’s plans for transition.
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
EGEE MiddlewareLCG Internal review18 November EGEE Middleware Activities Overview Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Future computing strategy Some considerations Ian Bird WLCG Overview Board CERN, 28 th September 2012.
Grid Security work in 2004 Andrew McNab Grid Security Research Fellow University of Manchester.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
Infrastructure Breakout What capacities should we build now to manage data and migrate it over the future generations of technologies, standards, formats,
EGEE Project Review Fabrizio Gagliardi EDG-7 30 September 2003 EGEE is proposed as a project funded by the European Union under contract IST
The National Grid Service Mike Mineter.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
John Womersley PPD Staff Meeting 12 October 2005.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
UK Grid Operations Support Centre All slides stolen by P.Clarke from a talk given by: Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations.
Powered down?. Every Child Matters: Children and young people have told us that 5 outcomes are key to well-being in childhood and later life – Being healthy,
Centre of Excellence in Physics at Extreme Scales Richard Kenway.
Bob Jones EGEE Technical Director
Ian Bird GDB Meeting CERN 9 September 2003
UK GridPP Tier-1/A Centre at CLRC
Fabric and Storage Management
UK Testbed Status Testbed 0 GridPP project Experiments’ tests started
Connecting the European Grid Infrastructure to Research Communities
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
Brian Matthews STFC EOSCpilot Brian Matthews STFC
LHCb thinking on Regional Centres and Related activities (GRIDs)
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

LHC Computing at RAL PPD Dave Newbold RAL PPD / University of Bristol The LHC computing challenge PPD and the Grid Computing for physics PPD added value

Audit 2007: 18th September orders of magnitude The SM Higgs All interactions Single p-p beam crossing event The LHC Computing Challenge Search for rare events vs huge statistical background LHC dataset ~ 5 Petabytes / yr Completely new computing regime

Audit 2007: 18th September Meeting the Challenge Scale of the problem –The largest scientific computing project in history – by far –Outstrips capabilities of any one institute, technically and politically –Computing now a worldwide activity, away from the accelerator lab A new approach to computing –The Grid: key enabling technology for 21st century HEP computing –Enables remote large computers as ‘generic utilities’ - securely –Coordinated through the CERN-sponsored LCG project The computing system –Built on a backbone of ~10 internationally-recognised ‘Tier-1 centres’ Safeguard the raw data, process on demand, serve to physicists –RAL Tier-1 caters for all four major LHC experiments (most do not) –University groups (~15 in the UK) typically provide ‘Tier-2 centres’

Audit 2007: 18th September PPD Computing Strategy Goals for LHC computing: –Support the UK LHC programme with large-scale computing resources –Develop and support the basic LHC computing infrastructure –Maintain expertise, and experts, in modern computing / software –Provide leadership / coordination for UK HEP computing Practical deliverables: –Tier-1 centre : supporting the international LHC community –Local (Tier-2) computing resources for UK physics analysis –Leadership and coordination of international Grid projects Including technical role in Grid software development –Leadership of LHC experiment computing and software projects –Interface between LHC experiment and computing resource providers Ensuring that generic resources are useful for physics These goals are addressed most effectively by the UK national lab

Audit 2007: 18th September PPD in the GridPP Project UK National Grid effort –2000: UK HEP community proposes a collective computing project PPD provided initial stimulus and leadership for this effort –The resulting GridPP project is now entering its third phase ~ £70M PPARC / EU funding over ten years –GridPP is a leading partner in all areas of LHC computing PPD role –The leading contributor to the project from the start –Project Management Board 50% RAL (18 seats, 5 PPD, 4 e-science) Chair of Deployment Board (D. Kelsey) Chair and Associate Chair of User Board (G. Patrick, D. Newbold) Grid Middleware coordinator (R. Middleton) Documentation officer (S. Burke) PPD gives a stable, neutral focal point of the project –Brokers, and enables, the technical contributions of UK institutes

Audit 2007: 18th September GridPP in Context PPD has a central role in every one of these areas

Audit 2007: 18th September Grid Research: Highlights International Grid middleware projects –‘Grid machinery’ is largely provided by EU-funded software projects –PPD is the contact point for Euro projects, GridPP, UK HEP community –Working on concrete software deliverables, usable beyond HEP (S. Fisher et al) Scalable monitoring tools – vital component of realistic Grid architecture Security and policy –The real ‘machinery’ behind the Grid is political Brokering and agreeing sharing of ‘foreign’ resources worldwide –PPD has led construction and agreement of worldwide security policy Also works on Grid security at a practical hands-on level Deployment –The art and science of making a huge dispersed system work –A PPD history of ‘young guns’, with strong international profile Expertise directly supports the LHC physics effort (S. Burke, S. Traylen) PPD is a highly effective environment for (senior) computing specialists –Lab’s international reputation allows us to take a early role in new projects –PPD can support and facilitate career development in specialist areas

Audit 2007: 18th September Expt. Computing: PPD Role Physicists’ view of the Grid –“How can it help me extract results, make discoveries?” –PPD key role is to make the system useful for physics – pragmatically The ‘missing pieces’ –Grid is a generic system; must be adapted, customised for physics –Requires interfaces, documentation, physicist-oriented tools E.g. physically meaningful cataloguing / description of data –Need people with expertise in both physics and modern computing Scientists, working directly within the experiments Computing for physics –Tier-1 / 2 centres are a generic resource - need operators / support Senior ops / support staff are trained physicists (C. Brew, M. Bly) –PPD provides the ‘glue’ that makes the centres work for physics UK confident that the RAL Tier-1 will deliver, come what may –Demonstrated success of (first) BaBar Tier-A centre (T. Adye, C. Brew)

Audit 2007: 18th September Expt. Software: Highlights Joint LHCb / ATLAS ‘Grid user interface’ tool - a “GridPP product” –UK coordinator is a PPD physicist (G. Patrick); >800 real users internationally –This is how most people will do physics on a day-to-day basis Job details Logical Folders Job Monitoring Log window Job builder Scriptor

Audit 2007: 18th September Expt. Operations: Highlights CMS CSA06 (Computing, Software, Analysis) - D. Newbold computing model steer –A non-stop month-long test of the realistic data flows, processing –All did not go according to plan (of course); RAL / PPD flexibility tested

Audit 2007: 18th September Summary Wide portfolio of LHC computing activity –Original, world-class, research in new computing techniques –Organisation, leadership of Grid research, and computing for physics –Practical delivery of computing resources, useful for physics The PPD role –A focal point for LHC computing, inside and outside the UK –A welcome collaborator for Universities, facilitates UK Grid research PPD added value –Contact point for physicists, e-science specialists, resource providers –The interface to large-scale facilities, providing access for UK HEP –A ‘trusted partner’ in international computing, representing the UK –A critical mass of recognised computing / software expertise The real work starts now! –PPD role will become ever more vital as we approach LHC start-up