LHC Computing – the 3 rd Decade Jamie Shiers LHC OPN meeting October 2010.

Slides:



Advertisements
Similar presentations
Policy Aspects of the Transition to IPv6 Geoff Huston Chief Scientist, APNIC.
Advertisements

1 From Grids to Service-Oriented Knowledge Utilities research challenges Thierry Priol.
Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
INFSO-RI Enabling Grids for E-sciencE The EGEE project Fabrizio Gagliardi Project Director EGEE CERN, Switzerland Research Infrastructures.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
L. Taylor 2 March CMS Centres Worldwide See paper (attached to agenda) “How to create a CMS My Institute” which addresses the questions:
Assessment of Core Services provided to USLHC by OSG.
Ian Bird LHCC Referees’ meeting; CERN, 11 th June 2013 March 6, 2013
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
CHEP – Mumbai, February 2006 The LCG Service Challenges Focus on SC3 Re-run; Outlook for 2006 Jamie Shiers, LCG Service Manager.
A Communication Strategy for the FCC study James Gillies, Head of Communication, CERN FCC week 2015, 25 March 2015.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
WG Goals and Workplan We have a charter, we have a group of interested people…what are our plans? goalsOur goals should reflect what we have listed in.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Workshop summary Ian Bird, CERN WLCG Workshop; DESY, 13 th July 2011 Accelerating Science and Innovation Accelerating Science and Innovation.
Report from the EELA External Advisory Committee V. Breton, F. Gagliardi, M. Kunze 5/9/2006.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
Evolution of Grid Projects and what that means for WLCG Ian Bird, CERN WLCG Workshop, New York 19 th May 2012.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
Towards a Global Service Registry for the World-Wide LHC Computing Grid Maria ALANDES, Laurence FIELD, Alessandro DI GIROLAMO CERN IT Department CHEP 2013.
Key prototype applications Grid Computing Grid computing is increasingly perceived as the main enabling technology for facilitating multi-institutional.
EU-IndiaGrid (RI ) is funded by the European Commission under the Research Infrastructure Programme WP5 Application Support Marco.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
CERN IT Department CH-1211 Geneva 23 Switzerland t GDB CERN, 4 th March 2008 James Casey A Strategy for WLCG Monitoring.
From the Transatlantic Networking Workshop to the DAM Jamboree to the LHCOPN Meeting (Geneva-Amsterdam-Barcelona) David Foster CERN-IT.
NORDUnet Nordic Infrastructure for Research & Education Workshop Introduction - Finding the Match Lars Fischer LHCONE Workshop CERN, December 2012.
EGEE is a project funded by the European Union under contract IST Support in EGEE Ron Trompert SARA NEROC Meeting, 28 October
SC4 Planning Planning for the Initial LCG Service September 2005.
WP3 Information and Monitoring Rob Byrom / WP3
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
LHC Computing, CERN, & Federated Identities
WLCG Service Report ~~~ WLCG Management Board, 16 th September 2008 Minutes from daily meetings.
Data Preservation in HEP Use Cases, Business Cases, Costs & Cost Models Grid Deployment Board International Collaboration for Data.
David Foster LCG Project 12-March-02 Fabric Automation The Challenge of LHC Scale Fabrics LHC Computing Grid Workshop David Foster 12 th March 2002.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Update on HEP SSC WLCG MB, 6 th July 2009 Jamie Shiers Grid Support Group IT Department, CERN.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
1 Proposal for a technical discussion group  Because...  We do not have a forum where all of the technical people discuss the critical.
International Collaboration for Data Preservation and Long Term Analysis in High Energy Physics RECODE - Final Workshop - January.
Responsive Innovation for Disaster Mitigation Gordon A. Gow University of Alberta.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Data Preservation in HEP Use Cases, Business Cases, Costs & Cost Models Grid Deployment Board International Collaboration for Data.
Computing Fabrics & Networking Technologies Summary Talk Tony Cass usual disclaimers apply! October 2 nd 2010.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Evolution of storage and data management
Bob Jones EGEE Technical Director
A Dutch LHC Tier-1 Facility
EGEE Middleware Activities Overview
David Kelsey CCLRC/RAL, UK
Ian Bird GDB Meeting CERN 9 September 2003
Long-term Grid Sustainability
Input on Sustainability
Collaboration Board Meeting
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

LHC Computing – the 3 rd Decade Jamie Shiers LHC OPN meeting October 2010

The 3 rd Decade of LHC Computing We are now entering the 3 rd decade of LHC Computing Marked by the successful use of the Worldwide LHC Computing Grid for extended data taking since the restart of the LHC at the end of March 2010 – See July 2010 Economist article What are the issues and challenges of this decade, in particular from a network viewpoint?

Evolution We heard in the Data Access and Management Jamboree in Amsterdam that modifications to the experiments’ data models are likely In a nutshell: exploit the success of the network; more dynamic caching; more data / network oriented… This might well be accompanied by increased network demands at least for some Tier2s (But you’ve heard this before…)

The First Decade Started at CHEP 1992, Annecy, France, where significant focus was on the challenges of the SSC and LHC – plus increasing focus on “industry standards” versus HEP-specific solutions Led to several years of R&D – object oriented analysis and design, object oriented languages and databases – and production use towards the end of the decade Co-existed with wide-scale LEP exploitation and a revolution in the IT world: Internet explosion, commodity PCs, the Web It ended with the elaboration of possible models for LHC Computing – the “MONARC proposal”MONARC

The MONARC model The MONARC project tried to define a set of viable models for LHC computing It proposed a hierarchical model with a small number of regional centres at the national level plus a larger number of local centres – Universities of Institutes This model – consisting of a Tier0, roughly 10 Tiers1 and some 100 Tier2s – is the basis of the today’s production environment N.B. MONARC foresaw optional airfreight as an alternative to costly and low-bandwidth networking (622Mbps or less…)

The MONARC model vs today Desk tops CERN MIPS 2000 Tbyte; Robot University n.10 6 MIPS 100 Tbyte; Robot FNAL/BNL MIPS 200 Tbyte; Robot 622 Mbits/s Desk tops Desk tops

Enter the Grid Around the turn of the millennium cracks were beginning to appear in the solutions proposed by the various R&D projects – and adopted at the 100TB-1PB scale by experiments from several labs across the world – Major data and software migrations necessary At the same time, Ian Foster et al were evangelizing a new model for distributed computing HEP bit: CERN was the lead partner in a series of EU funded projects and (W)LCG was born

The Second Decade Several generations of grid R&D and deployment projects: in Europe EDG followed by EGEE I, II and III (EUR100M of investment from EU) plus partner projects in other areas of the world 1 st half of the decade included “data challenges” run by the experiments testing components of their computing models and specific services 2 nd half: a series of “service challenges” that contributed to the ramp-up of the global service to be ready well prior to planned data taking – Strong focus on network issues – but also end-to-end service: usability of the global system by the experiments Moving targets: computing models, experiment frameworks and underlying middleware and services all developed concurrently…

The Story So Far… 1990s: R&D, pilot strategies used in production 2000s: data & service challenges, production deployment & hardening ▶ 2010s: service – production exploitation plus changes to reflect experience from production plus evolution in IT world

Service IMHO it is inevitable that the models will become more network centric This means that any concerns with the current situation risk to be aggravated – and should be resolved asap A particular concern – for a long while – has been response to and resolution of network problems

Requirements N.B. these are general Tx-Ty needs – not limited to those connections based on OPN We need involvement of network experts early on in problem handling It is understood that these problems are often complex and involve multiple parties – it is for precisely this reason that “the network experts” become involved early (knowledge and contacts) We need ownership of these problems – someone / team who will follow up until the issue is resolved and perform the appropriate post-mortem

Summary There are other talks in this meeting where monitoring, Tier2 support and other key issues will be addressed I did not see anything explicit regarding the above key service issues, which is why I chose to focus on them There is a clear increase in network reliance over the past 2 decades of LHC computing To be successful in the 3 rd, these chronic service issues need to be resolved. Period.