(Prague, March 2009) Andrey Y Shevel

Slides:



Advertisements
Similar presentations
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Advertisements

Joining the Grid Andrew McNab. 28 March 2006Andrew McNab – Joining the Grid Outline ● LCG – the grid you're joining ● Related projects ● Getting a certificate.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Main title HEP in Greece Group info (if required) Your name ….
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
LCG Introduction John Gordon, STFC-RAL GDB September 9 th, 2008.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Overview of grid activities in France in relation to FKPPL FKPPL Workshop Thursday February 26th, 2009 Dominique Boutigny.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Summary Collaborative tools track Eva Hladká Masaryk University & CESNET Czech Republic.
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
1 CHEP2007 Highlights. 2 Computing in High-Energy Physics Beijing 2003 – La Jolla Interlaken Mumbai Berlin Chicago.
EGEE-III-INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-III All Activity Meeting Brussels,
Glite. Architecture Applications have access both to Higher-level Grid Services and to Foundation Grid Middleware Higher-Level Grid Services are supposed.
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Grid Activities in Portugal Gonçalo Borges Jornadas LIP 2010 Braga, Janeiro 2010.
ANALYSIS TOOLS FOR THE LHC EXPERIMENTS Dietrich Liko / CERN IT.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
CHEP 2012 ADC talks G. Carlino (INFN Napoli) on behalf of the Computing Speaker Committee ADC Weekly, August 29 th.
My Jobs at CERN April 2015 My Jobs at CERN2
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
Acronyms GAS - Grid Acronym Soup, LCG - LHC Computing Project EGEE - Enabling Grids for E-sciencE.
DIRAC Distributed Computing Services A. Tsaregorodtsev, CPPM-IN2P3-CNRS FCPPL Meeting, 29 March 2013, Nanjing.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
HEPiX IPv6 Working Group David Kelsey (STFC-RAL) GridPP33 Ambleside 22 Aug 2014.
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
Savannah to Jira Migration
Bob Jones EGEE Technical Director
Status of WLCG FCPPL project
ALICE & Clouds GDB Meeting 15/01/2013
Overview of the Belle II computing
JRA3 Introduction Åke Edlund EGEE Security Head
POW MND section.
Grid related projects CERN openlab LCG EDG F.Fluckiger
I Brazilian LHC Computing Workshop Welcome
EGEE support for HEP and other applications
UK GridPP Tier-1/A Centre at CLRC
Readiness of ATLAS Computing - A personal view
GridICE monitoring for the EGEE infrastructure
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
LCG middleware and LHC experiments ARDA project
WLCG Collaboration Workshop;
Connecting the European Grid Infrastructure to Research Communities
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
LHC Data Analysis using a worldwide computing grid
Pierre Girard ATLAS Visit
Overview & Status Al-Ain, UAE November 2007.
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
Presentation transcript:

(Prague, 21-27 March 2009) Andrey Y Shevel A brief review of the conference Computing in High Energy and nuclear Physics - CHEP 2009 (Prague, 21-27 March 2009) Andrey Y Shevel Andrey Y Shevel

CHEP 2009 program Online computing (28) Event Processing (36) Software Components, tools and databases (40) Hardware and Computing Fabrics (18) Grid Middleware and Networking Technologies (43) Distributed Processing and Analysis (35) Collaborative Tools (8) Posters (553) Conference site http://www.particle.cz/conferences/chep2009/ Registered 738 participants Andrey Y Shevel

Summary (for Online computing) Huge amount of computing in the online field All have set up Frameworks Huge SW-development in the online field Cosmics proofed online systems are working The way how to use multicore usage is open Andrey Y Shevel

Event processing - 1 Andrey Y Shevel

Event processing - 2 Andrey Y Shevel

Andrey Y Shevel

Software Components, Tools and Databases - 2 Andrey Y Shevel

Hardware and Computing fabrics - 1 Andrey Y Shevel

Hardware and computing fabric - 2 Use of SSD to speedup the disk IO Use of virtual machines (Xen) Advanced storage technology at ALICE DAQ (especially because all data are in ROOT format) SL4 until Oct 2010 but do not delay with SL5 Lustre is favorite distributed filesystem in HEP (evaluated at FNAL) Cooling, HPSS, HEP-SPEC2006 Andrey Y Shevel

Grid middleware and networking technologies - 1 Andrey Y Shevel

Grid middleware and networking technologies - 2 Interoperation (EGEE, NORDUGRID, OSG) CRAB (CMS) via gLite to ARC (NORDUGRID) Panda (ATLAS) to ... Monitoring Network monitoring Job status monitoring Data management (FeDex, LFC, FTS, etc., etc.) Security Clouds – Grids, European Grid Initiative (EGI) IPv4 until 2011 ? IPv6 is 'almost' unavoidable but … how to guarantee it ? Andrey Y Shevel

Grid middleware and networking technologies - 3 - Shevel’s mark Andrey Y Shevel

Grid middleware and networking technologies - 4 - Shevel’s mark Andrey Y Shevel

Distributed Processing & Analysis 1 Main magic words: XrootD, Ganga, PanDa, Proof, ARC, Dirac, GAUDI Andrey Y Shevel

Distributed Processing & Analysis 2 Andrey Y Shevel

Distributed Processing & Analysis 3 Andrey Y Shevel

Distributed Processing & Analysis 4 Andrey Y Shevel - Shevel’s mark

Distributed Processing & Analysis 5 Andrey Y Shevel

Distributed Processing & Analysis 6 Andrey Y Shevel - Shevel’s mark

Collaborative Tools 1 CMS Centres Worldwide: a New Collaborative Infrastructure Evo (Enabling Virtual Organizations) High Definition Videoconferencing for High Energy physics Dirac Secure Web User interface Indico Central – Events Organization, ergonomics and Collaboration Tools Andrey Y Shevel

LHC authority opinion Andrey Y Shevel

Sergio Bertolucci conclusion Andrey Y Shevel

LCG Architecture Andrey Y Shevel

Main aim of the LCG Andrey Y Shevel

What are the Tiers Andrey Y Shevel

How many countries, sites, etc. Andrey Y Shevel

CONCLUSION: My own opinion based on personal discussions on the conference A lot of developers are involved into the Grid infrastructure The Grids … Not all expectations have been met by Grid More problems than it was expected at the beginning (around 2002). Does each physicist really need to use Grid? It depends. If you can not avoid of using Grid let you do. If you can avoid of using Grid – do not use it. Be aware of trade offs! Andrey Y Shevel

Grid at PNPI Conditions to keep Grid cluster (aka Tier 2/3) facilities at PNPI Good network connectivity (I am afraid around 10 Gbit is mandatory) Remark: Now we have really less than 100 Mbit The paid team who can help end users. Real needs to use it on site, i.e. participation in a collaboration scale computing activity. Can one use Grid if PNPI will not have Grid facilities on site? Yes, it is possible with less requirements to the network connectivity. Network connectivity is the main point. Andrey Y Shevel

Spare slide 1 Andrey Y Shevel

Spare slide 2: virtualization Andrey Y Shevel

Spare slide 3: Grids or/and (?) Clouds Andrey Y Shevel

Last spare slide: New technology Segwey – amazing vehicle Andrey Y Shevel