Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek29.8.2011 Dell Presentation.

Slides:



Advertisements
Similar presentations
Your university or experiment logo here What is it? What is it for? The Grid.
Advertisements

Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Education for HEP & CTU Vojtěch Petráček Czech technical university in Prague.
11 September 2007Milos Lokajicek Institute of Physics AS CR Prague Status of the GRID in the Czech Republic NEC’2007.
Tier 2 Prague Institute of Physics AS CR Status and Outlook J. Chudoba, M. Elias, L. Fiala, J. Horky, T. Kouba, J. Kundrat, M. Lokajicek, J. Svec, P. Tylka.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Prague Site Report Jiří Chudoba Institute of Physics, Prague Hepix meeting, Prague.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
10 October 2006ICFA DDW'06, Cracow Milos Lokajicek, Prague 1 Current status and plans for Czech Grid for HEP.
Prague TIER2 Computing Centre Evolution Equipment and Capacities NEC'2009 Varna Milos Lokajicek for Prague Tier2.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
FZU Computing Centre Jan Švec Institute of Physics of the AS CR, v.v.i
CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
INTRODUCTION The GRID Data Center at INFN Pisa hosts a big Tier2 for the CMS experiment, together with local usage from other HEP related/not related activities.
29 June 2004Distributed Computing and Grid- technologies in Science and Education. Dubna 1 Grid Computing in the Czech Republic Jiri Kosina, Milos Lokajicek,
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
21 October 2010 Dietrich Liko Grid Tier-2 HEPHY Scientific Advisory Board.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
Grid activities in the Czech Republic Jiří Kosina, Miloš Lokajíček, Jan Švec Institute of Physics of the Academy of Sciences of the Czech Republic
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
Nikhef/(SARA) tier-1 data center infrastructure
Networks ∙ Services ∙ People Enzo Capone (GÉANT) LHCOPN/ONE Meeting, LBL Berkeley (USA) LHCONE Application Pierre Auger Observatory 1-2 June.
High Energy Physics & Computing Grids TechFair Univ. of Arlington November 10, 2004.
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
5 Sept 2006GDB meeting BNL, MIlos Lokajicek Service planning and monitoring in T2 - Prague.
Computing Jiří Chudoba Institute of Physics, CAS.
13 October 2004GDB - NIKHEF M. Lokajicek1 Operational Issues in Prague Data Challenge Experience.
Site Report: Prague Jiří Chudoba Institute of Physics, Prague WLCG GridKa+T2s Workshop.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
Materials for Report about Computing Jiří Chudoba x.y.2006 Institute of Physics, Prague.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Ukrainian Academic Grid Initiative (UAGI) Status and outlook G. Zinovjev Bogolyubov Institute for Theoretical Physics Kiev, Ukraine.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
IBERGRID as RC Total Capacity: > 10k-20K cores, > 3 Petabytes Evolving to cloud (conditioned by WLCG in some cases) Capacity may substantially increase.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
1 Austrian Participation in ATLAS  Innsbruck University  FH Wiener Neustadt  Austrians in ATLAS at CERN RECFA meeting, Vienna, Emmerich Kneringer.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
CC-IN2P3: A High Performance Data Center for Research Dominique Boutigny February 2011 Toward a future cooperation with Israel.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2015/2016
COMPUTING FOR ALICE IN THE CZECH REPUBLIC in 2016/2017
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Prague TIER2 Site Report
Dagmar Adamova, NPI AS CR Prague/Rez
Southwest Tier 2.
LHC Data Analysis using a worldwide computing grid
This work is supported by projects Research infrastructure CERN (CERN-CZ, LM ) and OP RDE CERN Computing (CZ /0.0/0.0/1 6013/ ) from.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation

Outline Mission of the centre General infrastructure (electricity, cooling, network) Capacities and services People and financing Conclusion 2

Mission of the centre Who we are? – Regional Computing Centre for Particle Physics Institute of Physics of the Academy of Sciences of the Czech Republic, Prague Basic research in particle physics, solid state physics and optics What are we doing? – Computing support for big international Particle Physics, Nuclear Physics and Astro-particle Physics experiments using grid environment D0, ATLAS, ALICE, STAR, AUGER, Belle (CESNET) WLCG TIER2 centre – Solid State Physics computing – From the computing point of view: High Throughput Computing (HPC), large data samples processing, chaotic data analysis (by physicists), parallel computing Our users? – Collaborating scientists from institutes of the Academy of Sciences of the Czech Republic, Charles University and Czech Technical University and others – Big experiments (grid environment), individual members of the international experiments, local physicists CESNET (Czech Research Network Provider) – NGI (National Grid Initiative) of EGI (European Grid Initiative) – Delivers special point-to-point computer lines 3

Computing Centre Infrastructure Network connections – 1 Gb/s to European GEANT2 – 10 Gb/s to CESNET – CESNET – optical lines E2E FNAL, TAIPEI, GRIDKA FZK Karlsruhe, BNL FZÚ, UK, ČVUT, ÚJF Řež 4 Total electric power – For computers 200 kVA UPS engine generator 380 kVA – Air cooling 2x55 kW – Water cooling in units STULZ, 2x88 kVA Total 290 kVA (N+1)

Contribution to Tier2s in ATLAS + ALICE idsite/accounting/CESGA/coun try_view.html Long term slide from 2006 for unavailable regular financing From 2008 regular financing from grants (substantial capacity increase in all other Tier2s) 5

RAW Capacities in FZU HEPSPEC2006%TB disk% D ATLAS ALICE (60 Řež) D ATLAS (16 MFF)77 ALICE (60 Řež)21 6

ATLAS D0 ALICE jobs Center for Particle Physics in Prague Delivering substantial capacities to D0 – second largest contribution to the experiment from external laboratories, ATLAS 2%, ALICE 7% (very good numbers), and others experiments AUGER VO created in Prague Support for Tier3 centers at Prague institutes and Universities Today’s capacity 23 kHEPSPEC2006, cores 1,7 PB disc space LCG capacities cores, 100 PB disk, >140 centers worldwide -> Prague belongs to bigger centers AUGER Running jobs

D0 Monte Carlo production and physics analysis D0 MC 2010 MC simulations in external laboratories – FZU 486M events, 2nd largest after IN2P3, 1300 concurrent jobs (2M jobs/year), 55 TB sent to FNAL Physics analysis – PHD Theses – done in Prague 8

Observation of a Centrality-Dependent Dijet Asymmetry in Lead-Lead Collisions at  s NN = 2.76 TeV with the ATLAS Detector at the LHC Colleagues from Charles University (2 of 4 main authors) made the computing at our farm. 9

Operation, Financing Operation Centre is operated by SAVT (computing department of FZU) – Together with other computing services for FZU Our services run by 3-4 FTE Contact person from each experiment in close contact with operation staff Financing FZU – Construction, power infrastructure (UPS, Generator), cooling by FZU, 20 MCZK in 2003/4 – Operation, electricity HEP research grants – Computing capacities – Regular financing from 2008 (MoU with LCG) 6 MCZK per year 10

Conclusion Long term activity starting in 2001 with European GRID project EDG Most of personnel are it-professionals Successful operation for experiments – Institute support for farm construction and operation – Grants for computing capacities Visible Grid and non-grid computing services for HEP experiments and Czech collaborating institutes and physicists – Substantial support to our physicists and students in analysis competition 11

10:00 Introductions/Welcome – Prague Regional Computing Centre for Particle Physics 10:15-10:45 Jan Švec: FZU Computing Centre 10:45-12:30 Roger Goff: – HPC Technology – Future X86-64 Processor Technology – Green HPC – Co-processor Technology – Storage Technology (SSDs, LHC file system solutions) 12:30-13:00 Visit of the Centre 12