Michal Turala Warszawa, 25 February 2005 1 Computing development projects GRIDS M. Turala The Henryk Niewodniczanski Instytut of Nuclear Physics PAN and.

Slides:



Advertisements
Similar presentations
Symantec 2010 Windows 7 Migration EMEA Results. Methodology Applied Research performed survey 1,360 enterprises worldwide SMBs and enterprises Cross-industry.
Advertisements

PIONIER and its usability for GEANT extension to Eastern Europe Michał Przybylski, CEF Networks Workshop, May 2005.
Symantec 2010 Windows 7 Migration Global Results.
L ondon e-S cience C entre Application Scheduling in a Grid Environment Nine month progress talk Laurie Young.
1 Nia Sutton Becta Total Cost of Ownership of ICT in schools.
THE ICT RESEARCH INFRASTRUCTURE DEVELOPMENT PROGRAMME Grzegorz Żbikowski Department of Information Systems for Science Ministry of Science and.
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
1 AMY Detector (eighties) A rather compact detector.
CNES, JP ANTIKIDIS, WGISS17 GRID SESSION MAY 2004 A "Virtual Factory" experiment Wide Aera Grid Usage Generic demonstrative experiment of VFC 17th WGISS,Tromso.
EU DataGrid progress Fabrizio Gagliardi EDG Project Leader
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
GridPP: Executive Summary Tony Doyle. Tony Doyle - University of Glasgow Oversight Committee 11 October 2007 Exec 2 Summary Grid Status: Geographical.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Bernd Panzer-Steindel, CERN/IT WAN RAW/ESD Data Distribution for LHC.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft LCG-POB, , Reinhard Maschuw1 Grid Computing Centre Karlsruhe - GridKa Regional/Tier.
Polish Tier-2 Ryszard Gokieli Institute for Nuclear Studies Warsaw.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
LCG Grid Deployment Board, March 2003 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Status of GridKa for LCG-1 Forschungszentrum Karlsruhe.
Forschungszentrum Karlsruhe Technik und Umwelt Regional Data and Computing Centre Germany (RDCCG) RDCCG – Regional Computing and Data Center Germany software.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
Finnish Material Sciences Grid (M-grid) Arto Teräs Nordic-Sgi Meeting October 28, 2004.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Ian Willers Information: CMS participation in Monarc and RD45 Slides: Paolo Capiluppi, Irwin Gaines, Harvey Newman, Les Robertson, Jamie Shiers, Lucas.
A 1 Defining urban areas, some case studies in Finland Nordic Forum for Geostatistics Sinikka Laurila Statistics Finland
Equal or Not. Equal or Not
Slippery Slope
PSSA Preparation.
CERN IT Department1 / 17 Tour of CERN Computer Center and the Grid at CERN Information Technologies Department Tour of CERN Computer Center and the Grid.
Zagreb, NATO ANW: The Third CEENet Workshop on Network Management, Piotr Sąsiedzki POL-34 Silesian University of Technology Computer Centre.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
5 Nov 2001CGW'01 CrossGrid Testbed Node at ACC CYFRONET AGH Andrzej Ozieblo, Krzysztof Gawel, Marek Pogoda 5 Nov 2001.
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
Polish Contribution to the Worldwide LHC Computing Grid WLCG M. Witek On behalf of the team of Polish distributed Tier-2 Outline Introduction History and.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Henryk Palka NEC’07 Varna, Sept HEP GRID computing in Poland Henryk Palka Institute of Nuclear Physics, PAN, Krakow, Poland.
From GEANT to Grid empowered Research Infrastructures ANTONELLA KARLSON DG INFSO Research Infrastructures Grids Information Day 25 March 2003 From GEANT.
Michal Turala Daegu, 24 May Poland networking, digital divide and grid projects M. Pzybylski The Poznan Supercomputing and Networking Center, Poznan,
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Clusterix:National IPv6 Computing Facility in Poland Artur Binczewski Radosław Krzywania Maciej Stroiński
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
A. Hoummada May Korea Moroccan ATLAS GRID MAGRID Abdeslam Hoummada University HASSAN II Ain Chock B.P Maarif CASABLANCA - MOROCCO National.
Site Report --- Andrzej Olszewski CYFRONET, Kraków, Poland WLCG GridKa+T2s Workshop.
Geneva – Kraków network measurements for the ATLAS Real-Time Remote Computing Farm Studies R. Hughes-Jones (Univ. of Manchester), K. Korcyl (IFJ-PAN),
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
Grid site as a tool for data processing and data analysis
Russian Regional Center for LHC Data Analysis
The INFN TIER1 Regional Centre
Presentation transcript:

Michal Turala Warszawa, 25 February Computing development projects GRIDS M. Turala The Henryk Niewodniczanski Instytut of Nuclear Physics PAN and the Academic Computing Center Cyfronet AGH, Kraków

Michal Turala Warszawa, 25 February Outline - computing requirements of the future HEP experiments - HEP world wide computing models and related grid projects - Polish computing projects: PIONIER and GRIDS - Polish participation in the LHC Computing Grid (LCG) project

Michal Turala Warszawa, 25 February Data preselection in real time - many different physics processes - several levels of filtering - high efficiency for events of interest - total reduction factor of about 10 7 LHC data rate and filtering Level 1 - Special Hardware Level 2 - Embedded Processors/Farm 40 MHz (1000 TB/sec) equivalent) Level 3 – Farm of commodity CPU 75 KHz (75 GB/sec)fully digitised 5 KHz (5 GB/sec) 100 Hz (100 MB/sec) Data Recording & Offline Analysis

Michal Turala Warszawa, 25 February Data rate for LHC p-p events Typical parameters: Nominal rate events/s (luminosity /cm 2 s, collision rate 40MHz) Registration rate- ~100 events/s (270 events/s) Event size - ~1 M Byte/ event (2 M Byte/ event) Running time~ 10 7 s/ year Raw data volume~ 2 Peta Byte/year/experiment Monte Carlos~ 1 Peta Byte/year/experiment The rate and volume of HEP data doubles every 12 months !!! Already today BaBar, Belle, CDF, DO experiments produce 1 TB/ day

Michal Turala Warszawa, 25 February Data analysis scheme Interactive Data Analysis Interactive Data Analysis Processed Data Processed Data GB/sec 200 TB / year Detector Raw data EventReconstructionEventReconstruction EventSimulationEventSimulation One Experiment 35K SI95 ~200 MB/sec 250K SI95 350K SI95 64 GB/sec 350K SI95 64 GB/sec 500 TB 1 PB / year ~100 MB/sec analysis objects Event Filter (selection & reconstruction) Event Filter (selection & reconstruction) Event Summary Data Event Summary Data Batch Physics Analysis Analysis 0.1 to 1 GB/sec Thousands of scientists from M. Delfino

Michal Turala Warszawa, 25 February Multi-tier model of data analysis

7 LHC computing model (Cloud) CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Physics Department Desktop Germany Tier 1 USA FermiLab UK France Italy NL USA Brookhaven ………. The LHC Computing Centre

8 ICFA Network Task Force (1998): required network bandwidth (Mbps) 100–1000 X Bandwidth Increase Foreseen for See the ICFA-NTF Requirements Report:

Michal Turala Warszawa, 25 February LHC computing – specifications for Tier0 and Tier1 CERN ALICE ATLAS CMS LHCb CPU (kSI95) Disk Pool (TB) Aut. Tape (TB) Shelf Tape (TB) Tape I/O (MB/s) Cost (MCHF) Tier 1 CPU (kSI95) Disk Pool (TB) Aut. Tape (TB) Shelf Tape (TB) Tape I/O (MB/s) # Tier Cost av (MCHF)

Michal Turala Warszawa, 25 February Development of Grid projects

11 Applications Infrastructure Middleware Infrastructure DataTag Computing EuroGrid, DataGrid, Damien Tools and Middleware GridLab, GRIP Applications EGSO, CrossGrid, BioGrid, FlowGrid, Moses, COG, GEMSS, Grace, Mammogrid, OpenMolGrid, Selene, P2P / ASP / Webservices P2People, ASP-BP, GRIA, MMAPS, GRASP, GRIP, WEBSI Clustering GridStart EU FP5 Grid Projects (EU Funding: 58 M) from M. Lemke at CGW04

12 Strong Polish Participation in FP5 Grid Research Projects 2 Polish-led Projects (out of 12) CrossGrid CYFRONET Cracow ICM Warsaw PSNC Poznan INP Cracow INS Warsaw GridLab PSNC Poznan Significant share of funding to Poland versus EU25 FP5 IST Grid Research Funding: 9.96 % FP5 wider IST Grid Project Funding: 5 % GDP: 3.8 % Population: 8.8 % CROSSGRID partners from M. Lemke at CGW04

Michal Turala Warszawa, 25 February CrossGrid testbeds LUBLIN SZCZECIN BYDGOSZCZ TORUŃ OPOLE 16 sites in 10 countries, about 200 processors and 4 TB disk storage Testbeds for - development - production - testing - tutorials - external users Middleware: from EDG 1.2 to LCG Last week CrossGrid has concluded successfully its final review

Michal Turala Warszawa, 25 February CrossGrid applications POZNAŃ WROCŁAW KATOWICE KRAKÓW WARSZAWA SZCZECIN TORUŃ BIAŁYSTOK ELBLĄG OLSZTYN KIELCE PUŁAWY RZESZÓW OPOLE BIELSKO-BIAŁA CZĘSTOCHOWA Medical Blood flow simulation, supporting vascular surgeons in the treatment of arteriosclerosis Flood prediction and simulation based on weather forecasts and geographical data Flood prediction Distributed data mining in high energy physics, supporting the LHC collider experiments at CERN Physics Large-scale weather forecasting combined with air pollution modeling (for various pollutants) Meteo Pollution

Michal Turala Warszawa, 25 February Grid for real time data filtering GDAŃSK POZNAŃ WROCŁAW ZIELONA GÓRA ŁÓDŹ KATOWICE KRAKÓW LUBLIN WARSZAWA SZCZECIN BYDGOSZCZ TORUŃ BIAŁYSTOK ELBLĄG OLSZTYN KIELCE PUŁAWY RZESZÓW OPOLE BIELSKO-BIAŁA KOSZALIN RADOM CZĘSTOCHOWA Studies on a possible use of remote computing farms for event filtering; in 2004 beam test data shipped to Cracow, and back to CERN, in real time.

Michal Turala Warszawa, 25 February LHC Computing Grid project-LCG Objectives - design, prototyping and implementation of the computing environment for LHC experiments (MC, reconstruction and data analysis): - infrastructure - middleware - operations (VO) Schedule - phase 1 (2002 – 2005; ~50 MCHF); R&D and prototyping (up to 30% of the final size) - phase 2 (2006 – 2008 ); preparation of a Technical Design Report, Memoranda of Understanding, deployment (2007) Coordination - Grid Deployment Board: representatives of the world HEP community, supervising of the LCG grid deployment and testing

17 Country providing resources Country anticipating joining EGEE/LCG In EGEE-0 (LCG-2): 91 sites >9000 cpu ~5 PB storage Computing Resources – Dec From F. Gagliardi at CGW04 Three Polish institutions involved - ACC Cyfronet Cracow - ICM Warsaw - PSNC Poznan Polish i nvestment in the local infrastructure EGEE supporting the operations

Michal Turala Warszawa, 25 February Polish Participation in LCG project Polish Tier2 INP/ ACC Cyfronet Cracow resources (plans for 2004) 128 processors (50%), storage: disk ~ 10TB, tape (UniTree) ~ 10 TB (?) manpower engineers/ physicists ~ 1 FTE + 2 FTE (EGEE) ATLAS data challenges – qualified in 2002 INS/ ICM Warsaw resources (plans for 2004) 128 processors (50%), storage: disk ~ 10TB, tape ~ 10 TB manpower engineers/ physicists ~ 1 FTE + 2 FTE (EGEE) Connected to LCG-1 world-wide testbed in September 2003

Michal Turala Warszawa, 25 February Polish networking - PIONIER from the report of PSNC to ICFA ICIC, Feb (M. Przybylski) 5200 km fibres installed, connecting 21 MAN centres Multi-lambda connections planned Good connectivity of HEP centres to MANs - IFJ PAN to MAN Cracow – 100 Mb/s -> 1 Gbs, - INS to MAN Warsaw – 155 Mb/s Stockholm GEANT Prague

Michal Turala Warszawa, 25 February PC Linux cluster at ACC Cyfronet CrossGrid – LCG-1 4 nodes 1U 2x PIII, 1GHz 512 MB RAM 40 GB HDD 2 x FastEthernet 100Mb/s 23 nodes 1U 2x Xeon 2,4Ghz 1 GB RAM 40 GB HDD Ethernet 100Mb/s+1Gb/s HP ProCurve Switch 40 ports 100Mb/s, 1 port 1Gb/s (uplink) Monitoring: 1U unit KVM keyboard touch pad LCD Ethernet 100 Mb PIII 1GHz Xeon 2,4 GHz INTERNET Last year 40 nodes of I64 processors have been added; in 2005 investments of 140 Linux 32 bit processors and TB of disk storage are planned

Michal Turala Warszawa, 25 February ACC Cyfronet in LCG-1 Sept. 2003: Sites taking part in the initial LCG service (red dots) Kraków Poland Karlsruhe Germany This is the very first really running global computing and data Grid, which covers participants on three continents Small Test clusters at 14 institutions; Grid middleware package (mainly parts of EDG and VDT) a global Grid testbed from K-P. Mickel at CGW03l

Michal Turala Warszawa, 25 February Linux Cluster at INS/ ICM CrossGrid – EGEE - LCG cluster at the Warsaw University (Physics Department) Worker Nodes: 10 CPUS (Athlon 1.7 MHz) Storage Element: ~ 0.5 TB Network: 155 Mb/s LCG 2.3.0, registered in LCG Test Zone PRESENT STATE NEAR FUTURE (to be ready in June 2005) cluster at the Warsaw University (ICM) Worker Nodes: CPUS (64-bit) Storage Element: ~ 9 TB Network: 1 Gb/s (PIONEER) from K. Nawrocki

Michal Turala Warszawa, 25 February PC Linux Cluster at ACC Cyfronet CrossGrid – EGEE- LCG-1 LCG cluster at ACC Cyfronet statistics for 2004 CPU timeWalltime [hours] 20493, , , , , ,375 Atlas Alice LHCb CPU timeWalltime [seconds] Atlas Alice LHCb

24 ATLAS DC Status ATLAS ~ 1350 kSI2k.months ~ 120,000 jobs ~ 10 Million events fully simulated (Geant4) ~ 27 TB All 3 Grids have been proven to be usable for a real production DC2 Phase I started beginning of July, finishing now 3 Grids were used LCG ( ~70 sites, up to 7600 CPUs) NorduGrid (22 sites, ~3280 CPUs (800), ~14TB) Grid3 (28 sites, ~2000 CPUs) LCG 41% Grid3 29% NorduGrid 30% from L. Robertson at C-RRB, Oct. 2004

Michal Turala Warszawa, 25 February In response to the LCG MoU draft document and using data of the PASTA report the plans for Polish Tier2 infrastructure have been prepared – they are summarized in the Table Polish LHC Tier2 - future CPU (kSI2000) Disk, LHC (TBytes) Tape, LHC (TBytes) WAN (Mbits/s) Manpower (FTE) It is planned that in the next few years the LCG resources will grow incrementally mainly due to local investments. A step is expected around 2007, when the matter of LHC computing fundings should be finally resolved. from the report to LCG GDB, 2004

Michal Turala Warszawa, 25 February Thank you for your attention