Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 ATLAS-VALENCIA COORDINATION MEETING 9th May 2008 José SALT TIER2 Report 1.- Present Status of the ES-ATLAS-T2 2.- Computing-Analysis issues Overview.

Similar presentations


Presentation on theme: "1 ATLAS-VALENCIA COORDINATION MEETING 9th May 2008 José SALT TIER2 Report 1.- Present Status of the ES-ATLAS-T2 2.- Computing-Analysis issues Overview."— Presentation transcript:

1 1 ATLAS-VALENCIA COORDINATION MEETING 9th May 2008 José SALT TIER2 Report 1.- Present Status of the ES-ATLAS-T2 2.- Computing-Analysis issues Overview

2 2 Ramp-up of Tier-2 Resources (after LHC rescheduling) numbers are cumulative Evolution of ALL ATLAS T-2 resources according to the estimations made by ATLAS CB (October 2006) Strong increase of resources IFAEUAMIFICTOTAL CPU(ksi2k) 60 144 132 336 Disck (TB) 30 38 36 104 Spanish ATLAS T-2 assuming a contribution of a 5% to the whole effort Present resources of the Spanish ATLAS T-2 (May’08) Resources: Present Status and Growing Profile

3 3 IFAEUAMIFICT-2 CPU(kSI2k) 170 75 306 556 Disk(TB) 64 60 162 286 foreseen 2008 CPU(kSI2k) Pledged 230 219 438 887 876 foreseen 2008 Disk(TB) Pledged 94 97 98 97 198 194 390 388 New acquisitions in progress to get the pledged resources IFAE: –CPU: equipment already at PIC –Disk: 2 Thumpers SUN IFIC: –CPU & Disk in purchase phase UAM: –CPU & Disk in purchase phase Accounting values are normalized according to WLCG recommendantions

4 4 UAM: –José del Peso (PL) –Juanjo Pardo (*) –Luís Muñoz –Pablo Fernández (*T.MEC) IFAE: –Andreu Pacheco (PL) –Jordi Nadal (*) –Carlos Borrego (*) (<-) –Marc Campos –Hegoi Garaitogandia ( NIKHEF) IFIC –José Salt (PL and Coordinator) –Javier Sánchez (Tier-2 Operation Coordination) –Santiago González (Tier-3) –Alvaro Fernández (EGEE) –Mohammed Kaci (*T.MEC) –Luis March (*) –Farida Fassi ( FR Tier-1) –Gabriel Amorós (EGEE) –Alejandro Lamas (*) –Roger Vives (Ph.D student) –Elena Oliver (Ph. D. student) –Miguel Villaplana (Ph. D. student) Total FTE = 14 FTEs Human Resources of the Spanish ATLAS TIER-2 (*)means payed by the Tier-2 project

5 5 To increase surface from 90 m**2 to 150 m**2 – OK Upgrade UPS: 50 KVA to 250 KVA –To be delivered To install 70 lines of 16 Amps ( 3 lines/rack) –pending To increase the power for the building (Electrical 20KV transformer, DIESEL generator, Low Voltage distribution, …) –Public tendering in progress To change the air conditioning (impulsion on technical floor) –Waiting economical proposal New racks –Buying process Redistribution of all machines located at Computer Center –pending Execution: in progress and it will be ready by Summer 2008 Upgrade of IFIC Computing Room new area

6 6 Infrastructure Maintenance Production of Sim. Data (Public) Production of Sim. Data (Private) User Support Data Management Distributed Analysis Interactive Analysis Local Batch Facility End Users Interaction Scheme TIER-2 TIER-3

7 7 MC Production MC Production in 2008: Since January-2008, ATLAS has migrated to PANDA executor The production efficiency has been positively affected; the average efficiency was 50% and now is 75%-80% @ T2-ES T2-ES contribution is 1.2% in the first 2008 quaterly ( esentialy due to 2 downtime/paradas of PIC ) MC Production in 2006 and 2007 -The production in T2-ES follows the same trend as LCG/EGEE (good performance of the ES-ATLAS-T2) -The ES-ATLAS-T2 average contribution to the total Data Production in LCG/EGEE was 2.8% in 2006-2007 (take into account that 250 centers/institutes are Participating, 10 of them are T-1)

8 8 Activity: –To organize the ATLAS User Support in GRID Computing –Site User Support has started at the end of 2006 –To solve questions/doubts/getting started… – To provide periodical tutorials/hands on software tools for analysis, simulation,etc; –Maintainance of the related Twiki: https://twiki.ific.uv.es/twiki/bin/view/Atlas/AtlasTier2 Since 17-April-08 – Transfer of responsibility ATLAS Grid Computing User Support @ IFIC Farida Fassi Mohammed Kaci Elena Oliver Miguel Villaplana

9 9 Distributed data management exercises at Tier-2 Cosmic rays data taking (Tier-1 => Tier-2) 1) End Sept. 2007: M4 exercise (ESD & CBNT) 2) End Oct. – begin Nov 2007: M5 exercise (ESD & CBNT) 3) 12 Feb. 2008: FDR-1 (fdr08_run1, AOD) 4) 25 Feb. 2008: CCRC08 (ccrc08_t0, AOD) Data transfer (GB/day)Number of completed file transfers per day CCRC08 M5 IFIC tests DDM activity at Tier-2 in February 2008 Data Movement Management(DMM)

10 10 Data stored at the Spanish ATLAS Tier-2: Data at IFICData at UAMData at IFAE Tier-2Total data (TB)AODs (TB)AOD contribution IFAE3.60.925 % UAM11.44.035 % IFICDISK22.79.843 % More info: http://ific.uv.es/atlas-t2-eshttp://ific.uv.es/atlas-t2-es Data Movement Management(DMM)

11 11 Requested Data @ IFIC in 2007-2008 2007-02-28 Salvador Marti (silicio) misal1_csc11.005145.PythiaZmumu.digit.RDO.v12003103_tid003850 2007-02-28 Salvador Marti (silicio) misal1_csc11.005001.pythia_minbias.digit.RDO.v12003104_tid004278 2007-03-20 Salvador Marti (silicio) misal1_csc11.005200.T1_McAtNlo_Jimmy.digit.RDO.v12003104_tid004279 2007-03-20 Salvador Marti (silicio) misal1_csc11.005200.T1_McAtNlo_Jimmy.digit.RDO.v12003101_tid003494 2007-10-03 Susana Cabrera (Tier-3) trig1_misal1_mc12.005200.T1_McAtNlo_Jimmy.recon.AOD.v12000601_tid005997 2007-10-27 Salvador Marti (silicio) misal1_csc11.005300.PythiaH130zz4l.digit.RDO.v12003103_tid004087 2007-12-12 Susana Cabrera (Tier-3) trig1_misal1_csc11.005009.J0_pythia_jetjet.recon.AOD.v12000604_tid010586 trig1_misal1_csc11.005010.J1_pythia_jetjet.recon.AOD.v12000601_tid006001 trig1_misal1_mc12.005011.J2_pythia_jetjet.recon.AOD.v12000604_tid008266 trig1_misal1_mc12.005012.J3_pythia_jetjet.recon.AOD.v12000604_tid008274 trig1_misal1_csc11.005013.J4_pythia_jetjet.recon.AOD.v12000605_tid009524 trig1_misal1_mc12.005014.J5_pythia_jetjet.recon.AOD.v12000604_tid008270 trig1_misal1_mc12.005015.J6_pythia_jetjet.recon.AOD.v12000604_tid008264 trig1_misal1_csc11.005016.J7_pythia_jetjet.recon.AOD.v12000601_tid006005 2008-03-28 Susana Cabrera (Tier-3) user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.005204.TTbar_FullHad_McAtNlo_Jimmyv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.005500.AcerMC_Wtv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.005501.AcerMC_schanv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.005502.AcerMC_tchanv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008240.AlpgenJimmyWenuNp2_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008241.AlpgenJimmyWenuNp3_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008242.AlpgenJimmyWenuNp4_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008243.AlpgenJimmyWenuNp5_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008244.AlpgenJimmyWmunuNp2_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008245.AlpgenJimmyWmunuNp3_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008246.AlpgenJimmyWmunuNp4_pt20_filt3jetv12000601.001

12 12 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008247.AlpgenJimmyWmunuNp5_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008248.AlpgenJimmyWtaunuNp2_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008249.AlpgenJimmyWtaunuNp3_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008250.AlpgenJimmyWtaunuNp4_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_mc12.008251.AlpgenJimmyWtaunuNp5_pt20_filt3jetv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_csc11.005281.AcerMCbbWv12000601.001 user.top.TopView121302_MuidTau1p3p.trig1_misal1_csc11.005987.WZ_Herwigv12000601.001 trig1_misal1_mc12.008132.AlpgenJimmyZeeNp2LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008133.AlpgenJimmyZeeNp3LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008134.AlpgenJimmyZeeNp4LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008135.AlpgenJimmyZeeNp5LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008144.AlpgenJimmyZmumuNp2LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008145.AlpgenJimmyZmumuNp3LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008146.AlpgenJimmyZmumuNp4LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008147.AlpgenJimmyZmumuNp5LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008156.AlpgenJimmyZtautauNp2LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008157.AlpgenJimmyZtautauNp3LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008158.AlpgenJimmyZtautauNp4LooseCut.recon.AOD.v12000601 trig1_misal1_mc12.008159.AlpgenJimmyZtautauNp5LooseCut.recon.AOD.v12000601 2008-05-05 Susana Cabrera (Tier-3) trig1_misal1_mc12.006281.AlpgenJimmyWbbNp1.recon.AOD.v12000605 trig1_misal1_mc12.006282.AlpgenJimmyWbbNp2.recon.AOD.v12000605 trig1_misal1_mc12.006283.AlpgenJimmyWbbNp3.recon.AOD.v12000605 trig1_misal1_mc12.005500.AcerMC_Wt.merge.AOD.v12000605_tid010295 trig1_misal1_mc12.005501.AcerMC_schan.merge.AOD.v12000605 trig1_misal1_mc12.005502.AcerMC_tchan.merge.AOD.v12000605

13 13 It is provided by the Spanish NREN RedIRIS -Connection at 1 Gbps to University backbone -10 Gbps among RedIRIS POP in Valencia, Madrid and Catalunya Atlas collaboration: -More than 9 PetaBytes (> 10 million of files) transferred in the last 6 months among Tiers -The ATLAS link requirement between Tier1 and Tiers-2 has to be 50 MBytes/s (400 Mbps) in a real data taken scenario. Data transfer between Spanish Tier1 and Tier2. We reached 250 Mbps using gridftp between T1 -> T2 (CCRC’08) Data transfer from CASTOR (IFIC) for a TICAL private production. We reached 720 Mbps (plateau) In about 20 hours (4th March 08) High rate is possible Networking and Data Transfer

14 14 Analysis Use Cases #1: AANT ntuple generation from a given dataset to analize events with semileptonic channels to study Top Polarization (Roger) –Challenge: to process 595.550 evts –Distributed Analysis is needed –GANGA and TopView are used –Several sites in the game: IFIC (mainly),Lyon,FZK,CYF –Good performance of IFIC

15 15 Analysis Use Cases #2: Analysis of AOD for the process Z_H -> t tbaar (Little Higgs) ( physics analysis done by Miguel and Elena) –4 ROOT files x 5000 of event generation in local –30 files AOD x 250 reconstruction events in GRID (20000 evts) –Whole process takes 5 days/AOD file –Strategy; To test with a small number of events Once problems are fixed, jobs are sent to Tier-2 of 12 countries –We save time : 400 days running in local / 14 days using GRID

16 16 Last grid use report @ WLCG Workshop

17 17 Monthly Report -Very good % - well placed in the ranking (about 60 Tier-2 in the list) January-08 February-08 March-08 Tier-2 Availability & Reliability Report

18 18 FDR & CCRC’08 2.1 GB/s out of CERN Weekly Throughput

19 19 There are 3 people from the Spanish Tier-2 (out of 25) contributing to the ATLAS Distributed Computing operations Shifts: –L. March (IFIC), J. Nadal (IFAE) and M. Campos (IFAE) Shifts involve different activities: –MC production, DDM services, Distributed Analysis and Tier-0 These coordinated shifts started at the end of January 2008, around one month ago. They are official shifts for the ATLAS computing since then. ADC operations Shifts

20 20 2 shifters/day: one from USA and another from Europe. Each shift covers 15 hours/day with USA/Canada & Europe time zone. Shifts in Europe: from 9 am to 5 pm It will be increased to 24 hours/day with Asia and far East shifters. More info: https://twiki.cern.ch/twiki/bin/view/Atlas/ADCoS https://twiki.cern.ch/twiki/bin/view/Atlas/ADCoS Shifts covers 15 hours/day ADC operations Shifts

21 21 L. March has accepted to be the liaison between the ATLAS offline software commissioning and the Tier-1s for re-processing cosmic rays data. Started during last ATLAS Software Week (15-29 Feb. 2008). Work and responsibilities are under study. Tier-1 re-processing More info: https://twiki.cern.ch/twiki/bin/view/Atlas/T1Reprocessing https://twiki.cern.ch/twiki/bin/view/Atlas/AtlasOfflineCommissioning https://twiki.cern.ch/twiki/bin/view/Atlas/CosmicCommissioningReconstructionStatus

22 22 To identify the datasets and datastreams –updates due to the starting @ 10 TeV –Early Physics ( at CERN) User Space on the GRID –Under discussion in ATLAS Computing community –T-2 is a Collaboration Infrastructure and user disk space will be constrained by other concurrent activities (MC Production) –To put in place IFIC Local Resources: discuss the funding model from the different ATLAS projects (IFIC) To extract a few analysis ‘workflows’ in order not to spend time supporting the users To identify, among the different analysis, which level of priority can be assigned to the resources. Computing-Analysis Issues

23 23 User Support –To give problem-solution statistics –To establish limits in the kind of problems to solve Computing-Physics Analysis Board: –Necessary to follow-up the open issues and Action Lists in the inter-meeting period –Composition: 2-3 Computing + 2-3 Analysis –Establish coordination with the UAM and IFAE groups Computing-Analysis Issues

24 24 BACKUP SLIDES

25 25 Data Management is one of the main activities inside the ATLAS Spanish federated Tier-2. AOD physics data are going to be stored at these sites (distributed geographically) Data Management main activities: –Data monitoring at federated Tier-2: –http://ific.uv.es/atlas-t2-eshttp://ific.uv.es/atlas-t2-es –Old and unnecessary data cleaning –Inconsistency check and cleaning up (catalogue entries and/or file size mismatch) –Data replication to Tier-2 for Tier-2 users (using DQ2 client and doing subscriptions) Coordination with Tier-1: –Bi-weekly Tier-1/Tier-2 coordination meetings where Tier- 1/Tier-2 data management issues are discussed –There is a collaboration project between Tier-1/Tier-2 to monitor some ATLAS/Grid services at both sites Data Movement Management(DMM)

26 26 Main objective: to achieve a more federative structure Organization & management of the project is responsibility of the subproject leaders –SL constitute the Project Management Board A. Pacheco (IFAE), J. del Peso (UAM) and J. Salt (IFIC) –PMB chairman is the Coordinator of the Project J. Salt (Tier-2 Coordinator) PMB meeting every 4-5 weeks Tier-2 Operation Meetings: Virtual, biweakly Presential Tier-2 meetings: every 6 months –February’06 : Valencia –October’06: Madrid –May’07: Barcelona –November’07: Valencia with a previous T1-T2 Coordination meeting –May’08: foreseen in Madrid PROJECT MANAGEMENT (PM)

27 27 Spanish physicists are doing physics analysis using GANGA on data distributed around the world This tool has been installed in our Tier2 infrastructure GANGA is an easy-to-use front-end for job definition, management and submission. Users interaction via: Its own Python shell (command line) A Graphical User Interface (GUI) In our case the jobs are sent to the LCG/EGEE Grid flavour. We are doing performance test: LCG Resource Broker (RB) gLite Workload Management System (WMS). New RB from EGEE Distributed Analysis System (DAS)

28 28 Data Management More info: http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query?mode=listM5_Tier1

29 29 Results from FDR-1 (12 Feb. 2008): Tier-2datasetsTotal files in datasets IFAE2652 IFICDISK4487 UAM2652 More info: http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query?mode=listFDR#PIC http://gridui02.usatlas.bnl.gov:25880/server/pandamon/query?mode=listFDR#PIC Data Movement Management(DMM)


Download ppt "1 ATLAS-VALENCIA COORDINATION MEETING 9th May 2008 José SALT TIER2 Report 1.- Present Status of the ES-ATLAS-T2 2.- Computing-Analysis issues Overview."

Similar presentations


Ads by Google