Physicists's experience of the EGEE/LCG infrastructure usage for CMS jobs submission Natalia Ilina (ITEP Moscow) NEC’2007.

Slides:



Advertisements
Similar presentations
ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
Advertisements

INFSO-RI Enabling Grids for E-sciencE Workload Management System and Job Description Language.
1 CRAB Tutorial 19/02/2009 CERN F.Fanzago CRAB tutorial 19/02/2009 Marco Calloni CERN – Milano Bicocca Federica Fanzago INFN Padova.
FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
LNL CMS M.Biasotto, Bologna, 29 aprile LNL Analysis Farm Massimo Biasotto - LNL.
1 CMS user jobs submission with the usage of ASAP Natalia Ilina 16/04/2007, ITEP, Moscow.
CRAB Tutorial Federica Fanzago – Cern/Cnaf 13/02/2007 CRAB Tutorial (Cms Remote Analysis Builder)
User Experience in using CRAB and the LPC CAF Suvadeep Bose TIFR/LPC CMS101++ June 20, 2008.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Services Abderrahman El Kharrim
Basic Grid Job Submission Alessandra Forti 28 March 2006.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
A tool to enable CMS Distributed Analysis
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
Zhiling Chen (IPP-ETHZ) Doktorandenseminar June, 4 th, 2009.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
INFSO-RI Enabling Grids for E-sciencE Logging and Bookkeeping and Job Provenance Services Ludek Matyska (CESNET) on behalf of the.
Computational grids and grids projects DSS,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
Monitoring in EGEE EGEE/SEEGRID Summer School 2006, Budapest Judit Novak, CERN Piotr Nyczyk, CERN Valentin Vidic, CERN/RBI.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA3 Activity in Russia Sergey Oleshko, PNPI,
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
CERN IT Department CH-1211 Genève 23 Switzerland t Monitoring: Tracking your tasks with Task Monitoring PAT eLearning – Module 11 Edward.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
June 24-25, 2008 Regional Grid Training, University of Belgrade, Serbia Introduction to gLite gLite Basic Services Antun Balaž SCL, Institute of Physics.
CERN Using the SAM framework for the CMS specific tests Andrea Sciabà System Analysis WG Meeting 15 November, 2007.
Getting started DIRAC Project. Outline  DIRAC information system  Documentation sources  DIRAC users and groups  Registration with DIRAC  Getting.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
User Experience in using CRAB and the LPC CAF Suvadeep Bose TIFR/LPC US CMS 2008 Run Plan Workshop May 15, 2008.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
1 Andrea Sciabà CERN Critical Services and Monitoring - CMS Andrea Sciabà WLCG Service Reliability Workshop 26 – 30 November, 2007.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
INFSO-RI Enabling Grids for E-sciencE Αthanasia Asiki Computing Systems Laboratory, National Technical.
Workload Management System Jason Shih WLCG T2 Asia Workshop Dec 2, 2006: TIFR.
1 DIRAC Job submission A.Tsaregorodtsev, CPPM, Marseille LHCb-ATLAS GANGA Workshop, 21 April 2004.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
ITEP participation in the EGEE project NEC’2007, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks CRAB: the CMS tool to allow data analysis.
INFSO-RI Enabling Grids for E-sciencE CRAB: a tool for CMS distributed analysis in grid environment Federica Fanzago INFN PADOVA.
ANALYSIS TOOLS FOR THE LHC EXPERIMENTS Dietrich Liko / CERN IT.
Daniele Spiga PerugiaCMS Italia 14 Feb ’07 Napoli1 CRAB status and next evolution Daniele Spiga University & INFN Perugia On behalf of CRAB Team.
1Bockjoo Kim 2nd Southeastern CMS Physics Analysis Workshop CMS Commissioning and First Data Stan Durkin The Ohio State University for the CMS Collaboration.
Enabling Grids for E-sciencE CMS/ARDA activity within the CMS distributed system Julia Andreeva, CERN On behalf of ARDA group CHEP06.
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
10 March Andrey Grid Tools Working Prototype of Distributed Computing Infrastructure for Physics Analysis SUNY.
Tutorial on Science Gateways, Roma, Catania Science Gateway Framework Motivations, architecture, features Riccardo Rotondo.
1 Tutorial:Initiation a l’Utilisation de la Grille EGEE/LCG, June 5-6 N. De Filippis CMS tools for distributed analysis N. De Filippis - LLR-Ecole Polytechnique.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Enabling Grids for E-sciencE Work Load Management & Simple Job Submission Practical Shu-Ting Liao APROC, ASGC EGEE Tutorial.
Claudio Grandi INFN Bologna Workshop congiunto CCR e INFNGrid 13 maggio 2009 Le strategie per l’analisi nell’esperimento CMS Claudio Grandi (INFN Bologna)
Daniele Bonacorsi Andrea Sciabà
SuperB – INFN-Bari Giacinto DONVITO.
The EDG Testbed Deployment Details
Belle II Physics Analysis Center at TIFR
How to connect your DG to EDGeS? Zoltán Farkas, MTA SZTAKI
INFN-GRID Workshop Bari, October, 26, 2004
CRAB and local batch submission
Nicolas Jacq LPC, IN2P3/CNRS, France
N. De Filippis - LLR-Ecole Polytechnique
EGEE Middleware: gLite Information Systems (IS)
Presentation transcript:

Physicists's experience of the EGEE/LCG infrastructure usage for CMS jobs submission Natalia Ilina (ITEP Moscow) NEC’2007

Outline What do physicists need to know about grid? Overview of existing frameworks for jobs submission in CMS Specific of grid tutorials for physicists 2

What do physicists need to know about grid? 1.Bench mark – terminology: UI, CE, SE, WN, RB, VO… UI (User Interface) – a machine where user’s certificate is installed. From the UI user can be authenticated to use the Grid resources, submit a job for execution on a CE, list all the resources suitable to execute a job, replicate and copy files, cancel jobs, retrieve the output, show the status of submitted jobs CE (Computing Element) – a Grid batch queue, identified by a pair : / SE (Storage Element) - provides uniform access and services to large storage spaces. The SE may control large disk arrays, mass storage systems (MSS) and the like. 2.Operations with user certificate 3

3. The full chain for job submission with grid: to find information about needed data to write configuration file with the task to organize the most useful saving of the output maybe to realize the “bad” CE to save time submit jobs check the status kill and resubmit jobs (if it’s necessary) see and analyze the output 4

5 DBS data discovery page for CMS (very useful!)

6 The aim - to simplify the work of users to create and to submit analysis jobs into the grid environment. Its purpose is to allow users, without specific knowledge of grid infrastructure, to access and analyze remote data as easily as in a local environment, hiding the complexity of distributed computational services. Available tools: ASAP (Arda Support for CMS Analysis Processing) : CRAB (CMS Remote Analysis Builder) - more popular among CMS physicists: CMS frameworks for jobs submission: ASAP and CRAB (from physicists point of view)

7 # specify directory to store tasks jobdir = /afs/cern.ch/user/n/nilina/ASAP # store output at SE store_output = 1 output_se = srm.cern.ch output_se_path=/castor/cern.ch/user/n/nilina # specify grid to submit to (lcg, my) grid = lcg #specify dataset dbs_version = 2 primary_dataset = DY_mumu_10 tier = GEN-SIM-DIGI-RECO processed_dataset = CMSSW_1_3_1-Spring # pset file (CMSSW configuration file) pset_file = Z2muons.cfg output_files = MUONS.dat events_required = 5000 events_per_job = 1000 # specify minimum time requirements for the job min_wall_clock_time = 100 min_cpu_time = 100 Example of configuration file for ASAP

[CRAB] jobtype = cmssw scheduler = glitecoll [CMSSW] datasetpath = /DY_mumu_10/CMSSW_1_3_1-Spring /GEN-SIM-DIGI-RECO pset = test.cfg total_number_of_events = events_per_job = 1000 output_file = MUONS.dat use_dbs_2 = 1 [USER] copy_data = 1 storage_element = srm.cern.ch storage_path = /castor/cern.ch/user/n/nilina/Z2tau return_data = 1 use_central_bossDB = 0 use_boss_rt = 0 [EDG] rb = CERN proxy_server = myproxy.cern.ch virtual_organization = cms retry_count = 0 lcg_catalog_type = lfc lfc_host = lfc-cms-test.cern.ch lfc_home = /grid/cms Example of configuration file for CRAB 8

CRAB: already ready configuration files 9

10 ASAP_Monitor - very nice feature User can delegate responsibility for tasks to ASAP server. The server can process and monitor the users jobs and perform resubmissions in the case of failure.

11

12 Grid tutorials for physicists. Our experience. We have already first experience – training courses in Russia (JINR, ITEP) Program of the tutorial in ITEP (April 2007): Lecture's part: 1."Short introduction to LCG/EGEE""Short introduction to LCG/EGEE" 2."CMS user jobs submission with the usage of ASAP""CMS user jobs submission with the usage of ASAP" Practical part: “CMS user jobs submission with the usage of ASAP”CMS user jobs submission with the usage of ASAP

13 Physicists don’t need to know all details about LCG infrastructure (only main points) Physicists need to get the results (output files) in the easiest way and short time. All practical steps of jobs submission should be clarified for participants (from data finding to results retrieving) Tutorial should have practical part – to provide the participants with first experience (so participants should have already their grid certificates in advance!) Specific of grid tutorials for physicists

Summary LHC start-up is in the near future. So physicists should be able to do the analysis with Grid Physicists don’t need to understand the details of LCG infrastructure (in comparison with developers), but the full chain of jobs submission should be clarified There are tools in CMS (CRAB, ASAP) which hide the complexity from users The corresponding tutorials for physicists is needed 14