Using the Grid for the ILC Mokka and Marlin on the Grid ILC Software Meeting, Cambridge 2006.

Slides:



Advertisements
Similar presentations
DataTAG WP4 Meeting CNAF Jan 14, 2003 Interfacing AliEn and EDG 1/13 Stefano Bagnasco, INFN Torino Interfacing AliEn to EDG Stefano Bagnasco, INFN Torino.
Advertisements

GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Andrew McNab - Manchester HEP - 17 September 2002 Putting Existing Farms on the Testbed Manchester DZero/Atlas and BaBar farms are available via the Testbed.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
A tool to enable CMS Distributed Analysis
AMI S.A. Datasets… Solveig Albrand. AMI S.A. A set is… A number of things grouped together according to a system of classification, or conceived as forming.
AliEn uses bbFTP for the file transfers. Every FTD runs a server, and all the others FTD can connect and authenticate to it using certificates. bbFTP implements.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
OSG Site Provide one or more of the following capabilities: – access to local computational resources using a batch queue – interactive access to local.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Alain Romeyer - 15/06/20041 CMS farm Mons Final goal : included in the GRID CMS framework To be involved in the CMS data processing scheme.
CDF data production models 1 Data production models for the CDF experiment S. Hou for the CDF data production team.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
Building a distributed software environment for CDF within the ESLEA framework V. Bartsch, M. Lancaster University College London.
+ discussion in Software WG: Monte Carlo production on the Grid + discussion in TDAQ WG: Dedicated server for online services + experts meeting (Thusday.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
Contents 1.Introduction, architecture 2.Live demonstration 3.Extensibility.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Status of StoRM+Lustre and Multi-VO Support YAN Tian Distributed Computing Group Meeting Oct. 14, 2014.
4/20/02APS April Meeting1 Database Replication at Remote sites in PHENIX Indrani D. Ojha Vanderbilt University (for PHENIX Collaboration)
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
David Adams ATLAS ADA, ARDA and PPDG David Adams BNL June 28, 2004 PPDG Collaboration Meeting Williams Bay, Wisconsin.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Capabilities of Software. Object Linking & Embedding (OLE) OLE allows information to be shared between different programs For example, a spreadsheet created.
DDM Monitoring David Cameron Pedro Salgado Ricardo Rocha.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
Working with AliEn Kilian Schwarz ALICE Group Meeting April
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
Software Overview Akiya Miyamoto KEK JSPS Tokusui Workshop December-2012 Topics MC production Computing reousces GRID Future events Topics MC production.
PHENIX and the data grid >400 collaborators 3 continents + Israel +Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
GRID activities in Wuppertal D0RACE Workshop Fermilab 02/14/2002 Christian Schmitt Wuppertal University Taking advantage of GRID software now.
1 L.Didenko Joint ALICE/STAR meeting HPSS and Production Management 9 April, 2000.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
Testing the HEPCAL use cases J.J. Blaising, F. Harris, Andrea Sciabà GAG Meeting April,
M. Oldenburg GridPP Metadata Workshop — July 4–7 2006, Oxford University 1 Markus Oldenburg GridPP Metadata Workshop July 4–7 2006, Oxford University ALICE.
A computer contains two major sets of tools, software and hardware. Software is generally divided into Systems software and Applications software. Systems.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
M.Frank, CERN/LHCb Persistency Workshop, Dec, 2004 Distributed Databases in LHCb  Main databases in LHCb Online / Offline and their clients  The cross.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
SuperB – INFN-Bari Giacinto DONVITO.
Eleonora Luppi INFN and University of Ferrara - Italy
Akiya Miyamoto KEK 1 June 2016
Status of Storm+Lustre and Multi-VO Support
ALICE Computing Model in Run3
Grid Canada Testbed using HEP applications
R. Graciani for LHCb Mumbay, Feb 2006
Alice Software Demonstration
ATLAS DC2 & Continuous production
Presentation transcript:

Using the Grid for the ILC Mokka and Marlin on the Grid ILC Software Meeting, Cambridge 2006

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Motivation -MC-Simulations are essential for analysis tool development and optimization studies -“Divide and Conquer”-strategy speeds up computing -Combining multiple computing environments into one array of computercenters is called a Grid -In contrast to batch farms a Grid is NOT a regional computing facility -All activities on a Grid are done via a “Grid Middleware”, therefore different systems can be combined with a common interface. -The resulting computing capabilities are more powerful than any single batch system.

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge How things work

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Outbound connectivity required (DB) -G4 Datasets on SE -Output: -tarball with slcio, steering- and logfile -Needed libraries in tarball on SE -Output: -reconstructed slcio-file -tarball with log- and steeringfile Mokka & Marlin -Binaries and inputfiles are on SE -Steeringfiles are generated locally and send at job-submission -Easy to get other sofware working on the GRID

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Runscript -Display some info about WN -Copy all necessary files from SE onto WN -Set environment -Run binary (either Mokka or Marlin) -Put outputfiles on SE

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Status (Mokka) -Mokka was successfully tested –10k Z0pole events took ~8hours (10 jobs á 1000evt) –1k ttbar events took ~9hours (10 jobs á 100evts) -Events for 8 different detector configurations are on Grid Storage. -Files are available by our Online Database at ilcsoft.desy.de

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Status (Marlin) -Marlin was also successfully tested –10k Zpole events took ~8hours (1 job) -Reconstructed events for all simulated events are on GRID Storage. -There are files with all different reconstruction chains -Not yet available via our database

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Available data -Simulated data for: -e + e - -> ttbar (3000evts) -e + e - -> W + W - (3000evts) -e + e - -> cb (3000evts) -e + e - -> dus (3000evts) -e + e - -> Zh(120) (3000evts) -Available at 500GeV and 1TeV -Additionally there are files at ZPole (10k evts). -8 different detector configurations

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge How to access the data -Requirements: -access to a machine with LCG-software package installed or SL-installation and access to afs (see grid.desy.de) -valid grid-certificate for the VO „ilc“ -LFN from MC-Database

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge How to access the data (cont.) -Browse our database at -Get the LFNs for the files you´re interested in

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge How to access the data (cont.) -Use LCG-Commands to retrive files from storage -lcg-cp –- vo ilc lfn:{LFNfromDB} file:{absolute path}

04 April 2006Dennis Martsch, "Grid for the ILC Software Meeting, Cambridge Summary & Outlook -Using the GRID is a promising way for simulation and reconstruction (fast, reliable & easy to use) -Mokka & Marlin are working fine on the grid -~500GB of data were simulated beginning of Reconstruction of this data in 2 different chains was done quite fast -Visit for more informationhttp://grid.desy.de