JIM Deployment for the CDF Experiment M. Burgon-Lyon 1, A. Baranowski 2, V. Bartsch 3,S. Belforte 4, G. Garzoglio 2, R. Herber 2, R. Illingworth 2, R.

Slides:



Advertisements
Similar presentations
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Advertisements

Physics with SAM-Grid Stefan Stonjek University of Oxford 6 th GridPP Meeting 30 th January 2003 Coseners House.
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL March 25, 2003 CHEP 2003 Data Analysis Environment and Visualization.
18 Feb 2004Computing Division Project Status Report1 Project Status Report : SAMGrid  SAMGrid Management, Status, Operations – Merritt  SAMGrid Development.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
S. Veseli - SAM Project Status SAMGrid Developments – Part I Siniša Veseli CD/D0CA.
The SAMGrid Data Handling System Outline:  What Is SAMGrid?  Use Cases for SAMGrid in Run II Experiments  Current Operational Load  Stress Testing.
LcgCAF:CDF submission portal to LCG Federica Fanzago for CDF-Italian Computing Group Gabriele Compostella, Francesco Delli Paoli, Donatella Lucchesi, Daniel.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Building a distributed software environment for CDF within the ESLEA framework V. Bartsch, M. Lancaster University College London.
CDF Grid Status Stefan Stonjek 05-Jul th GridPP meeting / Durham.
28 April 2003Lee Lueking, PPDG Review1 BaBar and DØ Experiment Reports DOE Review of PPDG January 28-29, 2003 Lee Lueking Fermilab Computing Division D0.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
CHEP 2003Stefan Stonjek1 Physics with SAM-Grid Stefan Stonjek University of Oxford CHEP th March 2003 San Diego.
CHEP'07 September D0 data reprocessing on OSG Authors Andrew Baranovski (Fermilab) for B. Abbot, M. Diesburg, G. Garzoglio, T. Kurca, P. Mhashilkar.
Nick Brook Current status Future Collaboration Plans Future UK plans.
SAMGrid for CDF MC (and beyond) Igor Terekhov, FNAL/CD/CCF/SAM for JIM team.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Interactive Job Monitor: CafMon kill CafMon tail CafMon dir CafMon log CafMon top CafMon ps LcgCAF: CDF submission portal to LCG resources Francesco Delli.
SAMGrid as a Stakeholder of FermiGrid Valeria Bartsch Computing Division Fermilab.
SAM and D0 Grid Computing Igor Terekhov, FNAL/CD.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.

Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
22 nd September 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Dzero MC production on LCG How to live in two worlds (SAM and LCG)
16 September GridPP 5 th Collaboration Meeting D0&CDF SAM and The Grid Act I: Grid, Sam and Run II Rick St. Denis – Glasgow University Act II: Sam4CDF.
4 March 2004GridPP 9th Collaboration Meeting SAMGrid:JIM and CDF Development CDF Accepts the Need for the Grid –Requirements How to Meet the Need –Status.
1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
Data reprocessing for DZero on the SAM-Grid Gabriele Garzoglio for the SAM-Grid Team Fermilab, Computing Division.
Metadata Mòrag Burgon-Lyon University of Glasgow.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
GRID activities in Wuppertal D0RACE Workshop Fermilab 02/14/2002 Christian Schmitt Wuppertal University Taking advantage of GRID software now.
DCAF (DeCentralized Analysis Farm) Korea CHEP Fermilab (CDF) KorCAF (DCAF in Korea) Kihyeon Cho (CHEP, KNU) (On the behalf of HEP Data Grid Working Group)
SAM Overview (training session) for CDF Users Doug Benjamin Duke University Krzysztof Genser Fermilab/CD.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
LSF Universus By Robert Stober Systems Engineer Platform Computing, Inc.
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
Adapting SAM for CDF Gabriele Garzoglio Fermilab/CD/CCF/MAP CHEP 2003.
Grid Job, Information and Data Management for the Run II Experiments at FNAL Igor Terekhov et al FNAL/CD/CCF, D0, CDF, Condor team.
April 25, 2006Parag Mhashilkar, Fermilab1 Resource Selection in OSG & SAM-On-The-Fly Parag Mhashilkar Fermi National Accelerator Laboratory Condor Week.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
July 26, 2007Parag Mhashilkar, Fermilab1 DZero On OSG: Site And Application Validation Parag Mhashilkar, Fermi National Accelerator Laboratory.
A Data Handling System for Modern and Future Fermilab Experiments Robert Illingworth Fermilab Scientific Computing Division.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
CDF SAM Deployment Status Doug Benjamin Duke University (for the CDF Data Handling Group)
Sept Wyatt Merritt Run II Computing Review1 Status of SAMGrid / Future Plans for SAMGrid  Brief introduction to SAMGrid  Status and deployments.
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
Project Status Report : SAMGrid
Belle II Physics Analysis Center at TIFR
Leigh Grundhoefer Indiana University
DØ MC and Data Processing on the Grid
The DZero/PPDG D0/PPDG mission is to enable fully distributed computing for the experiment, by enhancing SAM as the distributed data handling system of.
Presentation transcript:

JIM Deployment for the CDF Experiment M. Burgon-Lyon 1, A. Baranowski 2, V. Bartsch 3,S. Belforte 4, G. Garzoglio 2, R. Herber 2, R. Illingworth 2, R. Kennedy 2, U. Kerzel 5, A. Kreymer 2, M. Leslie 3, L. Loebel-Carpenter 2, A. Lyon 2, W. Merrit 2, F. Ratnikov 6, R. St. Denis 1, A. Sill 7, S. Stonjek 2,3, I. Terekhov 2, J. Trumbo 2, S. Veseli 2, S. White 2 1 University of Glasgow, 2 Fermi National Accelerator Laboratory, 3 University of Oxford, 4 Istituto Nazionale di Fisica Nucleare, 5 Universität Karlsruhe, 6 Rutgers University, 7 Texas Tech University JIM Client JIM Submission JIM Execution & Monitoring JIM Execution & Monitoring SAM Database (FNAL) SAM JIM Broker DCAF JIM Execution & Monitoring SAM DCAF JIM Execution & Monitoring SAM DCAF JIM Execution & Monitoring SAM JIM Execution & Monitoring SAM DCAF Client Introduction The CDF (Collider Detector at Fermilab) Experiment is in the process of distributing computing infrastructure to numerous sites worldwide. The initial target of 25% of computer processing to be offsite by June 2004 has been achieved. JIM deployment will help achieve the second milestone of 50%, an estimated 4.5THz by June The software used for this task is comprised of: a mature data handling system called SAM (Sequential Access to data via Metadata); DCAF (Decentralised CDF Analysis Farm) for local job queuing and execution; and JIM (Job and Information Management), used to collect and distribute jobs to SAM stations and DCAF farms. Components of CDF Grid The diagram above shows how elements of the CDF Grid fit together. Users currently submit jobs from their terminal to DCAF, which uses SAM to transfer files. Once JIM is fully deployed, users will be encouraged to submit their jobs through the JIM client software, though the old interface may be used for JIM submissions. JIM client passes the job to the area submission site for queuing. After communicating with the broker, the job will be sent to an execution site, which may have a DCAF. The job will be executed, using SAM to transfer files, and DCAF or the local batch system (e.g. PBS) to execute the job. Monte Carlo Production Earlier this year the JIM development team focused on Monte Carlo (MC) production for the D0 experiment. The D0 success rate for MC is now over 99%. A script that makes a tarball from the CDF software environment has been used to run CDF MC on D0 Simplification of the JIM installation and upgrade procedure SAM station installations have been vastly simplified by the creation of a script. The once timely and difficult process can now be completed within a couple of hours, largely unattended. Simplifying the installation procedure was a crucial step to allow the quick roll-out of SAM, a critical element of the CDF Grid software set-up. A similar script is currently under development to provide the same ease of installation for JIM coupled with the efforts of the developers to reduce product tailoring requirements. A new product that installs and tailors many of the JIM components has been developed. Challenges and future work The most challenging step has been the tailoring of local batch systems. Individual execution sites have been tailored successfully with expert help, however this is not sufficiently easy to reproduce for widespread deployment. Investigations into Grid3 are underway and the possibility of using a combination of JIM and Grid3 components is under consideration. computing facilities at Wisconsin, first manually and then as a JIM submission. Thus the CDF software environment can be transferred around the grid, preventing problems with differing code versions on execution sites. This ensures that shared resources can be used fully for CDF jobs without application version issues. JIM Web Pages The screen shots above show: job submissions to the CDF Oxford JIM site over the past two weeks; The main JIM monitoring page; A section from the JIM installation manual. The job monitoring pages enable users to download the output of their completed job using a web browser. For problem resolution these pages are used in conjunction with SAM TV, a web monitoring tool displaying information on each file, project and site used by the SAM data handling system.