July 2002Frédéric Fleuret - LLR CCF : The French Computing Center CCF CCF Software work at the French Computing Center (CCF) Phenix.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Magda – Manager for grid-based data Wensheng Deng Physics Applications Software group Brookhaven National Laboratory.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
The Mass Storage System at JLAB - Today and Tomorrow Andy Kowalski.
The D0 Monte Carlo Challenge Gregory E. Graham University of Maryland (for the D0 Collaboration) February 8, 2000 CHEP 2000.
CCJ Computing Center in Japan for spin physics at RHIC T. Ichihara, Y. Watanabe, S. Yokkaichi, O. Jinnouchi, N. Saito, H. En’yo, M. Ishihara,Y.Goto (1),
GLAST LAT ProjectDOE/NASA Baseline-Preliminary Design Review, January 8, 2002 K.Young 1 LAT Data Processing Facility Automatically process Level 0 data.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
Central Reconstruction System on the RHIC Linux Farm in Brookhaven Laboratory HEPIX - BNL October 19, 2004 Tomasz Wlodek - BNL.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
An Overview of PHENIX Computing Ju Hwan Kang (Yonsei Univ.) and Jysoo Lee (KISTI) International HEP DataGrid Workshop November 8 ~ 9, 2002 Kyungpook National.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
W.A.Wojcik/CCIN2P3, May Running the multi-platform, multi-experiment cluster at CCIN2P3 Wojciech A. Wojcik IN2P3 Computing Center
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
4/20/02APS April Meeting1 Database Replication at Remote sites in PHENIX Indrani D. Ojha Vanderbilt University (for PHENIX Collaboration)
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
SLAC Site Report Chuck Boeheim Assistant Director, SLAC Computing Services.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
PHENIX and the data grid >400 collaborators Active on 3 continents + Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
HPSS for Archival Storage Tom Sherwin Storage Group Leader, SDSC
STAR Software Walk-Through. Doing analysis in a large collaboration: Overview The experiment: – Collider runs for many weeks every year. – A lot of data.
8 October 1999 BaBar Storage at CCIN2P3 p. 1 Rolf Rumler BaBar Storage at Lyon HEPIX and Mass Storage SLAC, California, U.S.A. 8 October 1999 Rolf Rumler,
CC-J Monthly Report Shin’ya Sawada (KEK) for CC-J Working Group
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Nov. 8, 2000RIKEN CC-J RIKEN CC-J (PHENIX Computing Center in Japan) Report N.Hayashi / RIKEN November 8, 2000 PHENIX Computing
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
PHENIX Simulation System 1 January 12, 2000 Simulation: Status for VRDC Tarun Ghosh, Indrani Ojha, Charles Vanderbilt University.
PHENIX and the data grid >400 collaborators 3 continents + Israel +Brazil 100’s of TB of data per year Complex data with multiple disparate physics goals.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
November 10, 1999PHENIX CC-J Updates in Nov.991 PHENIX CC-J Updates in Nov New Hardware - N.Hayashi / RIKEN November 10, 1999 PHENIX Computing Meeting.
Simulation Status for Year2 Running Charles F. Maguire Software Meeting May 8, 2001.
May 10, 2000PHENIX CC-J Updates1 PHENIX CC-J Updates - Preparation For Opening - N.Hayashi / RIKEN May 10, 2000 PHENIX Computing
Large scale data flow in local and GRID environment Viktor Kolosov (ITEP Moscow) Ivan Korolko (ITEP Moscow)
W.A.Wojcik/CCIN2P3, Nov 1, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center URL:
A UK Computing Facility John Gordon RAL October ‘99HEPiX Fall ‘99 Data Size Event Rate 10 9 events/year Storage Requirements (real & simulated data)
Overview of PHENIX Muon Tracker Data Analysis PHENIX Muon Tracker Muon Tracker Software Muon Tracker Database Muon Event Display Performance Muon Reconstruction.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, R. Brock,T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina,
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
DZero Monte Carlo Production Ideas for CMS Greg Graham Fermilab CD/CMS 1/16/01 CMS Production Meeting.
M. Brooks, 28-Mar-02 Heavy/Light meeting 1 Muon Analysis Work Getting Code ready for first data pass - DONE Get ready for second pass on DSTs - muon identification.
W.A.Wojcik/CCIN2P3, HEPiX at SLAC, Oct CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center URL:
CCJ introduction RIKEN Nishina Center Kohei Shoji.
PHENIX Simulation System 1 September 8, 1999 Simulation Work-in-Progress: ROOT-in-PISA Indrani Ojha Banaras Hindu University and Vanderbilt.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
Apr. 25, 2002Why DØRAC? DØRAC FTFM, Jae Yu 1 What do we want DØ Regional Analysis Centers (DØRAC) do? Why do we need a DØRAC? What do we want a DØRAC do?
Quark Matter 2002, July 18-24, Nantes, France Dimuon Production from Au-Au Collisions at Ming Xiong Liu Los Alamos National Laboratory (for the PHENIX.
Monthly video-conference, 18/12/2003 P.Hristov1 Preparation for physics data challenge'04 P.Hristov Alice monthly off-line video-conference December 18,
CCIN2P3 Site Report - BNL, Oct 18, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center.
SAM at CCIN2P3 configuration issues
Grid Canada Testbed using HEP applications
Near Real Time Reconstruction of PHENIX Run7 Minimum Bias Data From RHIC Project Goals Reconstruct 10% of PHENIX min bias data from the RHIC Run7 (Spring.
Vanderbilt University
Preparations for Reconstruction of Run6 Level2 Filtered PRDFs at Vanderbilt’s ACCRE Farm Charles Maguire et al. March 14, 2006 Local Group Meeting.
CC and LQCD dimanche 13 janvier 2019dimanche 13 janvier 2019
Preparations for Reconstruction of Run7 Min Bias PRDFs at Vanderbilt’s ACCRE Farm (more substantial update set for next week) Charles Maguire et al. March.
Presentation transcript:

July 2002Frédéric Fleuret - LLR CCF : The French Computing Center CCF CCF Software work at the French Computing Center (CCF) Phenix Collaboration Meeting Frédéric Fleuret LLR-Ecole Polytechnique

July 2002Frédéric Fleuret - LLR French Computing Center (IN2P3) The IN2P3 Computing Centre was created to meet the needs of these laboratories. This centre must be capable of making readily available a very large amount of data (several hundred terabytes) in a secure manner, distributed between disks and tapes managed by robots. Computing power must be rapidly extensible and data access must be as transparent as possible. The Institute IN2P3 computing center IN 2 P 3 =Institut National de Physique Nucleaire et de Physique des Particles = Institute of the CNRS (~26000 people, scientists)  American NSF ~ 3300 people, 1700 permanent scientists = 18 laboratories + 1 Computing Center (~45 people)

July 2002Frédéric Fleuret - LLR French Computing Center (IN2P3) Power : Linux Machines status (12/01) –PCIII Linux 12 machines 2 proc. / machine (500 MHz) Memory : 256 MB –PCIII Linux 96 machines 2 proc. / machine (750 MHz) Memory : 1024 MB –PCIII Linux 93 machines 2 proc. / machine (1000 MHz) Memory : 1024 MB Today 10,000 SpecInt95

July 2002Frédéric Fleuret - LLR French Computing Center (IN2P3) File Storage : –HPSS : Cartridges 6 Cartridge Mounting silos Capacity : 720 TB on line File Transfer : –BBFTP is a software designed to quickly transfer files accross a wide area network. It has been written for the babar experiment in order to transfer big files (more than 2 GigaBytes) between SLAC (California) and the In2p3 Computing Center (Lyon,France). 1.3 TB 7 TB 50 GB cartridges 400 cartridges

July 2002Frédéric Fleuret - LLR CCF HPSS storage CCF (July 02) Phenix : low usage so far... GroupSpace (GB) TOTAL 162,239 Agape 373 Aleph 155 Alice 1,547 AMS 558 Archeops 156 Atlas 2,503 Babar 98,498 Clas 1,116 CMSF 2,472 D0 8,119 Delphi 0 Eros2,034 Heral 477 Indra 0 LHCb2,020 Nusol 367 Pauger 19,008 Phenix 205 Snovae 192 Virgo3,635

July 2002Frédéric Fleuret - LLR CCF HPSS Phenix CCF : –Large storage capability (today, up to 50 TB for Phenix) –Low usage, so far... CCF (200 GB) CCF (8 TB) CCF (100 TB)

July 2002Frédéric Fleuret - LLR CCF Network CCF : –IN (US  CCF) : 1.2 – 2.0 MB/secs. –OUT (CCF  US) : ~1.5 MB/secs. BNL  CCF (~1.2 MB/sec.) CCF  FNAL (~1.5 MB/sec.) SLAC  CCF (~2 MB/sec.)

July 2002Frédéric Fleuret - LLR CCF The « guru » : Albert Romana Code implementation –Run complete simulation/reconstruction chain, both interactively and on CCF. Code development –Database for file CCF –Tutorials & script to run phenix code (run_phnx) –Start working on nanoDSTs for muons

July 2002Frédéric Fleuret - LLR CCF HPSS CCF (Albert Romana) Get information about files stored in HPSS Updated automatically when storing files in HPSS Accessible on the web Ready to store files (in CCF

July 2002Frédéric Fleuret - LLR CCF Tutorials (Raphaël Granier) –1 tutorial on Muon Reconstruction Code. –2 tutorials on Phenix Muon Software (based on Yajun Mao Tutorial). Tutorial to run Muon CCF Tutorial to run Muon RCF

July 2002Frédéric Fleuret - LLR CCF run_phnx (Albert) –Run any of the PHENIX programs : genp (Pythia), genh (Hijing) pisa, resp reco, dst –Syntax : run_phnx genp 1000 –By default, uses the scripts, macros and files installed by Raphael in his tutorial. –Allows to execute programs either interactively or by batch. –Upgrade version : use java and create panels to run run_phnx.

July 2002Frédéric Fleuret - LLR CCF Data Simulation : Hijing (Geun-Beom Kim) –Adapted to CCF –Ready to produce Hijing CCF c************************************************ cPISA INPUT INTERFACE TO HIJING 1.35 (F77 Syntax) c************************************************ $hijing_inp nruns = 1000, elgev = 200.0, ref = 'CMS', chproj = 'A', chtarg = 'A', n1 = 197, iz1 = 79, n2 = 197, iz2 = 79, bmin = 0.0, bmax = 0.0, iseed = 1, iseed_skip = 0, jet_trigger = 0, pthard = 0.0, nthetamin=90.0, nthetamax=90.0, sthetamin=143.0, sthetamax=171.0, ppmin=0.0, $end Y+-Y+- M+-M+-

July 2002Frédéric Fleuret - LLR CCF Data Simulation : Pythia (Frédéric) –adapted to CCF –Ready to produce Pythia CCF !============= CONTROL DATA CARDS FOR PYTHIA INITIALIZATION ===========* ! 'MSEL=0' ! turn OFF global process selection 'MSTP(51)=5005' ! structure function for GRV LO94 'MSTP(52)=2' ! structure function for GRV LO94 'MSUB(86)=1' ! g+g -> J/psi+g turned ON 'MSUB(87)=1' ! g+g -> chi0_c+g turned ON 'MSUB(88)=1' ! g+g -> chi1_c+g turned ON 'MSUB(89)=1' ! g+g -> chi2_c+g turned ON 'MDME(721,1)=0' ! J/psi -> ee turned OFF 'MDME(722,1)=1' ! J/psi -> mumu turned ON 'MDME(723,1)=0' ! J/psi -> random turned OFF 'MRLU(1)= ' ! starting random number ! $pyth_par tnumevt = 100 sqrts = 200. $end ! !================ CONTROL DATA CARDS FOR GEOMETRICAL SELECTION =======* $nmlglo thmi_no = 9. thma_no = 37. no_on =.false. thmi_su = 143. thma_su = 171. su_on =.true. $end ! !================ CONTROL DATA FOR SIGNAL DESCRIPTION ================* $nmlsig isig1 = 443 iprod1 = 13 nprod1 = 1,1 isig2 = 443 iprod2 = -13 nprod2 = 1,1 mi = 0.,100. $end ! !================ CONTROL DATA CARDS FOR OUTPUTS SELECTION ===========* generated J/  dimuons reconstructed dimuons

July 2002Frédéric Fleuret - LLR CCF Data CCF : –PRDF  DSTs   DSTs : proof of principle (Raphaël) –  DST  muon nanoDSTs (Geun-Beom & Frédéric) copied 4600  DSTs segments from RHIC disks –1 one MuID road events (filtered by Atsushi Taketani) Used bbftp  able to transfer directly to HPSS-CCF Produce nanoDSTs output merge nanoDST ~ 200 MB ~ events total (~35000 with  2 muons)

July 2002Frédéric Fleuret - LLR CCF Data CCF : –  DST  muon nanoDSTs (Geun-Beom & Frédéric) Selection cuts : –2  ntracks  4 events (90% of the entire stat) –-38 cm < event Z vertex < 38cm –track P T > 1 GeV. –Ready to reconstruct CCF QM Poster (Hiroki Sato) Red : Opposite Sign Blue : Like Sign (0.27 GeV)

July 2002Frédéric Fleuret - LLR Conclusion : we have done … –File transfer –Tutorials for using muon CCF & RCF –Simulations –Reconstruction –First nanoDSTs pass for muons Software work at the French Computing Center (CCF) Phenix Collaboration Meeting Frédéric Fleuret LLR-Ecole Polytechnique

July 2002Frédéric Fleuret - LLR Prospects –Ready to help producing nDSTs, uDSTs, DSTs… –Ready to help producing Monte Carlo data for signal and background analysis. –More work on muon analysis. Software work at the French Computing Center (CCF) Phenix Collaboration Meeting Frédéric Fleuret LLR-Ecole Polytechnique