Grid development at University of Manchester Hardware architecture: - 1 Computer Element and 10 Work nodes Software architecture: - EasyGrid to submit.

Slides:



Advertisements
Similar presentations
James Cunha Enabling Grid Computer for HEP Babar Team at University of Manchester Resources:
Advertisements

EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EasyGrid: a job submission system for distributed.
Your university or experiment logo here BaBar Status Report Chris Brew GridPP16 QMUL 28/06/2006.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
EasyGrid: the job submission system that works! James Cunha Werner GridPP18 Meeting – University of Glasgow.
Grid in action: from EasyGrid to LCG testbed and gridification techniques. James Cunha Werner University of Manchester Christmas Meeting
Dr. David Wallom Use of Condor in our Campus Grid and the University September 2004.
The DataGrid Project NIKHEF, Wetenschappelijke Jaarvergadering, 19 December 2002
Introduction to Analysis Example tutorial D. Greenwood For P. Skubic 8/21-22/20141ASP2014 Grid School.
Implementing Metadata Using RLS/LCG James Cunha Werner University of Manchester
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Introduction to EMF Server Communication and Cases Beta Testing November 4, 2009.
Identification of Upsilon Particles Using the Preshower Detector in STAR Jay Dunkelberger, University of Florida.
Τ ± → π ± π + π - π 0 ν τ decays at BaBar Tim West, Jong Yi, Roger Barlow The University of Manchester Carsten Hast, SLAC IOP HEP meeting Warwick, 12 th.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
The B A B AR G RID demonstrator Tim Adye, Roger Barlow, Alessandra Forti, Andrew McNab, David Smith What is BaBar? The BaBar detector is a High Energy.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
Enabling Grids for E-sciencE Medical image processing web portal : Requirements analysis. An almost end user point of view … H. Benoit-Cattin,
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
03/27/'07T. ISGC20071 Computing GRID for ALICE in Japan Hiroshima University Takuma Horaguchi for the ALICE Collaboration
James Cunha Job Submission for Babar Analysis James Werner Resources:
Grid Data Management A network of computers forming prototype grids currently operate across Britain and the rest of the world, working on the data challenges.
A Distributed Computing System Based on BOINC September - CHEP 2004 Pedro Andrade António Amorim Jaime Villate.
GLAST LAT ProjectDOE/NASA Baseline-Preliminary Design Review, January 8, 2002 K.Young 1 LAT Data Processing Facility Automatically process Level 0 data.
LOGO Scheduling system for distributed MPD data processing Gertsenberger K. V. Joint Institute for Nuclear Research, Dubna.
f ACT s  Data intensive applications with Petabytes of data  Web pages billion web pages x 20KB = 400+ terabytes  One computer can read
EasyGrid Job Submission System and Gridification Techniques James Cunha Werner Christmas Meeting University of Manchester.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
Status of LHCb-INFN Computing CSN1, Catania, September 18, 2002 Domenico Galli, Bologna.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
D ATA ANALYSIS FROM THE CERN TEST EXPERIMENT ABOUT THE HADES ELECTROMAGNETIC CALORIMETER By : Tazio Torrieri Supervisor : Vladimir Wagner.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Nadia LAJILI User Interface User Interface 4 Février 2002.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
Thomas Jefferson National Accelerator Facility Page 1 EC / PCAL ENERGY CALIBRATION Cole Smith UVA PCAL EC Outline Why 2 calorimeters? Requirements Using.
Ashok Agarwal University of Victoria 1 GridX1 : A Canadian Particle Physics Grid A. Agarwal, M. Ahmed, B.L. Caron, A. Dimopoulos, L.S. Groer, R. Haria,
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
LOGO Development of the distributed computing system for the MPD at the NICA collider, analytical estimations Mathematical Modeling and Computational Physics.
1 Lead glass simulations Eliane Epple, TU Munich Kirill Lapidus, INR Moscow Collaboration Meeting XXI March 2010 GSI.
BaBar and the Grid Roger Barlow Dave Bailey, Chris Brew, Giuliano Castelli, James Werner, Fergus Wilson and Will Roethel GridPP18 Glasgow March 20 th 2007.
LHC Physics Analysis and Databases or: “How to discover the Higgs Boson inside a database” Maaike Limper.
Portal Update Plan Ashok Adiga (512)
Measurement of J/  -> e + e - and  C -> J/  +   in dAu collisions at PHENIX/RHIC A. Lebedev, ISU 1 Fall 2003 DNP Meeting Alexandre Lebedev, Iowa State.
HIGUCHI Takeo Department of Physics, Faulty of Science, University of Tokyo Representing dBASF Development Team BELLE/CHEP20001 Distributed BELLE Analysis.
M. Muniruzzaman University of California Riverside For PHENIX Collaboration Reconstruction of  Mesons in K + K - Channel for Au-Au Collisions at  s NN.
Outline: Status: Report after one month of Plans for the future (Preparing Summer -Fall 2003) (CNAF): Update A. Sidoti, INFN Pisa and.
Timeshared Parallel Machines Need resource management Need resource management Shrink and expand individual jobs to available sets of processors Shrink.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid2Win : gLite for Microsoft Windows Roberto.
AI in HEP: Can “Evolvable Discriminate Function” discern Neutral Pions and Higgs from background? James Cunha Werner Christmas Meeting 2006 – University.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
1 What I did last year Nick Barlow Manchester Christmas meeting January 2005.
Status of Analysis Software Kanglin He For Analysis Software Group.
Status of MICE on the GRID  MICE VO u CEs  G4MICE Installation  Example test job  Station QA Analysis  Analysis jobs  File Storage  Documentation.
Σ Eγ comparison using Woochun Park Univ of South Carolina Feb 26,
Feedback from CMS Andrew Lahiff STFC Rutherford Appleton Laboratory Contributions from Christoph Wissing, Bockjoo Kim, Alessandro Degano CernVM Users Workshop.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
Mitchell Naisbit Manchester Jan2004 spectral function → a μ, α 1. Motivation Dominant uncertainty comes from experimentally determined hadronic loops:
Client installation Beijing, 13-15/11/2013. DIRAC Client Software Beijing, /11/2013 DIRAC Tutorial2  Many operations can be performed through the.
Geant4 GRID production Sangwan Kim, Vu Trong Hieu, AD At KISTI.
1 Overview of Benchmarking Tools Tim Barklow SLAC Oct 10, 2006.
Introduction to Particle Physics II Sinéad Farrington 19 th February 2015.
Using the Grid for the ILC Mokka and Marlin on the Grid ILC Software Meeting, Cambridge 2006.
First proposal for a modification of the GIS schema
Real Time Fake Analysis at PIC
EasyGrid: a job submission system for distributed analysis using grid
UK GridPP Tier-1/A Centre at CLRC
Presentation transcript:

Grid development at University of Manchester Hardware architecture: - 1 Computer Element and 10 Work nodes Software architecture: - EasyGrid to submit jobs to grid. -Babar software installed at AFS -LCG Grid software -Linux platform Important: Web page with all available information: Dr James Cunha Werner

EasyGrid: Analysis Submission to Grid Single command:./easygrid dataset_name Perform Handlers management and submission Configurable to achieve user’s requirements Software based in State-machine –Verify skimdata available: If not available perform BbkDatasetTCL to generate skimData. Each file will be a job. –Verify if there are handlers pending If not, script generation (gera.c) with edg-job-submit and ClassAdds, and script execution. Nest for submission policy and optimisation. If yes, verify job status. When the all jobs ended, recover results in user folder. (Prototype)

Benchmarks The different behavior of electrons, hadrons, and muons can be distinguished. Performing this analysis takes 7 days using one computer 24 hours a day. Using 10 CPUs in parallel, accessed via the Grid, it took only 8 hours. 18,700,000 events !!! Behavior of particles in the BaBar Electromagnetic Calorimeter (EMC )

Pi+- N Pi0 decays, with N= 1, 2, 3 and 4 Invariant masses of pairs of gammas, as measured by the EMC, from Pi0 decay produce a mass peak at 135 MeV (the peak in the plot). All other combinations are spread randomly around all energies (background). There were 81,700,000 events in the dataset and it took 4 days to run in production, with 26 jobs in parallel: to run it in one single computer would take more than 3 months.

Resonances reconstruction Note the change in resolution between the plots. This is consequence of insignificant branch rate of 4 Pi0 decay.