ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.

Slides:



Advertisements
Similar presentations
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
6/4/20151 Introduction LHCb experiment. LHCb experiment. Common schema of the LHCb computing organisation. Common schema of the LHCb computing organisation.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
11 Dec 2000F Harris Datagrid Testbed meeting at Milan 1 LHCb ‘use-case’ - distributed MC production
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
BaBar Grid Computing Eleonora Luppi INFN and University of Ferrara - Italy.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
GridPP CM, ICL 16 September 2002 Roger Jones. RWL Jones, Lancaster University EDG Integration  EDG decision to put short-term focus of effort on making.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
Dan Tovey, University of Sheffield GridPP: Experiment Status & User Feedback Dan Tovey University Of Sheffield.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
Status of the LHCb MC production system Andrei Tsaregorodtsev, CPPM, Marseille DataGRID France workshop, Marseille, 24 September 2002.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
22 nd September 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
29 May 2002Joint EDG/WP8-EDT/WP4 MeetingClaudio Grandi INFN Bologna LHC Experiments Grid Integration Plans C.Grandi INFN - Bologna.
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
LCG Generator Meeting, December 11 th 2003 Introduction to the LCG Generator Monthly Meeting.
…building the next IT revolution From Web to Grid…
05/09/2001ATLAS UK Physics Meeting Data Challenge Needs RWL Jones.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
The GridPP DIRAC project DIRAC for non-LHC communities.
ATLAS-specific functionality in Ganga - Requirements for distributed analysis - ATLAS considerations - DIAL submission from Ganga - Graphical interfaces.
LHCb Data Challenge in 2002 A.Tsaregorodtsev, CPPM, Marseille DataGRID France meeting, Lyon, 18 April 2002.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
CLRC Grid Team Glenn Patrick LHCb GRID Plans Glenn Patrick LHCb has formed a GRID technical working group to co-ordinate practical Grid.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
Joe Foster 1 Two questions about datasets: –How do you find datasets with the processes, cuts, conditions you need for your analysis? –How do.
Moving the LHCb Monte Carlo production system to the GRID
The LHCb Software and Computing NSS/IEEE workshop Ph. Charpentier, CERN B00le.
US ATLAS Physics & Computing
Gridifying the LHCb Monte Carlo production system
Simulation and Physics
ATLAS DC2 & Continuous production
HEC Beam Test Software schematic view T D S MC events ASCII-TDS
LHCb thinking on Regional Centres and Related activities (GRIDs)
Status and plans for bookkeeping system and production tools
Presentation transcript:

ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University

ATLAS Needs  Long term, ATLAS needs a fully Grid-enabled Reconstruction, Analysis and Simulation environment  Short-term, the first ATLAS priority is a Monte Carlo production system, building towards the full system  ATLAS has an agreed program of Data Challenges (based in MC data) to develop and test the computing model

RWL Jones, Lancaster University Data Challenge 0  Runs from October-December 2001  Continuity test of MC code chain.  Only modest samples 10 5 event samples, and essentially all in flat file format.  All the Data Challenges will be run on Linux systems  compilers distributed with the code if not already installed locally in the correct version.

RWL Jones, Lancaster University Data Challenge 1 Data Challenge 1  Runs in the first half of 2002  Several sets of 10 7 events (high level trigger studies, physics analysis).  Intend to generate and store 8Tbytes in the UK,  1-2Tbytes in Objectivity.  Will use of M9 DataGrid deliverables and as many other Grid tools as time permits.  Tests of distributed reconstruction and analysis  Test of database technologies

RWL Jones, Lancaster University Data Challenge 2 Data Challenge 2  Runs for the first half of 2003  Will generate several samples of 10 8 events  Mainly in OO-databases  Full use of the Testbed 1 and Grid tools  Complexity and scalability tests of the distributed computing system  Large-scale distributed physics analysis using Grid tools, calibration and alignement  Large-scale distributed physics analysis using Grid tools, calibration and alignement

RWL Jones, Lancaster University LHC Computing Model (Cloud) LHC Computing Model (Cloud) CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Physics Department    Desktop Germany Tier 1 USA FermiLab UK France Italy NL USA Brookhaven ………. The LHC Computing Centre

RWL Jones, Lancaster University Implications of Cloud Model  Internal: need cost sharing between global regions within collaboration  External (on Grid services): Need authentication/accounting/priority on the basis of experiment/region/team/local region/user  Note: The NW believes this is a good model for tier-2 resources as well.

RWL Jones, Lancaster University ATLAS Software  Late in moving to OO as physics TDR etc given a high priority  Generations and reconstruction now done in C++/OO Athena framework  Detector simulation still in transition to OO/C++/Geant4; DC1 will still use G3  Athena common framework with LHCb Gaudi

RWL Jones, Lancaster University Simulation software for DC1. Simulation software for DC1. HepMc Detector simulation Dice: slug+geant3 fortran produce GENZ+KINE bank ZEBRA ATHENA Fast det.simulation Reconstruction C++ reads GENZ +kine convert to HepMc produce Ntuples ATHENA Particle lev. simulation GeneratorModules C++, linux Py6 +code dedicated to B-physics PYJETS->HepMc EvtGen BaBar package ( later). Atlfast++ reads HepMc produce Ntuples

RWL Jones, Lancaster University Requirement Capture  Extensive use case studies:“ATLAS Grid Use Cases and Requirements” 15/X/01  Many more could be developed, especially in the monitoring areas  Short-term use case centred on immediate MC production needs  Obvious overlaps with LHCb – joint projects  Three main projects defined, “Proposed ATLAS UK Grid Projects” 26/X/01

RWL Jones, Lancaster University Grid User interface for Athena  Completely common project with LHCb  Obtains resource estimates and applies quota and security policies  Query installation tools  Correct software installed? Install if not  Job submission guided by resource broker  Run-time monitoring and job deletion  Output to MSS and bookkeeping update

RWL Jones, Lancaster University Installation Tools  Tools to automatically generate installation kits, deploy using Grid tools and install at remote sites via Grid job  Should be integrated with a remote autodetection service for installed software  Initial versions should cope with pre-built libraries and executables  Should later deploy development environment  ATLAS and LHCb build environments converging on CMT – some commonality here

RWL Jones, Lancaster University MC Production System  For DC1, will use existing MC production system (G3), integrated with M9 tools  (Aside: M9/WP8 validation and DC kit development in parallel)  Decomposition of MC system into components: Monte Carlo job submission, bookkeeping services, metadata catalogue services, monitoring and quality-control tools  Bookkeeping and data-management projects already ongoing – will work in close collaboration, good link with US projects  Close link with Ganga developments

RWL Jones, Lancaster University  Allow regional management of large productions  Job script and steering generated  Remote installation as required  Production site chosen by resource broker.  Generate events and store locally  Write log to web  Copy data to local/regional store through interface with Magda (data management).  Copy data from local storage to remote MSS  Update book-keeping database

RWL Jones, Lancaster University Work AreaPMB Allocation (FTE) Previously Allocated (FTE) Total Allocation (FTE) ATLAS/LHCb ATLAS LHCb This will just allow us to cover the three projects Additional manpower must be found for monitoring tasks, testing the computing model in DC2, and the simple running of the Data Challenges

RWL Jones, Lancaster University WP8 M9 Validation  WP8 M9 Validation now beginning  Glasgow, Lancaster(, RAL?) involved in the ATLAS M9 validation  Validation is exercises the tools using the ATLAS kit  The software used is behind the current version  This is likely to be the case in all future tests (decouples software changes from tool tests)  Previous test of MC production using Grid tools a success  DC1 validation (essentially of ATLAS code); Glasgow, Lancaster (Lancaster is working on tests of standard generation and reconstruction quantities to be deployed as part of kit) Cambridge to contribute