1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.

Slides:



Advertisements
Similar presentations
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Advertisements

Physics with SAM-Grid Stefan Stonjek University of Oxford 6 th GridPP Meeting 30 th January 2003 Coseners House.
ICHEP visit to NIKHEF. D0 Monte Carlo farm hoeve (farm) MC request schuur (barn) SAM MHz 2-CPU nodes 50 * 40 GB 1.2 TB.
CMS Applications Towards Requirements for Data Processing and Analysis on the Open Science Grid Greg Graham FNAL CD/CMS for OSG Deployment 16-Dec-2004.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
GRID DATA MANAGEMENT PILOT (GDMP) Asad Samar (Caltech) ACAT 2000, Fermilab October , 2000.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Meta-Computing at DØ Igor Terekhov, for the DØ Experiment Fermilab, Computing Division, PPDG ACAT 2002 Moscow, Russia June 28, 2002.
The Sam-Grid project Gabriele Garzoglio ODS, Computing Division, Fermilab PPDG, DOE SciDAC ACAT 2002, Moscow, Russia June 26, 2002.
Workload Management Massimo Sgaravatto INFN Padova.
JIM Deployment for the CDF Experiment M. Burgon-Lyon 1, A. Baranowski 2, V. Bartsch 3,S. Belforte 4, G. Garzoglio 2, R. Herber 2, R. Illingworth 2, R.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
SAMGrid – A fully functional computing grid based on standard technologies Igor Terekhov for the JIM team FNAL/CD/CCF.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
The SAMGrid Data Handling System Outline:  What Is SAMGrid?  Use Cases for SAMGrid in Run II Experiments  Current Operational Load  Stress Testing.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
5 November 2001GridPP Collaboration Meeting1 CDF and the Grid Requirements and Anti-Requirements CDF-o-Centric View Proposal Conclusion: CDF/D0 Deliverables.
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
1 School of Computer, National University of Defense Technology A Profile on the Grid Data Engine (GridDaEn) Xiao Nong
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Building a distributed software environment for CDF within the ESLEA framework V. Bartsch, M. Lancaster University College London.
D0 SAM – status and needs Plagarized from: D0 Experiment SAM Project Fermilab Computing Division.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
28 April 2003Lee Lueking, PPDG Review1 BaBar and DØ Experiment Reports DOE Review of PPDG January 28-29, 2003 Lee Lueking Fermilab Computing Division D0.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
SLICE Simulation for LHCb and Integrated Control Environment Gennady Kuznetsov & Glenn Patrick (RAL) Cosener’s House Workshop 23 rd May 2002.
CHEP 2003Stefan Stonjek1 Physics with SAM-Grid Stefan Stonjek University of Oxford CHEP th March 2003 San Diego.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
SAM and D0 Grid Computing Igor Terekhov, FNAL/CD.
Ruth Pordes, Fermilab CD, and A PPDG Coordinator Some Aspects of The Particle Physics Data Grid Collaboratory Pilot (PPDG) and The Grid Physics Network.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
22 nd September 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
Data reprocessing for DZero on the SAM-Grid Gabriele Garzoglio for the SAM-Grid Team Fermilab, Computing Division.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
…building the next IT revolution From Web to Grid…
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
The Particle Physics Data Grid Collaboratory Pilot Richard P. Mount For the PPDG Collaboration DOE SciDAC PI Meeting January 15, 2002.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
Replica Management Kelly Clynes. Agenda Grid Computing Globus Toolkit What is Replica Management Replica Management in Globus Replica Management Catalog.
1 Overall Architectural Design of the Earth System Grid.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Grid Activities in CMS Asad Samar (Caltech) PPDG meeting, Argonne July 13-14, 2000.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
Moving the LHCb Monte Carlo production system to the GRID
Distributed Data Access and Resource Management in the D0 SAM System
Joseph JaJa, Mike Smorul, and Sangchul Song
Gridifying the LHCb Monte Carlo production system
Status of Grids for HEP and HENP
Presentation transcript:

1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001

5/11/2001 Iain A Bertram - Lancaster 2 SAM, DØ, and the Grid lDØ Basics lWhat is SAM? èHistory èCurrent Deployment èCollaborators lSAM, Grid and Future Developments èOverview of Plans èUK Plans èCDF and SAM

5/11/2001 Iain A Bertram - Lancaster 3 The DØ Experiment lDetector Data è1,000,000 Channels èEvent size 250KB èEvent rate ~50 Hz èOn-line Data Rate 12 MBps èEst. 2 year totals (incl Processing and analysis): U1 x 10 9 events U~0.5 PB lMonte Carlo Data è5 remote processing centers èEstimate ~300 TB in 2 years.

5/11/2001 Iain A Bertram - Lancaster 4 Collaboration l~500 Physicists l72 institutions l18 Countries

5/11/2001 Iain A Bertram - Lancaster 5

5/11/2001 Iain A Bertram - Lancaster 6 SAM (DØ & FNAL Project) lSAM is Sequential Access to data via Meta-data lProject started in 1997 to handle DØ ’s needs for Run II data system. lSAM is a data-grid èNo fully functional Grid currently exists èSAM does have many GRID functionalities U Stations – logical collection of computers, networks, storage U Transparent access and transport of data between stations U Data Cataloguing – Replica Management U Fabric management U Job Submission on Local Station Only

5/11/2001 Iain A Bertram - Lancaster 7 Deployment

5/11/2001 Iain A Bertram - Lancaster 8 Deployment II Central Analysis Interconnected network of primary cache stations Communicating and replicating data where it is needed. MSS WAN Stations at FNAL Current active stations FNAL (several) Lyon FR (IN2P3), Amsterdam NL (NIKHEF) Lancaster UK Imperial College UK Others in US Datalogger Reco-farm ClueD0 LA N (Others)

5/11/2001 Iain A Bertram - Lancaster 9 Statistics lNumber of registered users: 360 lData in the system: 25 TB è160k files lAcessing > 3 TB a day èGoal 13 TB/day lFully integrated into DØ Analysis Framewrok 25TB

5/11/2001 Iain A Bertram - Lancaster 10 SAM Collaborators lPPDG - Particle Physics Data Grid èDØ Participation lCondor lGlobus l Fermilab CMS Computing Group l iVDGL International Virtual Data-Grid Laboratory. l IGMB – InterGrid Management Board

5/11/2001 Iain A Bertram - Lancaster 11 Future Plans SAM is an operational GRID lIdeal platform for demonstrating Grid technologies on the time scale of 2 years lModular design allows integration of modern Grid Tools lIdeal Testing Ground for LHC scale experiments èFull Scale Test of Grid Middleware

5/11/2001 Iain A Bertram - Lancaster 12 DØ Goals lDØ is fully committed to making SAM a fully functional GRID on the timescale of 2 years. lDØ is committed to using standard GRID tools wherever possible. lDØ has/is committing significant resources to the GRID.

5/11/2001 Iain A Bertram - Lancaster 13 General Goals 1.Use of standard middleware to promote interoperability 1.Globus Security infrastructure, Interoperability with Fermilab Kerberos security infrastructure 2.GridFTP as one of the supported file transfer protocols 3.Globus job submission. 4.Condor and extensions job submission 5.Publish availability and status of SAM station resources 6.Publish catalog of data files and their replicas using standard or standards emerging from PPDG and DataGrid

5/11/2001 Iain A Bertram - Lancaster 14 General Goals II 2.Additional Grid functionality for Job specification, submission and tracking. 1.Use of full Condor services for migration and checkpointing of jobs – as much as is possible with DØ software and the DØ software framework. This may require work on both Condor software to achieve full functionality 2.Building incrementally enhanced Job specification language and job submission services that ensure co-location of job execution and data files and reliably execute a chain of job processing, with dependencies between job steps. The first step in this is expected to be work in conjunction with the Condor team to provide for specification and execution of a Directed Acyclic Graph of jobs using an extended version of the DAGMAN product that CMS is testing for their MC job execution.

5/11/2001 Iain A Bertram - Lancaster 15 General Goals III 3.Enhancing Monitoring and Diagnostic capabilities 1.Extensions to existing system of logging all activities in the system to both local and central log files - as demanded by robustness and increased use of system. 2.Incorporation of emerging Grid Monitoring Architecture and monitoring tools. Little exists on this at this point and this work will involve working with other Grid projects and participating in Global Grid Forum working groups

5/11/2001 Iain A Bertram - Lancaster 16 Proposed Applications l Monte Carlo Production System èTo upgrade the distributed Monte Carlo production system as a short term use case for demonstrating, in an incremental fashion, essential components of the Grid. In particular this involves demonstrating transparent job submission to a number of DØ processing centres (SAM stations), reliable execution of jobs, transparent data access and data storage and an interface for users to understand and monitor the state of their MC requests and the resultant MC jobs and data that satisfy their requests

5/11/2001 Iain A Bertram - Lancaster 17 Proposed Applications lGeneral User Applications è To demonstrate analysis of data products (both MC and Detector Data) on desktop systems in the UK, using an enhanced version of the DØ SAM Grid system that incrementally incorporates Grid middleware components. This will not only demonstrate active use of a Grid for analysis but will also eventually demonstrate interoperability between the DØ Grid and other emerging Grid testbeds.

5/11/2001 Iain A Bertram - Lancaster 18 Milestones

5/11/2001 Iain A Bertram - Lancaster 19 CDF and SAM lRick St Denis Talk èCDF to use SAM for data access èUK setting up test facilities èCombined DØ and CDF Proposal

5/11/2001 Iain A Bertram - Lancaster 20 Proposal lRequest 6 FTEs è Four of the additional FTEs will work on integrating CORE Grid functionality into SAM (this talk) è One FTE for DØ applications (MC and Analysis) è One FTE integrate CDF software with SAM

5/11/2001 Iain A Bertram - Lancaster 21 Conclusions lSAM is an operational GRID lOffers great opportunity for testing GRID Middleware lGoal: Fully functional test-bed in two years.