Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) Introduction Partial Workshop Results DØRAM Architecture.

Slides:



Advertisements
Similar presentations
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Advertisements

4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Microsoft ® Application Virtualization 4.5 Infrastructure Planning and Design Series.
Microsoft ® Application Virtualization 4.6 Infrastructure Planning and Design Published: September 2008 Updated: February 2010.
Introduction to Information System Development.
The D0 Monte Carlo Challenge Gregory E. Graham University of Maryland (for the D0 Collaboration) February 8, 2000 CHEP 2000.
Microsoft ® Application Virtualization 4.6 Infrastructure Planning and Design Published: September 2008 Updated: November 2011.
DØ IB Meeting, Nov. 8, 2001 J. Yu, UTA, Remote Analysis Status Remote Analysis Coordination Computing hardware is rather inexpensive –CPU and storage media.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
The SAMGrid Data Handling System Outline:  What Is SAMGrid?  Use Cases for SAMGrid in Run II Experiments  Current Operational Load  Stress Testing.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
3rd Nov 2000HEPiX/HEPNT CDF-UK MINI-GRID Ian McArthur Oxford University, Physics Department
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
Snapshot of the D0 Computing and Operations Planning Process Amber Boehnlein For the D0 Computing Planning Board.
DØSAR, State of the Organization Jae Yu DOSAR, Its State of Organization 7th DØSAR (3 rd DOSAR) Workshop University of Oklahoma Sept. 21 – 22, 2006 Jae.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
A Design for KCAF for CDF Experiment Kihyeon Cho (CHEP, Kyungpook National University) and Jysoo Lee (KISTI, Supercomputing Center) The International Workshop.
Jan. 17, 2002DØRAM Proposal DØRACE Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Remote Analysis Station ArchitectureRemote.
DØ RAC Working Group Report Progress Definition of an RAC Services provided by an RAC Requirements of RAC Pilot RAC program Open Issues DØRACE Meeting.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
DØ RACE Introduction Current Status DØRAM Architecture Regional Analysis Centers Conclusions DØ Internal Computing Review May 9 – 10, 2002 Jae Yu.
Summary, Action Items and Milestones 1 st HiPCAT THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington Contact Jae Yu or Alan.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
F. Rademakers - CERN/EPLinux Certification - FOCUS Linux Certification Fons Rademakers.
Spending Plans and Schedule Jae Yu July 26, 2002.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
DØSAR a Regional Grid within DØ Jae Yu Univ. of Texas, Arlington THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington.
From DØ To ATLAS Jae Yu ATLAS Grid Test-Bed Workshop Apr. 4-6, 2002, UTA Introduction DØ-Grid & DØRACE DØ Progress UTA DØGrid Activities Conclusions.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
UTA Site Report DØrace Workshop February 11, 2002.
DØRACE Workshop Jae Yu Feb.11, 2002 Fermilab Introduction DØRACE Progress Workshop Goals Arrangements.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
Remote Institute Tasks Frank Filthaut 11 February 2002  Monte Carlo production  Algorithm development  Alignment, calibration  Data analysis  Data.
Feb. 13, 2002DØRAM Proposal DØCPB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Partial Workshop ResultsPartial.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
D0 File Replication PPDG SLAC File replication workshop 9/20/00 Vicky White.
DØRACE Workshop Agenda Feb. 11, 2002 (Morning – 1West) 8:30 Registration at High Rise next to 1 West Conference room 9:00 Welcome (John W.) - Session Chair:
DØRACE Workshop Agenda Feb. 11, 2002 (Morning – 1West) 8:30 Registration at High Rise next to 1 West Conference room 9:00 Welcome Womersley 9:10-9:30 DØRACE.
ALICE Grid operations +some specific for T2s US-ALICE Grid operations review 7 March 2014 Latchezar Betev 1.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
Apr. 25, 2002Why DØRAC? DØRAC FTFM, Jae Yu 1 What do we want DØ Regional Analysis Centers (DØRAC) do? Why do we need a DØRAC? What do we want a DØRAC do?
July 10, 2002DØRACE Status Report, Jae Yu DØ IB Meeting, OU Workshop 1 DØ RACE Status Report Introduction Software Distribution (DØRACE Setup) Hardware.
DØ Grid Computing Gavin Davies, Frédéric Villeneuve-Séguier Imperial College London On behalf of the DØ Collaboration and the SAMGrid team The 2007 Europhysics.
Grid Computing at NIKHEF Shipping High-Energy Physics data, be it simulated or measured, required strong national and trans-Atlantic.
SA1 Execution Plan Status and Issues
Pasquale Migliozzi INFN Napoli
Artem Trunov and EKP team EPK – Uni Karlsruhe
Grid Canada Testbed using HEP applications
Nuclear Physics Data Management Needs Bruce G. Gibbard
News & Action Items Automatic Release Ready notification system is in place and working (Thanks Alan!! ) Needs refinements!!! New Fermi RH CD-Rom released.
UM D0RACE STATION Status Report Chunhui Han June 20, 2002
DØ MC and Data Processing on the Grid
Using an Object Oriented Database to Store BaBar's Terabytes
DØRACE Workshop Summary
Remote Analysis Coordination
DØ Internal Computing Review
Comments on #3: “Motivation for Regional Analysis Centers and Use Cases” Chip Brock 3/13/2.
DØ RAC Working Group Report
Proposal for a DØ Remote Analysis Model (DØRAM)
News New Updated DØRACE setup instruction to reflect the download split available on the web Three new sites are at various stages Princeton: Phase IV.
Presentation transcript:

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) Introduction Partial Workshop Results DØRAM Architecture Requirement for Regional Analysis Centers Working Group Suggestions and Comments What Do I Think We Should Do? Conclusions DØ IB Feb. 14, 2002 Jae Yu

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 2 Categorized remote analysis system set up by the functionality –Desk top only –A modest analysis server –Linux installation –UPS/UPD Installation and deployment –External package installation via UPS/UPD CERNLIB Kai-lib Root –Download and Install a DØ release Tar-ball for ease of initial set up? Use of existing utilities for latest release download –Installation of cvs –Code development –KAI C++ compiler –SAM station setup DØRACE Strategy Phase I Rootuple Analysis Phase 0 Preparation Phase II Executables Phase III Code Dev. Phase IV Data Delivery

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 3 Progressive Partially Updated

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 4 Why do we need a DØRAM? Total Run IIa data sizes are –350TB for RAW – TB for Reco + root –1.4x10 9 Events total At the fully optimized 10sec/event reco.  1.4x10 10 Seconds for one time reprocessing Takes one full year w/ 500 machines –Takes a few months to transfer raw data for dedicated gigabit network Centralized system will do a lot of good but not sufficient (DØ analysis model proposal should be complemented with DØRAM) Need to allow remote locations to work on analysis efficiently Sociological benefits within the institutes by having strong HEP presence at the home institutions Regional Analysis Centers should be established to exploit remote analysis environment

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 5 Central Analysis Center (CAC) Desktop Analysis Stations DAS …. DAS …. RAC …. Regional Analysis Centers Store & Process 10~20% of All Data IAC …. Institutional Analysis Centers IAC …. IAC Normal Interaction Communication Path Occasional Interaction Communication Path Proposed DØRAM Architecture FNAL

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 6 Regional Analysis Centers A few selected sites that satisfy certain requirements Provide almost the same level of service as FNAL to a few institutional analysis centers Analyses carried out within the regional center network –Store 10~20% of statistically random enough data permanently for self-sustainable analysis –Most the analyses performed on these samples with the regional network –Refine the analyses using the smaller but unbiased data set Some policies necessary to determine approval for access to full data set –When the entire data set is needed  Underlying DØGrid architecture provides access to remaining data set

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 7 Regional Analysis Center Requirements Become a Mini-CAC in services to IACs and DAS Sufficient computing infrastructure –Large bandwidth (gagibit or better) to handle requests from IACs –Sufficient Storage Space to hold 10~20% of data permanently and expandable to accommodate data increase >30TB just for Run IIa RAW data –Sufficient CPU resources to provide regional or Institutional analysis requests and reprocessing Located preferably to avoid unnecessary network traffic overlap Software Distribution and Support –Mirror copy of CVS database for synchronized update between RAC’s and CAC –Keep the relevant copies of databases –Act as SAM service station to provide data and rootupls to IACs and DAS

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 8 Suggestions and Comments from The Working Group Data characteristics –Specialized data set, in addition to some service sets for reprocessing, should be allowed –Some level of replication should be allowed –Keep a full copy of Thumbnails for self-sufficient analyses Reprocessing Capalibity –Consistency of reconstructed data must be ensured –Organization of reprocessing must be centrally organized –Book keeping of reprocessing Two staged approach: –Before Full gridification  Copy of all data kept in the CAC –After full gridification Data fully distributed within the network Data sets may be mutually exclusive

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 9 In Europe, some institutions are already in the works to become an RAC –Karlsruhe (Germany) –NIKHEF (Netherlands) –IN2P3, Lyon (France) We need more participation from US and other regions Agreed to form a group to formulate RAC more systematically  Write up a document within 1-2 mos. –Functions –Services –Requirements –Etc.

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 10 What Do I Think We Should Do? Most the students and postDocs are at FNAL, thus it is important to provide them sufficient computing and cache resources for their analysis.  The Current suggestion for backend analysis clusters should be built!! In the mean time, we should select a few sites as RACs and prepare sufficient hardware and infrastructure –My rough map scan gives FNAL+3RACs in the US, a few in Europe, South America, and Asia. Software effort for Grid should proceed as fast as we can to complement the hardware –We cannot afford to spend time for Test beds –Our set ups should be the Test Bed and the actual Grid A working group to determine number of RAC sites, their requirements, data characteristics and select RACs within the next couple of months.

Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 11 Conclusions DØ must prepare for large data set era Need to expedite analyses in timely fashion Need to distribute data set throughout the collaboration Requested to provide summary within 2 months for inclusion to DØ Analysis Computing Plan document Establishing regional analysis centers will be the first step toward DØ Grid  By the end of Run IIa (2- 3years)