Discussion on Software Agreements and Computing MoU Slide 1 Agenda 1.AnnouncementsJ.Harvey 15’ 2.LHC Computing Review ãSummary of issues raised in the.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
Simulation Project Major achievements (past 6 months 2007)
Status of SICB and Monte Carlo productions Eric van Herwijnen Friday, 2 march 2001.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
Design, Implementation and Maintenance
SFT News 14 th July Tech Training Programme  Exercise underway to uncover training needs for work projects in 2015  Covers Technical Management,
Fermi Large Area Telescope (LAT) Integration and Test (I&T) Data Experience and Lessons Learned LSST Camera Workshop Brookhaven, March 2012 Tony Johnson.
Marco Cattaneo, 23rd February Status of the software migration  Migration strategy: Where we should be  Status: Where we are  Plans.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
Cluster currently consists of: 1 Dell PowerEdge Ghz Dual, quad core Xeons (8 cores) and 16G of RAM Original GRIDVM - SL4 VM-Ware host 1 Dell PowerEdge.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
Usability Issues Documentation J. Apostolakis for Geant4 16 January 2009.
RMS Update to TAC May 8, RMS Update to TAC ► At April 9 RMS Meeting:  Antitrust Training  RMS Voting Items: ► NPRR097Changes to Section 8 to Incorporate.
Update on Database Issues Peter Chochula DCS Workshop, June 21, 2004 Colmar.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
Peter Chochula ALICE DCS Workshop, October 6,2005 DCS Computing policies and rules.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
F. Rademakers - CERN/EPLinux Certification - FOCUS Linux Certification Fons Rademakers.
Presentation to the Information Services Board March 6, 2008 Bill Kehoe, Chief Information Officer Bill Kehoe, Chief Information Officer.
MINER A Software The Goals Software being developed have to be portable maintainable over the expected lifetime of the experiment extensible accessible.
Database Administrator RAL Proposed Workshop Goals Dirk Duellmann, CERN.
20th September 2004ALICE DCS Meeting1 Overview FW News PVSS News PVSS Scaling Up News Front-end News Questions.
IT – JINR Collaboration by Jürgen Knobloch CERN-JINR Meeting 20 th October 2005 This file is available at:
Support required for running application software projects in the SL/CO/AP section M.Vanden Eynden October 2000 * A description of the software development.
20/09/2006LCG AA 2006 Review1 Committee feedback to SPI.
LCG Generator Meeting, December 11 th 2003 Introduction to the LCG Generator Monthly Meeting.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
New software library of geometrical primitives for modelling of solids used in Monte Carlo detector simulations Marek Gayer, John Apostolakis, Gabriele.
Marco Cattaneo - DTF - 28th February 2001 File sharing requirements of the physics community  Background  General requirements  Visitors  Laptops 
Introduction What is detector simulation? A detector simulation program must provide the possibility of describing accurately an experimental setup (both.
Software Overview Akiya Miyamoto KEK JSPS Tokusui Workshop December-2012 Topics MC production Computing reousces GRID Future events Topics MC production.
2-Dec Offline Report Matthias Schröder Topics: Scientific Linux Fatmen Monte Carlo Production.
Franco Carbognani, EGO LSC-Virgo Meeting May 2007 Status and Plans LIGO-G Z Software Management.
Marco Cattaneo -EP Forum - 11th June 2001 File sharing requirements of the physics community  Background  General requirements  Visitors  Laptops 
GLAST LAT Offline SoftwareCore review, Jan. 17, 2001 Review of the “Core” software: Introduction Environment: THB, Thomas, Ian, Heather Geometry: Joanne.
1 SICBDST and Brunel Migration status and plans. 2 Migration Step 1: SICBMC/SICBDST split  Last LHCb week: Split done but not tested  Software week.
Computing R&D and Milestones LHCb Plenary June 18th, 1998 These slides are on WWW at:
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
1 ILD Workshop purpose and scope Yasuhiro Sugimoto May 23,
State of Georgia Release Management Training
CERN - European Organization for Nuclear Research Windows 2000 Update FOCUS June 13 th, 2002.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
CERN IT Department CH-1211 Genève 23 Switzerland t Migration from ELFMs to Agile Infrastructure CERN, IT Department.
1 july 99 Minimising RISC  General strategy - converge on PCs with Linux & NT to avoid wasting manpower in support teams and.
11/01/2012B.Couturier - Core Software Workshop 1 Software Development Infrastructure Main Topics Development tools Build and Release tools Tracking/Management.
Database Issues Peter Chochula 7 th DCS Workshop, June 16, 2003.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
2-December Offline Report Matthias Schröder Topics: Monte Carlo Production New Linux Version Tape Handling Desktop Computers.
Controls Software Migration Duplex Consoles. Goal of the MigrationGoal of the Migration –Move the accelerator controls system from Vax/VMS to PC/Linux.
CERN - IT Department CH-1211 Genève 23 Switzerland t Service Level & Responsibilities Dirk Düllmann LCG 3D Database Workshop September,
CDF SAM Deployment Status Doug Benjamin Duke University (for the CDF Data Handling Group)
Use of CMT in LHCb CMT Workshop, LAL (Orsay) 28 th February - 1 st March 2002 P. Mato / CERN.
DECTRIS Ltd Baden-Daettwil Switzerland Continuous Integration and Automatic Testing for the FLUKA release using Jenkins (and Docker)
Marco Cattaneo, Milano, 27th September Brunel status and plans Status of commissioning Forthcoming improvements Conventions.
SPI Report for the LHCC Comprehensive Review Stefan Roiser for the SPI project.
Application Support Environment Based on experience in High Energy Physics at CERN Presented at the UNESCO/CERN Workshop April 2002 Jürgen Knobloch.
LHCb computing model and the planned exploitation of the GRID Eric van Herwijnen, Frank Harris Monday, 17 July 2000.
Status of Brunel team and next steps
Olof Bärring LCG-LHCC Review, 22nd September 2008
Department of Licensing HP 3000 Replatforming Project Closeout Report
Simulation and Physics
Summary Computing Model SICb Event Model Detector Description
LHCb Data Quality Check web. cern
Agenda SICb Session Status of SICb software migration F.Ranjard
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

Discussion on Software Agreements and Computing MoU Slide 1 Agenda 1.AnnouncementsJ.Harvey 15’ 2.LHC Computing Review ãSummary of issues raised in the software panel P.Mato 30‘ ãProposal for 'software agreements' and discussion J.Harvey 30‘ ãPlanning for computing infrastructure F.Harris 30' COFFEE 3.Preparation & testing of software before release F.Ranjard 20' 4.Migration Status (SICBDST and BRUNEL) M.Cattaneo 20' 5.Status of SICbmc A.Jacholkowska 20‘ 6.Status of Monte Carlo Production E. van Herwijnen 10’ ãfollowed by discussion on the MC production procedure

Discussion on Software Agreements and Computing MoU Slide 2 Announcements qLHC Computing Review ãPanels 1 and 2 preparing reports ãPanel 3 will then discuss end June ãApril 5 th – presentation of baseline computing model ãMay 8 th – resource planning for software and hardware ãToday – technical input on software issues, manpower needs and responsibilities, and preparation of computing infrastructure qGRID Computing ãNew buzz word ãHEP community-wide activity ãEU Proposal

Discussion on Software Agreements and Computing MoU Slide 3 Announcements - GEANT4 qTraining day held on April 27 th ã15 participants ãoverview on the basic contents and functionality of Geant4 ãgreat success, feedback is very positive ãa new training session could be organized with more specific topics: ågeometry, physics process with may be morning session and afternoon "hands on". qProgress on integration of GEANT4 with GAUDI (GiGa Vanya Belyaev)- ãOn Linux GiGa is working (not yet on NT) ãnot yet installed in the release area ã qGEANT4Examples available which allows to run Geant4 examples under CMT.

Discussion on Software Agreements and Computing MoU Slide 4 Announcements - Conferences qNSS Oct 15-20, Lyon (Detectors, Electronics, Software) ãEvent Building workshop on Oct 20 th qVII International Workshop on Advanced Computing and Analysis Techniques in Physics Research ACAT2000 (Formerly AIHENP) ãFermilab, October ãArtificial Intelligence, Innovative Software Algorithms and Tools, Symbolic Problem solving and Large Scale Computing in High Energy Physics, Astrophysics, Accelerator Physics and Nuclear Physics 

Discussion on Software Agreements and Computing MoU Slide 5 Announcements - CERN School of Computing qWill be held at the Hotel Golden Coast in Marathon, Greece, from Sunday 17 th September to Saturday 30 th September qDeadline for application has been prolonged until 15 June 2000 qDetails of how to apply ã qThemes ãDistributed computing ãOO design and implementation ãStorage and Software for Data Analysis

Discussion on Software Agreements and Computing MoU Slide 6 Announcements - FOCUS questions 1. How does your experiment define "frozen OS" support? Frozen support means the possibility of running existing experiment software without modification (e.g. no changes to compiler, libraries, file system, interface to the stager and to tapes etc.). Services used by the experiment, such as the stager, LSF etc. should be maintained in such a way as to allow the experiment software to continue working. Essential bug fixes should be applied to the CERN software libraries, but no new features are requested.

Discussion on Software Agreements and Computing MoU Slide 7 Announcements - FOCUS questions 2. Does your experiment agree with the RISC decommissioning timetable presented by Manuel Delfino at FOCUS 17? (RSPLUS/RSBATCH - No physicist access in 2001, removal end 2001 HPPLUS - Freeze OS end 2001, removal end 2003 CERNLIB - Last maintenance release early 2002 PAW replacement in production mid-2001) No major problems with decommissioning schedule of RISC : our software is currently supported on Linux, WNT and AIX. We have stopped accessing RSPLUS/RSBATCH for production but still use it for debugging of FORTRAN code, due to the superior quality of the debugger compared to ddd on Linux. CERNLIB- no new functionality needed, current version should be compiled for future versions of Linux and Windows OS/compilers. We are actively migrating our software to C++ and expect to have completed much by end of 2001, but we cannot predict when last line of FORTRAN will have been removed from our production software. We request support for PAW until an OO alternative is officially supported by IT division.

Discussion on Software Agreements and Computing MoU Slide 8 Announcements - FOCUS questions 3. What are your experiment's requirements until 2005 concerning import/export of data? Currently we import MonteCarlo data on DLT 7000 from Rutherford and on Redwood from Lyon, as well as via network. Transfers via the network will increase in importance in future. Our current plan is to produce the following volumes of MonteCarlo data in the homelabs, which will be transferred to CERN preferably by network: TB TB TB TB TB In 2005 these numbers increase by two orders of magnitude in both directions (import of simulated data and export of real data)

Discussion on Software Agreements and Computing MoU Slide 9 FOCUS 4. How critical is archiving for your experiment? We make no use of pubarch. Some users archive their data on tapes, others in HSM. It is important that these data are safeguarded in any future migration. 5. What is the 3-year requirement for remote backup? We do not use remote backup for any of our systems.