5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review 20021 IT Review of the Year 2002 FOCUS December 2002 Thanks to all contributors from IT groups.

Slides:



Advertisements
Similar presentations
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Advertisements

6/2/2015Bernd Panzer-Steindel, CERN, IT1 Computing Fabric (CERN), Status and Plans.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
Site report: CERN Helge Meinhard (at) cern ch HEPiX fall SLAC.
Simulation Project Organization update & review of recommendations Gabriele Cosmo, CERN/PH-SFT Application Area Internal.
HEPiX Catania 19 th April 2002 Alan Silverman HEPiX Large Cluster SIG Report Alan Silverman 19 th April 2002 HEPiX 2002, Catania.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
CSU Fullerton Common Financial System Lessons Learned 1.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
REVIEW OF NA61 SOFTWRE UPGRADE PROPOSAL. Mandate The NA61 experiment is contemplating to rewrite its fortran software in modern technology and are requesting.
7/2/2003Supervision & Monitoring section1 Supervision & Monitoring Organization and work plan Olof Bärring.
October, Scientific Linux INFN/Trieste B.Gobbo – Compass R.Gomezel - T.Macorini - L.Strizzolo INFN - Trieste.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 Software Process panel SPI GRIDPP 7 th Collaboration Meeting 30 June – 2 July 2003 A.Aimar -
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
SLAC Site Report Chuck Boeheim Assistant Director, SLAC Computing Services.
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
F. Rademakers - CERN/EPLinux Certification - FOCUS Linux Certification Fons Rademakers.
1 The new Fabric Management Tools in Production at CERN Thorsten Kleinwort for CERN IT/FIO HEPiX Autumn 2003 Triumf Vancouver Monday, October 20, 2003.
10/22/2002Bernd Panzer-Steindel, CERN/IT1 Data Challenges and Fabric Architecture.
MINER A Software The Goals Software being developed have to be portable maintainable over the expected lifetime of the experiment extensible accessible.
20th September 2004ALICE DCS Meeting1 Overview FW News PVSS News PVSS Scaling Up News Front-end News Questions.
CERN Physics Database Services and Plans Maria Girone, CERN-IT
LCG LHC Computing Grid Project – LCG CERN – European Organisation for Nuclear Research Geneva, Switzerland LCG LHCC Comprehensive.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
4-8 th October 1999CERN Site Report, HEPiX SLAC. A.Silverman CERN Site Report HEPNT/HEPiX October 1999 SLAC Alan Silverman CERN/IT/DIS.
GLite – An Outsider’s View Stephen Burke RAL. January 31 st 2005gLite overview Introduction A personal view of the current situation –Asked to be provocative!
JRA Execution Plan 13 January JRA1 Execution Plan Frédéric Hemmer EGEE Middleware Manager EGEE is proposed as a project funded by the European.
Installing, running, and maintaining large Linux Clusters at CERN Thorsten Kleinwort CERN-IT/FIO CHEP
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
LCG Generator Meeting, December 11 th 2003 Introduction to the LCG Generator Monthly Meeting.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
S.Jarp CERN openlab CERN openlab Total Cost of Ownership 11 November 2003 Sverre Jarp.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
CERN-IT Oracle Database Physics Services Maria Girone, IT-DB 13 December 2004.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
Cluster Configuration Update Including LSF Status Thorsten Kleinwort for CERN IT/PDP-IS HEPiX I/2001 LAL Orsay Tuesday, December 08, 2015.
14 th April 1999CERN Site Report, HEPiX RAL. A.Silverman CERN Site Report HEPiX April 1999 RAL Alan Silverman CERN/IT/DIS.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
23/2/2000Status of GAUDI 1 P. Mato / CERN Computing meeting, LHCb Week 23 February 2000.
A. Aimar - EP/SFT LCG - Software Process & Infrastructure1 SPI Software Process & Infrastructure for LCG Project Overview LCG Application Area Internal.
Oracle for Physics Services and Support Levels Maria Girone, IT-ADC 24 January 2005.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
Data Transfer Service Challenge Infrastructure Ian Bird GDB 12 th January 2005.
DJ: WLCG CB – 25 January WLCG Overview Board Activities in the first year Full details (reports/overheads/minutes) are at:
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
1 july 99 Minimising RISC  General strategy - converge on PCs with Linux & NT to avoid wasting manpower in support teams and.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Last update: 03/03/ :37 LCG Grid Technology Area Quarterly Status & Progress Report SC2 February 6, 2004.
Enabling Grids for E-sciencE INFSO-RI Enabling Grids for E-sciencE Gavin McCance GDB – 6 June 2007 FTS 2.0 deployment and testing.
CERN IT Department CH-1211 Genève 23 Switzerland t SL(C) 5 Migration at CERN CHEP 2009, Prague Ulrich SCHWICKERATH Ricardo SILVA CERN, IT-FIO-FS.
ALICE RRB-T ALICE Computing – an update F.Carminati 23 October 2001.
Follow-up to SFT Review (2009/2010) Priorities and Organization for 2011 and 2012.
2-December Offline Report Matthias Schröder Topics: Monte Carlo Production New Linux Version Tape Handling Desktop Computers.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Grid Deployment Technical Working Groups: Middleware selection AAA,security Resource scheduling Operations User Support GDB Grid Deployment Resource planning,
Bob Jones EGEE Technical Director
Regional Operations Centres Core infrastructure Centres
Grid related projects CERN openlab LCG EDG F.Fluckiger
Olof Bärring LCG-LHCC Review, 22nd September 2008
Department of Licensing HP 3000 Replatforming Project Closeout Report
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review IT Review of the Year 2002 FOCUS December 2002 Thanks to all contributors from IT groups Wolfgang von Rüden

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Architectures and Data Challenges – ADC (1) Linux: –established 'Linux certification coordination body' to formally –involve main user groups and streamline certification process –implemented Linux release policy: –certified CERN Linux (major changes, six months to certify) –certified CERN Linux (minor release, nine weeks to certify) –central installation service for farm and desktop machines

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review CERN Linux certification Timeline: : Red Hat 7.3 release : CLUG recommendation to certify 7.3, not 8.x : start of the certification : planned end date (delayed by missing libraries, IT & LHCb requested more time) : actual certification New: Linux certification coordination group –Formal participation of divisions and experiments –Defines user environments to certify –Takes decision Good start, but need iterations to map out all user requirements

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Proposal: and lifetime planning ○ CERN Linux and are now obsoleted by ○ is still needed on LXBATCH and LXPLUS6 until May03, but only for running existing code, shutdown already agreed at FOCUS (but DTF asks for delay) ○ was never used in batch production, desktops are being migrated ○ Rundown schedule proposal – : Feature freeze -- no more application software updates (ASIS) – : end of support: no more security updates, no new installations. Existing machines need to be upgraded or individually secured

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Objectives and scope of the "Linux certification coordination committee" –To monitor the Linux certification process and adjust it as necessary. –To implement the CERN Linux certification process. Members of the LCCC (*) will jointly take decisions on which vendor release to certify jointly declare start and end of certifications individually ensure that the interests of the CERN Linux user communities they are representing are taken into account in the certification –to decide on CERN-wide Linux issues, after identifying and consulting with stakeholders outside of this committee. (*) : no real name yet

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Experiences with the new group ○ 3 meetings up to now – : general mandate, ways of working – : compilers for – : planned certification, status update (and decision to delay) ○ Taken up quickly, collaborative atmosphere, fulfills real need ○ Varying “depths” of requirements, explicit package lists ↔ “something like 6.1.1” ○ Planned: monitor progress of gcc-3.2 migration, decide on formal “mini-certification” if things go wrong until January ○ Planned: Red Hat 8.1 certification in spring

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Current Membership –Bruce Barnett(ATLAS-online) –Eric Cano(CMS-online) –Marco Cattaneo(LHCb) –Günter Duckeck(ATLAS-offline) –Benigno Gobbo(non-LHC experiments) –Jan Iven(IT-services catchall, chair) –Nicolas de Metz-Noblat (PS/CO) –Jarek Polok(Desktop, secretary) –Fons Rademakers(ALICE) –Tim Smith(IT PLUS/BATCH) –Stephan Wynhoff(CMS-offline) –Alastair Bland(SL/CO) –Missing: –Link to CLUG(CLUG chair?)

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Architectures and Data Challenges- ADC (2) Testbeds : –The testbeds at CERN have been the major place for testing and verification and CERN is the de facto control center of all production and test activities. –In spring EDG passed the project review successful using the testbeds installed, configured and maintained at CERN. –Based on EDG v1.1.2 middleware a first production stile test was run by the ATLAS collaboration in close cooperation with CERN’s support for the EDG testbeds. The set goals of the test have been reached before time and could be extended.

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Architectures and Data Challenges – ADC (3) Data Challenges: –During the year several Computing Data Challenges were organized to stress and test the different components in our Architecture (network, disk server, tape servers, CASTOR, etc.) e.g. : –ATLAS  DAQ complexity and communication –ALICE  DAQ, 1.8 GB/s event building, 350 MB/s disk I/O –IT  disk server performance + stability at 1 GB/s load, Disk MTBF long term tests, high throughput CASTOR and networking, new tape technology

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Applications for Physics and Infrastructure – API (1) Geant4 –Detector simulation for LHC experiments Physics verification progressing well – comparison with test beam data CMS plans to use Geant4 for the 2003 data challenge –Other users: Harp, BaBar (1 billion events produced), Space, Medical… –Delta-review done (October) – positive feedback –Collaboration & User Workshops at CERN October and November 2002 –Major new developments: new models in hadronic physics (cascade etc.), extensions in electromagnetic physics, physics lists for major use cases, cut by region, importance biasing and scoring for radiation studies

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Applications for Physics and Infrastructure – API (2) LCG Applications Area participation –after the SC2 moratorium to stop new developments for Anaphe, a consolidated release was made in October –the Anaphe team members have been assigned to projects in the LCG applications area –API participates LCG applications area projects: software process and infrastructure (SPI), persistency framework (POOL), core libraries and services (CLS) and physicist interface (PI), ROOT CERNLIB –“Last ever” release done in Summer 2002 –Support until Summer 2003 – still following platform/compiler decisions at CERN

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Supported Platforms 2003 Footnote: Geant4 is supported on Windows 2000/XP with VC++6.0-SP5 CERNLIB with Visual Fortran 6.1 PlatformO/SC/C++FORTRAN LINUX/IntelRH 7.3gcc  3.2g  3.2 Solaris8SC 5.3  5.4

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Number of Requests for Support Quarter

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Controls - CO CERN-wide support services continue for: –National Instruments (LabVIEW), PVSS-II, OPC, CAN Bus –PVSS contract extended to all CERN projects (except ST) Support for LHC Joint COntrols Project (JCOP) –Gas Control System: good progress with engineering & UNICOS collaboration –JCOP Framework release 1.2 and well-defined future road map –Detector Safety System project started for LHC experiment equipment safety –Progress with control for Racks, Sub-Racks and Data Interchange Controls for Fixed Target programme –COMPASS, HARP (now terminated), NA48, NA60

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Communication Services - CS End of the campus FDDI backbone ( x.y) General network backbone made fully redundant Kept pace with the physics data challenges in computer centre openlab reception of 10GE equipment Opening of 622 Mb/s protected production link to Chicago Opening of 2.5 Gb/s link to Chicago New record of long distance file transfer CERN - Triumf Technical network backbone available in all LHC surface areas The GSM/Radio antenna installed in first LHC half-octant Upgrade of the accelerator network has started

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Databases - DB POOL project activities –SC2 RTAG; leadership and participation COMPASS / HARP migration –Circa 100MB/s Grid Data Management –EDG milestones  preparation for LCG Oracle cluster for physics –2-node Sun cluster: detector DBs etc EDMS replacement (Sun-based) –Joint cluster with main DB server DB market survey  new contract amendment –Covers CERN staff & registered users –Deployment & support issues being worked out

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Data Services – DS (1) AFS: Migration of all servers to OpenAFS has been completed eliminating two major bugs seen by clients (‘disappearing’ files). Backup: After extensive analysis and testing we are recommending to merge the separate Legato and AFS backup services into a single one based on the IBM Tivoli Storage Manager already in use.

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Data Services – DS (2) Managed Storage: The CASTOR system now handles the bulk of physics data at CERN with over 1 Petabyte of data stored in 7 million files. It smoothly managed 3 months of Compass data taking at a speed of 45 MB/sec and is currently setting up for the Alice data challenge 4 when it should run for a continuous week at over 200 MB/sec. Symmetric splitting of tapes and drives into two separate robotic clusters was completed. Twenty high performance and capacity STK 9940B tape drives were purchased and are being used in the Alice DC4. We have almost completed migration of required Redwood cartridges to STK 9940A or B media prior to end of support at end 2002.

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Fabric Infrastructure and Operations - FIO Maintained and developed services as required –LXBATCH/LXPLUS, LSF, Remedy Completed RISC reduction programme –including consolidation of multiple independent Linux clusters into single LXBATCH service. Built foundation for future Tier0/Tier1 services –improved system monitoring in collaboration with EDG/WP4 and evaluation of PVSS as component of monitoring architecture. –improved system management tools; move to Linux standards and modularity; prototyping of EDG/WP4 tools –developed hardware management procedures for large clusters –converted B513 tape vault to machine room

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Internet Services – IS Mail and Collaborative Services –New mail infrastructure tested, validated and deployed –Calendar, Web interface, Encrypted sessions and better spam fight available on the new service –Service in production in December Web Services –Registration interface improved with access to log files, permission management, quota management –Solaris Cluster migrated to linux PCs, web site data consolidated on AFS Windows Services –Project and divisional data migrated into the DFS tree –More than 4700 W2k PCs remotely managed and secured –VPN pilot service deployed –Windows XP pilot service prepared

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Product Support - PS In agreement with the users, in particular CMS, Solaris 8 is now certified and SUNDEV has been upgraded this week to this version. CVS service adding more clients; now serving 5 services with several more under test; tests are underway on access models and speeds The migration of EUCLID client platforms to Windows is underway with a completion date of summer No problems so far. Users appreciate jump in performance (factor of about 3). The decision has been taken to migrate the EUCLID server from Alpha to Windows, starting in the second half of The Market Survey to re-tender the Desktop Support Contract (which includes support of computing for several experiments) has started. There will be particular emphasis on service levels and away from manpower. First discussions planned with users of the contract.

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review User Services – US New IT Web site Infrastructure for rental PCs New IT user support model (as of 1 st Sept.) –Higher first-time resolution rate –User feedback (see chart)

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Major Projects with IT participation (1) EU DataGrid –Successful EU review in March, major progress (ATLAS & CMS stress test) –Close collaboration with US Grid projects, LHC experiments and LCG –High visibility and acceptance within the EU EU DataTAG –New record of long distance file transfer CERN - Triumf LCG –Project now well on the rails with SC2 – PEB - GDB –Major progress in all sub-areas –Review processes with LHCC, C-RRB in place –~50 new people arrived –Funding for phase I still under discussion

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Major Projects with IT participation (2) JCOP: –Divers range of sub-projects progressing well –External review planned for early 2003 CERN openlab for DataGrid applications –Enterasys, HP and Intel have joined –opencluster components start to arrive

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Other major events CERN School of Computing – excellent feedback EU-Proposal for EGEE under preparation New medium/long-term plan assures support for LHC computing (to be accepted by Council next week). Adds 20 FTEs for long-term LHC support Introduction of P+M+I Work Package planning –Integrated planning for all services and projects, in personnel, materials and industrial services –Allows to move resources within limits between materials and personnel –Managing this plan in the future is as important as setting it up in the first place –For 2003: System management contract will be gradually in-sourced.

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Conclusions 2002 was again a very busy year with a lot of achievements From IT’s view, the relations with the users have improved further, also thanks to FOCUS The efforts made by many users has allowed to streamline the services and provide more for less! Financial situation is very tight for the coming years CERN’s computing activities are highly recognised by the CERN Council, EU, US funding agencies and industry We have enormous challenges in front of us and a very diversified programme. Stay focussed!

5 December 2002Wolfgang von Rüden, FOCUS Meeting, IT Review Last, but not least … A big THANK YOU to J. Altaber and M. Delfino