Offline Software Status Jan. 30, 2009 David Lawrence JLab 1.

Slides:



Advertisements
Similar presentations
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Advertisements

Sander Klous on behalf of the ATLAS Collaboration Real-Time May /5/20101.
ATLAS Tier-3 in Geneva Szymon Gadomski, Uni GE at CSCS, November 2009 S. Gadomski, ”ATLAS T3 in Geneva", CSCS meeting, Nov 091 the Geneva ATLAS Tier-3.
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
Pair Spectrometer Design Optimization Pair Spectrometer Design Optimization A. Somov, Jefferson Lab GlueX Collaboration Meeting September
Hall D Photon Beam Simulation and Rates Part 1: photon beam line Part 2: tagger Richard Jones, University of Connecticut Hall D Beam Line and Tagger Review.
The LHCb DAQ and Trigger Systems: recent updates Ricardo Graciani XXXIV International Meeting on Fundamental Physics.
Hall-D Software Status May 12, 2009 David Lawrence 5/12/091.
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
CHEP04 - Interlaken - Sep. 27th - Oct. 1st 2004T. M. Steinbeck for the Alice Collaboration1/20 New Experiences with the ALICE High Level Trigger Data Transport.
GlueX Collaboration Meeting, Newport News, October 25-27, 2007 GlueX Simulations Status Report Richard Jones GlueX Collaboration Meeting, Newport News,
Tagging Near the End-Point Richard Jones, University of Connecticut GlueX Collaboration Meeting, Newport News, May 8-10, Current design capabilities.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
May 2010 Graham Heyes Data Acquisition and Analysis group. Physics division, JLab Data Analysis Coordination, Planning and Funding.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
The JANA Calibrations and Conditions Database API March 23, 2009 David Lawrence JLab 3/23/091JANA Calibration API David Lawrence -- JLab.
The June Software Review David Lawrence, JLab Feb. 16, 2012.
Multi-threaded Event Processing with JANA David Lawrence – Jefferson Lab Nov. 3, /3/08 Multi-threaded Event Processing with JANA - D. Lawrence JLab.
PAIR SPECTROMETER DEVELOPMENT IN HALL D PAWEL AMBROZEWICZ NC A&T OUTLINE : PS Goals PS Goals PrimEx Experience PrimEx Experience Design Details Design.
Thomas Jefferson National Accelerator Facility (JLab) 6/16/09Multi-threaded event processing with JANA -- David Lawrence 1 6 GeV electron accelerator user.
Feb. 19, 2015 David Lawrence JLab Counting House Operations.
May. 11, 2015 David Lawrence JLab Counting House Operations.
Software Overview David Lawrence, JLab Oct. 26, 2007 David Lawrence, JLab Oct. 26, 2007.
BESIII computing 王贻芳. Peak Data volume/year Peak data rate at 3000 Hz Events/year: 1*10 10 Total data of BESIII is about 2*640 TB Event size(KB)Data volume(TB)
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
Hall D Trigger and Data Rates Elliott Wolin Hall D Electronics Review Jefferson Lab 23-Jul-2003.
Introduction to Hall-D Software February 27, 2009 David Lawrence - JLab.
Online Data Challenges David Lawrence, JLab Feb. 20, /20/14Online Data Challenges.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Requirements Review – July 21, Requirements for CMS Patricia McBride July 21, 2005.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
Status of the Beamline Simulation A.Somov Jefferson Lab Collaboration Meeting, May 11, 2010.
The GlueX Detector 5/29/091CIPANP The GlueX Detector -- David Lawrence (JLab) David Lawrence (JLab) Electron beam accelerator continuous-wave (1497MHz,
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
GlueX Software Status April 28, 2006 David Lawrence, JLab.
David N. Brown Lawrence Berkeley National Lab Representing the BaBar Collaboration The BaBar Mini  BaBar  BaBar’s Data Formats  Design of the Mini 
Databases E. Leonardi, P. Valente. Conditions DB Conditions=Dynamic parameters non-event time-varying Conditions database (CondDB) General definition:
1 Trigger and DAQ for SoLID SIDIS Programs Yi Qiang Jefferson Lab for SoLID-SIDIS Collaboration Meeting 3/25/2011.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Progress report on Muon Reconstruction based on Kalman filter Y. Fisyak, BNL.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Hall-D/GlueX Software Status 12 GeV Software Review III February 11[?], 2015 Mark Ito.
Report on CHEP ‘06 David Lawrence. Conference had many participants, but was clearly dominated by LHC LHC has 4 major experiments: ALICE, ATLAS, CMS,
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
The CMS CERN Analysis Facility (CAF) Peter Kreuzer (RWTH Aachen) - Stephen Gowdy (CERN), Jose Afonso Sanches (UERJ Brazil) on behalf.
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Modeling PANDA TDAQ system Jacek Otwinowski Krzysztof Korcyl Radoslaw Trebacz Jagiellonian University - Krakow.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
1 Electronics Status Trigger and DAQ run successfully in RUN2006 for the first time Trigger communication to DRS boards via trigger bus Trigger firmware.
Software for the CMS Cosmic Challenge Giacomo BRUNO UCL, Louvain-la-Neuve, Belgium On behalf of the CMS Collaboration CHEP06, Mumbay, India February 16,
DAQ interface + implications for the electronics Niko Neufeld LHCb Electronics Upgrade June 10 th, 2010.
LHCb datasets and processing stages. 200 kB100 kB 70 kB 0.1 kB 10kB 150 kB 0.1 kB 200 Hz LHCb datasets and processing stages.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Monitoring for the ALICE O 2 Project 11 February 2016.
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
1 Reconstruction tasks R.Shahoyan, 25/06/ Including TRD into track fit (JIRA PWGPP-1))  JIRA PWGPP-2: Code is in the release, need to switch setting.
Status of Hall C 6 GeV Analysis Software Robust Fortran/CERNLIB code, “ENGINE”, for analysis of HMS/SOS coincidence and single arm experiments that has.
October 21, 2010 David Lawrence JLab Oct. 21, 20101RootSpy -- CHEP10, Taipei -- David Lawrence, JLab Parallel Session 53: Software Engineering, Data Stores,
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
October 19, 2010 David Lawrence JLab Oct. 19, 20101RootSpy -- CHEP10, Taipei -- David Lawrence, JLab Parallel Session 18: Software Engineering, Data Stores,
05/14/04Larry Dennis, FSU1 Scale of Hall D Computing CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
The LHCb Event Building Strategy
Hall D Trigger and Data Rates
Presentation transcript:

Offline Software Status Jan. 30, 2009 David Lawrence JLab 1

Repository Activity 2

Changes to Simulation Geometry “geomC” new baseline for CDC Complete update of collimater cave (currently with “review2008” suffix in repository) – Magnets + fields – Pair spectrometer – Collimater Tagger microscope moved to behind fixed array Active collimater detail 40 stave Start Counter rumored to exist 3

Changes to Reconstruction Development on Kalman Filter (Simon) Development on Least Squares Global track fitter (Mark) (Re)Adoption of standard units DHelicalFit class merging multiple helical fitters Detector element numbering (plan) 4

JLab Computing Resources Batch farm: – Until recently, Hall-D had no allocation of the JLab computer farm. We now have a guaranteed 4.4% Wireless Networking: – Major policy changes on the JLab wireless network Non-managed laptops will be outside of the JLab firewall All ports (protocols) disabled by default with only a few open Computers registered on network though web page (may or may not require WEP key) Computer to Computer communications will be blocked Disk Space: – Currently, the Hall-D work disk is 1.5TB (only using 51%) – Tight budgets resulted in smaller than usual increases this year so only ~14TB of usable space could be added over the whole Physics Division Work disk Cache disk – We requested 1TB to 1.5TB be allocated for Hall-D work disk space 5

ACAT Conference SIMD/vectorization – Single Instruction, Multiple Data – Altivec, MMX, SSE, 3DNow!, … llvm-g++ compiler (non-commercial Apple project) – Drop-in replacement for g++ – Link-time optimization (-O4) has potential speed increases of 20% Parameterization of Magnetic Field map – Current GlueX 2D map is 2.5MB – Expect 3D map to be at least 100x larger 6

Data Rates in 12GeV era Front End DAQ Rate Event Size L1 Trigger Rate Bandwidth to mass Storage GlueX3 GB/s15 kB200 kHz300 MB/s CLAS12100 MB/s20 kB10 kHz100 MB/s ALICE500 GB/s2.5 MB200 kHz200 MB/s ATLAS113 GB/s1.5MB75 kHz300 MB/s CMS200 GB/s1 MB100kHz100 MB/s LHCb40 GB/s40 kB1 MHz100 MB/s STAR8 GB/s80 MB100 Hz30 MB/s PHENIX900 MB/s~60 kB~ 15 kHz450 MB/s 7 LHC JLab BNL * CHEP2007 talk Sylvain Chapelin private comm. * NIM A499 Mar ppg ** CHEP2006 talk MartinL. Purschke **

Available Projects Magnetic Field Simulation Studies Full Featured Event Viewer Calibrations/Conditions Database MC Acceptance studies for specific physics channels Monte Carlo Meister Online monitoring system Detector Alignment The following projects are not currently being worked on. All of these could be developed primarily offsite. All projects are expected to require at least 6 months of work some of which can be done by a student or post-doc. 8