Aggiornamenti… Stato DAQ TDR 2003, 2004… Prossime scadenze italiane Stato Tier1 (Alessia) G. Bagliesi Catania 13/9/02.

Slides:



Advertisements
Similar presentations
Advanced Piloting Cruise Plot.
Advertisements

1 CHEP 2000, Roberto Barbera The ALICE Inner Tracking System Off-line Software CHEP 2000, (
F.Pastore II Workshop sulla fisica di ATLAS e CMS Oct Trigger di primo livello per gli esperimenti ATLAS & CMS ad LHC F.Pastore INFN Sezione.
L. Perini Milano 6 Mar Centri Regionali e Progetto GRID per ATLAS-Italia La situazione a oggi: decisioni, interesse, impegni, punti da chiarire.
Days Months January Monday February Tuesday March Wednesday April
Designing Services for Grid-based Knowledge Discovery A. Congiusta, A. Pugliese, Domenico Talia, P. Trunfio DEIS University of Calabria ITALY
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
0 - 0.
Addition Facts
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
The ATLAS Computing Model Roger Jones Lancaster University CHEP06 Mumbai 13 Feb
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
School Shop. Welcome to my shop. You have 10p How much change will you get? 7p 3p change.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Bernd Panzer-Steindel, CERN/IT WAN RAW/ESD Data Distribution for LHC.
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
Geant 4 simulation of the DEPFET beam test Daniel Scheirich, Peter Kodyš, Zdeněk Doležal, Pavel Řezníček Faculty of Mathematics and Physics Charles University,
Richmond House, Liverpool (1) 26 th January 2004.
ABC Technology Project
Primary research figuresPrimary research figures These are some of the results from my primary research. percentages of people who like/dislike the show.
VOORBLAD.
Created by Susan Neal $100 Fractions Addition Fractions Subtraction Fractions Multiplication Fractions Division General $200 $300 $400 $500 $100 $200.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
© 2012 National Heart Foundation of Australia. Slide 2.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
COMPUTER B Y : L K. WINDOWS INFORMATION B Y : L K.
Addition 1’s to 20.
25 seconds left…...
Test B, 100 Subtraction Facts
Presenteren wij ………………….
Januar MDMDFSSMDMDFSSS
Week 1.
We will resume in: 25 Minutes.
Weekly Attendance by Class w/e 6 th September 2013.
20/01/09 OPERA general meeting Nagoya CNGS report.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
Zprávy z ATLAS SW Week March 2004 Seminář ATLAS SW CZ Duben 2004 Jiří Chudoba FzÚ AV CR.
DPS/ LCG Review Nov 2003 Working towards the Computing Model for CMS David Stickland CMS Core Software and Computing.
CMS Computing and Core-Software USCMS CB Riverside, May 19, 2001 David Stickland, Princeton University CMS Computing and Core-Software Deputy PM.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
David Stickland CMS Core Software and Computing
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
Dario Barberis: Conclusions ATLAS Software Week - 10 December Conclusions Dario Barberis CERN & Genoa University.
U.Gasparini10/04/031 Toward Physics TDR Barrel Muon week, Bologna, 9-12 April CCS (Computing&Core Software) schedule Computing TDR (due.
Ian Bird WLCG Workshop San Francisco, 8th October 2016
ALICE Physics Data Challenge 3
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
US ATLAS Physics & Computing
ATLAS DC2 & Continuous production
Presentation transcript:

Aggiornamenti… Stato DAQ TDR 2003, 2004… Prossime scadenze italiane Stato Tier1 (Alessia) G. Bagliesi Catania 13/9/02

11/9: 468 pag.

DAQ TDR schedule… September 13 th : Give draft to Annual Reviewers Draft V.2 October 1 st : To be given to LHCC (during CR) Draft V.3 November 4 th : Distributed before CPT week Final review during the CPT week (4-10/11) Final version: December 15th. Submission to LHCC Versione aggiornata on-line:

2003 Validazione di OSCAR (geant 4) Nuova persistenza (pool?) Sviluppo tool di analisi Attivita sui test-beam Test-beam congiunto ECAL-TRK Sviluppo software integrazione… Preparazione al 5% data challenge (vedi slides succ.)

Data challenge 5% (CMS IN/xxx) 5% Data challenge (dc04): Febbraio-aprile 2004 The 5% refers to a fraction of a final, 100% full Luminosity computing configuration. As described below, that is about 25% of the complexity required for initial LHC running in The subsequent challenges will take place in 2005 and 2006, and are scaled in turn to be 50% and 100% of LHC turn-on complexity. For simplicity, we refer to these challenges henceforth as DC04, DC05 and DC Hz * 3600 secondi * 20 ore * 1 mese = ~ 50 milioni eventi Prerequisiti: 1. The Simulation phase must start approximately 7 months before the challenge, i.e. July 2003 a. OSCAR/GEANT4 must be fully validated by this time b. 50kSI95 of CPU power must be available to CMS from this time for the remainder of 2003 c. The POOL persistency software must be sufficiently mature 2. Storage capacity in the range TB (uncompressed) will be required. The Simulated data need not reside at CERN, but the Digitized data will need to be available at CERN for the data-challenge proper. a. The Simulated data (~100TB) will need to be kept, presumably at the main simulation sites or at associated T1 centers. The media costs must be paid by somebody. b. The Digitized data must be transferred to CERN and stored, at least for the duration of the data challenge, plus say four-six months. c. The transfer rate of digitized data to CERN must be of the order of 1TB/day for a continuous period of 2 months (Dec03-Jan04) from the ensemble of Tier centers. (Significant effort may be required here to ensure transparent access for at least the critical servers.) 3. A model of dataset assignment and flow of data must be designed. With or without GRID tools being used in this phase, the data flows must be optimized. 4. Initial LCG Global Grid Service (LCG-1) availability, at least for partial activities. 5. Integration of the CMS "Production Tools" into the Grid deployed environment.

: Physics TDR Periodo caldo , ma si comincia da subito a formare i gruppi di lavoro… Previsti due volumi: Vol. 1) Detector response, reconstruction of physics objects, calibrations, detector parameterization - maps current work (and organization!) of the PRS groups Vol. 2) High-level analyses, like Higgs, Supersymmetry, Exotic physics, etc.- the new, additional work CMS Week (23-27/9): (Ri)organizzazione dei gruppi di fisica e di lavoro di CMS in vista del Physics TDR: forse rinviata a dicembre…

Scadenze italiane software e computing Riunione organizzativa TISB 24/9 Martedi sera al CERN (CMS Week) TISB Firenze 11/10 Prima riunione PIB di LCG (??) 17/10 TISB Firenze 8/11 Gruppo I dedicato al calcolo LHC 11-12/11: Roma II Workshop CMS Italia su computing e software 18/11 a Napoli (in settimana…)