1 Timescales Construction LCR MICE Beam Monitoring counters + DAQ My understanding Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 ISIS.

Slides:



Advertisements
Similar presentations
Line Efficiency     Percentage Month Today’s Date
Advertisements

Detector Systems Update MICO March 17, MICE Detector Systems  CKOV u Mississippi RAL  TOF0/1 u Hamamatsu PMT assemblies show an increasing.
MOM Report Ray Gamet MICE Operations Manager University of Liverpool MICO 18 th May 2009.
The LAr ROD Project and Online Activities Arno Straessner and Alain, Daniel, Annie, Manuel, Imma, Eric, Jean-Pierre,... Journée de réflexion du DPNC Centre.
Luminosity Monitor MWPCs, commissioning. A.Kiselev OLYMPUS Collaboration Meeting DESY, Hamburg,
Atlas SemiConductor Tracker Andrée Robichaud-Véronneau.
1 Analysis code for KEK Test-Beam M. Ellis Daresbury Tracker Meeting 30 th August 2005.
1 MICE PM Report Installations to date Future installation work Preparations for Phase Two Target status MICE Video Conference, 22nd May, 2008.
Detector Systems Update MICO March 3, MICE Detector Systems  CKOV u CKOV humidity (HIH4000) and temperature (LM35) sensors in hand for installation.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
MICO 8 th February 2010 Terry Hart (MOM) - Decay Solenoid and Target - MICE Machine Physics runs - Problems and Issues.
Detector Systems Update MICO March 10, MICE Detector Systems  CKOV u No update  TOF0/1 u Tests with laser injection system progressing. u Tests.
Detector Systems Update MICO Feb 25, MICE Detector Systems  CKOV u CKOV installation planned during March shutdown u Cable issues resolved. Geneva.
COBRA Magnet Test Milestone AssemblyDesignManufactoring DesignAss.TestWinding & Cryo. Deliv. PSI Test Platform Calibration Mapping device.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
The MINER A Operations Report All Experimenters Meeting Howard Budd, University of Rochester Jun 4, 2012.
1 Hall & Beamline Installation Richard Apsimon, Andy Nichols, Willie Spensley MICE Collaboration Meeting, th Feb 2008.
MICE: overview K. Long, 18 March, Contents ► Introduction ► Beam line ► Infrastructure ► MICE steps ► Phase II (comment) ► Conclusions.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
Status and plans for online installation LHCb Installation Review April, 12 th 2005 Niko Neufeld for the LHCb Online team.
MICE Beam-line and Detectors Status Report 16 th October 2009 Chris Booth The University of Sheffield.
1Malcolm Ellis - CM th October 2006 MICE Software  Status report: u Data Challenge s Design Iteration s MICEEvent/MICERun s DataCards s MiceModules.
Software Status  Last Software Workshop u Held at Fermilab just before Christmas. u Completed reconstruction testing: s MICE trackers and KEK tracker.
1M. Ellis - MICE VC - 1st November 2007 Cosmic Ray Test Preparations  The cosmic ray test was initially planned to be held in R12, but this proved to.
G4MICE Status and Plans 1M.Ellis - CM24 - RAL - 31st May 2009  Firstly, a correction to the agenda:  I failed to spot a mistake in the agenda that I.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
M. Bonesini - MICE Inst Meeting M. Bonesini INFN Milano TOF installation plans.
MICE Phase 1 Koji Yoshimura KEK June
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
1 KEK test-beam software progress Malcolm Ellis MICE Video Conference 4 th May 2005.
All Experimenters MeetingDmitri Denisov Week of July 7 to July 15 Summary  Delivered luminosity and operating efficiency u Delivered: 1.4pb -1 u Recorded:
1 MICE Tracker Readout Update AFE IIt firmware development VLSB firmware development Hardware progress Summary Terry Hart, MICE Tracker Meeting, August.
MICE CMPB Alain Blondel 1 Highlights on MICE.
DoE Review January 1998 Online System WBS 1.5  One-page review  Accomplishments  System description  Progress  Status  Goals Outline Stu Fuess.
Database David Forrest. What database? DBMS: PostgreSQL. Run on dedicated Database server at RAL Need to store information on conditions of detector as.
STATUS OF KISTI TIER1 Sang-Un Ahn On behalf of the GSDC Tier1 Team WLCG Management Board 18 November 2014.
ProjectImpactResourcesDeadlineResourcesDeadline Forecast Plan Time Resources Risk 001xx 002xx 003xx 004xx 005xx 006xx 007xx TotalXX Example 1: Portfolio.
1 MICE Tracker Readout Update Introduction/Overview TriP-t hardware tests AFE IIt firmware development VLSB firmware development Hardware progress Summary.
Tracker Cosmic Ray Test 2011 Linda R. Coney UC Riverside CM 29 - February 16, 2011.
Detector Summary Tracker. Well, as far as the tracker hardware is concerned, we are done. – Need to do the system test to make sure nothing has degraded.
LHCf Detectors Sampling Calorimeter W 44 r.l, 1.6λ I Scintilator x 16 Layers Position Detector Scifi x 4 (Arm#1) Scilicon Tracker x 4(Arm#2) Detector size.
Software Overview 1M. Ellis - CM21 - 7th June 2008  Simulation Status  Reconstruction Status  Unpacking Library  Tracker Data Format  Real Data (DATE)
Malcolm Ellis – Brunel University – 8th April The International Muon Ionization Cooling Experiment.
M. apollonio 7/7/2010CM27 - RAL11 Beam-Line Analysis …
1 MICE PM Report Phase One installations to date Phase One remaining items Progress with Phase Two Worries Don’t have time for a detailed status report,
Arnd Meyer (RWTH Aachen) Sep 8, 2003Page 1 Integrated Luminosity Total data sample on tape with complete detector > 200pb -1.
21 Mar 2007ALICE - Paul Dauncey1 ALICE CR07: Installation and Commissioning Paul Dauncey Material from: L. Leistam: “ALICE Status” W. Riegler: “Commissioning.
ELENA installation progress F. Butin / ELENA collaboration.
Online – Data Storage and Processing
Jan 2016 Solar Lunar Data.
MICE Computing and Software
Pasquale Migliozzi INFN Napoli
Gas system Commissioning procedures and status
ProtoDUNE Installation Workshop - DAQ
Universita’ di Torino and INFN – Torino
Average Monthly Temperature and Rainfall
LLRF and Beam-based Longitudinal Feedback Readiness
Gantt Chart Enter Year Here Activities Jan Feb Mar Apr May Jun Jul Aug

MONTH CYCLE BEGINS CYCLE ENDS DUE TO FINANCE JUL /2/2015
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Text for section 1 1 Text for section 2 2 Text for section 3 3
Presentation transcript:

1 Timescales Construction LCR MICE Beam Monitoring counters + DAQ My understanding Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 ISIS Shutdown ISIS Beam tuning MICE BEAM External Work MICE Detector TOF0 TOF1, Cal, Ceren, SciFi1 Start of MICE installation Hall available for detector install MICE work

2 Phases LCR Installation Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Tuning Data Phases Hall installation MICE work

3 Assumptions Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Month is 20 days Installation is eight hours per day Numbers are guesses : I would like feedback on numbers which you feel are correct as well as those you think are wrong (either way). Some of the things I include are not exactly computer or DAQ, but have an impact on them. Conflicts are direct which means two jobs cannot be done simultaneously whatever the manpower available.

4 LCR Installation Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Tasks Install cabinets and racks Install Networking and test transfers Purchase DAQ computer and install Purchase Local storage and install Commission DAQ computer and storage Install sub-detector electronics Install and commission slow controls Transfer simulated data to tape at Atlas Analyse simulated data at tier1 and tier2 Fast analysis of beam tuning data at tier1 Inputs LCR complete. Including power and A/C. LCR layout – rack space. Internal cable runs. Cable runs from hall. Installation timings. MICE DAQ system V1.0 LCR Installation

5 LCR Installation Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Conflicts Cables from sub-detectors require detectors to have been installed LCR Installation Outputs LCR set up. All power, racks and A/C DAQ computer and local storage. Network connection to tier1 and 2. Data tapes written at Atlas. Analysis and fast analysis of simulated data. Working DAQ. Beam Counters in position surveyed and read out. MICE DAQ V2.0

6 Hall Installation Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Hall installation Tasks Install sub-detectors (including cable runs to the LCR). Install beam detectors Conflicts Work on hall infrastructure Outputs Command and control communications with the LCR Working detectors and beam monitors Inputs

7 Tuning Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Tuning Tasks Set up counters with beam Read out counters Tune beam at Initial setting (using fast grid) other settings as possible Integrate detectors into the DAQ TOF, Cerenkov, Calorimeter, Scintillating fibre tracker Detector pedestal (and calibration) Conflicts Sub-detectors integration v beam tuning Detector installation in hall v beam tuning Outputs A running experiment Inputs LCR set up. All power, racks and A/C DAQ computer and local storage. Network connection to tier1 and 2. Data tapes written at Atlas Working DAQ Beam Counters in position and surveyed

8 Data taking Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 Tasks Take data Conflicts ? Outputs Cooling measurement Inputs All detectors read-out. Real data written to tape at Atlas Data analysed at tier1 and tier2 Fast turn around analysis exercised Data

9 LCR timings Time available 120 days 5 days Test transfers Infrastructure Fast ethernet connection to Atlas Install cabinets and racks Started negotiation. Some funds will be needed Status Plans for analysis at tier2s communicated to gridpp Archiving of data fits into the other experiment envelope Requirements for fast turn around analysis at tier1 communicated to gridpp 5 days Write to tape at Atlas10 days Test transfers RAL to tier2 London – South – North – Scot Fermilab - Sophia 5 days per tier 2

10 LCR timings Time available 120 days Server and Disks Procurement Data storage £33,132Clustervision quotes July ’06 Server £11,862 Delivery time ~ x weeks Server installation, storage installation; system burn-in Mice software installation 15 days + 20 days 5 days 1 day Analyse simulated data at tier2 London – South – North - Scot Fermilab - Sophia 2.5 days per tier 2 Fast analysis of simulated tuning data at tier1 (dummy) 5 days

11 LCR timings Time available 120 days Install and read-out sub-detector electronics Scintillating Fibre TOF Calorimeter Cerenkov Install and commission slow controls Install read-out beam counter electronics ? days ? Days *? days Total ? Days Beam counters need to be done 4 weeks before the rest of the detectors Slow controls will need an earlier end date

12 Hall timings Questions Is there any estimate of when things will be come available? Is there anyway of influencing the order of work and do we need to. Time available ?

13 Tuning timings Tasks Set up counters with beam Read out counters Tune beam at Initial setting (using fast grid) Integrate detectors into the DAQ TOF, Cerenkov, Calorimeter, Scintillating fibre tracker Detector pedestal (and calibration) Time available 20 days 5 days 10 days 5 daysContingency

14 Conclusion My guess is there is time to do what we need. Problem is likely to be that many detectors arrive towards the end of the period. My worry is that I have forgotten something I believe the Hardware/infrastructure for the DAQ is on track (see above) I do not understand the Software/Controls infrastructure well enough to be confident everything will come together 10 Tons I’m sure I’ve Forgotten Something