MICE Computing and Software

Slides:



Advertisements
Similar presentations
André Augustinus ALICE Detector Control System  ALICE DCS is responsible for safe, stable and efficient operation of the experiment  Central monitoring.
Advertisements

MAUS Update Adam Dobbs, MICE Project Board, 16 th April 2015.
Software Summary Database Data Flow G4MICE Status & Plans Detector Reconstruction 1M.Ellis - CM24 - 3rd June 2009.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Computing Panel Discussion Continued Marco Apollonio, Linda Coney, Mike Courthold, Malcolm Ellis, Jean-Sebastien Graulich, Pierrick Hanlet, Henry Nebrensky.
Linda R. Coney – 24th April 2009 Online Reconstruction & a little about Online Monitoring Linda R. Coney 18 August, 2009.
1 Analysis code for KEK Test-Beam M. Ellis Daresbury Tracker Meeting 30 th August 2005.
1 Data Storage MICE DAQ Workshop 10 th February 2006 Malcolm Ellis & Paul Kyberd.
1 Timescales Construction LCR MICE Beam Monitoring counters + DAQ My understanding Jan 07 Feb 07 Mar 07 Apr 07 May 07 Jun 07 Jul 07 Aug 07 Sep 07 ISIS.
Development plan and quality plan for your Project
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
SOFTWARE & COMPUTING Durga Rajaram MICE PROJECT BOARD Nov 24, 2014.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Artdaq Introduction artdaq is a toolkit for creating the event building and filtering portions of a DAQ. A set of ready-to-use components along with hooks.
+ discussion in Software WG: Monte Carlo production on the Grid + discussion in TDAQ WG: Dedicated server for online services + experts meeting (Thusday.
Nick Brook Current status Future Collaboration Plans Future UK plans.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM24 o Trigger o Event Building o Online Software o Front End.
1 Planning for Reuse (based on some ideas currently being discussed in LHCb ) m Obstacles to reuse m Process for reuse m Project organisation for reuse.
Configuration Database Antony Wilson MICE CM February 2011 RAL 1.
OFFLINE TRIGGER MONITORING TDAQ Training 5 th November 2010 Ricardo Gonçalo On behalf of the Trigger Offline Monitoring Experts team.
Software Status  Last Software Workshop u Held at Fermilab just before Christmas. u Completed reconstruction testing: s MICE trackers and KEK tracker.
V.Sirotenko, July Status of Online Databases Currently there are 2 online Oracle Databases running on d0online cluster: 1.Production DB, d0onprd,
MICE TARGET OPERATION C. Booth, P. Hodgson, P. J. Smith, Dept. of Physics & Astronomy University of Sheffield, England. 1 – The MICE Experiment2 - The.
Online Reconstruction 1M.Ellis - CM th October 2008.
MICE CM January 2009Jean-Sébastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM23 o Detector DAQ o Trigger o Online Software o Front End.
Online System Status LHCb Week Beat Jost / Cern 9 June 2015.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Why A Software Review? Now have experience of real data and first major analysis results –What have we learned? –How should that change what we do next.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
Prompt Calibration Loop 11 February Overview Prompt calibration loop in SCT –Provides ATLAS with conditions data used for the bulk reconstruction.
Tracker Cosmic Ray Test 2011 Linda R. Coney UC Riverside CM 29 - February 16, 2011.
Detector Summary Tracker. Well, as far as the tracker hardware is concerned, we are done. – Need to do the system test to make sure nothing has degraded.
AliRoot survey: Analysis P.Hristov 11/06/2013. Are you involved in analysis activities?(85.1% Yes, 14.9% No) 2 Involved since 4.5±2.4 years Dedicated.
November 1, 2004 ElizabethGallas -- D0 Luminosity Db 1 D0 Luminosity Database: Checklist for Production Elizabeth Gallas Fermilab Computing Division /
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
Next Generation of Apache Hadoop MapReduce Owen
MICE Tracker Software A. Dobbs CM32 9 th Feb 2012.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Coney - CM36 - June Online Overview L. Coney – UCR MICE CM36 – June 2013.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 UK Project Finance & Schedule. Overview Finance  Budget Setting 2013/14  June Finance  Forward look to complete stepVI Schedule  Overview 2.
MAUS Status and Plans Chris Rogers, ASTeC, Rutherford Appleton Laboratory.
THIS MORNING (Start an) informal discussion to -Clearly identify all open issues, categorize them and build an action plan -Possibly identify (new) contributing.
Online – Data Storage and Processing
Simulation Production System
Bulk production of Monte Carlo
WP18, High-speed data recording Krzysztof Wrona, European XFEL
Database Replication and Monitoring
MCproduction on the grid
MICE event viewer status - Mihailo Savic - CM 45
ALICE Monitoring
Offline Software A. Dobbs CM43 30th October 2015.
Bulk production of Monte Carlo
Data Management and Database Framework for the MICE Experiment
Chris Rogers, ASTeC, Rutherford Appleton Laboratory
Chris Rogers, ASTeC, Rutherford Appleton Laboratory
Savannah to Jira Migration
Tracker to Solenoid Alignment
Overview of CLAS12 Calibration
Chris Rogers, ASTeC, Rutherford Appleton Laboratory
US CMS Testbed.
Chris Rogers, ASTeC, Rutherford Appleton Laboratory
Chris Rogers, ASTeC, Rutherford Appleton Laboratory
Tracker Software Status
LHCb Data Quality Check web. cern
Presentation transcript:

MICE Computing and Software Chris Rogers, ASTeC, Rutherford Appleton Laboratory

Computing and Software aims MICE software and computing project aims to Readout the detectors Convert electronics signals to physics parameters Provide monte carlo model Provide online physics outputs Online monitoring Online reconstruction Online event display Provide controls interfaces to, and monitoring of, hardware Provide some support services e.g. web services, data curation Provide online feedback with physics data e.g. phase space distributions at each detector in real time Provide reconstructed data for analysis within 24 hours of data taking

Process Diagram

S/w & Computing Organisation Online responsible for MICE local control room (MLCR) systems Includes DAQ Infrastructure responsible for computing “glue” Offline responsible for developing physics tools Controls and monitoring responsible for slow control of hardware

Staffing Paul Smith has now taken over from Ian Taylor in online role Making great progress at getting to grips with the MLCR systems A lot to learn, a lot is hearsay and folklore We are lacking in sysadmin staff Missing someone local to RAL to look after PPD web services Missing someone local to RAL to look after the control room Seek to fill the hole – estimate 50% task We now have a project plan in place for all but online group Online is in pretty good shape for development tasks Need to make a second pass, check prioritisations are appropriate Will make an informal external review in coming months

Chris Rogers, ASTeC, Rutherford Appleton Laboratory Infrastructure Chris Rogers, ASTeC, Rutherford Appleton Laboratory

Infrastructure WBS

Webservice Downtimes Micewww downtime due to a failed upgrade of a cluster ~ 6 hours, during UK office hours CDB downtimes due to a memory leak in the webservice software Need to find and kill the leak Until then, we will have to implement an automated restarting of the software every night (not a big deal, needs to be done) ~ 120 hours, unacceptably high

GRID Resolved issues with MAUS failing to install on GRID nodes Caused by some problems with the distributed file system Now have tried twice to run MAUS reconstruction reprocessing jobs Both times have failed Memory leaks! Up to MAUS devs to fix this – it is becoming a “thing” Data archiving Missing second lot of Spectrometer solenoid mapping data Data movement Slowly progressing to automation Slowly progressing to SL6.4 MC Need a physics group representative to provide support for this role Priority lowered until we can find this person Overall, concerned by slow progress in GRID work Needs prioritisation over generic control room work

CDB Working towards a C API for cooling channel parameters Stores e.g. magnet currents Provides a more convenient interface for controls software Work around a multithreading issue in python API Providing a table for defining reconstruction job parameters e.g. random seed e.g. Monte Carlo data set definitions Would be great to see either of these in deployment Plans Next big job is to define a table that stores metadata information Is data analyzable? A list of boolean flags Was ISIS okay? Did the data move to the GRID Also seek further feedback on the CDB search tools http://cdb.mice.rl.ac.uk/cdbviewer/ Need Controls software to work and be used correctly to fill CDB!

CDB Viewer

Web Services (IIT) Plan a new web page “look and feel” Tabs for working groups etc across the top Limited number of direct links on the front page Nice, rotating image

SLA Last time, proposed close date of mid-February for this work So far, looks like that is holding schedule...

Schedule At the moment, we are holding schedule Expect all currently planned work will be finished by February As forecast at CM38 Don't have detailed breakdown here