Calibrations Emily Maher, Cheryl Patrick, Aaron McGowan, Chris Marshall, Laza Rakotondravohitra, Heather Ray April 8, 2014 1.

Slides:



Advertisements
Similar presentations
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
Advertisements

CERN LCG Overview & Scaling challenges David Smith For LCG Deployment Group CERN HEPiX 2003, Vancouver.
DSP online algorithms for the ATLAS TileCal Read Out Drivers Cristobal Cuenca Almenar IFIC (University of Valencia-CSIC)
Conditions Database for Minerva Jaewon Park University of Rochester MINERvA/Jupiter group meeting, Nov 07, 2007.
FD Timing Calibration Status Guardians of the FD timing calibration: –John Chapman, graduation in ~3-6 months. –Andy Blake, renewal of contract (or redundancy!)
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Parallel Reconstruction of CLEO III Data Gregory J. Sharp Christopher D. Jones Wilson Synchrotron Laboratory Cornell University.
Asynchronous Solution Appendix Eleven. Training Manual Asynchronous Solution August 26, 2005 Inventory # A11-2 Chapter Overview In this chapter,
ACD calibrations Pedestals Measured from online script Measure PHA w/ HV off, no charge injection Use cyclic triggers ~ ADC counts, very narrow.
DIRAC API DIRAC Project. Overview  DIRAC API  Why APIs are important?  Why advanced users prefer APIs?  How it is done?  What is local mode what.
WaveMaker Visual AJAX Studio 4.0 Training Troubleshooting.
BaBar WEB job submission with Globus authentication and AFS access T. Adye, R. Barlow, A. Forti, A. McNab, S. Salih, D. H. Smith on behalf of the BaBar.
Building a Real Workflow Thursday morning, 9:00 am Lauren Michael Research Computing Facilitator University of Wisconsin - Madison.
Shuei MEG review meeting, 2 July MEG Software Status MEG Software Group Framework Large Prototype software updates Database ROME Monte Carlo.
GLAST LAT ProjectDOE/NASA Baseline-Preliminary Design Review, January 8, 2002 K.Young 1 LAT Data Processing Facility Automatically process Level 0 data.
The Pipeline Processing Framework LSST Applications Meeting IPAC Feb. 19, 2008 Raymond Plante National Center for Supercomputing Applications.
Clarity Educational Community Enhanced Functionality and Integration Advanced GEL Scripts Presented by: James Gille | Date Prepared:
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Central Reconstruction System on the RHIC Linux Farm in Brookhaven Laboratory HEPIX - BNL October 19, 2004 Tomasz Wlodek - BNL.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
Compiled Matlab on Condor: a recipe 30 th October 2007 Clare Giacomantonio.
Launch SpecE8 and React from GSS. You can use the chemical analyses in a GSS data sheet to set up and run SpecE8 and React calculations. Analysis → Launch…
Some Design Notes Iteration - 2 Method - 1 Extractor main program Runs from an external VM Listens for RabbitMQ messages Starts a light database engine.
Building a Real Workflow Thursday morning, 9:00 am Greg Thain University of Wisconsin - Madison.
Dzero MC production on LCG How to live in two worlds (SAM and LCG)
Building a Real Workflow Thursday morning, 9:00 am Lauren Michael Research Computing Facilitator University of Wisconsin - Madison.
Grid Scheduler: Plan & Schedule Adam Arbree Jang Uk In.
Semi-Automatic patch upgrade kit
GLAST LAT Project LAT Instrument Analysis Workshop – Feb 27, 2006 Hiro Tajima, TKR Data Processing Overview 1 GLAST Large Area Telescope: TKR Data Processing.
NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech)1 How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 LIGO-G E.
PROOF and ALICE Analysis Facilities Arsen Hayrapetyan Yerevan Physics Institute, CERN.
Forward Carriage Commissioning CLAS Collaboration Meeting 6/19/2015CLAS12 CalCom Status Update1 ECAL PCAL FTOF Panel 1A FTOF Panel 1B Detector Status PMT.
The Gateway Computational Web Portal Marlon Pierce Indiana University March 15, 2002.
October Test Beam DAQ. Framework sketch Only DAQs subprograms works during spills Each subprogram produces an output each spill Each dependant subprogram.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Summary of User Requirements for Calibration and Alignment Database Magali Gruwé CERN PH/AIP ALICE Offline Week Alignment and Calibration Workshop February.
Overview of PHENIX Muon Tracker Data Analysis PHENIX Muon Tracker Muon Tracker Software Muon Tracker Database Muon Event Display Performance Muon Reconstruction.
Group Metting Submit job to Grid Grass. outline Why we need Grid CPUs example : How to run TPCana Before run TPCana.
DZero Monte Carlo Production Ideas for CMS Greg Graham Fermilab CD/CMS 1/16/01 CMS Production Meeting.
June 2004 ATLAS WeekAlexander Solodkov1 testbeam 2004 offline reconstruction.
TANYA LEVSHINA Monitoring, Diagnostics and Accounting.
11 Copyright © 2004, Oracle. All rights reserved. Performing a Migration Using Oracle Migration Workbench (Part II)
AliRoot survey: Calibration P.Hristov 11/06/2013.
Authors: Dmitrii Pereima, ITEP; Dmitry Golubkov, ITEP; Iouri Gouz, IHEP; Victor Egorychev, ITEP. Visualization of HCAL 137 Cs calibration data 1 CALO +
Mark Dorman – UCL/RAL – Calibration Workshop Talk Update on ND Strip-to-Strip Calibration Work Mark Dorman Calibration Workshop Fermilab, September 7-9.
Starting Analysis with Athena (Esteban Fullana Torregrosa) Rik Yoshida High Energy Physics Division Argonne National Laboratory.
CscClusterization and Micromegas Woochun Park Group Meeting.
ANALYSIS TRAIN ON THE GRID Mihaela Gheata. AOD production train ◦ AOD production will be organized in a ‘train’ of tasks ◦ To maximize efficiency of full.
HYDRA Framework. Setup of software environment Setup of software environment Using the documentation Using the documentation How to compile a program.
Keep-up processing Cheryl Patrick MINERvA data management meeting April 8-9, /8/14C Patrick, Keep-up.
Feb C.Smith UVA EC energy calibration – g13 pass0 For pass0 data were cooked with CALDB calibration constants reset to nominal 10 channels / MeV.
OpenPBS – Distributed Workload Management System
Progress on NA61/NA49 software virtualisation Dag Toppe Larsen Wrocław
P-GRADE overview and introduction: workflows & parameter sweeps (Advanced features) Gergely Sipos MTA SZTAKI
DPG Activities DPG Session, ALICE Monthly Mini Week
Work report Xianghu Zhao Nov 11, 2014.
Acutelearn Azure Administration Training in Hyderabad Classroom Training Instructor led trainings at Acutelearn premises Corporate Training Custom tailored.
Lesson 1: Introduction to Trifacta Wrangler
Lesson 1: Introduction to Trifacta Wrangler
Lesson 1: Introduction to Trifacta Wrangler
Lesson 1: Introduction to Trifacta Wrangler
Lesson 1 – Chapter 1B Chapter 1B – Terminology
Building Web Applications
Initial job submission and monitoring efforts with JClarens
TKR to CAL for 16 Towers The usual TKR extrapolation to CAL study, for the 16 tower data. Check calibrations, look for problems, etc. Today -- “work in.
Lesson 1 – Chapter 1C Trifacta Interface Navigation
Introduction to Athena
2 Getting Started.
2 Getting Started.
Calibration Infrastructure Design
Presentation transcript:

Calibrations Emily Maher, Cheryl Patrick, Aaron McGowan, Chris Marshall, Laza Rakotondravohitra, Heather Ray April 8,

Calibration Overview RawDigits Spill Data Ped Data LI Data Pedestals PMT Gains FEB Consts SupDigits Mapper Data SupDigits Tracked Data Rock  ’s S2S PLEX iterate Timing Energy Scale MC Calibrated Data Reconstructed Data Pedestals PMT Gains FEB Consts Attenuation Analyzed Data 2 Veto Wall

Calibrations Overview Determine calibration constants Load these calibration constants into the database Run separate scripts that execute Gaudi jobs to calculate constants for pedestals, gains, strip to strip, timing, meu, veto wall calibrations Also run a separate rock muon reconstruction job that is required for strip to strip, timing, meu, and veto calibrations Each calibration is assigned to a different person 3

Calibration Inputs CalibrationInputInput location PedestalsRawDigits/minerva/data/rawdigits GainsRawDigits/minerva/data/rawdigits Strip to stripSupDigits, rock muon reconstruction /minerva/data/supdigits TimingSupDigits, rock muon reconstruction /minerva/data/supdigits MEUSupDigits rock muon reconstruction /minerva/data/supdigits VetoSupDigits rock muon reconstruction /minerva/data/supdigits 4

Calibrations Scripts CalibrationScriptScript Location PedestalsCal/CalibrationTools/scripts/pedestal_calibra tion GainsSubmit_LI_runs.plTools/ProductionScripts/data_scripts Rock Muon Reconstructi on submitGridJobs.pyCal/RockMuonCalibration/script/processing Tools/SystemTests/options/CalibProcessing/ RockMuonReconstruction.opts Strip to stripmake_s2s.shCal/StripCal/scripts TimingrunTimeCal.pyCal/CalibrationTools/scripts/timing_calibrati on MEUMake_meu_mc_hist_files.py Make_meu_rock_muon_files.py Cal/CalibrationTools/scripts/meu_calibration VetoDraw_overlays_dbOutput.CCal/CalibrationTools/scripts/veto_timing_cali bration 5

Calibrations Output CalibrationOutputOutput Location PedestalsText files of pedestal tables /minerva/data/calibration/ped_tables GainsText files of gains tables/minerva/data/calibration/gain_tables Rock Muon Reconstruction Root ntuples Strip to stripText file of strip to strip constants /minerva/data/calibration/s2s_tables TimingText files of timing constants /minerva/data/calibration/timecal_tab les MEUText files of MEU constants /minerva/data/calibration/meu VetoText files of veto timing constants /minerva/data/calibration/veto_timin g_tables All files written to temp location then moved to official staging area /minerva/data/calibration 6

Database Uploads Once constants are stages, each calibrator will upload the files to the database website at: i/app/GUI/fileBrowser Then one person is in charge of loading all of these tables into the database using the same web site 7

One process in detail: Gains process submit_LI_ runs.pl process_max _pe_run.pl process_max _pe_run.pl Runs 1 instance per run Run manually as shared account minervagpvm0x Submit grid job for run (runs as minervacal) OfflinePedSup MaxPEGainAlg 8

Typical grid job stats (condor logs) Image size of job: 450kB (from condor logs) Run Remote Usage – Usr around 06:00:00 to 08:00:00 – Sys around 00:10:00 to 00:15:00 Proxy provided by cpatrick as for keep-up The job is listed in the condor q as running as minervacal – so why are the files created by minervadat? Command: jobsub --opportunistic -g -r v10r6p12 -i -t -L -f -dMAXPE 9

Gains algorithm inputs 10

Gain algorithm outputs All outputs to bluearc: top directory /minerva/data/users/$ENV{USER}/data_processing – Output directory can be overridden when calling script – Subdirectory structure hardcoded in script Files include: – Gain table (CSV file) (~1.5MB) – Histograms (root file) (~50kB) – Problem channel list (CSV file) (~40kB) – Tuned HV data file (CSV file) (~40kB) – Log file (~100kB) – Customized options file (~5kB) Files NOT written to SAM – instead gain table is written (manually, after manual quality check) to conditions database Files named in red are owned by minervadat, blue by minervacal 11