Ganga in the ATLAS Full Dress Rehearsal Birmingham 4th June 2008 Karl Harrison University of Birmingham - ATLAS Full Dress Rehearsal (FDR) involves loading.

Slides:



Advertisements
Similar presentations
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
Advertisements

Metadata Progress GridPP18 20 March 2007 Mike Kenyon.
From Quark to Jet: A Beautiful Journey Lecture 1 1 iCSC2014, Tyler Dorland, DESY From Quark to Jet: A Beautiful Journey Lecture 1 Beauty Physics, Tracking,
Conditions and configuration metadata for the ATLAS experiment E J Gallas 1, S Albrand 2, J Fulachier 2, F Lambert 2, K E Pachal 1, J C L Tseng 1, Q Zhang.
CMS Alignment and Calibration Yuriy Pakhotin on behalf of CMS Collaboration.
December Pre-GDB meeting1 CCRC08-1 ATLAS’ plans and intentions Kors Bos NIKHEF, Amsterdam.
ATLAS Report 14 April 2010 RWL Jones. The Good News  At the IoP meeting before Easter, Dave Charlton said the best thing about the Grid was there was.
ATLAS Analysis Overview Eric Torrence University of Oregon/CERN 10 February 2010 Atlas Offline Software Tutorial.
ATLAS-Specific Activity in GridPP EDG Integration LCG Integration Metadata.
Data-coordinator report Lian-You SHAN ACC15 Jan 9, 2010, IHEP.
December 17th 2008RAL PPD Computing Christmas Lectures 11 ATLAS Distributed Computing Stephen Burke RAL.
ATLAS : File and Dataset Metadata Collection and Use S Albrand 1, J Fulachier 1, E J Gallas 2, F Lambert 1 1. Introduction The ATLAS dataset search catalogs.
Nurcan Ozturk University of Texas at Arlington Grid User Training for Local Community TUBITAK ULAKBIM, Ankara, Turkey April 5 - 9, 2010 Overview of ATLAS.
RESISTVIR Work Package 1 (Project Management & Further Development) Mid-Term Meeting 7 th July 2006 Damian Cooper-Smith Chalex Research.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
Bookkeeping Tutorial. Bookkeeping & Monitoring Tutorial2 Bookkeeping content  Contains records of all “jobs” and all “files” that are created by production.
OFFLINE TRIGGER MONITORING TDAQ Training 5 th November 2010 Ricardo Gonçalo On behalf of the Trigger Offline Monitoring Experts team.
DPDs and Trigger Plans for Derived Physics Data Follow up and trigger specific issues Ricardo Gonçalo and Fabrizio Salvatore RHUL.
Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland t DBES Changes in PD2P replication strategy S. Campana (CERN IT/ES) on.
ATLAS: Heavier than Heaven? Roger Jones Lancaster University GridPP19 Ambleside 28 August 2007.
Metadata Mòrag Burgon-Lyon University of Glasgow.
Full Dress Rehearsal (FDR1) studies Sarah Allwood-Spiers 11/3/2008.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
A B A B AR InterGrid Testbed Proposal for discussion Robin Middleton/Roger Barlow Rome: October 2001.
Conditions Metadata for TAGs Elizabeth Gallas, (Ryan Buckingham, Jeff Tseng) - Oxford ATLAS Software & Computing Workshop CERN – April 19-23, 2010.
ATLAS Distributed Analysis Dietrich Liko. Thanks to … pathena/PANDA: T. Maneo, T. Wenaus, K. De DQ2 end user tools: T. Maneo GANGA Core: U. Edege, J.
PROOF Farm preparation for Atlas FDR-1 Wensheng Deng, Tadashi Maeno, Sergey Panitkin, Robert Petkus, Ofer Rind, Torre Wenaus, Shuwei Ye BNL.
Latest News & Other Issues Ricardo Goncalo (LIP), David Miller (Chicago) Jet Trigger Signature Group Meeting 9/2/2015.
In user trials a product is tested by “real users” trying out the product in a controlled or experimental setting, where they are given a set of tasks.
Integration of the ATLAS Tag Database with Data Management and Analysis Components Caitriana Nicholson University of Glasgow 3 rd September 2007 CHEP,
SunSatFriThursWedTuesMon January One month per page Space to write on your.
SunSatFriThursWedTuesMon January
University user perspectives of the ideal computing environment and SLAC’s role Bill Lockman Outline: View of the ideal computing environment ATLAS Computing.
Bookkeeping Tutorial. 2 Bookkeeping content  Contains records of all “jobs” and all “files” that are produced by production jobs  Job:  In fact technically.
2017 monthly calendar template
ATLAS Grid Computing Rob Gardner University of Chicago ICFA Workshop on HEP Networking, Grid, and Digital Divide Issues for Global e-Science THE CENTER.
ATLAS Computing Requirements LHCC - 19 March ATLAS Computing Requirements for 2007 and beyond.
Issues with cluster calibration + selection cuts for TrigEgamma note Hardeep Bansil University of Birmingham Birmingham ATLAS Weekly Meeting 12/08/2010.
TAGS in the Analysis Model Jack Cranshaw, Argonne National Lab September 10, 2009.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Performance Requirements J. Lewis Level 2 Loya Jurga 7 December 2001.
Victoria, Sept WLCG Collaboration Workshop1 ATLAS Dress Rehersals Kors Bos NIKHEF, Amsterdam.
22/10/2007Software Week1 Distributed analysis user feedback (I) Carminati Leonardo Universita’ degli Studi e sezione INFN di Milano.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
The ATLAS Computing & Analysis Model Roger Jones Lancaster University ATLAS UK 06 IPPP, 20/9/2006.
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
Conditions Metadata for TAGs Elizabeth Gallas, (Ryan Buckingham, Jeff Tseng) - Oxford ATLAS Software & Computing Workshop CERN – April 19-23, 2010.
10 January 2008Neil Collins - University of Birmingham 1 Tau Trigger Performance Neil Collins ATLAS UK Physics Meeting Thursday 10 th January 2008.
Finding Data in ATLAS. May 22, 2009Jack Cranshaw (ANL)2 Starting Point Questions What is the latest reprocessing of cosmics? Are there are any AOD produced.
Dynamic Data Placement: the ATLAS model Simone Campana (IT-SDC)
Detector SimOOlation activities in ATLAS A.Dell’Acqua CERN-EP/ATC May 19th, 1999.
LHCb Computing activities Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group.
Initial Planning towards The Full Dress Rehearsal Michael Ernst.
WLCG November Plan for shutdown and 2009 data-taking Kors Bos.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
Joe Foster 1 Two questions about datasets: –How do you find datasets with the processes, cuts, conditions you need for your analysis? –How do.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
Metadata and Supporting Tools on Day One David Malon Argonne National Laboratory Argonne ATLAS Analysis Jamboree Chicago, Illinois 22 May 2009.
ATLAS Distributed Computing Tutorial Tags: What, Why, When, Where and How? Mike Kenyon University of Glasgow.
Overview of the Belle II computing
ATLAS activities in the IT cloud in April 2008
ALICE Physics Data Challenge 3
AMI – Status November Solveig Albrand Jerome Fulachier
ATLAS Data Analysis Ontology: ontological representation of investigations DKB Meeting.
Conditions Data access using FroNTier Squid cache Server
ATLAS DC2 & Continuous production
Trillo Apparel Company Revised Delivery Plan
Presentation transcript:

Ganga in the ATLAS Full Dress Rehearsal Birmingham 4th June 2008 Karl Harrison University of Birmingham - ATLAS Full Dress Rehearsal (FDR) involves loading simulated event sampes in bytestream format onto SubFarm disks at detector pit, and treating the data in exactly the same way as experimental data “By doing this test, we’re trying to practice everything - from data coming off the experiment all the way to shipping the data around and analysing it - under conditions just as they will be when real data-taking begins.” “[These test are] essential, almost as much as ensuring the detector itself actually works.” Dave Charlton, FDR coordinator, in ATLAS e-News (15 January 2008)

4th June 20082/4 FDR-1 and FDR-2 Full Dress Rehearsal performed in two stages  For full details see FDR Twiki page: FDR-1 - Eight runs at low luminosity (10 31 cm -2 s -1 ) played through several times during week of 4th February Reprocessing and analysis in subsequent months “Was it a success? The question is a bit like asking whether it is successful when you bring your luxury automobile to the auto mechanic and are told that, although the engine basically works, it needs a large number of parts and adjustments before it will be road worthy.” Michael Wilson, in ATLAS e-News (10 March 2008) FDR-1 - Data to be played through during week of 2nd June Analysis to follow

4th June 20083/4 UK analysis experience in FDR-1 - UK analysis experience in FDR-1 summarised in short report: - Use made of AMI (ATLAS Metadata Interface), DQ2 (data-management system) and Ganga  In general worked reasonably well -Main problems seen when using Ganga - FDR-1 datasets were small, but couldn’t analyse more than one in a single Ganga job - High probability of job failure if no destination site specified - Difficult to understand causes of job failure

4th June 20084/4 Analysis goals for FDR-2 -FDR-2 aims to allow more stringent testing of the ATLAS analysis model - Distribution with more-realistic data placement at Tier-2 sites - UK Tier-2 sites will each have 50% of the data for a single trigger stream (Muon, Egamma, MinBias, Jet) - Had 100% of data for FDR-1 - Test of ability to find data with the more-realistic placement - Analysis involving Derived Physics Data (DPD) - Had no central DPD production for FDR-1 - Tag-based analysis - Full trigger tests -Access to luminosity information  Several of these place requirements on Ganga