CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Distributed Data Analysis in HEP Piotr MALECKI Institute of Nuclear Physics Kawiory 26A, 30-055 Kraków, Poland.

Slides:



Advertisements
Similar presentations
1 AMY Detector (eighties) A rather compact detector.
Advertisements

Challenges for Interactive Grids a point of view from Int.Eu.Grid project Remote Instrumentation Services in Grid Environment RISGE BoF Manchester 8th.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
First results from the ATLAS experiment at the LHC
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
CHEP 2012 – New York City 1.  LHC Delivers bunch crossing at 40MHz  LHCb reduces the rate with a two level trigger system: ◦ First Level (L0) – Hardware.
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
Grids: Why and How (you might use them) J. Templon, NIKHEF VLV T Workshop NIKHEF 06 October 2003.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
O. Stézowski IPN Lyon AGATA Week September 2003 Legnaro Data Analysis – Team #3 ROOT as a framework for AGATA.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
POLITEHNICA University of Bucharest California Institute of Technology National Center for Information Technology Ciprian Mihai Dobre Corina Stratan MONARC.
The ATLAS trigger Ricardo Gonçalo Royal Holloway University of London.
CrossGrid Task 3.3 Grid Monitoring Trinity College Dublin (TCD) Brian Coghlan Paris MAR-2002.
Cracow Grid Workshop, November 5-6, 2001 Overview of the CrossGrid Project Marian Bubak Institute of Computer Science & ACC CYFRONET AGH, Kraków, Poland.
Exploiting the Grid to Simulate and Design the LHCb Experiment K Harrison 1, N Brook 2, G Patrick 3, E van Herwijnen 4, on behalf of the LHCb Grid Group.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
1 of 16 The beauty of Office Automation is to alter not only our work environment but our very concept of the work.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
Crossgrid kick-off meeting, Cracow, March 2002 Santiago González de la Hoz, IFIC1 Task 3.5 Test and Integration (
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
N ATIONAL E NERGY R ESEARCH S CIENTIFIC C OMPUTING C ENTER Charles Leggett The Athena Control Framework in Production, New Developments and Lessons Learned.
Fermilab June 29, 2001Data collection and handling for HEP1 Matthias Kasemann Fermilab Overview of Data collection and handling for High Energy Physics.
André Augustinus 10 October 2005 ALICE Detector Control Status Report A. Augustinus, P. Chochula, G. De Cataldo, L. Jirdén, S. Popescu the DCS team, ALICE.
Control in ATLAS TDAQ Dietrich Liko on behalf of the ATLAS TDAQ Group.
LOGO PROOF system for parallel MPD event processing Gertsenberger K. V. Joint Institute for Nuclear Research, Dubna.
HEP 2005 WorkShop, Thessaloniki April, 21 st – 24 th 2005 Efstathios (Stathis) Stefanidis Studies on the High.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Clara Gaspar, March 2005 LHCb Online & the Conditions DB.
V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA.
VICOMTECH VISIT AT CERN CERN 2013, October 3 rd & 4 th O.COUET CERN/PH/SFT DATA VISUALIZATION IN HIGH ENERGY PHYSICS THE ROOT SYSTEM.
The KLOE computing environment Nuclear Science Symposium Portland, Oregon, USA 20 October 2003 M. Moulson – INFN/Frascati for the KLOE Collaboration.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
KISTI & Belle experiment Eunil Won Korea University On behalf of the Belle Collaboration.
2003 Conference for Computing in High Energy and Nuclear Physics La Jolla, California Giovanna Lehmann - CERN EP/ATD The DataFlow of the ATLAS Trigger.
CDF computing in the GRID framework in Santander
1 Erik Johansson Athens 2010 Particle collisions and antimatter In the ATLAS Experiment Erik Johansson Stockholm University.
LHC experimental data: From today’s Data Challenges to the promise of tomorrow B. Panzer – CERN/IT, F. Rademakers – CERN/EP, P. Vande Vyvre - CERN/EP Academic.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 WP8 - Demonstration ALICE – Evolving.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
BOF at GGF5, Edinburgh, Scotland, July 21-24, 2002 CrossGrid Architecture Marian Bubak and TAT Institute of Computer Science & ACC CYFRONET AGH, Cracow,
LCG LHC Computing Grid Project From the Web to the Grid 23 September 2003 Jamie Shiers, Database Group IT Division, CERN, Geneva, Switzerland
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
ATLAS Distributed Computing perspectives for Run-2 Simone Campana CERN-IT/SDC on behalf of ADC.
Axel Naumann, DØ University of Nijmegen, The Netherlands 04/20/2002 APS April Meeting 2002 Prospects of the Multivariate B Quark Tagger for the Level 2.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
L. Perini DATAGRID WP8 Use-cases 19 Dec ATLAS short term grid use-cases The “production” activities foreseen till mid-2001 and the tools to be used.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
EC Review – 01/03/2002 – F.Carminati – Accomplishments of the project from the end user point of view– n° 1 Accomplishments of the project from the end.
The ATLAS detector … … is composed of cylindrical layers: Tracking detector: Pixel, SCT, TRT (Solenoid magnetic field) Calorimeter: Liquid Argon, Tile.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
1 GlueX Software Oct. 21, 2004 D. Lawrence, JLab.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
Hall D Computing Facilities Ian Bird 16 March 2001.
Emanuele Leonardi PADME General Meeting - LNF January 2017
ALICE – First paper.
ALICE Physics Data Challenge 3
ALICE – Evolving towards the use of EDG/LCG - the Data Challenge 2004
US ATLAS Physics & Computing
CERN, the LHC and the Grid
Design Principles of the CMS Level-1 Trigger Control and Hardware Monitoring System Ildefons Magrans de Abril Institute for High Energy Physics, Vienna.
Presentation transcript:

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Distributed Data Analysis in HEP Piotr MALECKI Institute of Nuclear Physics Kawiory 26A, Kraków, Poland See: Celso Martinez Rivero Instituto de Fisica de Cantabria, Santander Presentation at Brussels negotiations, 24 Oct

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Contents: High Energy Physics for GRID HEP = events HEP = people what events what people Subtasks (of task 1.3) interactive acces to distributed databases (1.3.1) data-mining techniques (1.3.2) integration with experiments, user interfaces (1.3.3) near-real time applications (1.3.4) CrossGrid vs Datagrid - complementary interactive (CG) vs non-interactive access/analyses file-level (DG) vs object-level (CG) emphasis on integration and user-friendly interfaces TIME to start!

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 What events? Collisions of particles – patient continuation of the Ernest Rutherford experiment in 1911 LHC missiles and targets: 7+7 TeV, from protons to Pb-nuclei. From 2 to debris per colission, recorded in milions of channels of tracking detectors and calorimeters REAL EVENTS produced at CERN at 40 MHz rate propagating through detector systems with the speed of light buffered and filtered in multi-level trigger system (subtask ? - listen to K.Korcyl) required reduction rate: 6 – 7 orders of magn. volume of one event ~ n Mbyte (n is small) initial processing of recorded data: 100Mev/y the rule of thumb: reconstruction takes only few percent of the corresponding simulation time ARTIFICIAL EVENTS produced at any place by MC techniques propagation through detector systems takes from nn minutes to 24 hours/event (700 MHz Pentium III =~ 28 SI95)! modern HEP experiments do not understand raw data without the corresponding MC samples MC samples should be 5 – 10 times larger

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 What people? LHC users in Europe: 267 institutes 4603 users LHC users outside Europe: 208 institutes 1632 users CrossGrid partners:

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 CrossGrid focuses on preparation of interactive applications for physics analyses. Done by distributed physics groups working on distributed databases. Hence, CG plans to optimise the use of distributed databases. In particular, test and verify the use of database managment systems OO or O/R ROOT or other HEP specific solutions SUBTASK Interactive Distributed Data Access

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 This task is closely related to the previous Should the user job be performed at each database server ? Or Be a task duplicated amongst many GRID machines? A tool based on the ANN (artificial neural network) technique is reported as appropriate for data-minig and already studied by CSIC. SUBTASK Data-mining techniques on GRID

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 CG not only concentrates on interactive analyses of distributed data but declares development of comfortable conditions for users. Integrated common interface to various experimental frameworks. Also, e.g. through the use of WP3 portal tools. SUBTASK Integration and Deployment

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 level 1 - special hardware 40 MHz (40 TB/sec) level 2 - embedded processors level 3 - PCs 75 KHz (75 GB/sec) 5 KHz (5 GB/sec) 100 Hz (100 MB/sec) data recording & offline analysis Subtask Application to... High Level Trigger Challenging extention of the GRID concept for the near-synchronous applications !

CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Subtask DISSEMINATION JUST STARTED