F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May,8 2008 Benchmarking ATLAS applications Franco Brasolin - INFN Bologna - Alessandro.

Slides:



Advertisements
Similar presentations
TeraGrid Deployment Test of Grid Software JP Navarro TeraGrid Software Integration University of Chicago OGF 21 October 19, 2007.
Advertisements

Performance Testing - Kanwalpreet Singh.
Task Scheduling and Distribution System Saeed Mahameed, Hani Ayoub Electrical Engineering Department, Technion – Israel Institute of Technology
Introduction to eValid Presentation Outline What is eValid? About eValid, Inc. eValid Features System Architecture eValid Functional Design Script Log.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Administrator’s Guide
VMware vCenter Server Module 4.
Test results Test definition (1) Istituto Nazionale di Fisica Nucleare, Sezione di Roma; (2) Istituto Nazionale di Fisica Nucleare, Sezione di Bologna.
Introduction to HP LoadRunner Getting Familiar with LoadRunner >>>>>>>>>>>>>>>>>>>>>>

Abstract The automated multi-platform software nightly build system is a major component in the ATLAS collaborative software organization, validation and.
Manage Engine: Q Engine. What is it?  Tool developed by Manage Engine that allows one to test web applications using a variety of different tests to.
University of Maryland Compiler-Assisted Binary Parsing Tugrul Ince PD Week – 27 March 2012.
Testing Virtual Machine Performance Running ATLAS Software Yushu Yao Paolo Calafiura LBNL April 15,
1 port BOSS on Wenjing Wu (IHEP-CC)
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Nightly Releases and Testing Alexander Undrus Atlas SW week, May
PHP With Oracle 11g XE By Shyam Gurram Eastern Illinois University.
Uncovering the Multicore Processor Bottlenecks Server Design Summit Shay Gal-On Director of Technology, EEMBC.
Wenjing Wu Andrej Filipčič David Cameron Eric Lancon Claire Adam Bourdarios & others.
T Project Review Tetrastone [Iteration 2]
The huge amount of resources available in the Grids, and the necessity to have the most up-to-date experimental software deployed in all the sites within.
WNoDeS – Worker Nodes on Demand Service on EMI2 WNoDeS – Worker Nodes on Demand Service on EMI2 Local batch jobs can be run on both real and virtual execution.
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
CPU Inside Maria Gabriela Yobal de Anda L#32 9B. CPU Called also the processor Performs the transformation of input into output Executes the instructions.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Apache JMeter By Lamiya Qasim. Apache JMeter Tool for load test functional behavior and measure performance. Questions: Does JMeter offers support for.
SAN DIEGO SUPERCOMPUTER CENTER Inca TeraGrid Status Kate Ericson November 2, 2006.
Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Usage of virtualization in gLite certification Andreas Unterkircher.
1 Moving Linux in CMS Vincenzo. 2 Generic Certification backward compatibility SLC4 is binary backward compatible: –Binaries built on SLC3 are usable.
Page 1 Printing & Terminal Services Lecture 8 Hassan Shuja 11/16/2004.
Full and Para Virtualization
Mobile: Today and Beyond Stuart Parmenter, Director of Mobile
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
Automatic Resource & Usage Monitoring Steve Traylen/Flavia Donno CERN/IT.
1 Adaptive Parallelism for Web Search Myeongjae Jeon Rice University In collaboration with Yuxiong He (MSR), Sameh Elnikety (MSR), Alan L. Cox (Rice),
US ATLAS Western Tier 2 Status Report Wei Yang Nov. 30, 2007 US ATLAS Tier 2 and Tier 3 workshop at SLAC.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Computer Organization Instruction Set Architecture (ISA) Instruction Set Architecture (ISA), or simply Architecture, of a computer is the.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Data Manipulation with Globus Toolkit Ivan Ivanovski TU München,
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
Global ADC Job Monitoring Laura Sargsyan (YerPhI).
Daniele Spiga PerugiaCMS Italia 14 Feb ’07 Napoli1 CRAB status and next evolution Daniele Spiga University & INFN Perugia On behalf of CRAB Team.
Processors with Hyper-Threading and AliRoot performance Jiří Chudoba FZÚ, Prague.
HEPMARK2 Consiglio di Sezione 9 Luglio 2012 Michele Michelotto - Padova.
Performance profiling of Experiments’ Geant4 Simulations Geant4 Technical Forum Ryszard Jurga.
Feedback from CMS Andrew Lahiff STFC Rutherford Appleton Laboratory Contributions from Christoph Wissing, Bockjoo Kim, Alessandro Degano CernVM Users Workshop.
INFSO-RI JRA2 Test Management Tools Eva Takacs (4D SOFT) ETICS 2 Final Review Brussels - 11 May 2010.
Testing Infrastructure Wahid Bhimji Sam Skipsey Intro: what to test Existing testing frameworks A proposal.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
The ALICE data quality monitoring Barthélémy von Haller CERN PH/AID For the ALICE Collaboration.
Wouter Verkerke, NIKHEF 1 Using ‘stoomboot’ for NIKHEF-ATLAS batch computing What is ‘stoomboot’ – Hardware –16 machines, each 2x quad-core Pentium = 128.
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
Alessandro De Salvo Mayuko Kataoka, Arturo Sanchez Pineda,Yuri Smirnov CHEP 2015 The ATLAS Software Installation System v2 Alessandro De Salvo Mayuko Kataoka,
Benefits of a Virtual SIL
Understanding and Improving Server Performance
Virtualization Review and Discussion
Diskpool and cloud storage benchmarks used in IT-DSS
Outline Benchmarking in ATLAS Performance scaling
Mobile Operating System
Patricia Méndez Lorenzo ALICE Offline Week CERN, 13th July 2007
ATLAS Software Installation redundancy Alessandro De Salvo Alessandro
The ATLAS software in the Grid Alessandro De Salvo <Alessandro
Chapter 3: Windows7 Part 1.
HC Hyper-V Module GUI Portal VPS Templates Web Console
Support for ”interactive batch”
Alice Software Demonstration
Exploring Multi-Core on
Presentation transcript:

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, Benchmarking ATLAS applications Franco Brasolin - INFN Bologna - Alessandro De Salvo – INFN Roma1 - May, Outline  The ATLAS benchmark suite  Benchmark execution  Accessing the test results  Conclusions and next steps

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, The ATLAS benchmark suite The goal is to define an ATLAS benchmark system, based on the experiment software The goal is to define an ATLAS benchmark system, based on the experiment software To have more realistic measurements of the CPU power, based on the experiment code and real jobs To have more realistic measurements of the CPU power, based on the experiment code and real jobs The ATLAS tests are based on an existing infrastructure The ATLAS tests are based on an existing infrastructure KitValidation KitValidation Global KitValidation portal Global KitValidation portal

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, The test infrastructure (1/2) KitValidation (KV) KitValidation (KV) KitValidation has been developed to test the ATLAS software KitValidation has been developed to test the ATLAS software It can execute custom tests, defined by the software developers and included in the experiment software releases It can execute custom tests, defined by the software developers and included in the experiment software releases Used by the community since 2003 to test the ATLAS software installation Used by the community since 2003 to test the ATLAS software installation Grid installation Grid installation End-user installation on desktops, laptops, … End-user installation on desktops, laptops, … Generic pre-release testing Generic pre-release testing Global KitValidation (GKV) portal Global KitValidation (GKV) portal KitValidation is able to send the test results to a web service (GKV), if the corresponding feature is enabled KitValidation is able to send the test results to a web service (GKV), if the corresponding feature is enabled Few informations are sent to the portal Few informations are sent to the portal Machine infos (CPU type, CPU speed, Memory size, …) Machine infos (CPU type, CPU speed, Memory size, …) Test status (OK, FAILED) Test status (OK, FAILED) Test logfiles and xml job info Test logfiles and xml job info Through the GKV web interface search engine, the test results can be selected Through the GKV web interface search engine, the test results can be selected

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, The test infrastructure (2/2) The benchmark results are obtained by dynamically parsing the logfiles of the KV tests The benchmark results are obtained by dynamically parsing the logfiles of the KV tests Timings are obtained from the ATLAS software framework (Athena) Timings are obtained from the ATLAS software framework (Athena) Using the Athena ChronoStatSvc timing service for full job timing Using the Athena ChronoStatSvc timing service for full job timing Using specific timers from the Athena routines, currently available only for G4 simulation Using specific timers from the Athena routines, currently available only for G4 simulation Almost no bias is introduced when using the framework timing services Almost no bias is introduced when using the framework timing services The ChronoStatSvc is measuring the actual job running time, excluding the basic initialization and finalization of the framework The ChronoStatSvc is measuring the actual job running time, excluding the basic initialization and finalization of the framework Specific timing services can be added to fine tune the measurements Specific timing services can be added to fine tune the measurements The benchmark results are collected in GKV The benchmark results are collected in GKV All users can view the benchmask results All users can view the benchmask results Only power users have full access to the informations contained in the GKV pages, using their personal certificate Only power users have full access to the informations contained in the GKV pages, using their personal certificate

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, Benchmarks execution The test execution is performed by using a script The test execution is performed by using a script The sw-mgr script is able to The sw-mgr script is able to Install and test a specific ATLAS software release Install and test a specific ATLAS software release Test an existing ATLAS software installation Test an existing ATLAS software installation Multiple concurrent tests are also possible, with and arbitrary number of threads Multiple concurrent tests are also possible, with and arbitrary number of threads Useful when testing multi-core, multi-CPU machines at full load Useful when testing multi-core, multi-CPU machines at full load Example of a test of the existing installation of release at CERN Example of a test of the existing installation of release at CERN > VO_ATLAS_SW_DIR=/afs/cern.ch/project/gd/apps/atlas/slc3 > sw-mgr --test l $VO_ATLAS_SW_DIR/software/ \ -p $VO_ATLAS_SW_DIR/prod/releases/rel_14-0 -o -P AtlasOffline \ -p $VO_ATLAS_SW_DIR/prod/releases/rel_14-0 -o -P AtlasOffline \ -T release -t _i686_slc4_gcc34 -O opt -m 3.24 \ -T release -t _i686_slc4_gcc34 -O opt -m 3.24 \ --no-tag --tthreads 4 --kvpost --kvpost-tag "CERN_benchmark_AMD22_4t”\ --no-tag --tthreads 4 --kvpost --kvpost-tag "CERN_benchmark_AMD22_4t”\ --kv-cache --kv-cache Every user is allowed to perform such tests and publish the information to the GKV portal Every user is allowed to perform such tests and publish the information to the GKV portal

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, Accessing the test results The test results may be browsed via the GKV portal The test results may be browsed via the GKV portal

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, Preliminary Benchmarking Results for ATLAS lxbench01Intel(R) Xeon(TM) CPU 2.80GHz, 2 GB RAM lxbench02Intel(R) Xeon(TM) CPU 2.80GHz, 4 GB RAM lxbench03Dual Core AMD Opteron(tm) Processor 275 lxbench04Intel(R) Xeon(R) CPU 2.66GHz lxbench05Intel(R) Xeon(R) CPU 3.00GHz lxbench06Dual-Core AMD Opteron(tm) Processor 2218 lxbench07Intel(R) Xeon(R) CPU 2.33GHz Preliminary results obtained by running with release (SLC3, gcc 3.2.3, 32 bits) over different processors at CERN Preliminary results obtained by running with release (SLC3, gcc 3.2.3, 32 bits) over different processors at CERN

F. Brasolin / A. De Salvo – The ATLAS benchmark suite – May, Conclusions The ATLAS benchmarking suite is in place and functional The ATLAS benchmarking suite is in place and functional All the results are available from the GKV portal All the results are available from the GKV portal Every user may contribute to the tests, even if non ATLAS people, by simply wget-ing the test script and executing it Every user may contribute to the tests, even if non ATLAS people, by simply wget-ing the test script and executing it An initial set of tests has been defined and performed on several architectures, both in SLC3 compatibility mode (release ) and in SLC4 native mode (release ) An initial set of tests has been defined and performed on several architectures, both in SLC3 compatibility mode (release ) and in SLC4 native mode (release ) The preliminary results are encouraging but still some work needs to be done (see below) The preliminary results are encouraging but still some work needs to be done (see below) Next steps and improvements Next steps and improvements Provide more user-friendly pre-compiled scripts and examples to run the benchmarks for non-ATLAS users Provide more user-friendly pre-compiled scripts and examples to run the benchmarks for non-ATLAS users Add to GKV the feature of presenting the summary of the benchmark sessions Add to GKV the feature of presenting the summary of the benchmark sessions Define a final set of tests to be performed in the benchmark sessions Define a final set of tests to be performed in the benchmark sessions It would be highly desirable to have a compatible set of test (physics channels) among the experiments, if possible It would be highly desirable to have a compatible set of test (physics channels) among the experiments, if possible