Thomas Jefferson National Accelerator Facility Page 1 ClaRA Stress Test V. Gyurjyan S. Mancilla.

Slides:



Advertisements
Similar presentations
Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson National Accelerator Facility Strained superlattice.
Advertisements

Copyright © 2010, SAS Institute Inc. All rights reserved. SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks.
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
Cloudmesh Resource Shifting 1 2. Cloudmesh: from IaaS(NaaS) to Workflow (Orchestration) Workflow Virtual Cluster Components Infrastructure iPython (Pegasus)
Status of the new CRS software (update) Tomasz Wlodek June 22, 2003.
Thomas Jefferson National Accelerator Facility Page 1 Clas12 Reconstruction and Analysis Framework V. Gyurjyan.
The June Software Review David Lawrence, JLab Feb. 16, 2012.
Multi-threaded Event Processing with JANA David Lawrence – Jefferson Lab Nov. 3, /3/08 Multi-threaded Event Processing with JANA - D. Lawrence JLab.
Thomas Jefferson National Accelerator Facility Page 1 CLAS12 Software D.P. Weygand Thomas Jefferson National Accelerator Facility.
Thomas Jefferson National Accelerator Facility Page 1 Clas12 Reconstruction and Analysis Framework V. Gyurjyan S. Mancilla.
Www. MasterOfThings. com Application Enablement Platform with a visual IDE.
Software Architecture
Thomas Jefferson National Accelerator Facility Page 1 12 GeV Upgrade Software Review Jefferson Lab November 25-26, 2013 Software Project for Hall B 12.
EXPOSE GOOGLE APP ENGINE AS TASKTRACKER NODES AND DATA NODES.
Operated by the Southeastern Universities Research Association for the U.S. Department of Energy Thomas Jefferson National Accelerator Facility Page 1.
C O R P O R A T E T E C H N O L O G Y Siemens AG Software & Engineering Usage of Enterprise OSGi inside Siemens:  Siemens Communications, Enterprise Systems.
NLIT May 26, 2010 Page 1 Computing Jefferson Lab Users Group Meeting 8 June 2010 Roy Whitney CIO & CTO.
HEPiX Karlsruhe May 9-13, 2005 Operated by the Southeastern Universities Research Association for the U.S. Department of Energy Thomas Jefferson National.
Jefferson Lab Site Report Kelvin Edwards Thomas Jefferson National Accelerator Facility Newport News, Virginia USA
CSF4 Meta-Scheduler Name: Zhaohui Ding, Xiaohui Wei
SAT #003 Atlanta GA October 27, Scripting Datacenter Orchestration Glenn Blogs.NetApp.com/MSEnviro Scripting Datacenter Orchestration.
July 28' 2011INDIA-CMS_meeting_BARC1 Tier-3 TIFR Makrand Siddhabhatti DHEP, TIFR Mumbai July 291INDIA-CMS_meeting_BARC.
WNoDeS – Worker Nodes on Demand Service on EMI2 WNoDeS – Worker Nodes on Demand Service on EMI2 Local batch jobs can be run on both real and virtual execution.
Integrating JASMine and Auger Sandy Philpott Thomas Jefferson National Accelerator Facility Jefferson Ave. Newport News, Virginia USA 23606
Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson National Accelerator Facility Andy Kowalski.
Thomas Jefferson National Accelerator Facility Page 1 Clas12 Reconstruction and Analysis Framework V. Gyurjyan S. Mancilla.
Checkout, Installation and Running Tutorial-1 By V. Gyurjyan.
LOGO Development of the distributed computing system for the MPD at the NICA collider, analytical estimations Mathematical Modeling and Computational Physics.
Cyber Security Review, April 23-24, 2002, 0 Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson.
Thomas Jefferson National Accelerator Facility Page Hall B:User Software Contributions Gerard Gilfoyle University of Richmond 12 GeV Upgrade Software Review.
Thomas Jefferson National Accelerator Facility Page 1 CLAS12 Computing Requirements G.P.Gilfoyle University of Richmond.
Ad Hoc VO Akylbek Zhumabayev Images. Node Discovery vs. Registration VO Node Resource User discover register Resource.
V Gyurjyan, D Abbott, J Carbonneau, G Gilfoyle, D Heddle, G Heyes, S Paul, C Timmer, D Weygand V. Gyurjyan JLAB data acquisition and analysis group.
Aneka Cloud ApplicationPlatform. Introduction Aneka consists of a scalable cloud middleware that can be deployed on top of heterogeneous computing resources.
Workload management, virtualisation, clouds & multicore Andrew Lahiff.
Thomas Jefferson National Accelerator Facility Page 1 Overview Talk Content Break-out Sessions Planning 12 GeV Upgrade Software Review Jefferson Lab November.
Source Page US:official&tbm=isch&tbnid=Mli6kxZ3HfiCRM:&imgrefurl=
Batch Software at JLAB Ian Bird Jefferson Lab CHEP February, 2000.
OpenNebula: Experience at SZTAKI Peter Kacsuk, Sandor Acs, Mark Gergely, Jozsef Kovacs MTA SZTAKI EGI CF Helsinki.
Thomas Jefferson National Accelerator Facility Page 1 FNAL/JLab weekly meeting, April 23, 2014 Agenda  Path Forward  Status at FNAL  Status at JLab.
Requesting Geometry Service Tutorial-2 By V. Gyurjyan.
Division Level 1 (facts 0-2, 5, 9) © Thomas 2002.
Division Level 2 (facts 0-9) © Thomas 2002 Click here to start program.
Checkout, Installation and Running Tutorial-1 By V. Gyurjyan.
Next Generation of Apache Hadoop MapReduce Owen
CSF. © Platform Computing Inc CSF – Community Scheduler Framework Not a Platform product Contributed enhancement to The Globus Toolkit Standards.
Operated by the Southeastern Universities Research Association for the U.S. Department of Energy Thomas Jefferson National Accelerator Facility Page 1.
INFN/IGI contributions Federated Clouds Task Force F2F meeting November 24, 2011, Amsterdam.
LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University DOSAR II Workshop at UT-Arlington March 30-31, 2005.
Путешествуй со мной и узнаешь, где я сегодня побывал.
Compute and Storage For the Farm at Jlab
CLARA Micro-services architecture for distributed data analytics
Elastic Computing Resource Management Based on HTCondor
Blueprint of Persistent Infrastructure as a Service
CSS534: Parallel Programming in Grid and Cloud
FCT Follow-up Meeting 31 March, 2017 Fernando Meireles
ALICE Computing Model in Run3
CLARA Based Application Vertical Elasticity
PROCESS - H2020 Project Work Package WP6 JRA3
Page 1. Page 2 Page 3 Page 4 Page 5 Page 6 Page 7.
Topic 16 Queues Adapted from Mike Scott’s materials.
High Performance Data Scientist
Introducing – SAS® Grid Manager for Hadoop
A Web-Based Data Grid Chip Watson, Ian Bird, Jie Chen,
در تجزیه و تحلیل شغل باید به 3 سوال اساسی پاسخ دهیم Job analysis تعریف کارشکافی، مطالعه و ثبت جنبه های مشخص و اساسی هر یک از مشاغل عبارتست از مراحلی.
التعامل مع ضغوطات العمل إعداد وتقديم إقبال المطيري
Clas Reconstruction and Analyses framework
Physics data processing with SOA
CLARA . What’s new? CLAS Collaboration Meeting. March 6, 2019
Presentation transcript:

Thomas Jefferson National Accelerator Facility Page 1 ClaRA Stress Test V. Gyurjyan S. Mancilla

Thomas Jefferson National Accelerator Facility Page 2 ClaRA Components DPE Platform (cloud controller) C S DPE C S C S Orchestrator

Thomas Jefferson National Accelerator Facility Page 3 16 core hyper-threaded (no IO)

Thomas Jefferson National Accelerator Facility Page 4 Batch Deployment

Thomas Jefferson National Accelerator Facility Page 5 Batch job submission <![CDATA[ setenv CLARA_SERVICES /group/clas12/ClaraServices; $CLARA_SERVICES/bin/clara-dpe -host claradm-ib ]]>

Thomas Jefferson National Accelerator Facility Page 6 Batch queu Common queue Exclusive queue : CentOS core, 12 processing nodes

Thomas Jefferson National Accelerator Facility Page 7 R R W W Administrative Services Administrative Services ClaRA Master DPE Persistent Storage Persistent Storage AO Executive Node Farm Node N S1 S2 Sn S1 S2 Sn Single Data-stream Application

Thomas Jefferson National Accelerator Facility Page 8 Single Data-stream Application Clas12 Reconstruction: JLAB batch farm

Thomas Jefferson National Accelerator Facility Page 9 R R W W Administrative Services Administrative Services ClaRA Master DPE AO Executive Node Farm Node N S1 S2 Sn S1 S2 Sn Multiple Data-stream Application Persistent Storage Persistent Storage DS Persistent Storage Persistent Storage