EGI High-Throughput Compute

Slides:



Advertisements
Similar presentations
Building Campus HTC Sharing Infrastructures Derek Weitzel University of Nebraska – Lincoln (Open Science Grid Hat)
Advertisements

Workload Management Workpackage Massimo Sgaravatto INFN Padova.
6/2/20071 Grid Computing Sun Grid Engine (SGE) Manoj Katwal.
Workload Management Massimo Sgaravatto INFN Padova.
Stuart K. PatersonCHEP 2006 (13 th –17 th February 2006) Mumbai, India 1 from DIRAC.Client.Dirac import * dirac = Dirac() job = Job() job.setApplication('DaVinci',
Volunteer Computing and Hubs David P. Anderson Space Sciences Lab University of California, Berkeley HUBbub September 26, 2013.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Nicholas LoulloudesMarch 3 rd, 2009 g-Eclipse Testing and Benchmarking Grid Infrastructures using the g-Eclipse Framework Nicholas Loulloudes On behalf.
EXPOSE GOOGLE APP ENGINE AS TASKTRACKER NODES AND DATA NODES.
1 Evolution of OSG to support virtualization and multi-core applications (Perspective of a Condor Guy) Dan Bradley University of Wisconsin Workshop on.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Trusted Virtual Machine Images a step towards Cloud Computing for HEP? Tony Cass on behalf of the HEPiX Virtualisation Working Group October 19 th 2010.
EGEE is a project funded by the European Union under contract IST WS-Based Advance Reservation and Co-allocation Architecture Proposal T.Ferrari,
Aneka Cloud ApplicationPlatform. Introduction Aneka consists of a scalable cloud middleware that can be deployed on top of heterogeneous computing resources.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
INFSO-RI Enabling Grids for E-sciencE Policy management and fair share in gLite Andrea Guarise HPDC 2006 Paris June 19th, 2006.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Practical using WMProxy advanced job submission.
Interoperability and Integration of EGI with Helix Nebula - Workshop Sergio Andreozzi Strategy and Policy Manager (EGI.eu) 11/04/2013 EGI Community.
Virtual Cluster Computing in IHEPCloud Haibo Li, Yaodong Cheng, Jingyan Shi, Tao Cui Computer Center, IHEP HEPIX Spring 2016.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
Science Gateway- 13 th May Science Gateway Use Cases/Interfaces D. Sanchez, N. Neyroud.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Enabling Grids for E-sciencE Claudio Cherubino INFN DGAS (Distributed Grid Accounting System)
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Grid Computing: An Overview and Tutorial Kenny Daily BIT Presentation 22/09/2016.
Towards a High Performance Extensible Grid Architecture Klaus Krauter Muthucumaru Maheswaran {krauter,
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Honolulu - Oct 31st, 2007 Using Glideins to Maximize Scientific Output 1 IEEE NSS 2007 Making Science in the Grid World - Using Glideins to Maximize Scientific.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI solution for high throughput data analysis Peter Solagna EGI.eu Operations.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Role and Challenges of the Resource Centre in the EGI Ecosystem Tiziana Ferrari,
DIRAC: Workload Management System Garonne Vincent, Tsaregorodtsev Andrei, Centre de Physique des Particules de Marseille Stockes-rees Ian, University of.
Enabling Grids for E-sciencE University of Perugia Computational Chemistry status report EGAAP Meeting – 21 rst April 2005 Athens, Greece.
Practical using C++ WMProxy API advanced job submission
Review of the WLCG experiments compute plans
Workload Management Workpackage
Modeling Big Data Execution speed limited by: Model complexity
The advances in IHEP Cloud facility
DGAS A.Guarise April 19th, Athens
Volunteer Computing for Science Gateways
AWS Integration in Distributed Computing
Amazon Storage- S3 and Glacier
CREAM Status and Plans Massimo Sgaravatto – INFN Padova
Brief overview on GridICE and Ticketing System
DIRAC services.
Ideas for an ICOS Competence Centre Implementation of an on-demand computation service Ute Karstens, André Bjärby, Oleg Mirzov, Roger Groth, Mitch Selander,
Grid2Win: Porting of gLite middleware to Windows XP platform
Grid Computing.
Spatial Analysis With Big Data
MaterialsHub - A hub for computational materials science and tools.
EGI-Engage Engaging the EGI Community towards an Open Science Commons
An AAI solution for collaborations at scale
AWS Batch Overview A highly-efficient, dynamically-scaled, batch computing service May 2017.
FCT Follow-up Meeting 31 March, 2017 Fernando Meireles
GRID COMPUTING PRESENTED BY : Richa Chaudhary.
VMDIRAC status Vanessa HAMAR CC-IN2P3.
Overview Introduction VPS Understanding VPS Architecture
Condor and Multi-core Scheduling
SISAI STATISTICAL INFORMATION SYSTEMS ARCHITECTURE AND INTEGRATION
Social media for global scientific community – Mendeley project
DGAS Today and tomorrow
Wide Area Workload Management Work Package DATAGRID project
Building and running HPC apps in Windows Azure
LO2 – Understand Computer Software
Technical Outreach Expert
Microsoft Virtual Academy
Welcome to (HT)Condor Week #19 (year 34 of our project)
EOSC-hub Contribution to the EOSC WGs
Check-in Identity and Access Management solution that makes it easy to secure access to services and resources.
Presentation transcript:

EGI High-Throughput Compute

Content Motivation and driving consideration about the service Service architecture and interfaces: overview How the user can access the service E.g.: REST, GUI, CLIs, etc. Service options and attributes Acceptable Usage Policy (AUP) Access policy and business model Use cases Documentation/tutorial/information 11/14/2019

Motivation “Execute thousands of computational tasks to analyze large datasets” The EGI High-Throughput compute (HTC) provides users with the capability to access large amounts of computing resources, and to submit hundreds or thousands of computational tasks. The EGI HTC Compute service is provided by a distributed network of computing centres, accessible via a standard interface and membership of a virtual organisation. EGI offers more than 1,000,000 cores of installed capacity, supporting over 1.6 million computing jobs per day. 11/14/2019

Main features Access to high-quality computing resources. Integrated monitoring and accounting tools to provide information about the availability and resource consumption. Workload and data management tools to manage all computational tasks. Large amounts of processing capacity over long periods of time. Faster results for your research. Shared resources among users, enabling collaborative research. 11/14/2019

Generic service architecture interfacing directly with CEs User Client (User Interface) Job Sandbox Worker node CE node (ARC CE/HTCondor-CE) (Note: in the case of DIRAC WMS, the CEs are behind the WMS) 11/14/2019

ARC CE Architecture 11/14/2019

HT-Condor CE Architecture Local batch system 11/14/2019

DIRAC WMS Architecture DIRAC4EGI instance: https://dirac.egi.eu/DIRAC/ 11/14/2019

Service access ARC User Interface (UI): HTCondor-CE: Dirac WMS: http://www.nordugrid.org/documents/arc-ui.pdf HTCondor-CE: https://opensciencegrid.org/docs/compute-element/install-htcondor-ce/ Dirac WMS: https://dirac.readthedocs.io/en/latest/AdministratorGuide/Systems/WorkloadManagement/ https://agenda.ct.infn.it/event/895/ https://buildmedia.readthedocs.org/media/pdf/cream-guide/latest/cream-guide.pdf https://cream-guide.readthedocs.io/en/latest/User_Guide.html 11/14/2019

Service options Service Options Base High-Throughput Compute: It allows the execution of large numbers of independent or loosely coupled computing tasks. Limited parallel and multi-thread computing can be supported as well. MPI High-Throughput Compute: It allows parallel computing, with support of MPI protocol and libraries. 11/14/2019

Service attributes Service attributes Number of CPU Hours Guaranteed Number of CPU cores Amount of RAM per CPU core (GB) Parallelism (Threads) Access type: Opportunistic or Reserved Start of service (date) and duration (days) Other Technical Requirements 11/14/2019

Acceptable Usage Policy EGI AuP: https://documents.egi.eu/document/2623 Further VO-level AuPs may also apply 11/14/2019

Access policies and Funding models Policy-based Market-drive Payment model: Sponsored use for policy-based access (free for a certain quota) Pay-as-you-go for market-driven access Prices available at: https://wiki.egi.eu/wiki/EGI_Pay-for-Use_PoC:Service/Price_Overview 11/14/2019

HTC use cases Online PDF Online PDF 11/14/2019

More HTC use cases Clean methane from recycled carbon dioxide [Chemistry] Studying the atmospheric composition in Bulgaria [Atmospheric Sciences] Satellite observations help to monitor air quality in China [Atmospheric Sciences] Self-fertilization can improve resilience to extinction [Biodiversity] Old coal heaps can be valuable ecosystems [Biodiversity] 11/14/2019

EOSC-hub Service Catalogue https://www.eosc-hub.eu/catalogue