FutureGrid UAB Meeting XSEDE13 San Diego July 24 2013.

Slides:



Advertisements
Similar presentations
Sponsors and Acknowledgments This work is supported in part by the National Science Foundation under Grants No. OCI , IIP and CNS
Advertisements

Overview of the FutureGrid Software
Education and training on FutureGrig Salt Lake City, Utah July 18 th 2011 Presented by Renato Figueiredo
FutureGrid related presentations at TG and OGF Sun. 17th: Introduction to FutireGrid (OGF) Mon. 18th: Introducing to FutureGrid (TG) Tue. 19th –Educational.
Cloudmesh Resource Shifting 1 2. Cloudmesh: from IaaS(NaaS) to Workflow (Orchestration) Workflow Virtual Cluster Components Infrastructure iPython (Pegasus)
Clouds from FutureGrid’s Perspective April Geoffrey Fox Director, Digital Science Center, Pervasive.
Future Grid Introduction March MAGIC Meeting Gregor von Laszewski Community Grids Laboratory, Digital Science.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
FutureGrid Image Repository: A Generic Catalog and Storage System for Heterogeneous Virtual Machine Images Javier Diaz, Gregor von Laszewski, Fugang Wang,
Jefferson Ridgeway 2, Ifeanyi Rowland Onyenweaku 3, Gregor von Laszewski 1*, Fugang Wang 1 1* Indiana University, Bloomington, IN 47408, U.S.A.,
WORKFLOWS IN CLOUD COMPUTING. CLOUD COMPUTING  Delivering applications or services in on-demand environment  Hundreds of thousands of users / applications.
SALSASALSASALSASALSA Digital Science Center June 25, 2010, IIT Geoffrey Fox Judy Qiu School.
Design Discussion Rain: Dynamically Provisioning Clouds within FutureGrid Geoffrey Fox, Andrew J. Younge, Gregor von Laszewski, Archit Kulshrestha, Fugang.
Big Data and Clouds: Challenges and Opportunities NIST January Geoffrey Fox
Eucalyptus on FutureGrid: A case for Eucalyptus 3 Sharif Islam, Javier Diaz, Geoffrey Fox Gregor von Laszewski Indiana University.
Building service testbeds on FIRE D5.2.5 Virtual Cluster on Federated Cloud Demonstration Kit August 2012 Version 1.0 Copyright © 2012 CESGA. All rights.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Science of Cloud Computing Panel Cloud2011 Washington DC July Geoffrey Fox
Accessing and Managing Multiple Clouds (Infrastructures) with Cloudmesh June BigSystem Software-Defined Ecosystems at HPDC Vancouver Canada.
Experimenting with FutureGrid CloudCom 2010 Conference Indianapolis December Geoffrey Fox
DISTRIBUTED COMPUTING
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
Presented by: Sanketh Beerabbi University of Central Florida COP Cloud Computing.
INFSO-RI Module 01 ETICS Overview Alberto Di Meglio.
FutureGrid Dynamic Provisioning Experiments including Hadoop Fugang Wang, Archit Kulshrestha, Gregory G. Pike, Gregor von Laszewski, Geoffrey C. Fox.
Large Scale Sky Computing Applications with Nimbus Pierre Riteau Université de Rennes 1, IRISA INRIA Rennes – Bretagne Atlantique Rennes, France
INFSO-RI Module 01 ETICS Overview Etics Online Tutorial Marian ŻUREK Baltic Grid II Summer School Vilnius, 2-3 July 2009.
Image Management and Rain on FutureGrid: A practical Example Presented by Javier Diaz, Fugang Wang, Gregor von Laszewski.
FutureGrid Connection to Comet Testbed and On Ramp as a Service Geoffrey Fox Indiana University Infra structure.
Image Generation and Management on FutureGrid CTS Conference 2011 Philadelphia May Geoffrey Fox
Image Management and Rain on FutureGrid Javier Diaz - Fugang Wang – Gregor von.
From 0 to 100: Cloud Computing for the Non-Programmer Derek Morris, Jr. Mentor: Gregor von Laszewski ABSTRACT This project will be demonstrating that it.
Magellan: Experiences from a Science Cloud Lavanya Ramakrishnan.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
FutureGrid Cyberinfrastructure for Computational Research.
RAIN: A system to Dynamically Generate & Provision Images on Bare Metal by Application Users Presented by Gregor von Laszewski Authors: Javier Diaz, Gregor.
FutureGrid Computing Testbed as a Service Overview July Geoffrey Fox for FutureGrid Team
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
FutureGrid Computing Testbed as a Service NSF Presentation NSF April Geoffrey Fox for FutureGrid Team
FutureGrid Computing Testbed as a Service for Condo_of_Condos Internet 2 panel April Jose Fortes for FutureGrid Team.
Design Discussion Rain: Dynamically Provisioning Clouds within FutureGrid PI: Geoffrey Fox*, CoPIs: Kate Keahey +, Warren Smith -, Jose Fortes #, Andrew.
Computing Research Testbeds as a Service: Supporting large scale Experiments and Testing SC12 Birds of a Feather November.
Recipes for Success with Big Data using FutureGrid Cloudmesh SDSC Exhibit Booth New Orleans Convention Center November Geoffrey Fox, Gregor von.
Remarks on MOOC’s Open Grid Forum BOF July 24 OGF38B at XSEDE13 San Diego Geoffrey Fox Informatics, Computing.
SALSASALSASALSASALSA Digital Science Center February 12, 2010, Bloomington Geoffrey Fox Judy Qiu
Experiments in Utility Computing: Hadoop and Condor Sameer Paranjpye Y! Web Search.
Document Name CONFIDENTIAL Version Control Version No.DateType of ChangesOwner/ Author Date of Review/Expiry The information contained in this document.
Directions in eScience Interoperability and Science Clouds June Interoperability in Action – Standards Implementation.
Grappling Cloud Infrastructure Services with a Generic Image Repository Javier Diaz Andrew J. Younge, Gregor von Laszewski, Fugang.
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
Introductory Tutorial: OpenStack, Chef, Hadoop, Hbase, Pig I590 Data Science Curriculum Big Data Open Source Software and Projects September Geoffrey.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI Services for Distributed e-Infrastructure Access Tiziana Ferrari on behalf.
Private Public FG Network NID: Network Impairment Device
Digital Science Center II
Status and Challenges: January 2017
StratusLab Final Periodic Review
StratusLab Final Periodic Review
NSF start October 1, 2014 Datanet: CIF21 DIBBs: Middleware and High Performance Analytics Libraries for Scalable Data Science Indiana University.
NSF : CIF21 DIBBs: Middleware and High Performance Analytics Libraries for Scalable Data Science PI: Geoffrey C. Fox Software: MIDAS HPC-ABDS.
FutureGrid Computing Testbed as a Service
Digital Science Center Overview
I590 Data Science Curriculum August
Tutorial Overview February 2017
Sky Computing on FutureGrid and Grid’5000
Clouds from FutureGrid’s Perspective
Module 01 ETICS Overview ETICS Online Tutorials
Gregor von Laszewski Indiana University
Indiana University Gregor von Laszewski.
Gregor von Laszewski DIBBS meeting, Washington DC
Sky Computing on FutureGrid and Grid’5000
Presentation transcript:

FutureGrid UAB Meeting XSEDE13 San Diego July

Basic Status FutureGrid has been running for 3 years – 322 projects; 1874 users Funding available through September 30, 2014 with No Cost Extension which can be submitted in mid August (45 days prior to the formal expiration of the grant) Participated in Computer Science activities (call for white papers and presentation to CISE director) Participated in OCI solicitations Pursuing GENI collaborations

Technology OpenStack becoming best open source virtual machine management environment – Also more reliable than previous versions of OpenStack and Eucalyptus – Nimbus switch to OpenStack core with projects like Phantom – In past Nimbus was essential as only reliable open source VM manager XSEDE Integration has made major progress; 80% complete These improvements/progress will allow much greater focus on TestbedaaS software Solicitations motivated adding “On-ramp” capabilities; develop code on FutureGrid – Burst or Shift to other cloud or HPC systems (CloudMesh)

Assumptions “Democratic” support of Clouds and HPC likely to be important As a testbed, offer bare metal or clouds on a given node Run HPC systems with similar tools to clouds so HPC bursting as well as Cloud bursting Define images by templates that can be built for different HPC and cloud environments Education integration important (MOOC’s)

Integrate MOOC Technology We are building MOOC lessons to describe core FutureGrid Capabilities – Come to 5pm OGF MOOC BOF Will help especially educational uses – 28 Semester long classes: 563+ students – Cloud Computing, Distributed Systems, Scientific Computing and Data Analytics – 3 one week summer schools: 390+ students – Big Data, Cloudy View of Computing (for HBCU’s), Science Clouds – 7 one to three day workshop/tutorials: 238 students Science Cloud Summer School available in MOOC format First high level Software IP-over-P2P (IPOP) Overview and Details of FutureGrid How to get project, use HPC and use OpenStack

Online MOOC’s Science Cloud MOOC repository – FutureGrid MOOC’s – A MOOC that will use FutureGrid for class laboratories (for advanced students in IU Online Data Science masters degree) – MOOC Introduction to FutureGrid can be used by all classes and tutorials on FutureGrid Currently use Google Course Builder: Google Apps + YouTube – Built as collection of modular ~10 minute lessons

Recent FutureGrid Software Efforts Gregor von Laszewski, Geoffrey C. Fox Indiana University

Selected List of Services Offered 8 Cloud PaaS Hadoop Iterative MapReduce HDFS Hbase Swift Object Store IaaS Nimbus Eucalyptus OpenStack ViNE GridaaS Genesis Unicore SAGA Globus HPCaaS MPI OpenMP CUDA TestbedaaS Infrastructure: Inca, Ganglia Provisioning: RAIN, CloudMesh VMs: Phantom, CloudMesh Experiments: Pegasus, Precip Accounting: FG, XSEDE

FutureGrid Testbed-aaS and User on-Ramp

Information Services I Information Services – Message-based Information System (SDSC, TACC) GLUE2 Inca, Ganglia. Candidate for XSEDE after FutureGrid test – CloudMesh CloudMetrics Accounting integration (XSEDE) all events (logged) OpenStack, Eucalyptus, Nimbus – Inca: service monitoring including history event sampling – Others: Ganglia, Nagios

Information Services II CloudMesh CloudMetrics – Report – Portal – CLI: cm> generate report – API generate_report

XSEDE Integration New Features Project Request via XSEDE – Initiated via XSEDE Portal – Projects will be reviewed via Pops – Accounts and projects will be created on FG – FG summary metrics will be reported back to XSEDE Changes XSEDE: – new pops testbeds object – short lived projects FG: – FG simplified metrics for XSEDE. (FG has more Account information than XSEDE handles, Users with more need can goto FG portal, API, commandline tool) – Ongoing: determination of Metric Fixed charge by day Wall clock time for vms used & managed Planed Features Explore TAS integration Multiple Metrics Multiple Resources

FG Partner Cloud Tools Phantom – Management of VMs Multiple clouds Fault tolerant On demand provisioning Sensors Euca2ools++ PRECIP – Pegasus Repeatable Experiments for the Cloud in Python – Extends VM management tools with Run shell script on VM Copy files to VM Managed via Condor

Dynamic Resourcing Capabilities underlying FutureGrid User-Ramp Cloud/HPC Bursting Move workload (images/jobs) to other clouds (or HPC Clusters) in case your current resource gets over utilized. Users do this Providers do this Schedulers do this Resource(Cloud/HPC) Shifting or Dynamic Resource Provisioning Add more resources to a cloud or HPC capability from resources that are not used or are underutilized. Now doing this by hand We are automatizing this – PhD thesis We want to integrate this with Cloud Bursting

CloudMesh Requirements Support Shifting and Bursting Support User-Ramp Supports general commercial/academic cloud federation Bare metal and Cloud (later) provisioning Extensible architecture Plugin mechanism Security Initial Release Capabilities Delivers API, services, command line, command shell that supports the tasks needed to conduct provisioning and shifting Uniform API to multiple clouds via native protocol – Important for scalability tests – EC2 compatible tools and libraries are not enough (experience from FG)

CloudMesh Architecture

Rain Current Features Manages images on VMs & Bare metal – templated images Uses low-level client libraries – important for testing Command shell Moving of resources – Eucalyptus, OpenStack, HPC Under Development Provisioning via AMQP Provisioning multiple clusters – Provisioning Inventory for FG – Provisioning Monitor Provisioning command shell plugins Provisioning Metrics

CloudMesh: Example of Moving a Service

CloudMesh: Command Line Interface invoking dynamic provisioning $ cm FutureGrid - Cloud Mesh Shell ____ _ _ __ __ _ / ___| | ___ _ _ __| | | \/ | ___ ___| |__ | | | |/ _ \| | | |/ _` | | |\/| |/ _ \/ __| '_ \ | |___| | (_) | |_| | (_| | | | | | __/\__ \ | | | \____|_|\___/ \__,_|\__,_| |_| |_|\___||___/_| |_| ====================================================== cm> help Documented commands (type help ): ======================================== EOF dot2 graphviz inventory open project quit timer verbose clear edit help keys pause py rst use version cloud exec info man plugins q script var vm cm> Also REST interface Python API provision b-001 openstack

Next Steps: CloudMesh CloudMesh Software – First release end of August – Deploy on FutureGrid – Provide documentation – Develop intelligent scheduler Ph.D. thesis – Integrate with Chef Part of another thesis Other bare-metal provisioners: OpenStack Extend User On-Ramp features Other frameworks can use CloudMesh – e.g. Phantom, Precip

Acknowledgement Sponsor: – This material is based upon work supported in part by the National Science Foundation under Grant No Citation: – Fox, G., G. von Laszewski, et.al., “FutureGrid - a reconfigurable testbed for Cloud, HPC and Grid Computing”, Contemporary High Performance Computing: From Petascale toward Exascale, April, Editor J. Vetter. [pdf][pdf] CloudMesh, Rain: Indiana Uinversity Inca: SDSC Precip: ISI Phantom: UC