Advanced Research Computing Projects & Services at U-M

Slides:



Advertisements
Similar presentations
Distributed Data Processing
Advertisements

Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
INTRODUCTION TO CLOUD COMPUTING CS 595 LECTURE 6 2/13/2015.
July 21, 2005Interdisciplinary Modeling of Acquatic Ecosystems, Lake Tahoe 1 Advanced Computing in Environmental Sciences ACES Vanda Grubišić Desert Research.
Bill Wrobleski Director, Technology Infrastructure ITS Infrastructure Services.
Research Computing with Newton Gerald Ragghianti Nov. 12, 2010.
The University of Texas Research Data Repository : “Corral” A Geographically Replicated Repository for Research Data Chris Jordan.
Simulation as a Service: A Cloud-Based Framework to Support the Educational Use of Scientific Software Tom Bitterman, Da Cai, Dave Hudak, Rajiv Ramnath,
Office 365: Efficient Cloud Solutions Wednesday March 12, 9AM Chaz Vossburg / Gabe Laushbaugh.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Kick-off University Partners Meeting September 18, 2012 Michael Owen, VP Research, Innovation & International, UOIT on behalf of Consortium partners Southern.
A Paradigm for Interdisciplinary Research A. H. Rebar, DVM, Ph.D. Senior Associate Vice President for Research Executive Director of Discovery Park.
Steven Newhouse, Head of Technical Services European Bioinformatics Institute: ICT Challenges.
EarthLink Server Management and Monitoring Updated August 6, 2015.
The Research Hub AT UNC LIBRARIES Joe Williams, UNC Library :: Peter Leousis, Odum Institute :: Molly Sutphen, Center for Faculty Excellence.
Storage and data services eIRG Workshop Amsterdam Dr. ir. A. Osseyran Managing director SARA
Presentation to Senior Management Team 24 th October 2008 UCD IT Services IT Strategy
Microsoft and Community Tour 2011 – Infrastrutture in evoluzione Community Tour 2011 Infrastrutture in evoluzione.
VO Sandpit, November 2009 e-Infrastructure to enable EO and Climate Science Dr Victoria Bennett Centre for Environmental Data Archival (CEDA)
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
RNA-Seq 2013, Boston MA, 6/20/2013 Optimizing the National Cyberinfrastructure for Lower Bioinformatic Costs: Making the Most of Resources for Publicly.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
Looking Ahead: A New PSU Research Cloud Architecture Chuck Gilbert - Systems Architect and Systems Team Lead Research CI Coordinating Committee Meeting.
Data Management Recommendation ISTeC Data Management Committee.
IGERT at the National Science Foundation Carol Van Hartesveldt, Ph.D. Program Director, IGERT National Science Foundation.
Slide 1 Defining and Understanding Campus Policies associated with Integrating a Science DMZ into the Campus Environment Moderator: Wendy Huntoon, KINBER.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Built on Azure, Moodle Helps Educators Create Proprietary Private Web Sites Filled with Dynamic Courses that Extend Learning Anytime, Anywhere MICROSOFT.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
Challenges of Coping with Funding and Data Management in a Changing World Rick Lyons Director Infectious Disease Research Center.
Jonathan Carroll-Nellenback.
Cyberinfrastructure for international competitiveness Dr Happy Sithole Contributions from: Mr Leon Staphorst and Dr Anwar Vehad.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Tackling I/O Issues 1 David Race 16 March 2010.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Get Data to Computation eudat.eu/b2stage B2STAGE How to shift large amounts of data Version 4 February 2016 This work is licensed under the.
Introduction to Data Analysis with R on HPC Texas Advanced Computing Center Feb
EUDAT receives funding from the European Union's Horizon 2020 programme - DG CONNECT e-Infrastructures. Contract No Collaboration.
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
I've been to the summer camp, now what? May 12, 2016 Mariana Carrasco-T. Assistant Director of Michigan Institute for Computational Discovery & Engineering.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Commvault and Nutanix October Changing IT landscape Today’s Challenges Datacenter Complexity Building for Scale Managing disparate solutions.
READ ME FIRST Use this template to create your Partner datasheet for Azure Stack Foundation. The intent is that this document can be saved to PDF and provided.
Advanced Research Computing
Unit 3 Virtualization.
Accessing the VI-SEEM infrastructure
Univa Grid Engine Makes Work Management Automatic and Efficient, Accelerates Deployment of Cloud Services with Power of Microsoft Azure MICROSOFT AZURE.
Chapter 6: Securing the Cloud
A Brief Introduction to NERSC Resources and Allocations
Organizations Are Embracing New Opportunities
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CLOUD COMPUTING
Computing Clusters, Grids and Clouds Globus data service
Creating an Intellectual and Physical Home for Informatics
Tools and Services Workshop
Microsoft Azure-Powered BlueCielo Meridian360 Portal Improves Asset Data Integrity and Facilitates Secure Collaboration with External Stakeholders MICROSOFT.
Joslynn Lee – Data Science Educator
Matt Link Associate Vice President (Acting) Director, Systems
Walt Johnson Director, Rochester Center of Excellence in Data Science.
OpenNebula Offers an Enterprise-Ready, Fully Open Management Solution for Private and Public Clouds – Try It Easily with an Azure Marketplace Sandbox MICROSOFT.
StorFly-IES Intelligent External Storage for Industrial-IoT ®
Be Better: Achieve Customer Service Excellence and Create a Lean RMA and Returns Process with Renewity RMA and the Power of Microsoft Azure MICROSOFT AZURE.
Scalable SoftNAS Cloud Protects Customers’ Mission-Critical Data in the Cloud with a Highly Available, Flexible Solution for Microsoft Azure MICROSOFT.
Creating an Intellectual and Physical Home for Informatics
Enabling ML Based Research
Harvard Web Publishing Web Publishing for the Harvard Community
OUR HISTORY & MISSION ABOUT US. OUR HISTORY & MISSION ABOUT US.
Presentation transcript:

Advanced Research Computing Projects & Services at U-M New Advanced Research Computing Projects & Services at U-M Sharon Broude Geva Director of Advanced Research Computing (ARC) University of Michigan sgeva@umich.edu http://arc.umich.edu 2015 CASC Fall Meeting - 10/15/2015

Advanced Research Computing

U-M Data Science Initiative U-M investment of $100M over next 5 years Hire 35 new faculty over the next four years and engage existing faculty across campus; Support interdisciplinary data-related research initiatives and foster new methodological approaches to big data; provide new educational opportunities for students pursuing careers in data science; expand U-M’s research computing capacity; and strengthen data management, storage, analytics, and training resources. 1500+ registrants for kick-off symposium last week

U-M Data Science Initiative Faculty affiliates, challenge grants, graduate certificate, industry engagement Data Science infrastructure build and operations Data Science consultants and training

CC*DNI Award: MI-OSiRIS PI: Shawn McKee (U-M Physics, ARC) Co-PI’s: Swany (IU), Gossman (WSU), Merz (MSU) $4.9 Million (NSF) Will provide a distributed, multi-institutional storage IF that allows researchers at any of the three campuses to read, write, manage, and share data from their computing facility locations Goal - Provide transparent high-performance access to the same storage IF from well-connected locations on any of the three campuses

CC*DNI Award: MI-OSiRIS Will include network discovery, monitoring and management tools and creative use of CEPH features Users get customized data interfaces for their multi-institutional data needs Seamless rebalancing and expansion of storage Data sharing, archiving, security, and life-cycle management implemented and maintained with single distributed service Data IF view for each research domain can be optimized

MI-OSiRIS research domain interfaces showing the software-defined storage layer that we leverage for data life-cycle management and research domain customization. Ceph pool colors correspond to research domains.

CC*DNI Award: MI-OSiRIS Center for Network and Storage-Enabled Collaborative Computational Science Build and operations at U-M

MRI Award: ConFlux PI: Karthik Duraisamy (Aerospace Engineering) $3.5 Million ($2.42M NSF + $1.04M cost-share) Designed to enable HPC simulations to interface with large datasets while running To refine complex physics-based models with Big Data techniques (For cardiovascular disease; turbulence; clouds, rainfall, climate; dark matter, dark energy; material property prediction) CPU’s + GPU’s, large memory, ultra-fast interconnect, 3 PB hard drive Optimized hardware for machine learning Plan to expand availability to researchers and schools outside the grant team

MRI Award: ConFlux Center for Data-Driven Computational Physics Technical design, build, and operations

Turbo: High Performance Research Storage Isilon scalable storage for U-M researchers Configured to be easily shareable with on-campus resources such as the Flux HPC cluster, as well as off-campus systems and collaborators Performance sufficient for both IO-intensive operations and bulk file access, allowing researchers to work with data in place and avoid excessive data staging Primary tiered storage - Hybrid w/SSDs Tuned for large files (1MB or greater) but capable of handling small files (e.g., documents, spreadsheets, etc.) Billing granularity? threads? 20M?

Turbo: High Performance Research Storage NFSv3 and NFSv4 access (for Linux, Mavericks, Yosemite, Windows 7+) NFSv4 w/Kerberos access (Linux and Mavericks) Two security levels: regulated and/or sensitive data (PHI only w/Kerberos) non-sensitive data Globus available for volumes that do not contain PHI, for sharing and hosting data for external collaborators and institutes. Currently 1PB usable, replicated to 1PB DR Cost: $19.20 / TB / Month (Optional daily snapshots available at no cost)

Turbo: High Performance Research Storage Technical design, build, and operations