Panel: Building the NRP Ecosystem

Slides:



Advertisements
Similar presentations
University of St Andrews School of Computer Science Experiences with a Private Cloud St Andrews Cloud Computing co-laboratory James W. Smith Ali Khajeh-Hosseini.
Advertisements

STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Funding for Cyberinfrastructure What has worked at Minnesota? What doesn’t work? What are we doing to move forward?
TACC’s mission is to enable discoveries that advance science and society through the application of advanced computing technologies. Texas Advanced Computing.
Summary Role of Software (1 slide) ARCS Software Architecture (4 slides) SNS -- Caltech Interactions (3 slides)
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
IACT 901 Module 9 Establishing Technology Strategy - Scope & Purpose.
1 Supplemental line if need be (example: Supported by the National Science Foundation) Delete if not needed. Supporting Polar Research with National Cyberinfrastructure.
FACULTY OF COMPUTER SCIENCE OUTPUT DD  annual event from students for students with contact to industry (~800 visitors)  live demonstrations  research.
Knowledge Environments for Science and Engineering: Overview of Past, Present and Future Michael Pazzani, Information and Intelligent Systems Division,
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
CLOUD COMPUTING.  It is a collection of integrated and networked hardware, software and Internet infrastructure (called a platform).  One can use.
Dan Stanzione Executive Director, Texas Advanced Computing Center March 2015 An Update on the Texas Advanced Computing Center.
NSF Vision and Strategy for Advanced Computational Infrastructure Vision: NSF Leadership in creating and deploying a comprehensive portfolio…to facilitate.
Welcome to HTCondor Week #14 (year #29 for our project)
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
IPlant Collaborative Powering a New Plant Biology iPlant Collaborative Powering a New Plant Biology.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Data! Philip E. Bourne Ph.D. Associate Director for Data Science National Institutes of Health.
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen – Louisiana State University Rob.
Building Green Cloud Services at Low Cost Josep Ll. Berral, Íñigo Goiri, Thu D. Nguyen, Ricard Gavaldà, Jordi Torres, Ricardo Bianchini.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
The Magic of the Cloud: Supercomputers for Everyone, Everywhere Prof. Eric A. Brewer UC Berkeley.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Objectives.
RNA-Seq 2013, Boston MA, 6/20/2013 Optimizing the National Cyberinfrastructure for Lower Bioinformatic Costs: Making the Most of Resources for Publicly.
Workshop on the Future of Scientific Workflows Break Out #2: Workflow System Design Moderators Chris Carothers (RPI), Doug Thain (ND)
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Russ Hobby Program Manager Internet2 Cyberinfrastructure Architect UC Davis.
A Framework for Visualizing Science at the Petascale and Beyond Kelly Gaither Research Scientist Associate Director, Data and Information Analysis Texas.
The iPlant Collaborative Using iPlant for sharing, managing, and analyzing ecological data Ramona Walls Presented at ESA 2014 – Ignite session August 12,
Slide 1 Easing Access for Researchers. Slide 2 Collaborators wants an environment where managing members & access to resources is FAST and EASY Not This!
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
John Womersley LHC Joel Butler, Marcela Carena, Jim Strait, John Womersley No formal meeting of the group yet, but we took the opportunity to sit with.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
PEcAn The Predictive Ecosystem Analyzer. Motivation Synthesize heterogeneous data Bridge gap between conceptual and computational models Summarize what.
NASA Earth Exchange (NEX) Earth Science Division/NASA Advanced Supercomputing (NAS) Ames Research Center.
Introduction to Data Analysis with R on HPC Texas Advanced Computing Center Feb
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
I've been to the summer camp, now what? May 12, 2016 Mariana Carrasco-T. Assistant Director of Michigan Institute for Computational Discovery & Engineering.
Extreme Scale Infrastructure
Geoffrey Fox Panel Talk: February
Elastic Cyberinfrastructure for Research Computing
CI Updates and Planning Discussion
CyVerse Tools and Services
Tools and Services Workshop
Joslynn Lee – Data Science Educator
INFN Computing Outlook The Bologna Initiative
National Center for Genome Analysis Support
Recap: introduction to e-science
Introduction to XSEDE Resources HPC Workshop 08/21/2017
CI Milieu (Mafia, Peeps, Homies)
Building a Cyberinfrastructure Culture: IT as a Partner in Research
Cloudstor: Glamming up the ecosystem
EOSCpilot All Hands Meeting 8 March 2018 Pisa
Introduction to Free HPC Resources: XSEDE
Cloud Helps Company Scale to Demand for Growing Healthcare Provider Field MINI-CASE STUDY “Microsoft Azure gives us the opportunity to focus on the task.
Henry Neeman, University of Oklahoma
Department of Intelligent Systems Engineering
Storing and Accessing G-OnRamp’s Assembly Hubs outside of Galaxy
Scientific Computing Strategy
Funding for Cyberinfrastructure
Welcome to (HT)Condor Week #19 (year 34 of our project)
Presentation transcript:

Panel: Building the NRP Ecosystem Dan Stanzione NATIONAL RESEARCH PLATFORM MEETING August 2018 2/21/2019

Stampede 2 Funded by NSF as a renewal of the original Stampede project. The largest XSEDE resource (and largest university-based system). Follow the legacy of success of the first machine as a supercomputer for a *broad* range of workloads, large and small. Install without ever having a break in service – in the same footprint.

TACC At a GLANCE Personnel 160 Staff (~70 PhD) Facilities 12 MW Data center capacity Two office buildings, Three Datacenters, two visualization facilities, and a chilling plant. Systems and Services ~35,000 users in ~3,000 projects on fifteen production platforms 200+ Data collections in 60+ PB Hikari 380V DC Green computing system parternship with NEDO and NTT. 10k Haswell cores. HVDC and Solar (partial) Support for container ecosystem HPC - Stampede-2, Lonestar 5, Hikari Data - Wrangler VIS/ML - Maverick Cloud/Interactive – Chameleon, Jetstream, Roundup Storage Stockyard, Corral, Ranch Experimental – Fabric, Catapult, etc. Talk a little about what we mean by BIG INNOVATIVE SYSTEMS Then, spend the rest of my time talking about how we are HELPING PEOPLE GO FAST Cloud systems weave all this together We also have testbed systems online or coming in next 6 mo Altera FPGA, POWER/NVIDIA, HP APOLLO powered by solar HVDC 2/21/2019

And more to Come. . . 2/21/2019

THE EVOLUTION OF A CYBERINFRASTRUCTURE Ten years ago, cyberinfrastructure was largely about building the hardware and networks to support large scale science. Today, it’s about new interfaces to support data analysis, collaboration and sharing,reproducibility as well as easy access to simulation 2/21/2019

2/21/2019

Opportunities for NRP 2/21/2019

Opportunities for NRP Reproducibility, and the impact of data reuse, are sort of non- existent factors that we must improve upon. This doesn’t really require scale to make progress – - but does require integration to achieve. 2/21/2019

The role of National centers We focus, perhaps too much, on the physical infrastructure of the platform. The thing I can’t scale is “help down the hall”. The network is *necessary* to enable remote resources, collaboration, data-sharing, etc., but is not *sufficient*. A more tightly integrated collection of *people* will be key for NRP success, and future science. 2/21/2019