What is campus bridging and what is XSEDE doing about it?

Slides:



Advertisements
Similar presentations
Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
Advertisements

April 19, 2015 CASC Meeting 7 Sep 2011 Campus Bridging Presentation.
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
ACCI TASK FORCES Update CASC September 22, Task Force Introduction Timeline months or less from June 2009 Led by NSF Advisory Committee on.
Kathy Benninger, Pittsburgh Supercomputing Center Workshop on the Development of a Next-Generation Cyberinfrastructure 1-Oct-2014 NSF Collaborative Research:
Research CU Boulder Cyberinfrastructure & Data management Thomas Hauser Director Research Computing CU-Boulder
NSF ACCI Task Force on Campus Bridging CASC Meeting 16 March 2011, Arlington VA Craig Stewart Von Welch This material.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Introduction and Overview “the grid” – a proposed distributed computing infrastructure for advanced science and engineering. Purpose: grid concept is motivated.
1 Supplemental line if need be (example: Supported by the National Science Foundation) Delete if not needed. Supporting Polar Research with National Cyberinfrastructure.
University of Jyväskylä – Department of Mathematical Information Technology Computer Science Teacher Education ICNEE 2004 Topic Case Driven Approach for.
Campus Bridging: What is it and why is it important? Barbara Hallock – Senior Systems Analyst, Campus Bridging and Research Infrastructure.
Statewide IT Conference, Bloomington IN (October 7 th, 2014) The National Center for Genome Analysis Support, IU and You! Carrie Ganote (Bioinformatics.
Craig Stewart 23 July 2009 Cyberinfrastructure in research, education, and workforce development.
Overview of NSF ACCI Task Force on Campus Bridging Report Craig Stewart Von Welch Presented at Coalition for Academic.
1 | CSIRO ASKAP Science Data Archive (CASDA) – Stage 0 Project Intent Statement Confirm the necessary requirements, use cases, workflows, business processes,
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
8 June 2012 What is campus bridging and why should XSEDE Campus Champions care? Craig Stewart Executive Director, Pervasive Technology.
Ocean Observatories Initiative Common Execution Infrastructure (CEI) Overview Michael Meisinger September 29, 2009.
XSEDE Campus Bridging Birds Of a Feather Rich Knepper Craig Stewart James Wade Ferguson Presented at TeraGrid ‘11,
NSF ACCI (Advisory Committee for CyberInfrastructure) Taskforce Update - CASC Meeting 23 March 2010 Craig Stewart – Executive Director,
The National Center for Genome Analysis Support as a Model Virtual Resource for Biologists Internet2 Network Infrastructure for the Life Sciences Focused.
High Level Architecture Overview and Rules Thanks to: Dr. Judith Dahmann, and others from: Defense Modeling and Simulation Office phone: (703)
XSEDE12 Closing Remarks Craig Stewart XSEDE12 General Chair Executive Director, Indiana University Pervasive Technology Institute.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
© Trustees of Indiana University Released under Creative Commons 3.0 unported license; license terms on last slide. The IQ-Table & Collection Viewer A.
RNA-Seq 2013, Boston MA, 6/20/2013 Optimizing the National Cyberinfrastructure for Lower Bioinformatic Costs: Making the Most of Resources for Publicly.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Pti.iu.edu /jetstream Award # funded by the National Science Foundation Award #ACI Jetstream - A self-provisioned, scalable science and.
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
July 18, 2012 Campus Bridging Security Challenges from “Panel: Security for Science Gateways and Campus Bridging”
Enabling Science Through Campus Bridging A case study with mlRho Scott Michael July 24, 2013.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
National Science Foundation Where Discoveries Begin Office of Cyberinfrastructure Campus Bridging Task Force Craig A. Stewart.
Making Campus Cyberinfrastructure Work for Your Campus Guy Almes Patrick Dreher Craig Stewart Dir. Academy for Dir. Advanced Computing Associate Dean Advanced.
Russ Hobby Program Manager Internet2 Cyberinfrastructure Architect UC Davis.
Pti.iu.edu /jetstream Award # funded by the National Science Foundation Award #ACI Jetstream Overview – XSEDE ’15 Panel - New and emerging.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
INDIANAUNIVERSITYINDIANAUNIVERSITY Spring 2000 Indiana University Information Technology University Information Technology Services Please cite as: Stewart,
Authors: Ronnie Julio Cole David
08/05/06 Slide # -1 CCI Workshop Snowmass, CO CCI Roadmap Discussion Jim Bottum and Patrick Dreher Building the Campus Cyberinfrastructure Roadmap Campus.
November 18, 2015 Quarterly Meeting 30Aug2011 – 1Sep2011 Campus Bridging Presentation.
February 27, 2007 University Information Technology Services Research Computing Craig A. Stewart Associate Vice President, Research Computing Chief Operating.
HPC Centres and Strategies for Advancing Computational Science in Academic Institutions Organisers: Dan Katz – University of Chicago Gabrielle Allen –
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Nature Reviews/2012. Next-Generation Sequencing (NGS): Data Generation NGS will generate more broadly applicable data for various novel functional assays.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
© Trustees of Indiana University Released under Creative Commons 3.0 unported license; license terms on last slide. Update on EAGER: Best Practices and.
Award # funded by the National Science Foundation Award #ACI Jetstream: A Distributed Cloud Infrastructure for.
Jetstream: A new national research and education cloud Jeremy Fischer ORCID Senior Technical Advisor, Collaboration.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
7. Grid Computing Systems and Resource Management
1 A national science & engineering cloud funded by the National Science Foundation Award #ACI Craig Stewart ORCID ID Jetstream.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
Fire Emissions Network Sept. 4, 2002 A white paper for the development of a NSF Digital Government Program proposal Stefan Falke Washington University.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Jetstream Overview Jetstream: A national research and education cloud Jeremy Fischer ORCID Senior Technical Advisor,
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
1 Campus Bridging: What is it and why is it important? Barbara Hallock – Senior Systems Analyst, Campus Bridging and Research Infrastructure.
XSEDE Value Added and Financial Economies
Clouds , Grids and Clusters
Matt Link Associate Vice President (Acting) Director, Systems
funded by the National Science Foundation Award #ACI
XSEDE’s Campus Bridging Project
Campus Bridging at XSEDE
Bird of Feather Session
Presentation transcript:

What is campus bridging and what is XSEDE doing about it?

The beginnings of Campus Bridging as a concept…. In early 2009 National Science Foundation’s (NSF) Advisory Committee for Cyberinfrastructure (ACCI) charged six different task forces: one of those was called Campus Bridging. Cyberinfrastructure consists of computational systems, data and information management, advanced instruments, visualization environments, and people, all linked together by software and advanced networks to improve scholarly productivity and enable knowledge breakthroughs and discoveries not otherwise possible. The goal of campus bridging is to enable virtual proximity: – the seamlessly integrated use among a scientist or engineer’s personal cyberinfrastructure; cyberinfrastructure on the scientist’s campus; cyberinfrastructure at other campuses; and cyberinfrastructure at the regional, national, and international levels; as if they were proximate to the scientist. – When working within the context of a Virtual Organization (VO), the goal of campus bridging is to make the ‘virtual’ aspect of the organization irrelevant (or helpful) to the work of the VO. 3

4

5 From Welch, V., Sheppard, R., Lingwall, M.J., Stewart, C.A Current structure and past history of US cyberinfrastructure (data set and figures).

(In)adequacy of Research CI Key observations from ACCI Campus Bridging Task Force (paraphrased): Aggregate US cyberinfrastructure inadequate to meet needs Existing CI not optimally utilized Many of the challenges have to do with the existing state of software, security, and policy Some reasonable choices well executed now are better than perfect solutions implemented later From: NSF Advisory Committee for Cyberinfrastructure Task Force on Campus Bridging. Final Report. March

XSEDE campus bridging vision & strategy Promote better and easier use, via XSEDE and campus bridging tools, of the nation’s aggregate CI resources Campus Bridging is more a mindset that should affect most of what XSEDE does rather than a specific set of software modules within the overall set of XSEDE services and products. Campus Bridging could be thought of as a very technical and broad approach to useability Our goal is going to be to work with the various groups in XSEDE (particularly XAUS, Campus Champions, Documentation / Training) to align activities and communications so that XSEDE collectively does things in a way that achieves the goals set in campus bridging area To be conscientiously targeted at Data, HPC, and HTC – probably in that order Strategy: conscientiously make a small number of reasoned choices, pursue them with diligence, and reap economies of scale (if things go right) or clear learning experiences (otherwise) 7

XSEDE, its components, ‘within’ and ‘beyond’ XSEDE XSEDE is “the most advanced, powerful, and robust collection of integrated advanced digital resources and services in the world. It is a single virtual system that scientists can use to interactively share computing resources, data, and expertise.” The XSEDE grant award from NSF funds its core organizing, support, and management entity. Service Providers (part of SP Forum): – ‘Level 1’ Service Providers meet all XSEDE integration requirements... access through the XSEDE allocation process. [e.g. Kraken & Stampede] – ‘Level 2’ Service Providers make one or more digital services accessible via XSEDE services and interfaces, share one or more digital services with the XSEDE community along with the organization’s local users [IU Quarry, Cornell Red Cloud are examples] – ‘Level 3’ Service Providers are the most loosely coupled within the XSEDE Federation; they will advertise the characteristics of one or more digital services via XSEDE mechanisms [The CI for the Ocean Observatory Initiative is a possible example]

Challenges regarding campus bridging It’s not a specific thing. You can’t point to a ‘campus bridge’ the way you can a supercomputer There is no such thing as a ‘campus bridger’ the way there is a Campus Champion. It may make sense to talk about a ‘bridged resource’ It’s more a mindset toward a particular form of technical interoperability and usability than it is a specific thing The hardest thing about campus bridging: explaining a set of use cases that affects several types of XSEDE activities as campus bridging rather than having A thing to point to The second hardest thing: getting colleagues to abandon the idea that groups interested in campus bridging are XSEDE Service Provider wannabes.

XSEDE and systems engineering XSEDE is “an organization that delivers a series of instantiations of services to the US research community.” Development of these instantiations takes place via a particular systems engineering methodology called Architecture-centric systems engineering Use case descriptions => Use case quality attribute scenarios => Level 3 Decomposition documents (UML) => and then on to code and documents!

Campus Bridging use cases UCCB 1.0. InCommon-based Authentication. Consistent use of community-accepted authentication mechanisms. UCCB 2.0. Economies of scale in training and usability UCCB 3.0. Long-term remote interactive graphic session UCCB 4.0. Use of data resources from campus on XSEDE, or from XSEDE at a campus (Two approaches – one of which we hear about from Ian Foster in the next talk) UCCB 5.0. Support for distributed workflows spanning XSEDE and campus-based data, computational, and/or visualization resources UCCB 6.0. Shared use of computational facilities mediated or facilitated by XSEDE UCCB 7.0 Access to resources on a service for money basis (___ on demand). CB Prerequisite. XSEDE-wide unified trouble ticket handling

Economies of scale in training and usability from more consistency in cluster configurations 13 In reality, the four cluster admins depicted here being in agreement are all right. Experienced cluster admins all learned how to use what they learn when the tools were still developing, so the tool each sysadmin knows the best is the tool that lets that sysadmin do their work the best The only way to develop consistency is to provide installers that will make their work easier The XSEDE architecture group is developing installers for file management tools XSEDE campus bridging is developing Rocks Rolls cluster build setups (and documentation)

Economies of scale in training and outreach Image from TeraGridEOT: Education, Outreach, and Training news#2010scihigh Consistency in system setups – local becoming more like XSEDE – should also lead to economies of scale in training Materials and trainer expertise will be more easily transportable and extensible The campus bridging group plans to work very closely with the campus champions

Shared Virtual Compute Facilities SVCF – virtual cluster independent of XSEDE – Can we provide tools that will create authentication screens that look and work like XSEDE login – Doing this requires supporting multiple authentication mechanisms – Remember: not everyone one wants to have an XSEDE label on their organization! SVCF – accepting jobs from XSEDE – Requires ability for SVCFs to accept jobs (and trust) XSEDE – Requires ability for XSEDE to trust SVCFs – Requires trouble ticket exchange and security notification / response processes – This sort of SVCF may be a type of entity that one could meaningfully call a ‘bridged resource.’ This use case has High Performance Computing and High Throughput Computing as two variants. The Open Science Grid = the solution to the High Throughput Variants of this use case!

16

Campus Bridging GFFS pilot program Texas A&M – use of GFFS with Brazos Cluster Kansas – data transfer within Great Plans and with Indiana University CUNY – spanning campus to XSEDE resources Univ of Miami – data sharing within campus and across WAN Pilot projects are currently working with XSEDE Operations to prepare Genesis II Infrastructure for beta usage

Five year goals At the end of 5 years, have implemented and socialized a community vision of an integrated national cyberinfrastructure with XSEDE as a critical component – but just a component Working with XSEDE and the community as a whole, implement the technology required to support as many of the use cases described here as possible

Next deliverables and next steps Paper in the proceedings! Template for hybrid system description – generalization of the TACC / IU templates Podcast Some real progress on pilots ROCKS Rolls – mid PY2 first visible results likely Continued work with Open Science Grid Campus Bridging objectives go in to the prioritization process with everyone else’s priorities and we arrive at some estimate of what is feasible over next four years 19

XSEDE Campus Bridging Staff Craig Stewart (reports to Scott Lathrop in Scott’s role leading outreach) Jim Ferguson (works 25% on campus bridging) Therese Miller – IU overall project lead for XSEDE activities (will be aiding, expected particularly in regards to campus champions) Rich Knepper – Manager, Core Services, RT/PTI (0.25 FTE starting next year) 1.0 FTE for “ROCKS Rolling” for PY2 20

Please cite as: Stewart, C.A., R. Knepper, J.W. Ferguson, F. Bachmann, I. Foster, A. Grimshaw, V. Hazlewood and D. Lifka. What is campus bridging and what is XSEDE doing about it? Presentation. Presented at: XSEDE12 (Chicago, IL, Jul 2012). All slides (except where explicitly noted) are copyright 2012 by the Trustees of Indiana University, and this content is released under the Creative Commons Attribution 3.0 Unported license (

Thanks Campus Champions – whose input has already shaped Campus Bridging activities greatly. All XSEDE staff Guy Almes, Von Welch, Patrick Dreher, Jim Pepin, Dave Jent, Stan Ahalt, Bill Barnett, Therese Miller, Malinda Lingwall, Maria Morris Gabrielle Allen, Jennifer Schopf, Ed Seidel, all of the NSF program officers involved in the campus bridging task force activities All of the IU Research Technologies and Pervasive Technology Institute staff who have contributed to this entire 2+ year process Special thanks to CASC members who have participated in one of n information gathering exercises (where n is large) NSF for funding support (Awards , , , , ; this material and ongoing work supported by Award ) Funding support provided by Lilly Endowment and the Indiana University Pervasive Technology Institute Any opinions presented here are those of the presenter or collective opinions of members of the Task Force on Campus Bridging and do not necessarily represent the opinions of the National Science Foundation or any other funding agencies 22