Cyberinfrastructure and Internet2

Slides:



Advertisements
Similar presentations
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Advertisements

High Performance Computing Course Notes Grid Computing.
Background Chronopolis Goals Data Grid supporting a Long-term Preservation Service Data Migration Data Migration to next generation technologies Trust.
Internet2 Technology Update Eric Boyd Deputy Technology Officer Internet2 TIP 2008 January 21, 2008 Honolulu, HI.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
IRNC Special Projects: IRIS and DyGIR Eric Boyd, Internet2 October 5, 2011.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
The Research and Education Network: Platform for Innovation Heather Boyles, Next Generation Network Symposium Malaysia 2007-March-15.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
A Technology Vision for the Future Rick Summerhill, Chief Technology Officer, Eric Boyd, Deputy Technology Officer, Internet2 Joint Techs Meeting 16 July.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
Internet2 Performance Update Jeff W. Boote Senior Network Software Engineer Internet2.
Internet2 Update Eric Boyd Deputy Technology Officer October 20, 2008.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
© Internet 2012 Internet2 and Global Collaboration APAN 33 Chiang Mai 14 February 2012 Stephen Wolff Internet2.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
ASCR/ESnet Network Requirements an Internet2 Perspective 2009 ASCR/ESnet Network Requirements Workshop April 15/16, 2009 Richard Carlson -- Internet2.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Authors: Ronnie Julio Cole David
Cyberinfrastructure and Internet2 Eric Boyd Deputy Technology Officer Internet2.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Advanced Networks: The Past and the Future – The Internet2 Perspective APAN 7 July 2004, Cairns, Australia Douglas Van Houweling, President & CEO Internet2.
Southeastern Universities Research Association (SURA) - Intro for Fed/Ed 18 Mary Fran Yafchak Senior Program Manager, IT
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
© 2006 Open Grid Forum Network Monitoring and Usage Introduction to OGF Standards.
Internet2 End-to-End Performance Initiative Eric L. Boyd Director of Performance Architecture and Technologies Internet2.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Internet2 and Cyberinfrastructure Russ Hobby Program Manager,
Cyberinfrastructure and Internet2 Eric Boyd Deputy Technology Officer Internet2.
Advanced research and education networking in the United States: the Internet2 experience Heather Boyles Director, Member and Partner Relations Internet2.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
DICE: Authorizing Dynamic Networks for VOs Jeff W. Boote Senior Network Software Engineer, Internet2 Cándido Rodríguez Montes RedIRIS TNC2009 Malaga, Spain.
Internet2 Strategic Directions October Fundamental Questions  What does higher education (and the rest of the world) require from the Internet.
DICE Diagnostic Service Joe Metzger Joint Techs Measurement Working Group January
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Emerging StateNets Issues Associated with CI and the 3- Tier Networking Model Steve Corbató CI Strategic Initiatives, University of Utah StateNets – Tempe.
Internet2 Applications & Engineering Ted Hanss Director, Applications Development.
1 Network Measurement Challenges LHC E2E Network Research Meeting October 25 th 2006 Joe Metzger Version 1.1.
1 ESnet Network Requirements ASCAC Networking Sub-committee Meeting April 13, 2007 Eli Dart ESnet Engineering Group Lawrence Berkeley National Laboratory.
Bob Jones EGEE Technical Director
Internet2 End-to-End Performance Initiative
Clouds , Grids and Clusters
Report from WLCG Workshop 2017: WLCG Network Requirements GDB - CERN 12th of July 2017
The NSRC cultivates collaboration among a community of peers to build and improve a global Internet that benefits all parties. We facilitate the growth.
InterDomain Dynamic Circuit Network Demo
Summit 2017 Breakout Group 2: Data Management (DM)
DOE Facilities - Drivers for Science: Experimental and Simulation Data
Robert Szuman – Poznań Supercomputing and Networking Center, Poland
Network Requirements Javier Orellana
Internet2 Performance Update
Cloud Computing.
Grid Computing B.Ramamurthy 9/22/2018 B.Ramamurthy.
ESnet Network Measurements ESCC Feb Joe Metzger
Mary Fran Yafchak Senior Program Manager, IT
Grid Application Model and Design and Implementation of Grid Services
Future EU Grid Projects
E2E piPEs Overview Eric L. Boyd Internet2 24 February 2019.
Bird of Feather Session
Pfizer Internet2 Day Douglas E. Van Houweling President and CEO, UCAID
The New Internet2 Network: Expected Uses and Application Communities
Presentation transcript:

Cyberinfrastructure and Internet2 Eric Boyd Deputy Technology Officer Internet2

What is Cyberinfrastructure (CI)? A strategic orientation supported by NSF Calls for large-scale public investment to encourage the evolution of widely distributed computing via the telecommunications network Goal is to deploy the combined capacity of multiple sites to support the advance of current research, initially in science and engineering

General Session Thursday: Cyberinfrastructure: The Way Forward Francine Berman, San Diego Supercomputer Center, Moderator Paul Avery, University of Florida Thomas Knab, Case Western Reserve University Alan Whitney, MIT Haystack Observatory Eric Boyd, Internet2

The Distributed CI Computer Instrumentation Security Control Data Generation Human Support Training Researcher Control Program Viewing Security Help Desk Education And Outreach Collab Tools Publishing Policy and Funding Resource Providers Agencies Campuses Management Security and Access Authentication Control Authorization 3D Imaging Display and Visualization . Display Tools Security Data Input Behind all the functions is the Network. The Network is the backplane for the Distributed CI Computer Search Data Sets Storage Security Retrieval Input Schema Metadata Data Directories Ontologies Archive Computation Analysis Simulation Program Security Network 4

The Network is the Backplane for the Distributed CI Computer Instrumentation Security Control Data Generation Human Support Training Researcher Control Program Viewing Security Help Desk Education And Outreach Collab Tools Publishing Policy and Funding Resource Providers Agencies Campuses Management Security and Access Authentication Control Authorization 3D Imaging Display and Visualization . Display Tools Security Data Input Behind all the functions is the Network. The Network is the backplane for the Distributed CI Computer Search Data Sets Storage Security Retrieval Input Schema Metadata Data Directories Ontologies Archive Computation Analysis Simulation Program Security Network 5

Challenge and Opportunity The R&E community thinks of CI primarily in terms of building distributed computing clusters Opportunity: The network is a key component of CI Internet2 is leading the development of solutions for the network component of CI

Robust campus infrastructure Security and Authorization CI Requirements Data storage Robust campus infrastructure Security and Authorization IT support for local and remote resources Network performance monitoring tools Network resources to meet demand spikes 7 7

LHC epitomizes the CI Challenge 8

Large Hadron Collider (LHC) at CERN will go operational in 2008 Current Situation Large Hadron Collider (LHC) at CERN will go operational in 2008 Over 68 U.S. Universities and National Laboratories are poised to receive data More than 1500 scientists are waiting for this data Are campus, regional, and national networks ready for the task? 9 9

CERN Tier 0 Raw Data 10 10

US Tier 4 (1500 US scientists) Scientists Analyze Data US Tier 3 (68 orgs) Scientists Request Data CMS (7) Atlas (6-7) US Tier 2 (15 orgs) Provides Data to Tier 3 FNAL BNL Shared Data Storage and Reduction Tier 1 (12 orgs) Tier 4 represents an estimated 1500 US scientists or 6000 worldwide CERN Tier 0 Raw Data 11 11

GEANT-ESNet-Internet2 Local Infrastructure US Tier 4 (1500 US scientists) Scientists Analyze Data US Tier 3 (68 orgs) Scientists Request Data Internet2/Connectors Internet2/Connectors CMS (7) Atlas (6-7) US Tier 2 (15 orgs) Provides Data to Tier 3 GEANT-ESNet-Internet2 FNAL BNL Shared Data Storage and Reduction Tier 1 (12 orgs) This is the final build including local networking LHCOPN CERN Tier 0 Raw Data 12 12

GEANT-ESNet-Internet2 Peak Flow Network Requirements Local Infrastructure Tier 1 or 2 to Tier 3: Estimate: Requires 1.6 Gbps per transfer (2 TB's in 3 hours) Internet2/Connectors Internet2/Connectors Tier 1 to Tier 2: Requires 10-20 Gbps GEANT-ESNet-Internet2 Network usage requirements. Tier 2 to Tier 3: Requires 1.6 Gbps per transfer (2 TB's in 3 hours).  Please note that transfers will occur on a regular basis. Tier 0 to Tier1: Requires 10-40 Gbps LHCOPN CERN 13 13

Traffic Characteristics Science Network Requirements Aggregation Summary (slide courtesy of ESNet) Science Drivers Science Areas / Facilities End2End Reliability Connectivity 2006 End2End Band width 2010 End2End Band width Traffic Characteristics Network Services Advanced Light Source - DOE sites US Universities Industry 1 TB/day 300 Mbps 5 TB/day 1.5 Gbps Bulk data Remote control Guaranteed bandwidth PKI / Grid Bioinformatics 625 Mbps 12.5 Gbps in two years 250 Gbps Point-to-multipoint High-speed multicast Chemistry / Combustion 10s of Gigabits per second Climate Science International 5 PB per year 5 Gbps High Energy Physics (LHC) 99.95+% (Less than 4 hrs/year) US Tier1 (DOE) US Tier2 (Universities) International (Europe, Canada) 10 Gbps 60 to 80 Gbps (30-40 Gbps per US Tier1) Traffic isolation 14

Traffic Characteristics Science Network Requirements Aggregation Summary (slide courtesy of ESNet) Science Drivers Science Areas / Facilities End2End Reliability Connectivity 2006 End2End Band width 2010 End2End Band width Traffic Characteristics Network Services Magnetic Fusion Energy 99.999% (Impossible without full redundancy) DOE sites US Universities Industry 200+ Mbps 1 Gbps Bulk data Remote control Guaranteed bandwidth Guaranteed QoS Deadline scheduling NERSC - International 10 Gbps 20 to 40 Gbps Deadline Scheduling PKI / Grid NLCF Backbone Band width parity Backbone band width parity Nuclear Physics (RHIC) 12 Gbps 70 Gbps Spallation Neutron Source High (24x7 operation) 640 Mbps 2 Gbps 15

CI Components Bulk Transport 2-Way Interactive Video Applications …. Real-Time Communications …. Applications call on Network Cyberinfrastructure Phoebus …. …. …. Network Cyberinfrastructure Performance Infrastructure / Tools Middleware Control Plane Middleware (Federated Trust) = Shibboleth, Grouper, Signet Performance Framework = perfSONAR Performance Tools = BWCTL, NDT, OWAMP Control Plane = Dragon Library = Pheobus Measurement Nodes Network Control Plane Nodes 16

Internet2 Network CI Software Dynamic Circuit Control Infrastructure DRAGON (with ISI, MAX) Oscars (with ESnet) Middleware (Federated trust Infrastructure) Shibboleth Signet Grouper Comanage Performance Monitoring Infrastructure perfSONAR (with ESnet, GEANT2 JRA1, RNP, many others) BWCTL, NDT, OWAMP, Thrulay Distributed System Infrastructure Topology Service (with University of Delaware) Distributed Lookup Service (with University of Delaware, PSNC)

Internet2 Network CI Standardization Dynamic Circuit Control Protocol (IDC) DICE-Control, GLIF Measurement Schema / Protocol OGF NMWG IETF IPPM perfSONAR Consortium Middleware Arena Liberty Alliance OASIS Possible emerging corporate consortium Topology Schema / Protocol OGF NML-WG DICE-Control

Internet2’s CI Vision Internet2’s CI vision: Be a networking cyber-service provider Be a trust cyber-service provider Be a CI technology developer.

Internet2’s CI Position Internet2’s position: Backbone network provider Federated trust infrastructure provider Forum for collaboration by members of the R&E community Gives Internet2 a unique vision and strategy for Cyberinfrastructure.

Internet2’s CI Definition Components Supercomputing / Cycles / Computational Supercomputing / Storage (Non-volatile) Analysis / Visualization Interconnecting Networks (Campuses, Regionals, Backbones) Network Cyberinfrastructure Software

Internet2’s CI Audience Application Software Instrumentation / Remote Instruments / Sensors Data Sets

Internet2’s CI Constituencies Collaborators University Members Regional Networks Regional CI Organizations High Performance Computing Centers Federal Partners International Partners CI Integrators EXAMPLES Collaborators: Teragrid, Open Science Grid University Members: Discipline group researchers (e.g. LHC, eVLBI), IT Organizations Regional CI Organizations: NYS Grid CI Integrators: Teragrid

Early Thoughts: Internet2’s CI Strategy (1) Requirements Informed by our membership Agenda set by our governance mechanisms Offer, and in some cases develop, services and technology that are key components of a coherent CI software suite. For CI to work, it has to be a workable end-to-end system; Internet2 is emphasizing a systems approach towards CI. Internet2 is offering new services such as the Internet2 Network, InCommon, and the VO Service Center. Internet2 is developing and offering new technologies such as GridShib and perfSONAR. Internet2 may do systems integration work assembling open source communication tools into a common veneer.

Early Thoughts: Internet2’s CI Strategy (2) Play the role of community CI coordinator, convening community conversations. Partner with other community coordinators (e.g. Teragrid, EDUCAUSE). Play a convening function in order to facilitate the development, use, and dissemination of CI (e.g. Bridging the Gap workshop). Take a lead in international outreach efforts at several different layers of CI. Work with campuses to build valuable CI. Facilitate conversations among various federal agencies (e.g. DOE, NSF, NIH), each of which is developing its own CI, and present a consistent vision back to the campuses.

Internet2’s CI Tactics Target campus, national, and international audiences Integrate campus CI into regional national/international CI Target Application-community CI (quasi-national) Enable effective use of authorized resources, regardless of where they exist Enable integration of new resources as they become available Facilitate interoperability of multiple, autonomous CI providers Take a “toolkit” approach Make sure it still looks like a wall jack to end user Push for best practices for campuses What to do How to do it Community learns as a whole / avoid reinventing the wheel Contribute to the support structure for use of CI Open source CI software Centers of Excellence for various kinds of things Training)

Questions? Eric Boyd eboyd@internet2.edu