December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation, USA CISE Reorganization Status International Networking Solicitation Cyberinfrastructure
December 10, 2003Slide 2 International Networking New solicitation Priorities –Enable and Enhance communications, collaboration and cooperation by connecting aggregation points –Enhance connectivity to new regions –Support connections between share cyberinfrastructure –Cooperate to support domain specific CI
December 10, 2003Slide 3 NSF Supported International Links TrasPAC –Tokyo-LA, Tokyo-Chicago Euro-Link –Chicago – Amsterdam/CERN NaukaNet –Chicago – Moscow & China AMPATH –Miami – Rio & others
December 10, 2003Slide 4 TransPac
December 10, 2003Slide 5 European lambdas to US –8 GigEs Amsterdam—Chicago –8 GigEs London—Chicago Canadian lambdas to US –8 GigEs Chicago-Canada-NYC –8 GigEs Chicago-Canada-Seattle US lambdas to Europe –4 GigEs Chicago—Amsterdam –3 GigEs Chicago—CERN European lambdas –8 GigEs Amsterdam—CERN –2 GigEs Prague—Amsterdam –2 GigEs Stockholm—Amsterdam –8 GigEs London—Amsterdam TransPAC lambda (yellow) –1 GigE Chicago—Tokyo IEEAF lambdas (blue) –8 GigEs NYC—Amsterdam –8 GigEs Seattle—Tokyo TransLight Lambdas
December 10, 2003Slide 6 AMPATH
December 10, 2003Slide 7 GLORIAD
December 10, 2003Slide 8 Cyberinfrastructure Characteristics Community-Focused –virtual organizations –distributed, –collaborative Scale and Scope –Multidisciplinary –International –Supporting data- and compute-intensive applications –High-end to desktop –Heterogeneous Common Technology & Policy Platform(s) –Interoperability –Supports characteristics above
December 10, 2003Slide 9 Evolution of the Computational Infrastructure Supercomputer Centers PACI Terascale | | | | | | NPACI and Alliance SDSC, NCSA, PSC, CTC TCS, DTF, ETF Cyberinfrastructure Prior Computing Investments NSF Networking
December 10, 2003Slide 10 Hardware Integrated CI System meeting the needs of a community of communities Grid Services & Middleware Development Tools & Libraries Applications Environmental Science High Energy Physics Proteomics/Genomics … Domain- specific Cybertools (software) Shared Cybertools (software) Distributed Resources (computation, communication storage, etc.) Education and Training Discovery & Innovation
December 10, 2003Slide 11 Cyberinfrastructure consists of … Computational engines (supercomputers, clusters, workstations, small processors, …) Mass storage (disk drives, tapes, …) Networking (including wireless, distributed, ubiquitous) Digital libraries/data bases Sensors/effectors Software (operating systems, middleware, domain specific tools/platforms for building applications) Services (education, training, consulting, user assistance) All working together in an integrated fashion.
December 10, 2003Slide 12 In Ten Years, CI will be… rich in resources, comprehensive in functionality, and ubiquitous; easily usable by all scientists and engineers, from students to emertii; accessible anywhere, anytime needed by authenticated users; interoperable, extendable, flexible, tailorable, and robust; funded by multiple agencies, states, campuses, and organizations; supported and utilized by educational programs at all levels.
December 10, 2003Slide 13 Technical Challenges Computer Science and Engineering broadly How to build the components? Networks, processors, storage devices, sensors, software How to shape the technical architecture? Pervasive, many cyberinfrastructures, constantly evolving/changing capabilities How to customize CI to particular S&E domains
December 10, 2003Slide 14 Cyberinfrastructure Early Adopters Network for Earthquake Engineering Simulation (NEES) National Ecological Observatory Network (NEON) Biomedical Informatics Research Network (BIRN) Extensible Terascale Facility (ETF)
December 10, 2003Slide 15 TeraGrid (ETF) Configuration
December 10, 2003Slide 16 Extensible Terascale Facility
December 10, 2003Slide 17
December 10, 2003Slide Mpbs to 100 Million Homes NSF Funded Research Project (10/03) - $7.5M Stanford, Berkeley, CMU, Rice, Fraser Research, Internet2 Scope –Economics –Technologies (Backbone and Access) –Protocols Requires a redesign of the access, metropolitan and backbone networks of the Internet Applications?
December 10, 2003Slide 19 Scaling – Homes TODAY 500Kbps X 10 million homes | | *200 | *10 | | | FUTURE 100Mbps X 100 million homes Scale by a factor of 2000! 1 million homes connected at 100Mbps == 100Tbps!! At the network core petabits per second are required
December 10, 2003Slide 20 NSF & Cyberinfrastructure Douglas Gatchell International Networking Program Director NSF: National Science Foundation CISE: Directorate for Computer Information and Science and Engineering SCI: Division of Shared Cyberinfrastructure