Applications Requiring An Experimental Optical Network Invited Keynote I-Light Applications Workshop Indiana Univ. Purdue Univ. Indianapolis December 4, 2002 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD
Closing in on the Dream A High Performance Collaboration Grid Using satellite technology…demo of What It might be like to have high-speed fiber-optic links between advanced computers in two different geographic locations. Al Gore, Senator Chair, US Senate Subcommittee on Science, Technology and Space What we really have to do is eliminate distance between individuals who want to interact with other people and with other computers. Larry Smarr, Director National Center for Supercomputing Applications, UIUC ATT & Sun Source: Maxine Brown, EVL, UIC Illinois Boston SIGGRAPH 89 Science by Satellite
I-WAY: Information Wide Area Year Supercomputing 95 The First National 155 Mbps Research Network –65 Science Projects –Into the San Diego Convention Center I-Way Featured: –Networked Visualization Application Demonstrations –Large-Scale Immersive Displays –I-Soft Programming Environment UIC CitySpace Cellular Semiotics
Alliance 1997: Collaborative Video Production via Tele-Immersion and Virtual Director Donna Cox, Bob Patterson, Stuart Levy, Glen Wheless Alliance Project Linking CAVE, Immersadesk, Power Wall, and Workstation UIC
Fifteen Countries/Locations Proposing 28 Demonstrations: Canada, CERN, France, Germany, Greece, Italy, Japan, The Netherlands, Singapore, Spain, Sweden, Taiwan, United Kingdom, United States Applications Demonstrated: Art, Bioinformatics, Chemistry, Cosmology, Cultural Heritage, Education, High-Definition Media Streaming, Manufacturing, Medicine, Neuroscience, Physics, Tele-science Grid Technologies: Grid Middleware, Data Management/ Replication Grids, Visualization Grids, Computational Grids, Access Grids, Grid Portal iGrid 2002 September 24-26, 2002, Amsterdam, The Netherlands UIC Sponsors: HP, IBM, Cisco, Philips, Level (3), Glimmerglass, etc.
iGrid 2002 Was Sustaining 1-3 Gigabits/s Total Available Bandwidth Between Chicago and Amsterdam Was 30 Gigabit/s
The Move to Data-Intensive Science & Engineering- e-Science Community Resources ATLAS Sloan Digital Sky Survey LHC ALMA
Why Optical Networks Are Emerging as the 21 st Century Driver for the Grid Scientific American, January 2001
CONTROLPLANECONTROLPLANE Clusters Dynamically Allocated Lightpaths Switch Fabrics Physical Monitoring Apps Middleware A LambdaGrid Will Be the Backbone for an e-Science Network Source: Joe Mambretti, NU
NSF Defines Three Classes of Networks Beyond the Commodity Internet Production Networks (e.g. Internet2) –High-Performance Networks –Reaches All US Researchers –24 / 7 Reliable Experimental Networks –Trials of Cutting-Edge High-Performance Networks –Deliver Advanced Application Needs Unsupported by Production Networks –Robust Enough to Support Application-Dictated Development: –Software Application Toolkits, –Middleware, –Computing and Networking Research Networks –Smaller-Scale Network Prototypes –Enable Basic Scientific and Engineering Network Research –Testing of Component Technologies, Protocols, Network Architectures –Not Expected to Be Persistent –Not Expected to Support Production Applications
Local and Regional Lambda Experimental Networks Are Achievable and Practical Several GigaPOPs and States Are Building –Multi-Lambda Metropolitan Experimental Networks –Lighting up Their Own Dark Fiber (I-WIRE, I-Light, CENIC CalREN-XD) –With Hundreds of Lambdas by 2010 OptIPuter Funded to Research LambdaGrid –Middleware and Control Plane –Application Driven Substantial State and Local Funds Can Be Heavily Leveraged by an NSF Experimental Networks Program –Cross-country Inter-Connection (National Light Rail) –Persistent Support of Emerging Experimental Networks –First NSF Workshop UIC December 2001 –Second NSF Workshop UCI May 2002 –Expected NSF RFP by Winter 2003
The Next S-Curves of Networking Exponential Technology Growth 0% 100% Research Experimental/ Early Adopters Production/ Mass Market Time Technology S-Curve Gigabit Testbeds Connections Program Internet2 Abilene DWDM Experimental Networks Lambda Grids ~1990s Networking Technology S-Curves Technology Penetration
Cal-(IT) 2 An Integrated Approach to the Future Internet UC San Diego & UC Irvine Faculty Working in Multidisciplinary Teams With Students, Industry, and the Community The States $100 M Creates Unique Buildings, Equipment, and Laboratories
Data Intensive Scientific Applications Require Experimental Optical Networks Large Data Challenges in Neuro and Earth Sciences –Each Data Object is 3D and Gigabytes –Data are Generated and Stored in Distributed Archives –Research is Carried Out on Federated Repository Requirements –Computing Requirements PC Clusters –Communications Dedicated Lambdas Over Fiber –Data Large Peer-to-Peer Lambda Attached Storage –Visualization Collaborative Volume Algorithms Response –OptIPuter Research Project
The Biomedical Informatics Research Network a Multi-Scale Brain Imaging Federated Repository BIRN Test-beds : BIRN Test-beds : Multiscale Mouse Models of Disease, Human Brain Morphometrics, and FIRST BIRN (10 site project for fMRIs of Schizophrenics) NIH Plans to Expand to Other Organs and Many Laboratories
Microscopy Imaging of Neural Tissue Marketta Bobik Francisco Capani & Eric Bushong Confocal image of a sagittal section through rat cortex triple labeled for glial fibrillary acidic protein (blue), neurofilaments (green) and actin (red) Projection of a series of optical sections through a Purkinje neuron revealing both the overall morphology (red) and the dendritic spines (green)
Interactive Visual Analysis of Large Datasets -- East Pacific Rise Seafloor Topography Scripps Institution of Oceanography Visualization Center
Tidal Wave Threat Analysis Using Lake Tahoe Bathymetry Scripps Institution of Oceanography Visualization Center Graham Kent, SIO
SIO Uses the Visualization Center to Teach a Wide Variety of Graduate Classes Geodesy Gravity and Geomagnetism Planetary Physics Radar and Sonar Interferometry Seismology Tectonics Time Series Analysis Multiple Interactive Views of Seismic Epicenter and Topography Databases Deborah Kilb & Frank Vernon, SIO
NSFs EarthScope Rollout Over 14 Years Starting With Existing Broadband Stations
NSF Experimental Network Research Project The OptIPuter Driven by Large Neuroscience and Earth Science Data –NIH Biomedical Informatics Research Network –NSF EarthScope (UCSD SIO) Removing Bandwidth as a Constraint –Links Computing, Storage, Visualization and Networking –Software and Systems Integration Research Agenda NSF Large Information Technology Research Proposal –UCSD and UIC Lead Campuses –USC, UCI, SDSU, NW Partnering Campuses –Industrial Partners: IBM, Telcordia/SAIC, CENIC, Chiaro Networks, IXIA PILarry Smarr; Funded at $13.5M Over Five Years –Start Date October 1,
From SuperComputers to SuperNetworks-- Changing the Grid Design Point The TeraGrid is Optimized for Computing –1024 IA-64 Nodes Linux Cluster –Assume 1 GigE per Node = 1 Terabit/s I/O –Grid Optical Connection 4x10Gig Lambdas = 40 Gigabit/s –Optical Connections are Only 4% Bisection Bandwidth The OptIPuter is Optimized for Bandwidth –32 IA-64 Node Linux Cluster –Assume 1 GigE per Processor = 32 gigabit/s I/O –Grid Optical Connection 4x10GigE = 40 Gigabit/s –Optical Connections are Over 100% Bisection Bandwidth
OptIPuter Inspiration--Node of a 2009 PetaFLOPS Supercomputer VLIW/RISC CORE 24 GFLOPS 6 Ghz 160 GB/s Coherence 64 B wide 160 GB/s 64 B wide VLIW/RISC CORE 24 GFLOPS 6 Ghz... 2nd LEVEL CACHE 96 MB 2nd LEVEL CACHE 96 MB CROSS BAR DRAM - 4 GB - HIGHLY INTERLEAVED 640 GB/s MULTI-LAMBDA AON Source: Steve Wallach, Supercomputing 2000 Keynote
Global Architecture of a 2009 COTS PetaFLOPS System I/O ALL-OPTICAL SWITCH Multi-Die Multi-Processor Die/Box 4 CPU/Die 10meters= 50 nanosec Delay... LAN/WAN Source: Steve Wallach, Supercomputing 2000 Keynote
OptIPuter NSF Proposal Partnered with National Experts and Infrastructure Vancouver Seattle Portland San Francisco Los Angeles San Diego (SDSC) NCSA SURFnet CERN CA* net4 Asia Pacific Asia Pacific AMPATH PSC Atlanta CA*net4 Source: Tom DeFanti and Maxine Brown, UIC NYC TeraGrid DTFnet CENIC Pacific Light Rail Chicago UIC NU USC UCSD, SDSU UCI
Cluster – Disk Disk – Disk Viz – Disk DB – Cluster Cluster – Cluster OptIPuter LambdaGrid Enabled by Chiaro Networking Router switch Medical Imaging and Microscopy Chemistry, Engineering, Arts San Diego Supercomputer Center Scripps Institution of Oceanography Chiaro Enstara Image Source: Phil Papadopoulos, SDSC
½ Mile The UCSD OptIPuter Deployment SIO SDSC CRCA Phys. Sci - Keck SOM JSOE Preuss 6 th College Phase I, Fall 02 Phase II, 2003 SDSC Annex Collocation point Node M The OptIPuter Experimental UCSD Campus Optical Network Earth Sciences SDSC Arts Chemistry Medicine Engineering High School Undergrad College Phase I, Fall 02 Phase II, 2003 SDSC Annex To CENIC Collocation point Collocation Chiaro Router Production Router Source: Phil Papadopoulos, SDSC; Greg Hidley, Cal-(IT) 2
Planned Chicago Metro Electronic Switching OptIPuter Laboratory Intl GE, 10GE Natl GE, 10GE Metro GE, 10GE 16x1 GE 16x10 GE 16-Processor McKinley at University of Illinois at Chicago 16-Processor Montecito/Chivano at Northwestern StarLight 10x1 GE + 1x10GE Nationals: Illinois, California, Wisconsin, Indiana, Abilene, FedNets. Washington, Pennsylvania… Internationals: Canada, Holland, CERN, GTRN, AmPATH, Asia… Source: Tom DeFanti
Metro Optically Linked Visualization Walls with Industrial Partners Set Stage for Federal Grant Driven by SensorNets Data –Real Time Seismic –Environmental Monitoring –Distributed Collaboration –Emergency Response Linked UCSD and SDSU –Dedication March 4, 2002 Linking Control Rooms Cox, Panoram, SAIC, SGI, IBM, TeraBurst Networks SD Telecom Council UCSD SDSU 44 Miles of Cox Fiber
NTT Super High Definition Video (NTT 4Kx2K=8 Megapixels) Over Internet2 Starlight in Chicago USC In Los Angeles SHD = 4xHDTV = 16xDVD Applications: Astronomy Mathematics Entertainment
5x3 Grid of 1280x1024 Pixel LCD Panels Driven by 16-PC Cluster Resolution=6400x3072 Pixels, or ~3000x1500 pixels in Autostereo OptIPanel
The Continuum at EVL and TRECC OptIPuter Amplified Work Environment Passive stereo displayAccessGridDigital white board Tiled display Source: Tom DeFanti, Electronic Visualization Lab, UIC
Fast polygon and volume rendering with stereographics GeoWall Earth Science GeoFusion GeoMatrix Toolkit Underground Earth Science Rob Mellors and Eric Frost, SDSU SDSC Volume Explorer Dave Nadeau, SDSC, BIRN SDSC Volume Explorer Neuroscience Anatomy Visible Human Project NLM, Brooks AFB, SDSC Volume Explorer 3D APPLICATIONS: + = OptIPuter Transforms Individual Laboratory Visualization, Computation, & Analysis Facilities The Preuss School UCSD OptIPuter Facility