Download presentation
Presentation is loading. Please wait.
Published byOlivia O'Connor Modified over 11 years ago
1
Applications Requiring An Experimental Optical Network Invited Keynote I-Light Applications Workshop Indiana Univ. Purdue Univ. Indianapolis December 4, 2002 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD
2
Closing in on the Dream A High Performance Collaboration Grid Using satellite technology…demo of What It might be like to have high-speed fiber-optic links between advanced computers in two different geographic locations. Al Gore, Senator Chair, US Senate Subcommittee on Science, Technology and Space What we really have to do is eliminate distance between individuals who want to interact with other people and with other computers. Larry Smarr, Director National Center for Supercomputing Applications, UIUC ATT & Sun Source: Maxine Brown, EVL, UIC http://sunsite.lanet.lv/ftp/sun-info/sunflash/1989/Aug/08.21.89.tele.video Illinois Boston SIGGRAPH 89 Science by Satellite
3
I-WAY: Information Wide Area Year Supercomputing 95 The First National 155 Mbps Research Network –65 Science Projects –Into the San Diego Convention Center I-Way Featured: –Networked Visualization Application Demonstrations –Large-Scale Immersive Displays –I-Soft Programming Environment UIC http://archive.ncsa.uiuc.edu/General/Training/SC95/GII.HPCC.html CitySpace Cellular Semiotics
4
Alliance 1997: Collaborative Video Production via Tele-Immersion and Virtual Director Donna Cox, Bob Patterson, Stuart Levy, Glen Wheless www.ncsa.uiuc.edu/People/cox/ Alliance Project Linking CAVE, Immersadesk, Power Wall, and Workstation UIC
5
Fifteen Countries/Locations Proposing 28 Demonstrations: Canada, CERN, France, Germany, Greece, Italy, Japan, The Netherlands, Singapore, Spain, Sweden, Taiwan, United Kingdom, United States Applications Demonstrated: Art, Bioinformatics, Chemistry, Cosmology, Cultural Heritage, Education, High-Definition Media Streaming, Manufacturing, Medicine, Neuroscience, Physics, Tele-science Grid Technologies: Grid Middleware, Data Management/ Replication Grids, Visualization Grids, Computational Grids, Access Grids, Grid Portal iGrid 2002 September 24-26, 2002, Amsterdam, The Netherlands www.startap.net/igrid2002 UIC Sponsors: HP, IBM, Cisco, Philips, Level (3), Glimmerglass, etc.
6
iGrid 2002 Was Sustaining 1-3 Gigabits/s Total Available Bandwidth Between Chicago and Amsterdam Was 30 Gigabit/s
7
The Move to Data-Intensive Science & Engineering- e-Science Community Resources ATLAS Sloan Digital Sky Survey LHC ALMA
8
Why Optical Networks Are Emerging as the 21 st Century Driver for the Grid Scientific American, January 2001
9
CONTROLPLANECONTROLPLANE Clusters Dynamically Allocated Lightpaths Switch Fabrics Physical Monitoring Apps Middleware A LambdaGrid Will Be the Backbone for an e-Science Network Source: Joe Mambretti, NU
10
NSF Defines Three Classes of Networks Beyond the Commodity Internet Production Networks (e.g. Internet2) –High-Performance Networks –Reaches All US Researchers –24 / 7 Reliable Experimental Networks –Trials of Cutting-Edge High-Performance Networks –Deliver Advanced Application Needs Unsupported by Production Networks –Robust Enough to Support Application-Dictated Development: –Software Application Toolkits, –Middleware, –Computing and Networking Research Networks –Smaller-Scale Network Prototypes –Enable Basic Scientific and Engineering Network Research –Testing of Component Technologies, Protocols, Network Architectures –Not Expected to Be Persistent –Not Expected to Support Production Applications www.evl.uic.edu/activity/NSF/index.html
11
Local and Regional Lambda Experimental Networks Are Achievable and Practical Several GigaPOPs and States Are Building –Multi-Lambda Metropolitan Experimental Networks –Lighting up Their Own Dark Fiber (I-WIRE, I-Light, CENIC CalREN-XD) –With Hundreds of Lambdas by 2010 OptIPuter Funded to Research LambdaGrid –Middleware and Control Plane –Application Driven Substantial State and Local Funds Can Be Heavily Leveraged by an NSF Experimental Networks Program –Cross-country Inter-Connection (National Light Rail) –Persistent Support of Emerging Experimental Networks –First NSF Workshop UIC December 2001 –Second NSF Workshop UCI May 2002 –Expected NSF RFP by Winter 2003
12
The Next S-Curves of Networking Exponential Technology Growth 0% 100% Research Experimental/ Early Adopters Production/ Mass Market Time Technology S-Curve Gigabit Testbeds Connections Program Internet2 Abilene DWDM Experimental Networks Lambda Grids ~1990s 2000 2010 Networking Technology S-Curves Technology Penetration
13
Cal-(IT) 2 An Integrated Approach to the Future Internet www.calit2.net 220 UC San Diego & UC Irvine Faculty Working in Multidisciplinary Teams With Students, Industry, and the Community The States $100 M Creates Unique Buildings, Equipment, and Laboratories
14
Data Intensive Scientific Applications Require Experimental Optical Networks Large Data Challenges in Neuro and Earth Sciences –Each Data Object is 3D and Gigabytes –Data are Generated and Stored in Distributed Archives –Research is Carried Out on Federated Repository Requirements –Computing Requirements PC Clusters –Communications Dedicated Lambdas Over Fiber –Data Large Peer-to-Peer Lambda Attached Storage –Visualization Collaborative Volume Algorithms Response –OptIPuter Research Project
15
The Biomedical Informatics Research Network a Multi-Scale Brain Imaging Federated Repository BIRN Test-beds : BIRN Test-beds : Multiscale Mouse Models of Disease, Human Brain Morphometrics, and FIRST BIRN (10 site project for fMRIs of Schizophrenics) NIH Plans to Expand to Other Organs and Many Laboratories
16
Microscopy Imaging of Neural Tissue Marketta Bobik Francisco Capani & Eric Bushong Confocal image of a sagittal section through rat cortex triple labeled for glial fibrillary acidic protein (blue), neurofilaments (green) and actin (red) Projection of a series of optical sections through a Purkinje neuron revealing both the overall morphology (red) and the dendritic spines (green) http://ncmir.ucsd.edu/gallery.html
17
Interactive Visual Analysis of Large Datasets -- East Pacific Rise Seafloor Topography http://siovizcenter.ucsd.edu/library/gallery/shoot1/index.shtml Scripps Institution of Oceanography Visualization Center
18
Tidal Wave Threat Analysis Using Lake Tahoe Bathymetry http://siovizcenter.ucsd.edu/library/gallery/shoot1/index.shtml Scripps Institution of Oceanography Visualization Center Graham Kent, SIO
19
SIO Uses the Visualization Center to Teach a Wide Variety of Graduate Classes Geodesy Gravity and Geomagnetism Planetary Physics Radar and Sonar Interferometry Seismology Tectonics Time Series Analysis Multiple Interactive Views of Seismic Epicenter and Topography Databases http://siovizcenter.ucsd.edu/library/gallery/shoot2/index.shtml Deborah Kilb & Frank Vernon, SIO
20
NSFs EarthScope Rollout Over 14 Years Starting With Existing Broadband Stations
21
NSF Experimental Network Research Project The OptIPuter Driven by Large Neuroscience and Earth Science Data –NIH Biomedical Informatics Research Network –NSF EarthScope (UCSD SIO) Removing Bandwidth as a Constraint –Links Computing, Storage, Visualization and Networking –Software and Systems Integration Research Agenda NSF Large Information Technology Research Proposal –UCSD and UIC Lead Campuses –USC, UCI, SDSU, NW Partnering Campuses –Industrial Partners: IBM, Telcordia/SAIC, CENIC, Chiaro Networks, IXIA PILarry Smarr; Funded at $13.5M Over Five Years –Start Date October 1, 2002 www.calit2.net/news/2002/9-25-optiputer.html
22
From SuperComputers to SuperNetworks-- Changing the Grid Design Point The TeraGrid is Optimized for Computing –1024 IA-64 Nodes Linux Cluster –Assume 1 GigE per Node = 1 Terabit/s I/O –Grid Optical Connection 4x10Gig Lambdas = 40 Gigabit/s –Optical Connections are Only 4% Bisection Bandwidth The OptIPuter is Optimized for Bandwidth –32 IA-64 Node Linux Cluster –Assume 1 GigE per Processor = 32 gigabit/s I/O –Grid Optical Connection 4x10GigE = 40 Gigabit/s –Optical Connections are Over 100% Bisection Bandwidth
23
OptIPuter Inspiration--Node of a 2009 PetaFLOPS Supercomputer VLIW/RISC CORE 24 GFLOPS 6 Ghz 160 GB/s Coherence 64 B wide 160 GB/s 64 B wide VLIW/RISC CORE 24 GFLOPS 6 Ghz... 2nd LEVEL CACHE 96 MB 2nd LEVEL CACHE 96 MB CROSS BAR DRAM - 4 GB - HIGHLY INTERLEAVED 640 GB/s MULTI-LAMBDA AON Source: Steve Wallach, Supercomputing 2000 Keynote
24
Global Architecture of a 2009 COTS PetaFLOPS System I/O ALL-OPTICAL SWITCH Multi-Die Multi-Processor 1 2 3 64 63 49 48 4 5 16 17 18 32 33 47 46 128 Die/Box 4 CPU/Die 10meters= 50 nanosec Delay... LAN/WAN Source: Steve Wallach, Supercomputing 2000 Keynote
25
OptIPuter NSF Proposal Partnered with National Experts and Infrastructure Vancouver Seattle Portland San Francisco Los Angeles San Diego (SDSC) NCSA SURFnet CERN CA* net4 Asia Pacific Asia Pacific AMPATH PSC Atlanta CA*net4 Source: Tom DeFanti and Maxine Brown, UIC NYC TeraGrid DTFnet CENIC Pacific Light Rail Chicago UIC NU USC UCSD, SDSU UCI
26
Cluster – Disk Disk – Disk Viz – Disk DB – Cluster Cluster – Cluster OptIPuter LambdaGrid Enabled by Chiaro Networking Router www.calit2.net/news/2002/11-18-chiaro.html switch Medical Imaging and Microscopy Chemistry, Engineering, Arts San Diego Supercomputer Center Scripps Institution of Oceanography Chiaro Enstara Image Source: Phil Papadopoulos, SDSC
27
½ Mile The UCSD OptIPuter Deployment SIO SDSC CRCA Phys. Sci - Keck SOM JSOE Preuss 6 th College Phase I, Fall 02 Phase II, 2003 SDSC Annex Collocation point Node M The OptIPuter Experimental UCSD Campus Optical Network Earth Sciences SDSC Arts Chemistry Medicine Engineering High School Undergrad College Phase I, Fall 02 Phase II, 2003 SDSC Annex To CENIC Collocation point Collocation Chiaro Router Production Router Source: Phil Papadopoulos, SDSC; Greg Hidley, Cal-(IT) 2
28
Planned Chicago Metro Electronic Switching OptIPuter Laboratory Intl GE, 10GE Natl GE, 10GE Metro GE, 10GE 16x1 GE 16x10 GE 16-Processor McKinley at University of Illinois at Chicago 16-Processor Montecito/Chivano at Northwestern StarLight 10x1 GE + 1x10GE Nationals: Illinois, California, Wisconsin, Indiana, Abilene, FedNets. Washington, Pennsylvania… Internationals: Canada, Holland, CERN, GTRN, AmPATH, Asia… Source: Tom DeFanti
29
Metro Optically Linked Visualization Walls with Industrial Partners Set Stage for Federal Grant Driven by SensorNets Data –Real Time Seismic –Environmental Monitoring –Distributed Collaboration –Emergency Response Linked UCSD and SDSU –Dedication March 4, 2002 Linking Control Rooms Cox, Panoram, SAIC, SGI, IBM, TeraBurst Networks SD Telecom Council UCSD SDSU 44 Miles of Cox Fiber
30
NTT Super High Definition Video (NTT 4Kx2K=8 Megapixels) Over Internet2 Starlight in Chicago USC In Los Angeles SHD = 4xHDTV = 16xDVD www.ntt.co.jp/news/news02e/0211/021113.html Applications: Astronomy Mathematics Entertainment
31
5x3 Grid of 1280x1024 Pixel LCD Panels Driven by 16-PC Cluster Resolution=6400x3072 Pixels, or ~3000x1500 pixels in Autostereo OptIPanel
32
The Continuum at EVL and TRECC OptIPuter Amplified Work Environment Passive stereo displayAccessGridDigital white board Tiled display Source: Tom DeFanti, Electronic Visualization Lab, UIC
33
Fast polygon and volume rendering with stereographics GeoWall Earth Science GeoFusion GeoMatrix Toolkit Underground Earth Science Rob Mellors and Eric Frost, SDSU SDSC Volume Explorer Dave Nadeau, SDSC, BIRN SDSC Volume Explorer Neuroscience Anatomy Visible Human Project NLM, Brooks AFB, SDSC Volume Explorer 3D APPLICATIONS: + = OptIPuter Transforms Individual Laboratory Visualization, Computation, & Analysis Facilities The Preuss School UCSD OptIPuter Facility
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.