The Access Grid at Vislab Chris Willing Vislab University of Sydney, Australia.
Today Background Differences to “traditional” vc Vislab Implementation – ATP Vislab Implementation – Physics Virtual Venues Differences to ANL implementation Future Work
Background VISLAB – visualisation, high perf. Computing Labs at Physics, ATP Bernard Pailthorpe at SDSC in 2000 –High resolution displays –AG node Start April 2001 for SCGlobal, November 2001
Hyogo Prefecture (Japan) Police Command Center - reference site Tiled rear projection array of 6 Model 200 ILA Projectors
SmartSpaces, Stanford Courtesy of Pat Hanrahan, CS - Stanford
PowerWall, Princeton Courtesy of Kai Li, CS - Princeton 8x commodity projectors
High Density Tiled Display, SDSC 3x3 array with common light source
(Background) VISLAB – visualisation, high perf. Computing Labs at Physics, ATP Bernard Pailthorpe at SDSC in 2000 –High resolution displays –AG node Start April 2001 for SCGlobal, November 2001
Display, VisLab x1 array, 3840x1024 pixels, single light source
Application in archaeology: Angkor
(Today) Background Differences to “traditional” vc Vislab Implementation – ATP Vislab Implementation – Physics Virtual Venues Differences to ANL implementation Future Work
Differences – 1. traditional Each node sends image stream(s) to MCU which decides what is output to each node TV style
Differences – 2. access grid Each node sends image streams to a multicast group address Each node sees all other sources
but if “Each node sees all other sources” then screen overload e.g. 10 other sites, each with 3 cameras => 30 video streams
Normal screen First Access Grid session at USyd on 29 Aug, 2001
Widescreen 5120x1024 Normal screen 1280x1024
Summary of Differences limited vs. rich user experience complex (studio) vs. simple (pc) proprietary vs. open expensive vs. cheap but richness requires pixels
Technology diversion 1 Unicast 3 streams to 7 sites = 21 video streams
Efficient but who pays? Multicast 3 streams to 7 sites = 3 streams (mostly)
Today Background Differences to “traditional” vc Vislab Implementation – ATP Vislab Implementation – Physics Virtual Venues Differences to ANL implementation Future Work
Vislab Implementation (ATP) display video audio Gentner cameras mics PA projectors (monitor)
Access Grid Sydney Preparing for SC-Global: test “cruises” + A/G-Sydney
ATP display Mon. display network Projected display 3840x1024
ATP video capture video network table screen mon. camera projectors
ATP audio capture Gentner Rx audio network 1x Radio mic 2x PZM mon. PA
Gentner – echo control Site ASite B echo delay due to - distance - application buffering
Vislab Implementation (Physics) Mon. network Projected display 2560x1024 2x cameras Polycom Soundstation 1x video
Virtual Venues ANL runs Virtual Venues Server APAG server at Virtual rooms characterised by facilities –Video multicast group address –Audio multicast group address –MUD location (back channel link for participants)
Linux only –stability –sync with vvd (not CORBA event handler) –potential for compute clustering (openMosix) Graphics cards (nVidia Quadro4) –use “well known” visualisation apps e.g. performer, openInventor, openDX –potential for video clustering (chromium) Differences to ANL
Video clustered AG display video audio Gentner cameras 1 mics PA 2 3 monitorprojectors
Performer with AG
High Resolution with AG
Future Video clustering, stereo viewing Shared event stream Higher resolution (PAL, HDTV) –H263, hardware MJPEG External machine capture Coexistence with H323 (via VRVS)
Links