Download presentation
Presentation is loading. Please wait.
Published byRuth Matthews Modified over 9 years ago
1
National Computational Science Boston UniversityNational Computational Science Alliance The Access Grid: Introduction and Case Studies Jennifer Teig von Hoffman Boston University jtvh@bu.edu OSHEAN Wave Conference November 8, 20001
2
Boston UniversityNational Computational Science Alliance Objectives Introduction to the Access Grid (AG) AG use at Boston University and nationwide –Special focus on events organized by and/or participated in by AG nodes in New England Case studies –MPI workshop –SC Global Looking forward
3
National Computational Science Boston UniversityNational Computational Science Alliance Introduction to the Access Grid
4
Boston UniversityNational Computational Science Alliance What is the Access Grid? A group of interconnected nodes, each using a specific suite of hardware, software, and tools to facilitate group-to-group collaboration over the Internet. A project led by Argonne National Laboratory Futures Group, with strong participation from National Computational Science Alliance and others –Boston University is an Alliance member –Boston University is one of many organizations actively involved in packaging and documentation of Access Grid, as well as an active user
5
Boston UniversityNational Computational Science Alliance What is an AG Session Like? Interacting with multiple sites, each with multiple people –Unknown what the technical limits are, but once you get to about 10 sites it’s hard to manage human interactions Room-based computing –Large screens to display video and visual aids –BU screen is 88” x 162”, rear projected –Hands-free audio –BU uses mostly tabletop microphones
6
Supercomputing 2000, Dallas TX
7
Meeting with Illinois Governor George H. Ryan, Urbana, Illinois
8
Soft, Fluffy, and Virtual, by Cindy Ludlum
9
Alliance Chautauqua 1999 at Boston University, Boston, Massachusetts
10
Boston UniversityNational Computational Science Alliance Virtual Venues: The Metaphor Access Grid Lobby Jack Frost RoomWindmer Room Big Horn RoomBridgeport Room Full Sail RoomLucky Labrador Room Secure RoomScreening Room Test Room
11
Boston UniversityNational Computational Science Alliance Hardware/Software Requirements All components are open source, off- the-shelf, or both Average AG node currently costs about $45,000 –This estimate does not include networking, an important cost consideration for many smaller institutions Anybody with the right hardware, software, and network can get on AG –No fees, no need to join anything (though you do need to register for some services)
12
Boston UniversityNational Computational Science Alliance Networking Requirements Multicast-enabled 100bT connection to the AG node 100 Mbit/sec path to the Internet from the node mcast-support@accessgrid.org for more info
13
Boston UniversityNational Computational Science Alliance AGCF @ BU Current Installation Four commodity PCs –Two running Windows –Two running Red Hat linux Three projectors, custom built rear- projection screen Audio mixers and echo-cancellation devices Speakers, cameras, microphones Connection to Abilene network
14
Boston UniversityNational Computational Science Alliance How Many Access Grid Nodes? All figures here are estimates – no perfect way to track number of nodes New England –Boston University, Massachusetts –Dartmouth University, New Hampshire –University of Maine, Maine –Coming soon: Worchester Polytechnic Institute, Massachusetts Across the United States –58 nodes in 28 states (including Alaska and Hawaii)
15
Boston UniversityNational Computational Science Alliance Access Grid Nodes Outside the US 14 nodes in 8 countries –Australia –Canada –China –Germany –Italy –Japan –Korea –United Kingdom Sydney VisLab, Australia
16
National Computational Science Boston UniversityNational Computational Science Alliance Access Grid Use at Boston University and Nationwide
17
Boston UniversityNational Computational Science Alliance AG-Based Events at BU Five Alliance Chautauquas – A conference by and for computational scientists –1999: University of New Mexico; University of Kentucky; Boston University –2000: OSC; University of Kansas
18
Boston UniversityNational Computational Science Alliance AG-Based Events at BU (2) Conferences, workshops and seminars on computational science –Topics including PETSc, MPI, and Globus Meetings, both formal and informal, of organizations including: –Boston University Teaching and Learning Technology Roundtable –Institute for African-American E-Culture –Many Alliance Working Groups and Teams Distributed Rap Sessions, sponsored by Coalition to Diversify Computing NSF Site Reviews
19
Boston UniversityNational Computational Science Alliance AG-Based Events at BU (3) Boston Cyberarts Festival, May 2001 –“Soft, Fluffy, and Virtual,” in collaboration with Cindy Ludlam –“Tracer,” in collaboration with Deborah and Richard Cornell
20
Boston UniversityNational Computational Science Alliance Partial List NE Participants @ BU Dartmouth University Harvard University Massachusetts General Hospital Massachusetts Institute of Technology Museum of Science Northeastern University Tufts University University of Maine University of Massachusetts Medical Center Worchester Polytechnic Institute (and countless others at demos)
21
Boston UniversityNational Computational Science Alliance Interesting Uses of the AG (not BU) Collaborative teaching of graduate course –University of Alaska at Fairbanks –University of Montana –University of New Mexico TOUCH (Telehealth Outreach for Unified Community Health), University of New Mexico –http://hsc.unm.edu/telemedicine/projects_touch.html Numerous lunchtime talks and seminars Job interviews Thesis defense
22
National Computational Science Boston UniversityNational Computational Science Alliance Case Studies
23
Boston UniversityNational Computational Science Alliance Introduction to the Case Studies Different types of events require different types of preparations –Just like face-to-face events Informal events over the AG require little if any preparation, assuming all participating sites have well- functioning AG nodes Larger and/or more formal events can require substantial preparations
24
Boston UniversityNational Computational Science Alliance Case Study: MPI Workshop Sponsored by National Computational Science Alliance Workshop on use of message passing libraries for parallel computing Two day workshop, approx. 6 hours/day Reference: Report on March 28 & 29, 2001, MPI Workshop over the Access Grid, Leslie Southern, Ohio Supercomputer Center, April 2001 –http://alliance.osc.edu/mpi/report.pdf
25
Boston UniversityNational Computational Science Alliance Overview of MPI Workshop Lecture and, where available, hands-on lab sessions –Lab sessions used a text chat tool, when labs were not in same room as AG nodes –Some facilities did not have appropriate lab facilities, so workshop was designed for lab sections to be optional Informal style of interaction; participants were encouraged to break in with questions at any time
26
Boston UniversityNational Computational Science Alliance Sites Participating in MPI Workshop Ohio Supercomputer Center (host) Albuquerque High Performance Computing Center Boston University Dartmouth University of Kansas National Center for Supercomputing Applications North Dakota State University University of Kentucky 96 participants total
27
Boston UniversityNational Computational Science Alliance Preparations for MPI Workshop The Alliance PACS Training Group decided to hold the workshop –This was an existing group of collaborators, not a new collaboration formed for the purpose Determined structure of the course, with AG strengths and weaknesses in mind Announced course to AG community, to solicit potential participating sites outside the PACS Training Group
28
Boston UniversityNational Computational Science Alliance Preparations for MPI Workshop (2) Four “test cruises” held during the two months before the workshop –Technical cruises to ensure that all AG nodes were operating well –Rehearsal cruise for instructor to get more familiar with the AG Appropriate software and other files installed on lab machines at sites participating in lab sessions
29
Boston UniversityNational Computational Science Alliance Preparations for MPI Workshop (3) Lab exercises developed by host site and distributed electronically Various logistics preparations at each site –Catering, name tags, directions, handouts, guest accounts, etc
30
Boston UniversityNational Computational Science Alliance Back-up and Back Channel Plans Backup audio plans –Conference call in place over Plain Old Telephone System (POTS) –Sites dialed their audio gear into conference call, for quick-and-easy switch to backup in case of major network failure Network bridge (Multi-Session Bridge) Back channel communications –Separate conference call for node operators –Text-based MOO (standard, built-in AG back channel)
31
Boston UniversityNational Computational Science Alliance The Event Itself Went very smoothly overall Some sites fell back to network bridge and/or POTS –Switching over to POTS is non-disruptive, other than sudden change in audio quality No difficulties with video or display media (PowerPoint and desktop sharing) –AG video not full-motion, but seems to support interaction well
32
Boston UniversityNational Computational Science Alliance Evaluation of MPI Workshop 27% of attendees submitted evaluation forms (web-based) Overall evaluation score: 4.64 out of 5 –Evaluation form included questions about course content, instructor, use of training technologies, and local facilities Responses to some questions were very site-specific –Different rooms, different equipment, different staff, etc
33
Boston UniversityNational Computational Science Alliance Evaluation of MPI Workshop (2) At least 88% strongly agreed or agreed that –Materials were clearly visible –Facility was conducive to learning –Video was satisfactory –Presenter was clearly visible 82% strongly agreed or agreed that quality of audio was satisfactory
34
Boston UniversityNational Computational Science Alliance Case Study: SC Global November 12 – 16, 2001 AG-enabled component of Supercomputing 2001 conference in Denver, Colorado One to six venues of concurrent content –Three venues of conference program sessions –High-quality version of webcast content –AG Robot traversing show floor –Showcase Node with innovative, visually-appealing content on the show floor Participating sites from around the world
35
Boston UniversityNational Computational Science Alliance Supercomputing 2001 The premiere technical and industrial meeting for high-end networking and computing and computational science Sponsored by –IEEE Computer Society –Association for Computing Machinery/Special Interest Group on Computer Architecture Conferences have been held annually since 1987 This year expected attendance: 5,000
36
Boston UniversityNational Computational Science Alliance SC Global The first truly global technical conference on the grid Will use AG technology to link core Supercomputing conference in Denver with dozens of constellation sites both across the US and worldwide At least 34 participating sites, in at least 10 countries
37
SC Global Participating Sites Image by Robert Patterson, NCSA
38
Boston UniversityNational Computational Science Alliance Some SC Global Challenges Multiple concurrent venues of content will stress support systems –Technical support –Program/content support –Plain old coordination and logistics Extensive use of Voyager, a multimedia server for recording and playback of AG sessions Crossing time zone, language, and culture boundaries
39
Boston UniversityNational Computational Science Alliance Preparing for SC Global February: Proposals for participating sites March: Call for proposals for “content” –Collaboration with SC2001 content committees –Content includes art and dance, in addition to topics more directly related to high-performance computing July – November: Nanocruises September: Production Institute October: Megacruise Week October – November: Development of Production Plans
40
Boston UniversityNational Computational Science Alliance Nanocruises June - November Individual cruises led by staff from countless participating sites –Cruise leaders generally seasoned AG operators Twice-weekly one-hour test sessions to ensure –Nodes fully operational –Operators fully trained
41
Boston UniversityNational Computational Science Alliance Production Institute September 17 - 21 23 participating sites 13 sessions of instruction for –AG technical and production leaders –Speakers –Masters of Ceremony
42
Boston UniversityNational Computational Science Alliance Megacruise October 8 - 12 In theory, dry runs of all SC Global content, using Production Plans –A Production Plan is essentially a detailed agenda, including all relevant technical info In actuality, dry runs of much SC Global content, and a very good reminder for those who weren’t ready for their dry runs
43
Boston UniversityNational Computational Science Alliance Staffing Infrastructure for SC Global Staffing at Denver Core Nodes –Each of four “core” nodes staffed with 5 or 6 people –Roles will include Production and/or Technical Lead, Floor Manager, and Master of Ceremonies for each room Staffing at Constellation Sites –Varies from 1 to 5 staff per node, depending on local circumstances –Roles similar to those listed above
44
Boston UniversityNational Computational Science Alliance Evaluation Two evaluation forms: –One for participants at Denver “core” nodes –One for organizers of participation at remote nodes Each of these forms will of necessity approach questions from a different point of view –Were resources not limited, we might have implemented additional evaluation forms as well
45
National Computational Science Boston UniversityNational Computational Science Alliance Moving Forward “What’s Next?”
46
Boston UniversityNational Computational Science Alliance Some Works in Progress Addressing the question of participating in AG sessions from desktop computers and videoconferencing systems –Difficulties both in technical compatibility and in user interface Additional tools for desktop sharing, scientific visualization, and other kinds of collaboration
47
Boston UniversityNational Computational Science Alliance More Works in Progress Human Factors studies –Motorola Laboratories –University of Illinois Distributed venues servers –Multiple virtual buildings, each one containing multiple virtual rooms –At present, at least three AG venues server supporting many day-to-day activities –Venues server software available for download
48
Boston UniversityNational Computational Science Alliance To Learn More Access Grid Home Page http://www.accessgrid.org/ Access Grid Documentation Project http://www.accessgrid.org/agdp/ Access Grid Conference Facility at Boston University http://scv.bu.edu/accessgrid/
49
Boston UniversityNational Computational Science Alliance... And an Invitation Selected sessions from SC Global available at the Access Grid Conference Facility at Boston University –http://scv.bu.edu/events/ –ariella@bu.edu –617-353-0817 Sessions include TOUCH Telehealth project, VR-based art from BU, and AG “vision vs. reality”
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.