Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Boulder AreaTeragrid (BAT) “Step up to the Plate” Marla Meehl Peter O’Neil Jim Van Dyke.

Similar presentations


Presentation on theme: "The Boulder AreaTeragrid (BAT) “Step up to the Plate” Marla Meehl Peter O’Neil Jim Van Dyke."— Presentation transcript:

1 The Boulder AreaTeragrid (BAT) “Step up to the Plate” Marla Meehl Peter O’Neil Jim Van Dyke

2 Outline Motivations for connecting to the Teragrid The pyramid and hierarchies of networking CENIC (California) and I-Wire (Illinois) NSF ANIR’s hierarchy for funding Distributed Terascale Facility vision Pacific Light Rail/Teragrid access to DTF Applications and benefits Costs and potential funding partners

3 Motivations Make UCAR/NCAR, NOAA/NIST, and CU- Boulder (BAT) facilities (supercomputers, mass storage, visualization, models, datasets, and networking technologies,) available to the Teragrid community Enable advanced scientific application services Investment in infrastructure to position the BAT for emerging federal program funding

4 Motivations Initiate the BAT collaboration in national and international “experimental” grid networking Accelerate exchange of technical expertise Natural extension of BRAN Prudent and scalable path for long term Political clout of being “on” the Teragrid

5 PITAC Report Considerations President's Information Technology Advisory Committee (PITAC) NSF Centers 1 (no more) generation behind ASCI Support for massive petabyte databases Near future Bandwidth across country as fast if not faster than bandwidth within centers 10GHz processor’s by end of decade for petaflop computing LSN projects and funding driven by PITAC report(s)

6 NSF Review of NCAR/SCD “…advanced networking capabilities will be required to make this data available to the community that NCAR serves.” “NCAR must lead the way to this future.” be a “player” on the national stage - not follow or lag behind “SCD will need to increase its investment in research – research carefully directed at the most critical problems faced by the computational atmospheric sciences community.” “The panel recognizes that the provision of computing, datasets, data analysis, and networking services to the university community is central to the mission of SCD.” “NCAR must continue to develop its networking capabilities, otherwise, it will not be able to serve its proper role as a data repository for the atmospheric sciences community.”

7 Our Challenge “The world has seemingly grown smaller through astonishing advances in telecommunications, but we have, at the same time, a vastly greater appreciation of the complexity and interrelatedness of the physical and human spheres that form the coupled Earth system. NSF now uses new terms such as ‘planetary metabolism’ and planetary ecology’ to capture the need to think in a more integrated sense about humanity’s relationship with the natural world. These scientific challenges are indeed grand.”  NCAR Strategic Plan - 2001

8 SCD’s Challenge “New to SCD is a focus on using this [intellectual] leadership as a transformational engine for NCAR and its community through convergence of elements of the information technology revolution, such as collaborative environments and connect to NSF’s Teragrid of distributed computing and data services.”  NCAR Strategic Plan - 2001

9 CENIC Pyramid

10 Future Networks Operational High Performance (Production) Networks Experimental Infrastructure Networks Research Networks Networking – ANIR/NSF

11 High Performance (Production) Networks Abilene, vBNS+, FedNets (ESnet, DREN, NREN) Essential Tool for Research and Education Always available and dependable 24/7 High Performance International Connectivity Exciting Future NSF support for focused Activities e.g. middleware, measurement initiatives, network simulation Networking – ANIR/NSF

12 Experimental Networks – PLR/NLR, I-Wire, Teragrid High performance trials of cutting-edge networks Based on advanced application needs unsupported by existing production network services Ultra High Speed – one or more 10-40 Gig waves Robust enough to support application-dictated development of application software toolkits, middleware, computing, and networking Provide delivered experimental services on a persistent basis, yet encourage experimentation with innovative and novel concepts International Connectivity Networking – ANIR/NSF

13 Research Net – Point to Point Waves, DTF Experiment with Disruptive Technologies Design of Experiments Implementation of Experiments Evaluation of Results Smaller-scale network prototypes which enable basic scientific and engineering network research and testing of component technologies, protocols, and network architectures Networking – ANIR/NSF

14 “The Network is the Supercomputer”

15 Distributed/Extended Machine Rooms (MR2MR) DTF

16 12/5/01 Critical Mass Sites Top 10 Res. Univ.: Next 15 Res. Univ: Key Centers, Labs: Intl. 10gig & DTF Increasingly with the broadband & even private waves – fiber needed for e2e experimental/developmental networks

17 draft 12/4/01 Critical Mass Sites Top 10 Res. Univ.: Next 15 Res. Univ: Centers, Labs: Intl. 10gig & Key Hubs Leverage Regional Connections Incent fatter/dedicated pipes Enable significant e2e Connect Scientists/Labs/Devices Establish Tera/MetaPop Centers

18 Enabling New Class of Applications Data intensive computing Collaboration technology Distance visualization Workflow management and collaborative problem solving environments Management of large-scale, distributed, multi- institutional systems, e.g., Grid Sensornet Hierarchical data delivery

19 Applications Turbulence Big Data 1000 year climate data Earth Systems Grid Data Portals Atmospheric reanalysis Windows to the Universe NSF Cyber-infrastructure – eScience NCAR is being asked to be a leader here Doppler radar networks

20 What If DTF Fails? The data, network, and application focused research and development could still make the Teragrid a success The BAT could be a big part of this success Custom networks (tailor-made) Efficient networks that are geographically placed Data repositories Agile optical transport networks Enhancing and scaling networks

21 Benefit to Non-Teragrid BAT Users Benefits all users if we can access big datasets on the NCAR Mass Storage System and other storage systems Pre-positioning to be highly competitive in e-Science funding environment Access to broader data repositories Testbeds for refining standards and technology in a limited environment for application dictated services on a broader basis (in cooperation with private sector) Positions Colorado for state-level advanced networking efforts

22 Additional Benefits to UCAR Universities of Experimental Network “The pursuit of knowledge drives a researcher’s experimental design, which in turn, determines the scientific resources required, which then drives the information resources and services required. Or, that is how it should be, from an application’s point of view. A major complaint from application scientists is that historic funding mechanisms for FedNets create the opposite order, whereby networks define the limits of the applications. Furthermore, end-to-end requirements have not been addressed; the problem of routinely getting from science machines in the sites/campuses to the high-performance wide-area network is unsolved.”  NSF CISE Grand Challenges in e-Science Workshop Report 2002

23 Experimental Networks to Incubate Paradigm Shift “Networks should be described as collections of application services rather than by their circuits, their theoretical bandwidth or their architectures, and experimental networks are the only likely means for incubating this paradigm shift… e-Science developers care only about services delivered at the application level – such as observed data transfer rates, video frame rates, reliable multicasting, and inter- organizational security and authentication capabilities. Delivery of application services requires a vertical integration effort – from the network infrastructure level all the way to the application layer, requiring a ‘paradigm shift’ in the way the Nation thinks about high-performance networks.”  NSF CISE Grand Challenges in e-Science Workshop Report 2002

24 Cost Sharing and Funding Opportunities UCAR/NCAR, NOAA/NIST, and CU Boulder cost share for experimental/ research efforts NSF ANIR and ATM support for NCAR inter- connectivity to Teragrid Year 3 Teragrid funds ($37.5M) January solicitation for TeraPoPs as core nodes of “experimental” network infrastructure for optical application transfer March solicitation for experimental network research Potential DOE funding

25 Funding Criteria Setting priorities between areas of research ultimately requires tradeoffs between different goals, quality of life, and expanding the frontiers of human knowledge and understanding The allocation of funds to research is primarily a political process Promote tools, technologies, or facilities that can accelerate the pace of discovery in geosciences Enable cyber-infrastructure for integrated and interdisciplinary studies CRA Testimony to Senate Advisory Committee

26 Budget Assumptions Two cost options Includes cost of spare equipment Boulder fiber interconnect costs undetermined No direct attachment to the DTF

27

28 Teragrid/Light Rail Cost Summary – Worst Case ItemNCAR Only One- Time NCAR RecurringShared One-TimeShared Recurring Denver Node$1M (10 yr IRU)$200K$333K$67K Juniper Router$500K (3 yr life)$50K$167K$17K Light Rail Sub- Total $1.5M$250K$500K$84K Boulder-Denver Dark Fiber $500K (10 yr IRU)$75K$167K$25K DWDM Equipment $1M (3 yr life)$100K$333K$33K NCAR Juniper Router $1M (3 yr life)$100K$333K$33K Boulder-Denver Sub-Total $2.5M$275K$833K$91K TOTAL$4M$525K$1.33M$175K

29

30 Teragrid/Light Rail Cost Summary – Current Realistic Estimate ItemNCAR Only One-Time NCAR Recurring Shared One- Time Shared Recurring Denver Node$1M (10 yr IRU)$200K$333K$67K 10 GigE Switch$200K (3 yr life)$20K$67K$7K Light Rail Sub- Total $1.2M$220K$400K$74K Boulder-Denver Dark Fiber $150K (10 yr IRU) $30K$50K$10K 10 GigE Switch$800K (3 yr life)$80K$267K$27K Boulder-Denver Sub-Total $950K$110K$317K$37K TOTAL $2.15M$330K$717K$110K

31 Other Cost Considerations Savings UCAR/NCAR/NOAA FRGP OC12 = $100,000/year CU-Boulder FRGP/4-campus OC12 = $100,000/year

32 Summary Facilitates/defines the BAT national presence Participation strongly encouraged by SDSC, NCSA, PSC, NSF, and Light Rail Critical to ensure long term funding opportunities Valuable long term investment incremental cost of network upgrades minimized “We can’t afford not to participate”


Download ppt "The Boulder AreaTeragrid (BAT) “Step up to the Plate” Marla Meehl Peter O’Neil Jim Van Dyke."

Similar presentations


Ads by Google