Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Great White SHARCNET. SHARCNET: Building an Environment to Foster Computational Science Shared Hierarchical Academic Research Computing Network.

Similar presentations


Presentation on theme: "The Great White SHARCNET. SHARCNET: Building an Environment to Foster Computational Science Shared Hierarchical Academic Research Computing Network."— Presentation transcript:

1 The Great White SHARCNET

2 SHARCNET: Building an Environment to Foster Computational Science Shared Hierarchical Academic Research Computing Network

3 l How do you study the first milliseconds of the universe? l And then change the rules?

4 l How do you study materials that can’t be made in a lab (yet)? l On the surface of the sun? l At the centre of the earth?

5 l How do you study the effects of an amputated limb on blood flow through the heart? l How do you repeat the experiment? l Where do you get volunteers?

6 Increasingly, the answer to “how” in these questions is by using a computer In addition to experimental and theoretical, we now have computational science!

7 So what does this have to do with Western and SHARCNET???? And GreatWhite?

8 SHARCNET was created to meet the needs of computational researchers in South Western Ontario. Western is the lead institution and the administrative home of SHARCNET.

9 Vision To establish a world-leading, multi-university and college, interdisciplinary institute with an active academic-industry partnership, enabling forefront computational research in critical areas of science, engineering and business.

10 Focus on Infrastructure and Support l Support development of computational approaches for research in science, engineering, business, social sciences l Computational facilities l People

11 Computational Focus l Provide world-class computational facilities and support l Explore new computational models Build on “Beowulf” model

12 “Beowulf” l 6 th century Scandinavian hero l Yes, but not in this context: Collection of separate computers connected by standard communications l What’s so interesting about this?

13 First Beowulf l First explored in 1994, two researchers at NASA l Built out of “common” computers 16 Intel ‘486 processors 10Mb ethernet l To meet specific computational needs

14 Beowulf “Philosophy” l Build out of “off the shelf computational components” l Take advantage of increased capabilities, reliability and cost effectiveness of mass market computers l Take advantage of parallel computation l “Price/Performance”: “cheap” supercomputing!

15 Growth of Beowulf l Growth in number and size of “Beowulf clusters” l Continued development of mass produced computational elements l Continued development of communication technologies l Development of new parallel programming techniques

16 SHARCNET l Sought to exploit “Beowulf” approach l High performance clusters: “Beowulf on steroids” Powerful “off the shelf” computational elements Advanced communications l Geographical separation (local use) l Connect clusters: emerging optical communications

17 Great White! l Processors: 4 alpha processors: 833Mhz (4p-SMP) 4 Gb of memory 38 SMPs: a total of 152 processors l Communications 1 Gb/sec ethernet 1.6 Gb/sec quadrics connection l November 2001: #183 in the world Fastest academic computer in Canada 6 th fastest academic computer in North America

18 Great White

19 SHARCNET l Extend “Beowulf” approach to clusters of high performance clusters l Connect clusters: “clusters of clusters” Build on emerging optical communications Initial configuration used optical equipment from telecommunications industry l Collectively a supercomputer!

20 Clusters Across Universities GUELPH MAC UWO 108 128 48 152 Optical communication

21 Experimental Computational Environment Deeppurple 48 processors Optical communication (8Gb/sec) Greatwhite 152 processors

22 Technical Issues l Resource Management How to intelligently allocate and monitor resources Availability –Failure rates multiplied by number of systems –Job migration, check pointing Performance l User Management

23 Technical Issues l Data Management Must get the data to the processors Some data sets are too large to move Some HPC centers now focusing on “Data Grids” vs “Computation Grids” l Most Sharcnet programs use MPI which can run either over TCP or the Quadrics transport layer.

24 SHARCNET: More than this! l Not just computational facilities l Focus on computational resources to provide innovative science Support Build research community l CFI-OIT: Infrastructure l ORDCF: People and programs

25 Objectives l Provide state of the art computational facilities l Develop a network of HPC Clusters l Facilitate & enable world class computational research l Increase pool of people skilled in HPC techniques & processes l Evaluate & create computational Grid as a means of providing supercomputing capabilities l Achieve log term self sustainability l Create major business opportunities in Ontario

26 Operating Principles l Partnership among institutions l Shared resources l Equality of opportunity l Services at no cost to researchers

27 Partners Academic:Industry: University of Guelph Hewlett Packard McMaster UniversityQuadrics Supercomputing World The University of Western Ontario Platform Computing University of Windsor Nortel Networks Wilfrid Laurier University Bell Canada Sheridan College Fanshawe College

28 Support Programs: Part 1 l Chairs Program: up to 14 new faculty l Fellowships: approx. $1 Million per year Undergraduate summer jobs Graduate scholarships Post doctoral fellowships

29 Support Programs: Part 2 l Technical Staff System administrators at sites HPC consultants at sites l Workshops l Conferences

30 Results?? l Researchers from a variety of disciplines Chemistry Physics Economics Biology l Beginning to “ramp up”

31 Chemistry l Model chemical systems and processes at the atomic and electronic levels New quantum chemical techniques New computational methods l For example: Molecular dynamics simulation of hydrogen formation in a single-site Olefin polymerization catalyst

32 Economics l Research on factors that influence people to retire l Model incorporates both health and financial factors Previous models looked at one or other Much more complex: difficult to estimate parameters

33 Materials l Understand friction and lubrication at molecular and atomic levels l Friction between polymer bearing surfaces l Two polymer bearing surfaces in sliding motion and good solvent conditions l Green: upper wall l Red: lower wall

34 Astrophysics l Galaxy merger l Approximately 100,000 particles (stars and dark matter)

35 Astrophysics l Forming giant planets from protoplanetary disks l Shows evolution of disk over about 200 years

36 Materials: Granular matter: l Understand flow of granular matter l Study the effectiveness of mixing l Study the effect of different components on mixing

37 The Future? l New members: Southwestern Ontario l Support for new science Greater computation Storage facilities l New areas Bioinformatics

38 The Future? l New Partners ( Waterloo, Brock, York, UOIT) l Additional Capacity –Storage ½ petabyte across 4 sites (multistage performance) –Network 10 Gb/s core (UWO, Waterloo, Guelph, Mac) 1 Gb/s to other SHARCNET sites 10 Gb/s to Michnet, HPCVL (?) –Upgrades Large capability machines at Mac, UWO, Guelph Large capacity machines at Waterloo Increased development sites

39 The Future? l Additional Capabilities –Visualization l Total investment –~$49M (actually there is an additional 7 million cash from HP). l 2004-2005 –With the new capabilities, Sharcnet could be in the top 100 or 150 of supercomputers. –Will be the fastest supercomputer of its kind – I.e.,a distributed system where nodes are clusters.

40

41

42 Lake Ontario Lake Huron I I X I α α I S S Windsor Western Waterloo Guelph UOIT York Fields Brock McMaster Laurier Sheridan Fanshawe Robarts Perimeter Lake Erie T α G I I 10 Gb/s α Redeployed Alphas 1 Gb/s I Itanium cluster X Xeon cluster S SMP G Grid Lab T Interconnect Topology Cluster Tape EVA Disc MSA Disc 100km 50 mi (0.1ms)

43 The Future: SHARCNET!


Download ppt "The Great White SHARCNET. SHARCNET: Building an Environment to Foster Computational Science Shared Hierarchical Academic Research Computing Network."

Similar presentations


Ads by Google