Download presentation
Presentation is loading. Please wait.
Published byLydia West Modified over 9 years ago
1
Information Technology at Purdue Presented by: Dr. Gerry McCartney Vice President and CIO, ITaP HPC User Forum September 8-10, 2008 Using SiCortex SC5832 for scientific computing
2
Scientific computing at Purdue The discovery and research arm of central IT at Purdue is the Rosen Center for Advanced Computing. The center provides high performance computing systems and storage for measured and computed data to faculty doing computationally intensive cutting-edge science, engineering, and social science research. Top quality user and system support enables researchers to focus on research rather than worry about maintaining a computer system. Rosen Center has 55 full-time staff.
3
Scientific computing at Purdue Steele community cluster: 812 8-core Dell 1950 systems Black cluster, Purdue’s portion of the IEDC machine at Indiana University with 256 IBM JS21 Blades Seven other clusters, ranging from the Radon cluster, consisting of desktop PCs recycled from computing labs, to Pete, consisting of 166 HP dual-processor, dual-core DL 40 systems
4
Scientific computing at Purdue SGI Altix 4700 system IBM Power5 system Storage infrastructure: Home directories, scratch, and archival 70.5 TB Fibre Channel and SATA disk, served from a BlueArc Titan 2100 cluster (2 nodes) Scratch: 210 TB SATA disk served by a BlueArc 2500 cluster (2 nodes) Archival: DiskXtender UNIX/Linux (DXUL), 1.5 TB SCSI disk as temporary cache connected to a 6670 slot ADIC/Quantum Scalar 10K tape library; capacity of the entire library is 1.2 PB, while capacity of the archive portion is 750 TB
5
Scientific computing at Purdue Research computing capacity has grown, in aggregate peak teraflops, from about 14 teraflops in 2006 to 100 teraflops in 2008
6
Condor Condor pool of nearly 20,000 machines takes otherwise wasted computer cycles from idle desktops and puts them to good use. Resources are provided for open scientific research through Purdue’s TeraGrid site.
7
High power, low energy SiCortex uses radical green technology to achieve high performance computing at a much lower power and cooling cost. SiCortex SC5832 installed in June is being used by researchers in mechanical, materials, and aeronautical and astronautical engineering.
8
SiCortex SC5832 users at Purdue Alejandro Strachan, materials engineering Previous solution: Various supercomputing resources at Purdue, which handled the problems but lost time to overhead as problems scaled up Current work: Tailored Sandia National Labs-developed LAMMPs molecular dynamics simulation software for the SiCortex machine, then researchers ran jobs using the full complement of 3,240 processors Findings: Code scaled close to perfectly to its full complement of processors Interprocessor communication appears to offset the speed deficit of the lower power chips
9
SiCortex SC5832 users at Purdue “It allows us to do simulations much larger than we could do with other machines here. … If you don’t have very fast communications between the processors it ends up killing you. That made [using this machine] very attractive to us.” —Alejandro Strachan, materials engineering
10
SiCortex SC5832 users at Purdue John Abraham, mechanical engineering Previous solution: Resources of the National Center for Supercomputing Applications (NCSA) Current work: Code for running flow and chemical reactions using 128, 256, and 1,024 processors of the SiCortex machine Findings: Code ported easily to the SiCortex machine with help from Rosen Center staff Code scaled well at the 128- and 256-processor levels Existing programs did not scale as well on 1,024 processors, so they need to be refined for this purpose
11
SiCortex SC5832 users at Purdue After scaling up to 1,024 … “I’d like to use four times that many. … SiCortex is something attractive for us and promising so far.” —John Abraham, mechanical engineering
12
SiCortex SC5832 users at Purdue Greg Blaisdell, aeronautical and astronautical engineering Previous solution: Big Ben at Pittsburgh Supercomputing Center, a more than 21-teraflop Cray XT3 system Current work: Reworked some of his research team’s own code written in Fortran 90 to take advantage of as many SiCortex system processors as possible, and compared two codes from large eddy simulations on SiCortex and Big Ben Findings: Easy to get up and running, with a student running a job the same day he gained access to the system SiCortex showed slightly better scaling up from a minimum of 16 cores to 128, but Big Ben tested better on a second code using up to 1,024 cores A potential benefit is that testing on the SiCortex machine might help refine codes to better prepare for petascale computing
13
SiCortex SC5832 users at Purdue “We wanted to take advantage of as many processors as possible. … My student was really pleased at how easy it was to get things up and running.” —Greg Blaisdell, aeronautical and astronautical engineering
14
SiCortex SC5832 users at Purdue Steve Frankel, mechanical engineering Previous solution: Researcher’s own and central Purdue clusters and the IBM Blue Gene supercomputer at Argonne National Lab, using 16,000 processing cores Current work: Frankel’s own homogeneous isotropic turbulence simulation (HITS) code used 1,024 cores in the SiCortex system with dedicated space to run continually Finding: Scaling performance was comparable to the Blue Gene
15
SiCortex SC5832 users at Purdue “The number of points in a flow and number of time steps our work incorporates require high performance computing to avoid waiting days, weeks, months, or years for results. Our codes run best on a lot of cores. … Working with the national lab led us to try the SiCortex computer when it arrived at Purdue.” —Steve Frankel, mechanical engineering
16
Get comparative data on power usage Gain experience in scaling different types of problems Find the SiCortex sweet spot, where absolute execution time matches or betters conventional machines
17
mccart@purdue.edu The End
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.