Introduction to Clusters, Rocks, and MPI Dany Guevara David Villegas Spring 2007 Florida International University
Acknowledgments This presentation is a compilation of slides from selected material presented at Rocks-A-Palooza II - May 2006, Singapore. Additionally, contents from the following sources were used: Wikipedia Globus alliance Ian Foster – A Globus Primer
What type of cluster are we going to build? a) Highly Available b) Visualization Cluster c) Computing Cluster
Live Installation of Rocks Boot the frontend node by inserting the Kernel CD.
Installation of Compute Nodes Log into Frontend node as root At the command line run: > insert-ethers
Installation of Compute Nodes Turn on the compute node Insert CD Reboot and make sure to boot off CD
Which CD is needed to boot? a.) OS Roll – Disk 1 b.) Kernel Roll c.) Service Pack Roll
Okay, we have Rocks installed. Now what? Let's write, compile, and run a parallel program. But first, let's take a closer look at Rocks.
What Linux distribution is Rocks based on?
Parallel Code Now, let's focus on how to write a parallel application.
I'm a scientist interesting in parallel computing I'm a scientist interesting in parallel computing. I have a large amounts of data, little time, and little programming experience. In the rare event of hardware failure, I'm willing to restart the application since a restart isn't life-threatening. What communication layer should I use for my application? a.) Sockets b.) Message Passing Interface (MPI) c.) Parallel Virtual Machines (PVM)
Let's log into gcb.fiu.edu and create and build a MPI hello.c program.