Presentation is loading. Please wait.

Presentation is loading. Please wait.

U Oklahoma and the Open Science Grid Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of.

Similar presentations


Presentation on theme: "U Oklahoma and the Open Science Grid Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of."— Presentation transcript:

1 U Oklahoma and the Open Science Grid Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of Wisconsin, Wednesday April 30 2008

2 OU and the OSG Condor Week 2008, Wed Apr 30 20082 Outline Context: OU Hardware OU and the Open Science Grid Oklahoma Cyberinfrastructure Initiative OU NSF CI-TEAM Project

3 Context: OU Hardware

4 OU and the OSG Condor Week 2008, Wed Apr 30 20084 1,024 Pentium4 Xeon CPUs (“Irwindale” 3.2 GHz EM64T) 2,176 GB RAM (800 MHz FSB) 23,000 GB disk (NFS+Lustre) Infiniband & Gigabit Ethernet OS: Red Hat Linux Enterp 4 Peak speed: 6,553.6 GFLOPs Pentium4 Xeon Cluster topdawg.oscer.ou.edu

5 OU and the OSG Condor Week 2008, Wed Apr 30 20085 Pentium4 Xeon Cluster topdawg.oscer.ou.edu DEBUTED 6/2005 #54 WORLDWIDE, #9 AMONG US UNIVERSITIES, #4 EXCLUDING BIG NSF CENTERS CURRENTLY #289 WORLDWIDE, #29 AMONG US UNIVERSITIES, #20 EXCLUDING BIG 4 NSF CENTERS www.top500.org

6 OU and the OSG Condor Week 2008, Wed Apr 30 20086 Condor Pool @ OU OU IT has deployed a large Condor pool (775 desktop PCs in dozens of labs around campus). OU’s Condor pool provides a huge amount of computing power – more than OSCER’s big cluster – in terms of Condorized PCs: if OU were a state, we’d be the 12th largest state in the US; if OU were a country, we’d be the 10th largest country in the world (other than “unknown”). Also, we’ve been seeing empirically that lab PCs are available for Condor jobs about 80% of the time.

7 OU and the OSG Condor Week 2008, Wed Apr 30 20087 Proposed New Hardware The following information is public. RFP: issued March 19, closed April 3 Architecture described: quad core, dual socket, x86-64 ~5 times as fast as topdawg (similar compute node count) ~4 times as much RAM (probably 1333 MHz FSB) ~4 times as much disk (~100 TB) high performance interconnect (probably IB) plus GigE Red Hat Enterprise Linux 5 (assumed) OU Board of Regents meets May 8-9.

8 OU and the Open Science Grid

9 OU and the OSG Condor Week 2008, Wed Apr 30 20089 OU and the Open Science Grid Currently, OU’s relationship with the OSG primarily benefits High Energy Physics: D0 project ATLAS project DOSAR project: cross disciplinary grid organization with members in other OSG VOs (including D0 and ATLAS) We have 5 OSG resources: OSCER’s large cluster (topdawg) – general purpose OSCER’s Condor pool (currently D0 only) – general purpose OUHEP’s Tier2 cluster – dedicated to HEP projects OUHEP’s desktop cluster – dedicated to HEP projects OUHEP’s OSG Integration TestBed (ITB): 8 nodes, used to test new pre-production OSG releases. We recently installed OSG 0.9.0 and bestman-xrootd (a Storage Resource Manager) as part of the integration effort.

10 OU and the OSG Condor Week 2008, Wed Apr 30 200810 OU and D0 12/26/06 - 12/26/07EventsData #1 Michigan State U33,677,5052.81 TB #2 U Oklahoma 16,516,500 1.32 TB #3 U Florida13,002,0281.07 TB #4 UC San Diego10,270,2500.81 TB #5 U Nebraska8,956,8990.71 TB #6 Indiana U4,111,7400.35 TB #7 U Wisconsin3,796,4970.30 TB #8 Louisiana Tech U3,224,4050.25 TB #9 Langston U (OK)1,574,0620.11 TB

11 OU and the OSG Condor Week 2008, Wed Apr 30 200811 OU D0 Breakdown OSCER’s big cluster (topdawg) 8,020,250 events (6th in the US), 0.66 TB OSCER Condor pool 6,024,000 events (6th in the US), 0.49 TB Dedicated OU HEP Tier3 cluster 2,472,250 events (9th in the US), 0.16 TB Notes: Without OSCER’s Condor pool, OU would be #4. Without OSCER’s cluster, OU would be #6. Without OU HEP’s dedicated Tier3 cluster, OU would still be #2.

12 OU and the OSG Condor Week 2008, Wed Apr 30 200812 OU and ATLAS 4/4/2007 – 4/27/2008Wallclock Hours #1 Boston U325,700 #2 U Chicago297,600 #3 Indiana U235,400 #4 Michigan State U170,000 #5 UT Arlington160,300 #6 U Oklahoma 145,700 http://gratia-osg.fnal.gov:8880/gratia-reporting/ Note: A buggy version of gratia ran on OU’s resources until 4/3/2008.

13 OU and the OSG Condor Week 2008, Wed Apr 30 200813 OU: First in the World OU was the first institution in the world to simultaneously run ATLAS and D0 grid production jobs on a general-purpose, multi-user cluster. Most grid production jobs run on dedicated clusters that are reserved for one or the other of these projects, or on Condor pools.

14 OU and the OSG Condor Week 2008, Wed Apr 30 200814 OU’s Collaborations OU plays key roles in: Oklahoma Center for High Energy Physics (OCHEP) Collaboration between OU, Oklahoma State U and Langston U (HBU) Funded by a Dept of Energy EPSCoR grant ATLAS Southwest Tier2: OU, Langston U, U Texas Arlington DOSAR (Distributed Organization for Scientific and Academic Research) OU, Langston U, U Arizona, Iowa State, Kansas State, U Kansas, Louisiana Tech, Louisiana State, Rice, U Mississippi, U Texas Arlington, Universidade Estadual Paulista (Brazil), Cinvestav (Mexico)

15 OU and the OSG Condor Week 2008, Wed Apr 30 200815 OU Helps with Condor OU has helped set up Windows/coLinux/Fedora Condor pools at: Oklahoma State U U Texas Arlington

16 Oklahoma Cyberinfrastructure Initiative

17 OU and the OSG Condor Week 2008, Wed Apr 30 200817 OK Cyberinfrastructure Initiative Oklahoma is an EPSCoR state. Oklahoma recently submitted an NSF EPSCoR Research Infrastructure Proposal (up to $15M). This year, for the first time, all NSF EPSCoR RII proposals MUST include a statewide Cyberinfrastructure plan. Oklahoma’s plan – the Oklahoma Cyberinfrastructure Initiative (OCII) – involves: all academic institutions in the state are eligible to sign up for free use of OU’s and Oklahoma State U’s centrally-owned CI resources; other kinds of institutions (government, NGO, commercial) are eligible to use, though not necessarily for free. OCII includes building a Condor flock between OU (775 PCs) and OSU (~300 PCs). We’ve already helped OSU set up their Condor pool; they just need to roll out the deployment, and then we’ll be able to use it for HEP/OSG.

18 OU’s NSF CI-TEAM Project

19 OU and the OSG Condor Week 2008, Wed Apr 30 200819 OU’s NSF CI-TEAM Project OU recently received a grant from the National Science Foundation’s Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21st Century Workforce (CI-TEAM) program. Objectives: Provide Condor resources to the national community Teach users to use Condor and sysadmins to deploy and administer it Teach bioinformatics students to use BLAST over Condor

20 OU and the OSG Condor Week 2008, Wed Apr 30 200820 OU NSF CI-TEAM Project teach students and faculty to use FREE Condor middleware, stealing computing time on idle PCs; teach system administrators to deploy and maintain Condor on PCs; teach bioinformatics students to use BLAST on Condor; provide Condor Cyberinfrastructure to the national community (FREE). Condor pool of 775 desktop PCs (already part of the Open Science Grid); Supercomputing in Plain English workshops via videoconferencing; Cyberinfrastructure rounds (consulting) via videoconferencing; drop-in CDs for installing full-featured Condor on a Windows PC (Cyberinfrastructure for FREE); sysadmin consulting for installing and maintaining Condor on desktop PCs. OU’s team includes: High School, Minority Serving, 2-year, 4-year, masters-granting; 18 of the 32 institutions are in 8 EPSCoR states (AR, DE, KS, ND, NE, NM, OK, WV). Cyberinfrastructure Education for Bioinformatics and Beyond Objectives:OU will provide:

21 OU and the OSG Condor Week 2008, Wed Apr 30 200821 OU NSF CI-TEAM Project Participants at OU (29 faculty/staff in 16 depts) Information Technology OSCER: Neeman (PI) College of Arts & Sciences Botany & Microbiology: Conway, Wren Chemistry & Biochemistry: Roe (Co-PI), Wheeler Mathematics: White Physics & Astronomy: Kao, Severini (Co-PI), Skubic, Strauss Zoology: Ray College of Earth & Energy Sarkeys Energy Center: Chesnokov College of Engineering Aerospace & Mechanical Engr: Striz Chemical, Biological & Materials Engr: Papavassiliou Civil Engr & Environmental Science: Vieux Computer Science: Dhall, Fagg, Hougen, Lakshmivarahan, McGovern, Radhakrishnan Electrical & Computer Engr: Cruz, Todd, Yeary, Yu Industrial Engr: Trafalis OU Health Sciences Center, Oklahoma City Biochemistry & Molecular Biology: Zlotnick Radiological Sciences: Wu (Co-PI) Surgery: Gusev Participants at other institutions (62 faculty/staff at 31 institutions in 18 states) 1.California State U Pomona (masters-granting, minority serving): Lee 2.Colorado State U: Kalkhan 3.Contra Costa College (CA, 2-year, minority serving): Murphy 4.Delaware State U (masters, EPSCoR): Lin, Mulik, Multnovic, Pokrajac, Rasamny 5.Earlham College (IN, bachelors): Peck 6.East Central U (OK, masters, EPSCoR): Crittell,Ferdinand, Myers, Walker, Weirick, Williams 7.Emporia State U (KS, masters-granting, EPSCoR): Ballester, Pheatt 8.Harvard U (MA): King 9.Kansas State U (EPSCoR): Andresen, Monaco 10.Langston U (OK, masters, minority serving, EPSCoR): Snow, Tadesse 11.Longwood U (VA, masters): Talaiver 12.Marshall U (WV, masters, EPSCoR): Richards 13.Navajo Technical College (NM, 2-year, tribal, EPSCoR): Ribble 14.Oklahoma Baptist U (bachelors, EPSCoR): Chen, Jett, Jordan 15.Oklahoma Medical Research Foundation (EPSCoR): Wren 16.Oklahoma School of Science & Mathematics (high school, EPSCoR): Samadzadeh 17.Purdue U (IN): Chaubey 18.Riverside Community College (CA, 2-year): Smith 19.St. Cloud State University (MN, masters): J. Herath, S. Herath, Guster 20.St. Gregory’s U (OK, 4-year, EPSCoR): Meyer 21.Southwestern Oklahoma State U (masters, EPSCoR, tribal): Linder, Moseley, Pereira 22.Syracuse U (NY): Stanton 23.Texas A&M U-Corpus Christi (masters): Scherger 24.U Arkansas Fayetteville (EPSCoR): Apon 25.U Arkansas Little Rock (masters, EPSCoR): Hall, Jennings, Ramaswamy 26.U Central Oklahoma (masters-granting, EPSCoR): Lemley, Wilson 27.U Illinois Urbana-Champaign: Wang 28.U Kansas (EPSCoR): Bishop, Cheung, Harris, Ryan 29.U Nebraska-Lincoln (EPSCoR): Swanson 30.U North Dakota (EPSCoR): Bergstrom, Hoffman, Majidi, Moreno, Peterson, Simmons, Wiggen, Zhou 31.U Northern Iowa (masters-granting): Gray E E E E

22 OU and the OSG Condor Week 2008, Wed Apr 30 200822 Okla. Supercomputing Symposium 2006 Keynote: Dan Atkins Head of NSF’s Office of Cyber- infrastructure 2004 Keynote: Sangtae Kim NSF Shared Cyberinfrastructure Division Director 2003 Keynote: Peter Freeman NSF Computer & Information Science & Engineering Assistant Director 2005 Keynote: Walt Brooks NASA Advanced Supercomputing Division Director http://symposium2008.oscer.ou.edu/ 2007 Keynote: Jay Boisseau Director Texas Advanced Computing Center U. Texas Austin Tue Oct 7 2008 @ OU Over 225 registrations already! Over 150 in the first day, over 200 in the first week, over 225 in the first month. FREE! Parallel Computing Workshop Mon Oct 6 @ OU FREE! Symposium Tue Oct 7 @ OU 2008 Keynote: José Munoz Deputy Office Director/ Senior Scientific Advisor Office of Cyber- infrastructure National Science Foundation

23 OU and the OSG Condor Week 2008, Wed Apr 30 200823 To Learn More about OSCER http://www.oscer.ou.edu/

24 Thanks for your attention! Questions?


Download ppt "U Oklahoma and the Open Science Grid Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of."

Similar presentations


Ads by Google