High Energy Physics at the OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research University.

Slides:



Advertisements
Similar presentations
Oklahoma Center for High Energy Physics A DOE EPSCoR Proposal Submitted by a consortium of University of Oklahoma, Oklahoma State University, Langston.
Advertisements

Contact: Hirofumi Amano at Kyushu 40 Years of HPC Services In this memorable year, the.
Research CU Boulder Cyberinfrastructure & Data management Thomas Hauser Director Research Computing CU-Boulder
My Road to Supercomputing Holly Krutka The University of Oklahoma.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Supercomputing in Plain English An Overview of High Performance Computing Henry Neeman, Director OU Supercomputing Center for Education & Research University.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? Henry Neeman Director OU Supercomputing Center for Education & Research September.
1. 2 Welcome to HP-CAST-NTIG at NSC 1–2 April 2008.
Henry Neeman, OSCER Director OU Supercomputing Center for Education & Research Wednesday October University of Oklahoma OSCER: State.
Parallel Programming & Cluster Computing Overview: What the Heck is Supercomputing? Henry Neeman, Director OU Supercomputing Center for Education & Research.
Henry Neeman, OSCER Director OU Supercomputing Center for Education & Research Wednesday October University of Oklahoma OSCER: State.
Parallel & Cluster Computing Supercomputing Overview Paul Gray, University of Northern Iowa David Joiner, Shodor Education Foundation Tom Murphy, Contra.
An Introduction to Princeton’s New Computing Resources: IBM Blue Gene, SGI Altix, and Dell Beowulf Cluster PICASso Mini-Course October 18, 2006 Curt Hillegas.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? CS1313 Spring 2009.
Cyberinfrastructure: Initiatives at the US National Science Foundation Stephen Nash Program Director, Operations Research U.S. National Science Foundation.
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
Parallel & Cluster Computing Linear Algebra Henry Neeman, Director OU Supercomputing Center for Education & Research University of Oklahoma SC08 Education.
Henry Neeman, OSCER Director September 25, 2003 Oklahoma Supercomputing Symposium 2003 OSCER OSCER State of the Center Address.
Institutional Research Computing at WSU: Implementing a community-based approach Exploratory Workshop on the Role of High-Performance Computing in the.
Dr. Gerry McCartney Vice President for Information Technology and System CIO Olga Oesterle England Professor of Information Technology BETTER THAN REMOVING.
Center For Research Computing (CRC), University of Notre Dame, Indiana Application of ND CRC to be a member of the OSG Council Jarek Nabrzyski CRC Director.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? Henry Neeman Director OU Supercomputing Center for Education & Research ChE.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
and beyond Office of Vice President for Information Technology.
OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research OU Information Technology University.
Parallel Programming & Cluster Computing Overview: What the Heck is Supercomputing? Henry Neeman, University of Oklahoma Charlie Peck, Earlham College.
Parallel & Cluster Computing An Overview of High Performance Computing Henry Neeman, Director OU Supercomputing Center for Education & Research University.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? PRESENTERNAME PRESENTERTITLE PRESENTERDEPARTMENT PRESENTERINSTITUTION DAY MONTH.
27 May 2004 C.N. Papanicolas EGEE and the role of IASA ( In close collaboration with UOA ) IASA GRID Steering Committee: George Kallos Lazaros.
Introduction to Research Consulting Henry Neeman, University of Oklahoma Director, OU Supercomputing Center for Education & Research (OSCER) Assistant.
Cyberinfrastructure for Distributed Rapid Response to National Emergencies Henry Neeman, Director Horst Severini, Associate Director OU Supercomputing.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
U Oklahoma and the Open Science Grid Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University of Oklahoma Condor Week 2008, University of.
CSG - Research Computing Redux John Holt, Alan Wolf University of Wisconsin - Madison.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Henry Neeman, OSCER Director OU Supercomputing Center for Education & Research Oklahoma Supercomputing Symposium 2005 Wednesday October.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
Parallel & Cluster Computing Monte Carlo Henry Neeman, Director OU Supercomputing Center for Education & Research University of Oklahoma SC08 Education.
HPCVL High Performance Computing Virtual Laboratory Founded 1998 as a joint HPC lab between –Carleton U. (Comp. Sci.) –Queen’s U. (Engineering) –U. of.
The Birmingham Environment for Academic Research Setting the Scene Peter Watkins, School of Physics and Astronomy (on behalf of the Blue Bear team)
Unclipped Condor in Windows via coLinux Unclipped Condor in Windows ® via coLinux Henry Neeman, Horst Severini, Chris Franklin, Josh Alexander University.
Contact: Hirofumi Amano at Kyushu Mission 40 Years of HPC Services Though the R. I. I.
National Science Foundation CI-TEAM Proposal: Blast on Condor How Will This Help [InstAbbrev]? Your Name Here Your Job Title Here Your Department Here.
CCS Overview Rene Salmon Center for Computational Science.
Parallel & Cluster Computing N-Body Simulation and Collective Communications Henry Neeman, Director OU Supercomputing Center for Education & Research University.
Computational Research in the Battelle Center for Mathmatical medicine.
Jonathan Carroll-Nellenback.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
OKLAHOMA Supercomputing Symposium 2011 University of Oklahoma October 11, 2011 James Wicksted, RII Project Director Associate Director, Oklahoma EPSCoR.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
North Dakota EPSCoR State Cyberinfrastructure Strategic Planning Workshop Henry Neeman, Director OU Supercomputing Center for Education & Research (OSCER)
High Performance Computing (HPC)
Brief introduction about “Grid at LNS”
Building a Community of Computational Science & Engineering
NIIF HPC services for research and education
Setting Up a Low Cost Statewide Cyberinfrastructure Initiative
Deploying Regional Grids Creates Interaction, Ideas, and Integration
Introduction to Research Facilitation
Southwest Tier 2 Center Status Report
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
Super Computing By RIsaj t r S3 ece, roll 50.
Southwest Tier 2.
Using Remote HPC Resources to Teach Local Courses
Introduction to Research Facilitation
Henry Neeman, University of Oklahoma
Introduction to Research Facilitation
Presentation transcript:

High Energy Physics at the OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research University of Oklahoma D0SAR Workshop, Thursday September

OSCER D0SAR Workshop, Thursday September People

OSCER D0SAR Workshop, Thursday September Things

OSCER

OSCER D0SAR Workshop, Thursday September What is OSCER? Multidisciplinary center Division of OU Information Technology Provides: Supercomputing education Supercomputing expertise Supercomputing resources: hardware, storage, software For: Undergrad students Grad students Staff Faculty Their collaborators (including off campus)

OSCER D0SAR Workshop, Thursday September Who is OSCER? Academic Depts Aerospace & Mechanical Engr Anthropology Biochemistry & Molecular Biology Biological Survey Botany & Microbiology Chemical, Biological & Materials Engr Chemistry & Biochemistry Civil Engr & Environmental Science Computer Science Economics Electrical & Computer Engr Finance Health & Sport Sciences History of Science Industrial Engr Geography Geology & Geophysics Library & Information Studies Mathematics Meteorology Petroleum & Geological Engr Physics & Astronomy Radiological Sciences Surgery Zoology More than 150 faculty & staff in 25 depts in Colleges of Arts & Sciences, Business, Engineering, Geosciences and Medicine – with more to come!

OSCER D0SAR Workshop, Thursday September Who is OSCER? Organizations Advanced Center for Genome Technology Center for Analysis & Prediction of Storms Center for Aircraft & Systems/Support Infrastructure Cooperative Institute for Mesoscale Meteorological Studies Center for Engineering Optimization Fears Structural Engineering Laboratory Geosciences Computing Network Great Plains Network Human Technology Interaction Center Institute of Exploration & Development Geosciences Instructional Development Program Laboratory for Robotic Intelligence and Machine Learning Langston University Mathematics Dept Microarray Core Facility National Severe Storms Laboratory NOAA Storm Prediction Center OU Office of Information Technology OU Office of the VP for Research Oklahoma Center for High Energy Physics Oklahoma Climatological Survey Oklahoma EPSCoR Oklahoma Medical Research Foundation Oklahoma School of Science & Math St. Gregory’s University Physics Dept Sarkeys Energy Center Sasaki Applied Meteorology Research Institute

OSCER D0SAR Workshop, Thursday September Center for Analysis & Prediction of Storms: daily real time weather forecasting Oklahoma Center for High Energy Physics: simulation and data analysis of banging tiny particles together at unbelievably high speeds Advanced Center for Genome Technology: bioinformatics (e.g., Human Genome Project) Biggest Consumers

OSCER D0SAR Workshop, Thursday September Who Are the Users? Over 300 users so far: over 60 OU faculty over 60 OU staff over 125 students about 40 off campus users … more being added every month. Comparison: National Center for Supercomputing Applications (NCSA), after 20 years of history and hundreds of millions in expenditures, has about 2100 users. * * Unique usernames on cu.ncsa.uiuc.edu and tungsten.ncsa.uiuc.edu

OSCER D0SAR Workshop, Thursday September Okla. Supercomputing Symposium 2006 Keynote: Dan Atkins Head of NSF’s Office of Cyberinfrastructure 2004 Keynote: Sangtae Kim NSF Shared Cyberinfrastructure Division Director 2003 Keynote: Peter Freeman NSF Computer & Information Science & Engineering Assistant Director 2005 Keynote: Walt Brooks NASA Advanced Supercomputing Division Director Join us Wed Oct OU – FREE!

OSCER Resources An ORDER OF MAGNITUDE year!

OSCER D0SAR Workshop, Thursday September OSCER Hardware TOTAL: 1484 GFLOPs*, 368 CPUs, 434 GB RAM Aspen Systems Pentium4 Xeon 32-bit Linux Cluster 270 Pentium4 Xeon CPUs, 270 GB RAM, 1.08 TFLOPs Aspen Systems Itanium2 cluster 66 Itanium2 CPUs, 132 GB RAM, 264 GFLOPs IBM Regatta p690 Symmetric Multiprocessor 32 POWER4 CPUs, 32 GB RAM, GFLOPs IBM FAStT500 FiberChannel-1 Disk Server Qualstar TLS Tape Library * GFLOPs: billions of calculations per second

OSCER D0SAR Workshop, Thursday September OSCER Hardware TOTAL: 11,300 GFLOPs*, 1838 CPUs, 3058 GB RAM Dell Pentium4 Xeon 64-bit Linux Cluster 1024 Pentium4 Xeon CPUs, 2240 GB RAM, 6553 GFLOPs Aspen Systems Itanium2 cluster 64 Itanium2 CPUs, 128 GB RAM, 256 GFLOPs NEW! Condor Pool: 750 student lab PCs, 4500 GFLOPs NEW! National Lambda Rail (10 Gbps network) COMING! Small Opteron Cluster 16 AMD Opteron CPUs/32 cores, 128 GB RAM, 2500 GB disk COMING! New tape library * GFLOPs: billions of calculations per second

OSCER D0SAR Workshop, Thursday September ,024 Intel Xeon CPUs (3.2 GHz) 2,176 GB RAM 14,000 GB disk Infiniband & Gigabit Ethernet OS: Red Hat Enterprise Linux 4 Peak speed: 6,553 GFLOPs * * GFLOPs: billions of calculations per second Intel Xeon Linux Cluster topdawg.oscer.ou.edu

OSCER D0SAR Workshop, Thursday September Intel Xeon Linux Cluster topdawg.oscer.ou.edu DEBUTED AT #54 WORLDWIDE, #9 AMONG US UNIVERSITIES, #4 EXCLUDING BIG 3 NSF CENTERS CURRENTLY #88 WORLDWIDE, #17 AMONG US UNIVERSITIES, #10 EXCLUDING BIG 3 NSF CENTERS

OSCER D0SAR Workshop, Thursday September Itanium2 1.0 GHz CPUs 128 GB RAM 5,774 GB disk OS: Red Hat Linux Enterprise 4 Peak speed: 256 GFLOPs * * GFLOPs: billions of calculations per second Purchased with NSF Major Research Instrumentation grant Itanium2 Cluster schooner.oscer.ou.edu

OSCER D0SAR Workshop, Thursday September Condor Pool Condor is a software package that allows number crunching jobs to run on idle desktop PCs. OU IT is deploying a large Condor pool (750 desktop PCs, almost 200 so far) over the course of the When fully deployed, it’ll provide a huge amount of additional computing power – more than was available in all of OSCER in And, the cost is very very low. Also, we’ve been seeing empirically that Condor gets about 90% of each PC’s time.

OSCER D0SAR Workshop, Thursday September Coming! National Lambda Rail The National Lambda Rail (NLR) is the next generation of high performance networking. From 1 Gbps to 10 Gbps in one year!

OSCER

D0SAR Workshop, Thursday September OSCER HEP on OSCER’s Linux cluster HEP on OSCER’s Condor pool HEP on OSCER’s grants

OSCER D0SAR Workshop, Thursday September HEP on OSCER’s Linux Cluster Topdawg has a special grid services node set aside explicitly for HEP activities. The grid node is available only to HEP users. Horst Severini has pseudo-root privileges on it. It has its own disk space that’s exclusively for HEP. HEP has its own queues, one for preemptable jobs and another for non-preemptable. We’re expecting D0 to be up and running next week.

OSCER D0SAR Workshop, Thursday September HEP on OSCER’s Condor Pool Samgrid is now running. Final certification in a few days. Currently almost 200 PCs but is expanding in the next few weeks to about PCs planned, but could become larger.

OSCER D0SAR Workshop, Thursday September HEP on OSCER’s Grants NSF Major Research Instrumentation ($504K) PI Neeman, Co-PI Skubic (and others) Purchased Itanium2 cluster NSF Small Grant for Exploratory Research ($132K) PI Neeman, Co-PI Severini Using Condor to make large resources available for national emergencies NSF CI-TEAM ($250K) PI Neeman, Co-PI Severini (and others) Teaching Condor use and management across the US

OSCER D0SAR Workshop, Thursday September To Learn More About OSCER

Thanks for your attention! Questions?