North Dakota EPSCoR State Cyberinfrastructure Strategic Planning Workshop Henry Neeman, Director OU Supercomputing Center for Education & Research (OSCER)

Slides:



Advertisements
Similar presentations
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Advertisements

RESEARCH IT Presentation for the Faculty Senate February 2013 Loretta Early, CIO.
Contact: Hirofumi Amano at Kyushu 40 Years of HPC Services In this memorable year, the.
High Performance Computing and CyberGIS Keith T. Weber, GISP GIS Director, ISU.
Research CU Boulder Cyberinfrastructure & Data management Thomas Hauser Director Research Computing CU-Boulder
User Perspective: Using OSCER for a Real-Time Ensemble Forecasting Experiment…and other projects Currently: Jason J. Levit Research Meteorologist, Cooperative.
SAN DIEGO SUPERCOMPUTER CENTER at the UNIVERSITY OF CALIFORNIA; SAN DIEGO SDSC RP Update October 21, 2010.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
CAC we enable your success 5/15/2015www.cac.cornell.edu1 High Performance Computing Center Sustainability NSF Workshop May 3-5, 2010 Stanley Ahalt –
Information Technology Center Introduction to High Performance Computing at KFUPM.
Parallel Programming & Cluster Computing Overview: What the Heck is Supercomputing? Henry Neeman, Director OU Supercomputing Center for Education & Research.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? CS1313 Spring 2009.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
18:15:32Service Oriented Cyberinfrastructure Lab, Grid Deployments Saul Rioja Link to presentation on wiki.
OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research OU Information Technology University.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? Henry Neeman, Director OU Supercomputing Center for Education & Research Blue.
Parallel Programming & Cluster Computing Overview: What the Heck is Supercomputing? Henry Neeman, University of Oklahoma Charlie Peck, Earlham College.
SDSC RP Update TeraGrid Roundtable Reviewing Dash Unique characteristics: –A pre-production/evaluation “data-intensive” supercomputer based.
Parallel & Cluster Computing An Overview of High Performance Computing Henry Neeman, Director OU Supercomputing Center for Education & Research University.
Supercomputing in Plain English Overview: What the Heck is Supercomputing? PRESENTERNAME PRESENTERTITLE PRESENTERDEPARTMENT PRESENTERINSTITUTION DAY MONTH.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Parallel Programming & Cluster Computing Overview: What the Heck is Supercomputing? Dan Ernst Andrew Fitz Gibbon Tom Murphy Henry Neeman Charlie Peck Stephen.
Cyberinfrastructure for Distributed Rapid Response to National Emergencies Henry Neeman, Director Horst Severini, Associate Director OU Supercomputing.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
The Sharing and Training of HPC Resources at the University of Arkansas Amy Apon, Ph.D. Oklahoma Supercomputing Symposium October 4, 2006.
The Red Storm High Performance Computer March 19, 2008 Sue Kelly Sandia National Laboratories Abstract: Sandia National.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
EPSCoR Cyberinfrastructure Assessment Workshop North Dakota Jurisdictional Assessment October 15, 2007 Bonnie Neas VP for IT North Dakota State University.
Edinburgh Investment in e-Science Infrastructure Dr Arthur Trew.
Grid Computing at The Hartford Condor Week 2008 Robert Nordlund
COSEE: History and Development Don Elthon Program Director, Ocean Sciences Education U.S. National Science Foundation.
Contact: Hirofumi Amano at Kyushu Mission 40 Years of HPC Services Though the R. I. I.
National Science Foundation CI-TEAM Proposal: Blast on Condor How Will This Help [InstAbbrev]? Your Name Here Your Job Title Here Your Department Here.
20 October Management of Information Technology Chapter 6 Chapter 6 IT Infrastructure and Platforms Asst. Prof. Wichai Bunchua.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Parallel Programming & Cluster Computing An Overview of High Performance Computing Henry Neeman, University of Oklahoma Paul Gray, University of Northern.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
What is MCSR? Who is MCSR? What Does MCSR Do? Who Does MCSR Serve? What Kinds of Accounts? Why Does Mississippi Need Supercomputers? What Kinds of Research?
High Performance Computing - Starting out Small EDUCAUSE Campus Cyberinfrastructure Workshop Snowmass, CO August 5, 2006 Bonnie Neas Interim Deputy CIO.
ACI-REF Virtual Residency Henry Neeman, University of Oklahoma Director, OU Supercomputing Center for Education & Research (OSCER) Assistant Vice President,
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Regional Cyberinfrastructure Planning Great Plains Network Greg Monaco, Ph.D. Director for Research and Cyberinfrastructure Initiatives
Supercomputing in Plain English What is a Cluster? Blue Waters Undergraduate Petascale Education Program May 23 – June
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
NICS Update Bruce Loftis 16 December National Institute for Computational Sciences University of Tennessee and ORNL partnership  NICS is the 2.
OKLAHOMA Supercomputing Symposium 2011 University of Oklahoma October 11, 2011 James Wicksted, RII Project Director Associate Director, Oklahoma EPSCoR.
Advisory Board Meeting March Earth System Science.
北京大学计算中心 Computer Center of Peking University Building an IaaS System in Peking University Weijia Song April 23, 2012.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Hopper The next step in High Performance Computing at Auburn University February 16, 2016.
High Energy Physics at the OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research University.
High Performance Computing Center ACI-REF Virtual Residency August 7-13, 2016 How do Design a Cluster Dana Brunson Asst. VP for Research Cyberinfrastructure.
Possible Impacts of 2015 America COMPETES Act on Oklahoma EPSCoR
Setting Up a Low Cost Statewide Cyberinfrastructure Initiative
What is HPC? High Performance Computing (HPC)
Deploying Regional Grids Creates Interaction, Ideas, and Integration
Clouds , Grids and Clusters
The Shifting Landscape of CI Funding
Using Remote HPC Resources to Teach Local Courses
Regional Cyberinfrastructure Planning
Introduction to Research Facilitation
Henry Neeman, University of Oklahoma
Supercomputing CS1313 Fall 2018.
Introduction to Research Facilitation
Presentation transcript:

North Dakota EPSCoR State Cyberinfrastructure Strategic Planning Workshop Henry Neeman, Director OU Supercomputing Center for Education & Research (OSCER) University of Oklahoma Thursday March Institutional and Statewide Cyberinfrastructure: Dollars and Sense

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March PEOPLE

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March THINGS

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March What is Cyberinfrastructure?

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March What is a Ship? “… [W]hat a ship is … It's not just a keel and hull and a deck and sails. That's what a ship needs. But what a ship is... is freedom.” – “Pirates of the Caribbean”

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March What Cyberinfrastructure NEEDS High Performance Computing (supercomputing) Simulation Optimization Data Mining High Performance Networking: moving data fast High Throughput Computing: harnessing idle PCs to do number crunching Grid/Cloud/Utility Computing: linking geographically dispersed systems to tackle larger problems Scientific Visualization: turning a vast sea of data into pictures and movies that a person can understand Shared Resources: sensor networks, instruments, data collections

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March What Cyberinfrastructure IS Information-intensive techniques and technologies that enable our best thinkers, inventors and implementers – academic, government, industry and non-profit – to discover and design new generations of knowledge, products, capabilities and opportunities in science, technology, engineering and mathematics.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March What are HPC & HTC Used For? Simulation of physical phenomena – things that are too big, too small, too slow, too fast, too expensive or too dangerous to study in real life Weather forecasting Protein folding Energy management Optimization: picking the best mix of stuff Data mining: finding needles of information in a haystack of data Gene sequencing Detecting storms that might produce tornados Visualization: turning a vast sea of data into pictures that a scientist can understand Moore, OK Tornadic Storm Courtesy Kelvin Droegemeier Courtesy Mordecai-Mark Mac Low Courtesy Greg Bryan & Mike Norman

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Cyberinfrastructure in Oklahoma

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March ,076 Intel Xeon CPU chips/4304 cores 529 dual socket/quad core Harpertown 2.0 GHz, 16 GB each 3 dual socket/quad core Harpertown 2.66 GHz, 16 GB each 3 dual socket/quad core Clovertown 2.33 GHz, 16 GB each 2 x quad socket/quad core Tigerton, 2.4 GHz, 128 GB each 8,800 GB RAM ~100 TB globally accessible disk QLogic Infiniband Force10 Networks Gigabit Ethernet Red Hat Enterprise Linux 5 Peak speed: 34.6 TFLOPs* *TFLOPs: trillion calculations per second OU: Dell Intel Xeon Linux Cluster sooner.oscer.ou.edu

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March DEBUTED NOVEMBER 2008 AT: #90 worldwide #47 in the US #14 among US academic #10 among US academic excluding TeraGrid OU: Dell Intel Xeon Linux Cluster sooner.oscer.ou.edu

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Purchased mid-July 2008 First friendly user Aug Full production Oct Christmas Day 2008: >~75% of nodes and ~66% of cores were in use. OU: Dell Intel Xeon Linux Cluster sooner.oscer.ou.edu

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OSU: Several Medium-sized Supercomputers Pistol Pete: Linux cluster, 64 compute nodes, 512 CPU cores, 1024 GB RAM, 25 TB disk, Infiniband, 5.4 TFLOPs, being deployed now. Cimarron: Linux cluster, 14 compute nodes, 112 CPU cores, 224 GB RAM, 3.5 TB disk, GigE, 896 GFLOPs, deployed summer Spur: Shared memory machine with 4 quad core 2.4 GHz, 128 GB RAM, 1.5 TB disk, deployed spring Bullet: Linux cluster, 64 compute nodes, 128 CPU cores, 256 GB RAM, 5 TB disk, Infiniband, 820 GFLOPs, deployed fall 2006.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Fastest Supercomputer vs. Moore GFLOPs: billions of calculations per second

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OU: Condor Pool Condor is a software technology that allows idle desktop PCs to be used for number crunching. OU IT has deployed a large Condor pool (795 desktop PCs in IT student labs all over campus). It provides a huge amount of additional computing power – more than was available in all of OSCER in TFLOPs peak compute speed. And, the cost is very very low – almost literally free. Also, we’ve been seeing empirically that Condor gets about 80% of each PC’s time. OU and OSU have worked together to get a comparable Condor pool set up at OSU.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OSU: Condor Pool In collaboration with OU IT, OSU IT is deploying a large Condor pool (approx 300 desktop PCs in IT student labs all over campus). Once deployed, OU’s and OSU’s Condor pools will “flock” together, allowing jobs to migrate invisibly between institutions, but giving each institution’s jobs priority on their own PCs.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Networking: National LambdaRail

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Networking: Internet2

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Why High Performance Networking? Ability to move large datasets rapidly Natural hazards prediction & mitigation –OK: tornadoes Large scale data-intensive projects – OK: high energy physics 10 Gigabits per second = 100,000 Gigabytes per day Telepresence (including, potentially, telemedicine) A Seat At The Table Without high performance networking, we wouldn’t be in the running for major national initiatives.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Oklahoma Cyberinfrastructure Initiative

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OK Cyberinfrastructure Initiative (OCII) All academic institutions in Oklahoma are eligible to sign up for free use of OU’s and OSU’s centrally-owned CI resources. Other kinds of institutions (government, NGO, commercial) are eligible to use, though not necessarily for free. Everyone can participate in our CI education initiative (not just in OK). UND, Meridian Environmental Technology Inc Oklahoma Supercomputing Symposium open to all (10th Symposium Oct ).

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OCII History NSF EPSCoR Research Infrastructure Improvement (Track 1) Call for Proposals: “Cyberinfrastructure” appears 18 times. OK EPSCoR asked for statewide CI plan Meeting: OU & OSU CI leads, OU & OSU CIOs, OneNet director, OK EPSCoR director & associate director, science theme leaders Sketched out OCII plan OU helped OSU set up Condor pool. Providing “Supercomputing in Plain English” overview talk statewide (have visited 11 of 13 public universities, 7 of 12 private universities).

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OCII Accomplishments Other NSF EPSCoR Grants Track 2: Ecoinformatics (ecology) with Kansas: $6M ($3M to OK) C2: Networking: $1.17M to OK – includes OK’s only HBCU, Tribal Colleges NSF MRI Grant: Oklahoma PetaStore ($793K to OU) OCII provides supercomputer accounts to many external users (outside of OU and OSU): 75+ users at 24 academic, government, industry and nonprofit organizations (and counting) – up from 34 users at 15 institutions since Dec OCII is directly responsible for $20M in NSF grants to OK since our first meeting in 2006 (Track1, Track2, C2, MRI).

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OCII Outcomes

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OSCER State of the Center Address Wednesday October External Funding Outcomes (OU) External research funding facilitated by OSCER (Fall Fall 2009): $186M+ total, $100M+ to OU Funded projects: OU faculty and staff in 19 academic departments and 2 other campus organizations (research centers etc) Comparison: Fiscal Year (July 2001 – June 2010): OU Norman externally funded research expenditure: $611M Since being founded in fall of 2001, OSCER has enabled research projects comprising more than 1 / 7 of OU Norman's total externally funded research expenditure, with more than a 7-to-1 return on investment.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March OSCER State of the Center Address Wednesday October Publications Facilitated by OSCER 600+ publications facilitated by OSCER 2010: 128+ papers 2009: 105 papers 2008: : : : : : : : 3 These papers would have been impossible, or much more difficult, or would have taken much longer, without OSCER’s direct, hands-on help.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Education Outcomes “Supercomputing in Plain English” participants at 167 academic, government, industry and nonprofit groups, including: OU, OSU + 12 other OK academic institutions 5 OK govt agencies (state, federal, military) 4 OK companies 1 OK non-governmental agency 36 other academic institutions in 18 other EPSCoR jurisdictions (including ND) Research universities, masters-granting, bachelors-granting, community colleges, high school, middle school; minority- serving “Supercomputing in Plain English” overview talk 20 OK academic institutions, several other OK organizations

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Lessons Learned Cooperation between institutions is essential: among PhD-granting institutions; between PhD-granting and masters- and bachelors-granting; between Minority Serving Institutions and others; between academic and non-academic. A CI win for any institution in the state is a win for the whole state. Providing users across the state with access to centrally- owned resources at PhD-granting institutions costs little but returns a lot. Help the institutions that you think are your rivals; it turns out that they’re your best advocates. CI has a huge Return on Investment.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Cyberinfrastructure and Students

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Moore’s Law In 1965, Gordon Moore was an engineer at Fairchild Semiconductor. He noticed that the number of transistors that could be squeezed onto a chip was doubling about every 18 months. It turns out that computer speed is roughly proportional to the number of transistors per unit area. Moore wrote a paper about this concept, which became known as “Moore’s Law.”

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Moore’s Law Observed GFLOPs: billions of calculations per second

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March CI and Young North Dakotans Historically we’ve seen that whatever happens at the high end of computing today will be on your desktop in 10 to 15 years. So, for North Dakota undergrads to learn about high end Cyberinfrastructure puts them ahead of the curve: they’ll be more competitive now; they’ll know something important about the future.

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March The Future is Now Computing power doubles every 18 months. Undergrads are about 20 years old. Life expectancy in the US is about 80 years. So they have about 60 years left. How much more powerful will computers be in 60 years? 15 years = 10 doublings = 1024x ~ approx 1000 times better 30 years = 20 doublings ~ approx a million times better 45 years = 30 doublings ~ approx a billion times better 60 years = 40 doublings ~ approx a trillion times better The problem with the future: “The future always comes too fast, and in the wrong order.” (Alvin Toffler)

Institutional and Statewide Cyberinfrastructure: Dollars and Sense (Neeman) North Dakota EPSCoR State CI Conference, Thursday March Thanks for your attention! Questions?