National Science Foundation CI-TEAM Proposal: Blast on Condor How Will This Help [InstAbbrev]? Your Name Here Your Job Title Here Your Department Here.

Slides:



Advertisements
Similar presentations
Condor use in Department of Computing, Imperial College Stephen M c Gough, David McBride London e-Science Centre.
Advertisements

RESEARCH IT Presentation for the Faculty Senate February 2013 Loretta Early, CIO.
Online Holiday Shopping Brings Great Deals – and Fraud This lesson is part of the iKeepCurrent TM Program, provided by iKeepSafe TM.
Supercomputing in Plain English High Throughput Computing Henry Neeman, Director OU Supercomputing Center for Education & Research Blue Waters Undergraduate.
BOINC Berkeley Open Infrastructure for Network Computing An open-source middleware system for volunteer and grid computing (much of the images and text.
Amazon. Cloud computing also known as on-demand computing or utility computing. Similar to other utility providers like electric, water, and natural gas,
A 1:1 ratio of computers to students in Years 9-12 at Marist-Sion College Peter Houlahan, Principal.
Dinker Batra CLUSTERING Categories of Clusters. Dinker Batra Introduction A computer cluster is a group of linked computers, working together closely.
Parallel Programming & Cluster Computing High Throughput Computing Henry Neeman, University of Oklahoma Charlie Peck, Earlham College Tuesday October 11.
TLTR Emerging Technologies Bert Wachsmuth Math and Computer Science Seton Hall University.
Conductor A Framework for Distributed, Type-checked Computing Matthew Kehrt.
1 Distributed, Internet and Grid Computing. 2 Distributed Computing Current supercomputers are too expensive ASCI White (#1 in TOP500) costs more than.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Copyright Anthony K. Holden, This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,
Scientific Computing on Smartphones David P. Anderson Space Sciences Lab University of California, Berkeley April 17, 2014.
VAP What is a Virtual Application ? A virtual application is an application that has been optimized to run on virtual infrastructure. The application software.
Ch 4. The Evolution of Analytic Scalability
EirplayMedia (c) 2009 EirplayMedia Game Production Cycle.
Public-resource computing for CEPC Simulation Wenxiao Kan Computing Center/Institute of High Physics Energy Chinese Academic of Science CEPC2014 Scientific.
Statewide IT Conference, Bloomington IN (October 7 th, 2014) The National Center for Genome Analysis Support, IU and You! Carrie Ganote (Bioinformatics.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research OU Information Technology University.
Three steps to sell Office Always ask every customer the following questions to get them interested in buying Office: Did you know that Office.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Introduction to Research Consulting Henry Neeman, University of Oklahoma Director, OU Supercomputing Center for Education & Research (OSCER) Assistant.
Horst Severini Chris Franklin, Josh Alexander University of Oklahoma Implementing Linux-Enabled Condor in Windows Computer Labs.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Cyberinfrastructure for Distributed Rapid Response to National Emergencies Henry Neeman, Director Horst Severini, Associate Director OU Supercomputing.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
INTRODUCTION TO VIRTUALIZATION KRISTEN WILLIAMS MOSES IKE.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
The Computer Science Labs Important Information. Undergraduate Lab Sites Located in EMS on the 2 nd floor The EMS building labs are: – E270 ASCII and.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
Derek Wright Computer Sciences Department University of Wisconsin-Madison MPI Scheduling in Condor: An.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
and Citizen Cyber-Science David P. Anderson Space Sciences Laboratory U.C. Berkeley.
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
Derek Wright Computer Sciences Department University of Wisconsin-Madison Condor and MPI Paradyn/Condor.
IT 499 Bachelor Capstone Week 3. Adgenda Administrative Review UNIT two UNIT threeProject UNIT four Preview Project Status Summary.
1 Title: Introduction to Computer Instructor: I LTAF M EHDI.
GRID activities in Wuppertal D0RACE Workshop Fermilab 02/14/2002 Christian Schmitt Wuppertal University Taking advantage of GRID software now.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Horst Severini, Chris Franklin, Josh Alexander, Joel Snow University of Oklahoma Implementing Linux-Enabled Condor in Windows Computer Labs.
OKLAHOMA Supercomputing Symposium 2011 University of Oklahoma October 11, 2011 James Wicksted, RII Project Director Associate Director, Oklahoma EPSCoR.
Advisory Board Meeting March Earth System Science.
Research Computing at the SSCC. What We Do Provide computing support of Social Science researchers. Focus on statistical computing.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Software sales at U Waterloo Successfully moved software sales online Handle purchases from university accounts Integrated with our Active Directory and.
High Energy Physics at the OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research University.
CernVM and Volunteer Computing Ivan D Reid Brunel University London Laurence Field CERN.
Volunteer Computing and BOINC Dr. David P. Anderson University of California, Berkeley Dec 3, 2010.
The Future of Volunteer Computing David P. Anderson U.C. Berkeley Space Sciences Lab UH CS Dept. March 22, 2007.
Volunteer Computing: Involving the World in Science David P. Anderson U.C. Berkeley Space Sciences Lab February 16, 2007.
Setting Up a Low Cost Statewide Cyberinfrastructure Initiative
The Whys/Whats/Hows of Proposal Writing
Cloud-Hosted Desktop Virtualization:
Volunteer Computing: SETI and Beyond David P
David P. Anderson Space Sciences Lab UC Berkeley LASER
Grid Computing Colton Lewis.
Grid Means Business OGF-20, Manchester, May 2007
ITIS 1210 Introduction to Web-Based Information Systems
Introduction to Research Facilitation
Ch 4. The Evolution of Analytic Scalability
Henry Neeman, University of Oklahoma
Columbia Area Linux User’s Group
Introduction to Research Facilitation
Presentation transcript:

National Science Foundation CI-TEAM Proposal: Blast on Condor How Will This Help [InstAbbrev]? Your Name Here Your Job Title Here Your Department Here Your Institution Name Here Date of Presentation Your Institution’s Logo Here

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 2 Outline What is Condor? What is BLAST? The Condor/BLAST NSF CI-TEAM Project Condor Practicalities Summary

What is Condor?

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 4 Desktop PCs Are Idle Half the Day Desktop PCs tend to be very active during the workday. But at night, during most of the year, they’re idle. So we’re only getting half their value.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 5 Supercomputing at Night We’ve got lots of desktop PCs that are idle during the evening and during intersessions. Wouldn’t it be great to put them to work on something useful to [InstAbbrev]? For example, [PROJECT(S) AT YOUR INSTITUTION].

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 6 Supercomputing at Night Example SETI – the Search for Extra-Terrestrial Intelligence – is looking for evidence of green bug-eyed monsters on other planets, by mining radio telescope data. runs number crunching software as a screensaver on idle PCs around the world. There are many similar projects: (protein folding) climateprediction.net (Laser Interferometer Gravitational wave Observatory) (Large Hadron Collider) (proteins for biomedical research) Great Internet Mersenne Prime Search (mathematics)

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 7 Condor is Like Condor steals computing time on existing desktop PCs when they’re idle. Condor runs in background when no one is sitting at the desk. Condor allows us to get much more value out of the hardware that we’ve already purchased, because there’s little or no idle time.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 8 Condor is Better Than Condor is general purpose and can work for any “loosely coupled” application. Condor can do all of its I/O over the network, not using the desktop PC’s local disk. Anyone who uses Condor is automatically doing Grid Computing, which the federal research funding agencies are really pushing.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 9 How is Condor Helpful? FREE under the proposed NSF project (if funded). Repurpose idle time on existing desktop PCs – we’ve already paid for the hardware, and the rest of the costs will be covered by NSF (if funded). Enable research that involves lots of computing. Share multiple independent Condor pools among various institutions (“flocking”) – so we’ll get access to far more computing resources than we’ve actually paid for. Quick & dirty Grid Computing resource, to get us accustomed to using and hosting Grids.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 10 Flocking Each institution can have its own, independently managed Condor pool. Independently owned and managed Condor pools can flock – behave as if they are one giant superpool. On local PCs here at [InstAbbrev], top priority will always go to our local users. But, we can get access to other institutions’ Condor pools.

BLAST

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 12 What is BLAST? BLAST (Basic Linear Alignment Search Tool) is the most popular bioinformatics software package. BLAST compares strips of genetic data (A,C,G,T) against databases of known genetic information. BLAST is used by everyone from undergraduates to top researchers.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 13 BLAST on Condor The Condor group at the University of Wisconsin has already implemented BLAST on Condor. They are providing free help with deploying BLAST on the NSF CI-TEAM project’s Condor pool. So, we’ll have access to a huge BLAST resource – thousands of PCs.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 14 How Will We Use BLAST? [FILL THIS IN YOURSELF.]

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 15 How Will We Use Condor? [FILL THIS IN YOURSELF.]

NSF CI-TEAM Project

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 17 NSF CI-TEAM Program The NSF Cyberinfrastructure TEAM program is a brand new program. It is providing grants of up to $250,000 for up to 2 years. One of CI-TEAM’s goals is to expand Cyberinfrastructure – for example, supercomputing – to institutions and people that traditionally haven’t had much access.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 18 Our NSF CI-TEAM Project The University of Oklahoma (OU) is leading an NSF CI-TEAM project. The focus is on setting up Condor pools all over Oklahoma and the region. The kickstart application will be BLAST, but these Condor pools will be available for any appropriate application. Most of the money in OU’s CI-TEAM proposal will go to institutions like ours, for software licenses and PCs to manage the Condor pools.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 19 An Added Bonus The OU Supercomputing Center for Education & Research (OSCER), which is leading this CI- TEAM project, has several large supercomputers. OSCER’s policy is that anyone can have access to OSCER’s supercomputers, if they are on a project that has an OU faculty or staff member as the Principal or Co-Principal Investigator. The PI of this NSF CI-TEAM project will be at OU. So, as a bonus, everyone at [InstAbbrev] who is on the CI-TEAM project – and their students – will get FREE access to OSCER’s supercomputers!

Condor Practicalities

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 21 What Will Be Needed? We will need: 1 PC to manage Condor; for each PC to be Condorized: 1 VMware software $85 per PC (includes 2 nd year of support); Condor software $0 per PC; Linux software $0 per PC. But we won’t have to pay any of this!

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 22 Cost to [InstAbbrev]: $0 Cost to NSF CI-TEAM project: $85 per PC for software; $1500 per institution for hardware: 1 PC to manage the pool of Condor-enabled PCs. The NSF CI-TEAM grant will pay for all of this. So, we should only commit to Condor ONLY if the NSF CI-TEAM proposal is funded.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 23 Our Obligations: $0 [InstAbbrev] will be obliged to pay NOTHING: If the NSF CI-TEAM proposal is fully funded, then there will be ample money available for our costs. If the NSF CI-TEAM proposal is funded at a reduced budget level, then we’ll reduce the number of Condor-enabled PCs proportionally. If the NSF CI-TEAM proposal isn’t funded, then the project simply won’t happen.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 24 Our Obligations: Labor Our IT staff will install Condor and the related support software on our PCs and the freely provided management PC. OU IT staff will provide FREE help and advice. OU IT staff WON’T TOUCH our PCs, unless we specifically ask them to. Our labor cost is expected to be in the range of worker hours per year.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 25 Security To date, there have been no known incidents of a security breach attributable to Condor – despite the fact that Condor is currently running on over 50,000 PCs around the world. Condor can run entirely without administrator/root privileges. If there are firewall issues, Condor can work within them.

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 26 What Will We Do with Condor? “Loosely coupled” problems : zillions of small jobs [LOCAL EXAMPLES HERE]

Summary

InstLogo NSF-Funded Condor at [InstAbbrev] Presentation Date 28 Summary Get more value out of hardware already paid for Costs to [InstAbbrev] $0 Modest labor with free help No security issues Includes free access to OU supercomputers Can do exciting projects [Examples]