Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.

Slides:



Advertisements
Similar presentations
Grid Technology Transfer through The Matrix Project IP Proposal for the Call 5 Jarek Nabrzyski et al. IP Proposal for the Call 5 Jarek.
Advertisements

Conference xxx - August 2003 Fabrizio Gagliardi EDG Project Leader and EGEE designated Project Director Position paper Delivery of industrial-strength.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
High Performance Computing Course Notes Grid Computing.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
Cyberinfrastructure and Scientific Collaboration: Application of a Virtual Team Performance Framework to GLOW II Teams Sara Kraemer Chris Thorn Value-Added.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor- a Project and a System.
Structural Genomics – an example of transdisciplinary research at Stanford Goal of structural and functional genomics is to determine and analyze all possible.
The flight of the Condor - a decade of High Throughput Computing Miron Livny Computer Sciences Department University of Wisconsin-Madison
Douglas Thain (Miron Livny) Computer Sciences Department University of Wisconsin-Madison High-Throughput.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
Cloud Computing projects in Engineering Harold Castro, Ph.D. Associate professor Systems and Computing Engineering COMIT (Communications and IT) Research.
Welcome to HTCondor Week #14 (year #29 for our project)
Miron Livny Computer Sciences Department University of Wisconsin-Madison From Compute Intensive to Data.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Peter Couvares Computer Sciences Department University of Wisconsin-Madison High-Throughput Computing With.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Harnessing the Capacity of Computational.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Commodity Computing.
1/8 Enhancing Grid Infrastructures with Virtualization and Cloud Technologies Ignacio M. Llorente Business Workshop EGEE’09 September 21st, 2009 Distributed.
Welcome to CW 2007!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Taking stock of Grid technologies - accomplishments and challenges.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
1 Pan-American Advanced Studies Institute (PASI) Program Grid Computing and Advanced Networking Technologies for e-Science Mendoza, Argentina May 15-21,
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
Condor Team Welcome to Condor Week #10 (year #25 for the project)
DV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science DOE: Scientific Collaborations at Extreme-Scales:
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The Grid the united computing power Jian He Amit Karnik.
1 Condor Team 2011 Established 1985.
Authors: Ronnie Julio Cole David
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Packaging & Testing: NMI & VDT.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Miron Livny Computer Sciences Department University of Wisconsin-Madison The Role of Scientific Middleware in the Future of HEP Computing.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison High Throughput Computing (HTC)
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor and (the) Grid (one of.
Welcome!!! Condor Week 2006.
Douglas Thain, John Bent Andrea Arpaci-Dusseau, Remzi Arpaci-Dusseau, Miron Livny Computer Sciences Department, UW-Madison Gathering at the Well: Creating.
Welcome to CW 2008!!!. The Condor Project (Established ‘85) Distributed Computing research performed by.
Condor Project Computer Sciences Department University of Wisconsin-Madison Condor Introduction.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Since computing power is everywhere, how can we make it usable by anyone? (From Condor Week 2003, UW)
Condor A New PACI Partner Opportunity Miron Livny
Clouds , Grids and Clusters
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Collaborations and Interactions with other Projects
Miron Livny John P. Morgridge Professor of Computer Science
Dean Martin Cadwallader Dean of the Graduate School
Presentation transcript:

Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and A Model

The Condor Project (Established ‘85) Distributed systems research performed by a team that faces  software engineering challenges in a UNIX/Linux/NT environment,  active interaction with users and collaborators,  and the daily maintenance and support challenges of a real-life distributed production environment. Funding - NSF, NASA,DoE, DoD, IBM, INTEL, Microsoft and the UW Graduate School.

National Grid Efforts › National Technology Grid - NCSA Alliance (NSF-PACI) › Information Power Grid (NASA) › Particle Physics Data Grid (DoE)

Applications › Optimization - UW, ANL, NW › High Energy Physics - INFN, UNM, UW, Caltech › Biological Sciences - UW, UMN › Animation - UW, C.O.R.E › Software Engineering - Oracle › JAVA - NASA

CS Collaborations › Argonne National Lab (Globus) - Grid middleware › Universitat Autonoma de Barcelona - Scheduling of Master-Worker Applications › Clark Atlanta University - User Interfaces

Funding Distribution

Concept(s)

“ … Since the early days of mankind the primary motivation for the establishment of communities has been the idea that by being part of an organized group the capabilities of an individual are improved. The great progress in the area of inter-computer communication led to the development of means by which stand-alone processing sub-systems can be integrated into multi-computer ‘communities’. … “ Miron Livny, “ Study of Load Balancing Algorithms for Decentralized Distributed Processing Systems.”, Ph.D thesis, July 1983.

Every Community needs a Matchmaker!

Why? Because..... someone has to bring together members who have requests for goods and services with members who offer them.  Both sides are looking for each other  Both sides have constraints  Both sides have preferences

High Throughput Computing For many experimental scientists, scientific progress and quality of research are strongly linked to computing throughput. In other words, they are less concerned about instantaneous computing power. Instead, what matters to them is the amount of computing they can harness over a month or a year --- they measure computing power in units of scenarios per day, wind patterns per week, instructions sets per month, or crystal configurations per year.

Computing is a Commodity Raw computing power is everywhere - on desk- tops, shelves, and racks. It is  cheap  dynamic,  distributively owned,  heterogeneous and  evolving.

Master-Worker (MW) computing is common and Naturally Parallel. It is by no means Embarrassingly Parallel. Doing it right is by no means trivial.

A Tool

Our Answer to High Throughput MW Computing on commodity resources

The Condor System A High Throughput Computing system that supports large dynamic MW applications on large collections of distributively owned resources developed, maintained and supported by the Condor Team at the University of Wisconsin - Madison since ‘86.  Originally developed for UNIX workstations.  Fully integrated NT version is available.  Deployed world-wide by academia and industry.  More than 1300 CPUs at the U of Wisconsin.  Available at

Condor CPUs on the UW Campus

Some Numbers-CS Pool Total since 6/98 4,000,000hours ~450 years “Real” Users1,700,000hours ~260 years CS-Optimization610,000hours CS-Architecture350,000hours Physics245,000hours Statistics80,000hours Engine Research Center38,000hours Math90,000hours Civil Engineering27,000hours Business970hours “External” Users165,000hours ~19 years MIT76,000hours Cornell38,000hours UCSD38,000hours CalTech18,000hours

A Model for...

CS-Domain Collaborations Multi disciplinary research that advances the state-of-the-art in CS and a domain science.  Based on mutual respect and understanding of objectives, needs, constraints and culture  Leverage expertise, resources and funding  Enables experimental Computer Science  Enables speculative science

Campus Scientific Computing Support the increasing demand from domain scientists for advanced computing, storage and networking services  Computing power  State-of-the-art middle-ware and libraries  Access to experts who understand the nature and dynamics of scientific computing  Cycles for class/research projects

Software Distribution and Support Making software developed in academia available to academic and commercial users.  Legal and technical support for software distribution  Infrastructure for “for-fee” support  Blueprint for dealing with IP rights

Do not be picky, be agile!!!