Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.

Slides:



Advertisements
Similar presentations
Efi.uchicago.edu ci.uchicago.edu Connecting Campus Infrastructures with HTCondor Services Lincoln Bryant Computation and Enrico Fermi Institutes University.
Advertisements

Supporting Research on Campus - Using Cyberinfrastructure (CI) Public research use of ICT has rapidly increased in the past decade, requiring high performance.
Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago July 17, 2013.
Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago June 4, 2013.
Bosco: Enabling Researchers to Expand Their HTC Resources The Bosco Team: Dan Fraser, Jaime Frey, Brooklin Gore, Marco Mambelli, Alain Roy, Todd Tannenbaum,
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
Technical Review Group (TRG)Agenda 27/04/06 TRG Remit Membership Operation ICT Strategy ICT Roadmap.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
 Amazon Web Services announced the launch of Cluster Compute Instances for Amazon EC2.  Which aims to provide high-bandwidth, low- latency instances.
Faster, More Scalable Computing in the Cloud Pavan Pant, Director Product Management.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Swift: A Scientist’s Gateway to Campus Clusters, Grids and Supercomputers Swift project: Presenter contact:
Server and Short to Mid Term Storage Funding Research Computing Funding Issues.
Welcome to HTCondor Week #14 (year #29 for our project)
Assessment of Core Services provided to USLHC by OSG.
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
1 Open Science Grid in Adolescence: Ruth Pordes Fermilab.
Ways to Connect to OSG Tuesday afternoon, 3:00 pm Lauren Michael Research Computing Facilitator University of Wisconsin-Madison.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
1.5 Engage Year 5. Prior Year Challenges staffing issues – Rynge transition to ISI in Nov/Dec ’09 – new hire started 7/12/2010; new to Grid computing,
OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum,
Connect.usatlas.org ci.uchicago.edu ATLAS Connect Technicals & Usability David Champion Computation Institute & Enrico Fermi Institute University of Chicago.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Site Lightning Report: MWT2 Mark Neubauer University of Illinois at Urbana-Champaign US ATLAS Facilities UC Santa Cruz Nov 14, 2012.
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
The Research Computing Center Nicholas Labello
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
OSG Area Coordinator’s Report: Workload Management April 20 th, 2011 Maxim Potekhin BNL
Grid Computing at The Hartford Condor Week 2008 Robert Nordlund
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Remote Cluster Connect Factories David Lesny University of Illinois.
Grid and Cloud Computing Globus Provision Dr. Guy Tel-Zur.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Authors: Ronnie Julio Cole David
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
The Condo of Condos Panel discussion Internet Annual Meeting Arlington, VA 4/22/13 1.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
GLIDEINWMS - PARAG MHASHILKAR Department Meeting, August 07, 2013.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
ATLAS Midwest Tier2 University of Chicago Indiana University Rob Gardner Computation and Enrico Fermi Institutes University of Chicago WLCG Collaboration.
Slide 1 Regional Network Collaborations Marla Meehl, UCAR/FRGP CC*IIE Collaborative Research: CC*IIE Region: Rocky Mountain Cyberinfrastructure Mentoring.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
Globus.org/genomics Globus Galaxies Science Gateways as a Service Ravi K Madduri, University of Chicago and Argonne National Laboratory
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
Identity Management in Open Science Grid Identity Management in Open Science Grid Challenges, Needs, and Future Directions Mine Altunay, James Basney,
Tier 3 Support and the OSG US ATLAS Tier2/Tier3 Workshop at UChicago August 20, 2009 Marco Mambelli –
UC3: A Framework for Cooperative Computing at the University of Chicago Lincoln Bryant Computation and Enrico Fermi.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
University of California Cloud Computing Task Force Russ Hobby.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Out of the basement, into the cloud: Extending the Tier 3 Lincoln Bryant Computation and Enrico Fermi Institutes.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
How to get the needed computing Tuesday afternoon, 1:30pm Igor Sfiligoi Leader of the OSG Glidein Factory Operations University of California San Diego.
UCS D OSG Summer School 2011 Intro to DHTC OSG Summer School An introduction to Distributed High-Throughput Computing with emphasis on Grid computing.
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Outline Expand via Flocking Grid Universe in HTCondor ("Condor-G")
OSG Rob Gardner • University of Chicago
Presentation transcript:

Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke

2 A typical campus: distributed departments, distributed researchers, distributed resources

3 Commodification, cloud technologies, and practices achieving economies of scale drive centralization of resources on campuses, condo clusters, etc. College-level funding discouraging “closet clusters”

4 Yet, for human and practical reasons there often remains a distribution of resources on campuses

5 And off-campus too… E.g. National CI grids, HPC centers, commercial clouds, science clouds …

6 Bridged Campus Cyberinfrastructures

7 Needed: services to make distributed resources transparent to campus users… …and cost effective for resource providers, wherever those resources are provisioned.

8 CI Connect Vision Accelerate research on campus by providing connective services for local, cloud and national cyberinfrastructure

9 CI Connect for research projects OSG Connect is a service for connecting users and project Groups to the national-scale Open Science Grid Provisioned as a service, using the CI Connect platform

10 UChicago UC3 Open Science Grid Amazon EC2 osgconnect.net portal login stash

11 OSG Connect resources: campus grid, opportunistic OSG, & Amazon cloud in one place – integrated with identity and research data management services Scheduling to OSG via flocking to the GlideinWMS service

12 OSG Connect projects Enabling individual researchers

13 Knowledge Synthesis Studies enabled by d-HTC M. Culbertson, UIUC computing on 19 sites in OSG M. Culbertson, UIUC computing on 19 sites in OSG

14 CI Connect Campuses An HTCondor-based job submission environment connecting local campus cluster resources, the OSG, bridged campuses, or optionally Amazon cloud resources Offered as a service (PaaS model) Deployed Duke University as the first branded CI Connect campus Deployments in progress for the University of Chicago Discussions with University of Michigan

15 Collaboration between Scalable Computing Center at Duke University and the Computation Institute at UChicago

16 Duke Condor Grid Open Science Grid UChicago UC3 Grid duke. ci-connect.net portal login stash Campus bridge to partner institution Connection to OSG Access controlled by Duke SCC

17 UC3 RCC Midway Beagle ci-connect.net portal login stash Will set this up for UChicago Computation and Enrico Fermi Institutes (Tier3 users will migrate to ATLAS Connect)

18 CI Connect Summary CI Connect offers a way to bring more CPU into users home environment without deploying IT Offers a means to share resources on and off campus with non-physics groups Integrated Globus for data management Instances deployed for: – OSG – Duke Scalable Support Center (Duke Grid) – ATLAS Started discussion with U Michigan Interest from XSEDE campus bridging