Www.ci.anl.gov www.ci.uchicago.edu Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago June 4, 2013.

Slides:



Advertisements
Similar presentations
Duke Enterprise CMS CGS Meeting 5/7/2004 Cheryl Crupi Senior Manager, Duke OIT Office of Web Services.
Advertisements

Overview What is the National ITS Architecture? User Services
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago July 17, 2013.
The Internet2 NET+ Services Program Jerry Grochow Interim Vice President CSG January, 2012.
Help communities share knowledge more effectively across the language barrier Automated Community Content Editing PorTal.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
Tom Sheridan IT Director Gas Technology Institute (GTI)
Technical Review Group (TRG)Agenda 27/04/06 TRG Remit Membership Operation ICT Strategy ICT Roadmap.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
DEV392: Extending SharePoint Products And Technologies Through Web Parts And ASP.NET Clint Covington, Program Manager Data And Developer Services - Office.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
Thee-Framework for Education & Research The e-Framework for Education & Research an Overview TEN Competence, Jan 2007 Bill Olivier,
DCS Architecture Bob Krzaczek. Key Design Requirement Distilled from the DCS Mission statement and the results of the Conceptual Design Review (June 1999):
Open Cloud Sunil Kumar Balaganchi Thammaiah Internet and Web Systems 2, Spring 2012 Department of Computer Science University of Massachusetts Lowell.
Annual SERC Research Review - Student Presentation, October 5-6, Extending Model Based System Engineering to Utilize 3D Virtual Environments Peter.
Assessment of Core Services provided to USLHC by OSG.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
BMC Software confidential. BMC Performance Manager Will Brown.
Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over the Internet. Cloud is the metaphor for.
Near East Rural & Agricultural Knowledge and Information Network - NERAKIN Food and Agriculture Organization of the United Nations Near East and North.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
OSG Area Coordinators Campus Infrastructures Update Dan Fraser Miha Ahronovitz, Jaime Frey, Rob Gardner, Brooklin Gore, Marco Mambelli, Todd Tannenbaum,
Connect.usatlas.org ci.uchicago.edu ATLAS Connect Technicals & Usability David Champion Computation Institute & Enrico Fermi Institute University of Chicago.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
Address Maps and Apps for State and Local Governments
The Research Computing Center Nicholas Labello
An Introduction to Progress Arcade ™ June 12, 2013 Rob Straight Senior Manager, OpenEdge Product Management.
IPlant Collaborative Tools and Services Workshop iPlant Collaborative Tools and Services Workshop Collaborating with iPlant.
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
PALMS update Marco Mambelli 18/9/ PALMS project OASIS provides the infrastructure to host the software in CVMFS but the users need more guidance.
Evolution of the Open Science Grid Authentication Model Kevin Hill Fermilab OSG Security Team.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Discovery Environment Overview.
Russ Hobby Program Manager Internet2 Cyberinfrastructure Architect UC Davis.
GBIF Mid Term Meetings 2011 Biodiversity Data Portals for GBIF Participants: The NPT Global Biodiversity Information Facility (GBIF) 3 rd May 2011.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Last Updated 1/17/02 1 Business Drivers Guiding Portal Evolution Portals Integrate web-based systems to increase productivity and reduce.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
IBM Software Group ® Managing Reusable Assets Using Rational Suite Shimon Nir.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Fire Emissions Network Sept. 4, 2002 A white paper for the development of a NSF Digital Government Program proposal Stefan Falke Washington University.
Windows SharePoint Services. Overview Windows SharePoint Services (WSS) Information Worker Infrastructure component delivered in Windows Server 2003 Enables.
State of Georgia Release Management Training
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
V7 Foundation Series Vignette Education Services.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Keith Chadwick 1 Metric Analysis and Correlation Service. CD Seminar.
© 2012 Eucalyptus Systems, Inc. Cloud Computing Introduction Eucalyptus Education Services 2.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
Esri UC 2014 | Technical Workshop | Address Maps and Apps for State and Local Government Allison Muise Nikki Golding Scott Oppmann.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Dev Test on Windows Azure Solution in a Box
Leigh Grundhoefer Indiana University
Remedy Integration Strategy Leverage the power of the industry’s leading service management solution via open APIs February 2018.
Presentation transcript:

Campus Grids & Campus Infrastructures Community Rob Gardner Computation Institute / University of Chicago June 4, 2013 (revised & condensed for June 26 Area Coordinator’s meeting)

2 Campus area Will discuss all but BOSCO in Campus area Not everything has been captured in JIRA yet but we’re working on it Obviously there are many dependencies throughput OSG++: User Support, Production, Operations, Accounting, Software, GlideinWMS, PandaWMS, Networking and facilities teams

OSG CONNECTION SERVICES PLATFORM OF SERVICES FOR CAMPUS-BASED RESEARCHERS

4 Task Launch Cycle Share Software Share Data Share Recasting Campus Grids as Platform of Services Accelerate engagement Suite of services for campuses Connecting science to resources with increasing capability Data Local access Anywhere access Transfer services Advanced analytics Software Campus access Anywhere access Tasks Resource access Campus Grid Cloud HPC

5 What are the elements? Graduated Platform of Services – Campus Engagement & Identity Integration tools – Job management: BOSCO and its extensions + pure HTCondor – Distributed software access (OASIS, PALMS, PARROT) – Distributed data access (SRM, XRD, HTTP, SKELETONKEY) – Accounting and Informatics services for cycle sharing (GRATIA, CIVAIS) Campus Infrastructures Community – Forum, meetings, context to drive adoption, gather feedback, register impact – Tutorials, demonstrators, campus blueprints, engagements

6 Capabilities Foundational – Campus identity (  federated, grid) – Job management over diverse resources – Ubiquitous software and data access – Monitoring and accounting services Practical – Application best practices on d-HTC – Advanced workflow services – Advanced user interfaces

7 Simplify job submission to OSG Build off experience from OSG-XSEDE Avoid the burden of VO creation for new communities Get them going quickly, using a carrier VO and pilot submission service (glideinWMS or PandaWMS) Leverage campus identities and new tools which accelerate uptake

8 High Level Block Diagram (Chander) Layout Chander’s diagram

login.osgconnect.net login1.osgconnect.net login2.osgconnect.net loginn.osgconnect.net … OSG operations production services Production facilities contributing to OSG VOVO VOVO gssi-ssh ssh bosco Carrier VO

OSG Connect Web Service Leveraged existing implementation at UC3 Leverages CHTC portal, UC3 UBolt Working with OSG Security to get InCommon and OSG Connect service working together Effort for this coming from ATLAS Tier 2 Expect to have first version of this together in time for the Duke workshop Fall-back plans with reduced capability

OSG Connect Submit Infrastructure Start small, but plan for a scale of 1000 users submitting through multiple VO carriers and pilot factories Start with existing OSG mechanisms Expressed interest to engage ATLAS Tier 3 community, providing access to “beyond pledge” resources at ATLAS sites – Will setup a separate VO front end for this purpose

Distributed Software Access Parrot and SkeletonKey Motivated by UC3 users needing to have an easy way to remotely access their software and data on clusters around campus Designed as an easy alternative for users to manually using Parrot – Also Chirp for data access Provides an easy to use configuration file

Design Goals Provide an easily understood way for users to incorporate remote software/data access into their current workflows Allow users to expand their computations to incorporate opportunistic usage of other campus clusters Eventually, allow users to expand from campus grids to using OSG opportunistically

Current Work Have initial ‘version one’ implementation Working with three groups to incorporate SkeletonKey in their workflows and actively utilize campus grid environment at U of C Incorporating user feedback into current code and updating features based on user needs More input and feedback welcome from Parrot experts out there

SkeletonKey OSG AHM With SkeletonKey Directly

PALMS project OASIS: – provides the infrastructure to host the software in CVMFS but users need more guidance to install the software (1) and to access it from OSG resources (2) Programs, Applications and Libraries Management and Setup (PALMS) – A system to install and manage software in OASIS – Simplifies the packaging and installation of different versions for different platforms – Helps users to setup the correct and desired environment (applications and libraries)

PALMS software manager features Help packaging application and deploying it on OASIS (or into any CVMFS stratum) Allow installs, updates and removals of applications and libraries Allow multiple versions for distinct platforms Allow multiple versions for the same platform Does not require root on the OASIS server Can manage and solve dependencies and conflicts Help adapting and installing native packages

PALMS user features Help select the correct version for the platform Provide a default version but allow to choice Setup the correct environment for the user shell Work automatically with different shells Add no performance penalty compared to default OASIS

PALMS activities Project planning – Presentations and white paper Software development, packaging and documentation Deployment on OSG OASIS and on UC3 Librarian (software manager) activities for the OSG VO

CIVAIS project There is a lot of information about the operation of a Campus Infrastructure or OSG Processed information is more valuable than raw data Data and info differ by role (researcher, PI, computer center director, funding program manager, network administrator, …) CIVAIS: Campus Infrastructures Visual Analytics and Informatics Services – A analytics service collecting information form a Campus Infrastructure – Provides clear, concise and flexible views for users – And an open data platform (policy based) to stimulate derived metrics and 3 rd party apps for advanced analytics

Use Case Example (1) What do computing center executive/steering committees most want to know? – How are resources being used – Are they serving investing stakeholders fairly, as well as the broader university community – Is capacity meeting demand – Which technologies (processing, storage, network, visualization) are most likely to yield the most benefit to the most users – How do we judge the effectiveness of resource usage for advancing the scientific goals of our stake-holding partners

Use Case Example (2) What does the OSG Executive Team most want to know? – How effective is the campus program in engaging new users and research communities on campuses – Which disciplines, outside of high energy physics, have received benefit from OSG – What are the impacts of OSG services and technologies on accelerating the scientific mission of our stake-holding organizations as well as the national

CIVAIS key features Design starting from the user experience Multiple roles determining access policies and interests Interactive extensible web displays tailored to the role of the user Designed for a Campus Infrastructure Easy to install and deploy on a Campus Hierarchical reporting for a wider community (e.g. OSG CIC) Highly scalable – a single Campus reporting running on a single machine, bigger and more complex structures scaling on a distributed architecture Pluggable and open interface – Accepting multiple inputs (Gratia, message queues, etc…) – Ability to add custom views to the display – Open Data available via RESTful API

CIVAIS architecture diagram

CIVAIS architecture highlights Highly scalable and reliable data warehouse Multiple data inputs including Gratia server and probes and Google documents and Web forms Message bus for flexible and reliable communication (double arrows in the diagram) RESTful API for controlled data access Multiplatform portal using HTML5 and vector graphics for viewing, browsing and exporting data Standard plug-in definition for both data input and viewer extension

CIVAIS activities Project planning – Presentations and white paper Project mock-ups and evaluation Software development, packaging and documentation Deployment and testing on UC3

OSG CAMPUS INFRASTRUCTURES COMMUNITY SHARING CAMPUS-CENTRIC DHTC EXPERTISE AND BEST PRACTICES

CIC Year 1 Goals Development of a topical seminar series and forum highlighting concepts in the development and use of campus infrastructures (done, continue in Y2) Convening face-to-face meetings of the OSG CIC for both infrastructure providers and domain experts/leaders on campuses (done, continue in Y2) Development of a campus engagement program which programmatically develops ties between research domain experts, campus infrastructure providers and the CIC. (failed) (addressed 4 Y2 below) Developing a program for CIC engagement with XSEDE. (invited to meetings, but no program) (Y2 strategy in context of above)

CIC Year 1 Milestones Define the appropriate metrics for telling the campus story in OSG. We have discussed these in terms of: – Making distributed high throughput computing easy, visible (awareness) and ubiquitous (failed) (New capabilities in Y2) – Finding the appropriate metric for measuring “presence” on campuses (failed) (Will be addressed in Y2) – Capture science success stories, indicating the multiplicative effects of using campus and distributed HTC resources (failed) (To be addressed in context of OSG Communications, OSG Connect, in Y2) – Classification of infrastructures with a maturity model [12] (TBD, augment with user-focused metric) Establish the CIC Topic Seminar series as a staple for community building and knowledge sharing (done, continue in Y2) Convene one face-to-face CIC meeting with a broad technical program compelling to the campus infrastructure providers and users (done, transition from topical to campus engagement focus in Y2) Promote community through use of a CIC resource center (social contacts, topical seminar materials, pointers to tools and guides) (done, continue in Y2)

August Workshop at Duke

August Workshop at Duke WORKSHOP USER METRICS: TARGET = 50 # USERS REGISTERED TO OSGCONNECT # USERS SUCESSFULLY COMPLETING QUICKSTART # USERS BOSCO TO CAMPUS # USERS OSGCONNECT:DIRECT TO OSG # USERS OSGCONNECT:BOSCO TO OSG # USERS > 1000 JOBS ON OSG WORKSHOP CAMPUS RESEARCHER METRICS: TARGET = 5 # NEW REGISTERED CAMPUSES # NEW RESEARCH PROJECTS # NEW APPLICATIONS WORKSHOP CAPABILITY METRICS: TARGET = 5 # ABLE TO USE OASIS # ABLE TO USE DISTRIBUTED DATA TBD

OSG community activity with broad visibility With OSG Communications, plan to absorb into openscienegrid.org family in Y2 Archive of expertise established at