Internet2 Meeting 2006 UABgrid : A campus-wide distributed computational infrastructure University of Alabama at Birmingham UABgrid Architecture Team Jill.

Slides:



Advertisements
Similar presentations
The National Grid Service and OGSA-DAI Mike Mineter
Advertisements

19 July 2005UAB-IBM Life Sciences Mtg, Hawthorne Center UAB IT Academic Computing David L Shealy, Director Jill Gemmill, Asst. Director John-Paul Robinson,
The Internet2 NET+ Services Program Jerry Grochow Interim Vice President CSG January, 2012.
Authenticated QoS Signaling William A. (Andy) Adamson Olga Kornievskaia CITI, University of Michigan.
Serving the Research Mission: An Approach to Central IT’s Role Matthew Stock University at Buffalo.
2006 © SWITCH Authentication and Authorization Infrastructures in e-Science (and the role of NRENs) Christoph Witzig SWITCH e-IRG, Helsinki, Oct 4, 2006.
DESIGNING A PUBLIC KEY INFRASTRUCTURE
UABgrid Identity Infrastructure John-Paul Robinson, David Shealy, UAB, IT Infrastructure Services Educause.
Identity Management, PKI and Grids Jill Gemmill, PhD University of Alabama at Birmingham.
Public Key Infrastructure at the University of Pittsburgh Robert F. Pack, Vice Provost Academic Planning and Resources Management March 27, 2000 CNI Spring.
1 Intellectual Architecture Leverage existing domain research strengths and organize around multidisciplinary challenges Institute for Computational Research.
Information Technology Current Work in System Architecture November 2003 Tom Board Director, NUIT Information Systems Architecture.
1 Short Course on Grid Computing Jornadas Chilenas de Computación 2010 INFONOR-CHILE 2010 November 15th - 19th, 2010 Antofagasta, Chile Dr. Barry Wilkinson.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Approaching a Data Center Guy Almes meetings — Tempe 5 February 2007.
Constructing Campus Grids Experiences adapting myVocs to UABgrid John-Paul Robinson High Performance Computing Services Office of the Vice President for.
3 Nov 2003 A. Vandenberg © Second NMI Integration Testbed Workshop on Experiences in Middleware Deployment, Anaheim, CA 1 Shibboleth Pilot Local Authentication.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
and beyond Office of Vice President for Information Technology.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
DynamicBLAST on SURAgrid: Overview, Update, and Demo John-Paul Robinson Enis Afgan and Purushotham Bangalore University of Alabama at Birmingham SURAgrid.
Chao “Bill” Xie, Victor Bolet, Art Vandenberg Georgia State University, Atlanta, GA 30303, USA February 22/23, 2006 SURA, Washington DC Memory Efficient.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
TeraGrid Science Gateways: Scaling TeraGrid Access Aaron Shelmire¹, Jim Basney², Jim Marsteller¹, Von Welch²,
Research Computing Working Group Brainstorming session with Ken.
MyVocs and GridShib: Integrated VO Management Jill Gemmill, John-Paul Robinson University of Alabama at Birmingham Tom Scavo, Von Welch National Center.
Developing a 100G TestBed for Life Science Collaborations  Taking advantage of existing UM/SURA dark fiber to create a research 100G pathway from St.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
Directory Enabled AuthN/Z at Clemson LDAP yesterday, Shibboleth tomorrow Jill Gemmill Barry Johnson Jill Gemmill Barry Johnson.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
Top Issues Facing Information Technology at UAB Sheila M. Sanders UAB Vice President Information Technology February 8, 2007.
SURA GridPlan Infrastructure Working Group Art Vandenberg Georgia State University Mary Fran Yafchak SURA Working.
Using NMI Components in MGRID: A Campus Grid Infrastructure Andy Adamson Center for Information Technology Integration University of Michigan, USA.
GridShib: Grid/Shibboleth Interoperability September 14, 2006 Washington, DC Tom Barton, Tim Freeman, Kate Keahey, Raj Kettimuthu, Tom Scavo, Frank Siebenlist,
NSF Middleware Initiative Renee Woodten Frost Assistant Director, Middleware Initiatives Internet2 NSF Middleware Initiative.
Information Resources and Communications University of California, Office of the President UC-Wide Activities in Support of Research and Scholarship David.
Federated Environments and Incident Response: The Worst of Both Worlds? A TeraGrid Perspective Jim Basney Senior Research Scientist National Center for.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Linking Research Data to Clinical Data – a Pilot The University of Alabama at Birmingham.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
Identity and Access Management Roadmap Presentations for Committee on Technology and Architecture March 21, 2012 Amy Day, MBA Director of GME IAM Committee.
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
TERENA 2003, May 21, Zagreb TERENA Networking Conference, 2003 MOBILE WORK ENVIRONMENT FOR GRID USERS. TESTBED Miroslaw Kupczyk Rafal.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Leveraging Campus Authentication for Grid Scalability Jim Jokl Marty Humphrey University of Virginia Internet2 Meeting April 2004.
Campus Experience: Pubcookie University of Alabama at Birmingham Academic Computing Zach Garner.
Claims-Based Identity Solution Architect Briefing zoli.herczeg.ro Taken from David Chappel’s work at TechEd Berlin 2009.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
Parallel Algorithm for Multiple Genome Alignment Using Multiple Clusters Nova Ahmed, Yi Pan, Art Vandenberg Georgia State University SURA Cyberinfrastructure.
Welcome to Base CAMP: Enterprise Directory Deployment Ken Klingenstein, Director, Internet2 Middleware Initiative Copyright Ken Klingenstein This.
H.350 Deployment Case Studies IETF Leveraging Middleware for Unified Campus Services: ITU-T H.350 and IETF RFC 3944 Jason Lynn (UAB) Frank Reinemer (Danet)
The National Grid Service Mike Mineter.
MGRID Architecture Andy Adamson Center for Information Technology Integration University of Michigan, USA.
An Integrated Collaboration Platform John-Paul Robinson Internet2 Member Meeting Fall 2006.
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
CernVM and Volunteer Computing Ivan D Reid Brunel University London Laurence Field CERN.
2NCSA/University of Illinois
Grid Computing.
Patrick Dreher Research Scientist & Associate Director
NMI Testbed GRID Utility for Virtual Organization
Sky Computing on FutureGrid and Grid’5000
OGCE Portal Applications for Grid Computing
Federated Environments and Incident Response: The Worst of Both Worlds
TeraGrid Identity Federation Testbed Update I2MM April 25, 2007
Sky Computing on FutureGrid and Grid’5000
NSF Middleware Initiative: GridShib
Presentation transcript:

Internet2 Meeting 2006 UABgrid : A campus-wide distributed computational infrastructure University of Alabama at Birmingham UABgrid Architecture Team Jill Gemmill Purushotham Bangalore John-Paul Robinson

Acknowledgments This work has been supported by: Office of the Vice President for Information Technology Department of Computer & Information Sciences, School of Natural Sciences and Mathematics Enabling Technology Laboratory, School of Engineering National Science Foundation o ANI “NMI Enabled Open Source Collaboration Tools for Virtual Organizations” o NSF ANI via SURA Subcontract “UAB Middleware Testbed Program: Integrated Directory Services, PKI, Video, and Parallel Computing” o NSF CNR “Computer and Information Sciences Grid Node Research Facility” Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation

3 UAB Background 36 yr. old urban medical research U. 82 city blocks 13 schools (= 13 deans) 17,000 students; 16,000 employees Students are 26.3 percent African American and 60.2 percent female 70 research centers 20 th in NIH funding (4 th in SE) $433 million in research funding; doubling every 10 years No history of centrally supported HPC or other research-oriented computing services (eg Statistics) The Alabama Supercomputer Authority

4 What’s a Campus Grid? Strategic View:  Maximize use of university’s investment in computational resources  Minimize administrative effort involved in campus-wide resource sharing  By leveraging investments in Identity Management, WebISO, Directories, and Network infrastructures UABgrid is a federation of resource owners who happen to share a common identity provider

5 UABgrid Partners Office of VPIT: Sheila Sanders, VPIT; IT Academic Computing: David L. Shealy, Jill Gemmill, John-Paul Robinson,  128 node cluster; 64 node P3 cluster; desktop condor pool; 6 terabytes IBP storage Department of Computer and Information Sciences: Tony Skjellum, CIS Chair; Puri Bangalore, Asst. Prof.  256 processor & 64 processor clusters;, Viz Wall, Parallel Storage System Engineering Enabling Technology Lab: Bharat Soni, Chair Mechanical Engineering; Alan Shih, ETLab Director  256 processor and 128 processor clusters; Viz Wall, High Speed Storage Systems

6 Current UABgrid Applications BioInformatics  BLAST, Gene Sequence Analysis, Structural Biology, Micro-Array Data Analysis, Visualization PDE  Automotive & Industrial, Surface Simulations, Optimization Grid and Middleware Research  Scheduling, Load Balancing, Granular Authorization

7 UABgrid Architecture Today: Phase I GigaBit

8 UABgrid Phase II Additional Grid Nodes 10 GigE

9 Factors Supporting Resource Sharing Provost and VP Research are being inundated with competing school requests to purchase clusters; Deans who’ve gotten clusters find themselves losing classroom space to equipment racks and facing large power and AC bills; Clusters, large databases, schedulers, etc. require expensive expertise

10 Grid User Management Grid identity comes from enterprise authentication system ( “BlazerID”) WebISO leveraged to provide digital certificate, private key and proxy certs behind the scenes Grid Portal and Per-System User Accounts Are Provisioned Automatically, saving much administrative effort (Phase I : grid-mapfiles; Phase II : LDAP stored posix accounts + GridShib)

11

12 Grids for Mere Mortals For jobs run repeatedly where only the database or query varies, it is worthwhile to build a user-friendly interface and also to optimize use of resources Example: BLAST (National Library of Medicine gene sequence matching software)

13 Access using BlazerID and password Queries and Results easily uploaded & downloaded Web UI can be hosted on any server Web UI can be written in any development language Improving the Interface : GridBLAST

14 Improving Performance: G-BLAST A native Grid Service Interface for BLAST G-BLAST provides automatic BLAST algorithm selection based on # of queries, length of queries, size of the database used, and machines available BLAST algorithms employed: multi- threaded BLAST, database-splitting BLAST (e.g., mpiBLAST), query-splitting BLAST

15 G-BLAST architecture Client Program Web Interface Users … BLAST1BLAST2BLASTn GIS Invoker Grid Service Interface Resource Information Grid Service Query (2) Response (3) Query (1), (7) Dispatch (4)Result (5) Notify (6) AIS Scheduler Application Information

16 G-BLAST Scheduler Architecture Jobs Job ID’s (JIDs) Analyzer BLAST Benchmark database Resource Broker Resource Information Job Submission Agent Resource GIIS/GRIS AIS

17 UABgrid Funding and Management Today All equipment has been purchased with various grant funds ETLab has been designated as a campus resource; ETLab has contracted for 50% of one IT provided unix administrator to manage its clusters Academic Computing has 2.3 employees and provides other support in addition to HPC Computer science / NS&M resources are available to other campus computational scienctists Computer Science has 1 administrator for all CIS systems Each research department hires its own programmer(s) Developing sustainable funding model(s) is challenging

18 Federated Grids Exploring cross-domain resource sharing scenarios  Federated Identity : experiences in SURAgrid  Federated Attributes : myVocs and GridShib

19 SURAgrid BlazerID and password Grid Portal Resources Kerberos Login Grid Portal Resources Louisiana State University Digital Certificate Login Grid Portal Resources University of Virginia SURAGrid CA Bridge SURAGrid Portal Texas Advanced Computating Center UABgrid CA UVA CA LSU CA

20 Use of Shibboleth in Grids provides Attribute based Access Control (not just identity) Example: Faculty may be assigned higher priority in job queues than students For VO’s the most important attribute is “member of VO ABC”, and VO memberships typically cross domains. myVocs offers easy, self management for VOs and expects web browser as primary access to resources Combined with GridShib, myVocs enables VO membership-based access to grid resources : a Virtual Organization Service Center

21 Inside myVocs Attribute Aggregation

22

23 Q & A Jill Gemmill  Further Information: 