Center For Research Computing (CRC), University of Notre Dame, Indiana Application of ND CRC to be a member of the OSG Council Jarek Nabrzyski CRC Director.

Slides:



Advertisements
Similar presentations
21 st Century Science and Education for Global Economic Competition William Y.B. Chang Director, NSF Beijing Office NATIONAL SCIENCE FOUNDATION.
Advertisements

Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
NG-CHC Northern Gulf Coastal Hazards Collaboratory Simulation Experiment Integration Sandra Harper 1, Manil Maskey 1, Sara Graves 1, Sabin Basyal 1, Jian.
High Performance Computing Course Notes Grid Computing.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1. 2 Welcome to HP-CAST-NTIG at NSC 1–2 April 2008.
Joint Unit Liaison Meeting 1 INFORMATION AND TECHNOLOGY SERVICES Transforming U-M: Joint UL Meeting July 21, 2009.
Overview of the “120-day Study” on the WSU Research Enterprise
Cloud Computing projects in Engineering Harold Castro, Ph.D. Associate professor Systems and Computing Engineering COMIT (Communications and IT) Research.
Site Report US CMS T2 Workshop Samir Cury on behalf of T2_BR_UERJ Team.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
© The Trustees of Indiana University Centralize Research Computing to Drive Innovation…Really Thomas J. Hacker Research & Academic Computing University.
Purdue RP Highlights TeraGrid Round Table September 23, 2010 Carol Song Purdue TeraGrid RP PI Rosen Center for Advanced Computing Purdue University.
August 27, 2008 Platform Market, Business & Strategy.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Gurcharan S. Khanna Director of Research Computing RIT
Unidata Policy Committee Meeting Bernard M. Grant, Assistant Program Coordinator for the Atmospheric and Geospace Sciences Division May 2012 NSF.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
Students Becoming Scientists in the World: Integrating Research and Education for Sustainable Development Dr. James P. Collins Directorate for the Biological.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
Cloud Computing in NASA Missions Dan Whorton CTO, Stinger Ghaffarian Technologies June 25, 2010 All material in RED will be updated.
Distributed EU-wide Supercomputing Facility as a New Research Infrastructure for Europe Gabrielle Allen Albert-Einstein-Institut, Germany Jarek Nabrzyski.
INTRODUCTION The GRID Data Center at INFN Pisa hosts a big Tier2 for the CMS experiment, together with local usage from other HEP related/not related activities.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
The Research Computing Center Nicholas Labello
RNA-Seq 2013, Boston MA, 6/20/2013 Optimizing the National Cyberinfrastructure for Lower Bioinformatic Costs: Making the Most of Resources for Publicly.
Cyberinfrastructure: Enabling New Research Frontiers Sangtae “Sang” Kim Division Director – Division of Shared Cyberinfrastructure Directorate for Computer.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
CSG - Research Computing Redux John Holt, Alan Wolf University of Wisconsin - Madison.
DV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science DOE: Scientific Collaborations at Extreme-Scales:
Center for Research Computing at Notre Dame Jarek Nabrzyski, Director
Tools for collaboration How to share your duck tales…
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Committee Meeting, June 9, 2008 Strategic Institutional Research Plan.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
Open Science Grid OSG Engagement Strategy and Status ETP Conference Call Oct ; 5:30PM EST Bringing additional non-physicists onto OSG John McGee.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
7. Grid Computing Systems and Resource Management
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
Final Implementation of a High Performance Computing Cluster at Florida Tech P. FORD, X. FAVE, K. GNANVO, R. HOCH, M. HOHLMANN, D. MITRA Physics and Space.
Tackling I/O Issues 1 David Race 16 March 2010.
G-Cloud - The Delivery of a Shared Computing Platform for Government Ian Osborne Director, Digital Systems KTN Intellect.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
High Performance Computing (HPC)
Elastic Cyberinfrastructure for Research Computing
What is HPC? High Performance Computing (HPC)
Partner Toolbox Cloud Infrastructure & Management
Deploying Regional Grids Creates Interaction, Ideas, and Integration
Clouds , Grids and Clusters
CyVerse Tools and Services
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Jarek Nabrzyski Director, Center for Research Computing
University of Technology
OSG Rob Gardner • University of Chicago
School of Education Opportunity for Discovery, Learning & Engagement
Presentation transcript:

Center For Research Computing (CRC), University of Notre Dame, Indiana Application of ND CRC to be a member of the OSG Council Jarek Nabrzyski CRC Director 1

Center For Research Computing (CRC), University of Notre Dame, Indiana About me – Poznan Supercomputing and Networking Center, Poland – Co-founder of eGrid (European Grid Forum) – Co-founder of GGF – – Executive Director 2009-today – – First incoming director of the CRC – Faculty in the CSE department Research – Resource management, workflow scheduling, cloud computing Teaching – Cloud Computing 2

Center For Research Computing (CRC), University of Notre Dame, Indiana Why I am here? Notre Dame is becoming more and more involved in national and international large-scale scientific collaboration – CMS – Data preservation – Malaria, and other infectious diseases – Adaptation to climate change, and more… Need for national collaborations from apps to infrastructure layers OSG’s has always been perceived by the CRC as one of the most important national production DCEs 3

Center For Research Computing (CRC), University of Notre Dame, Indiana Why am I here (2) Strong belief that together we can do much more High value of OSG’s goals Be a good citizen – Contribute spare resources to a national production infrastructure – Contribute to the national cyberinfrastructure vision This is a long term commitment! 4

Center For Research Computing (CRC), University of Notre Dame, Indiana CRC Mission The University of Notre Dame’s Center for Research Computing (CRC) engages in computational science, fosters multidisciplinary research and provides advanced computational tools and services. The CRC works to facilitate discoveries across science, engineering, arts, humanities, social sciences, business and other disciplines. 5

Center For Research Computing (CRC), University of Notre Dame, Indiana CRC Vision To become an internationally recognized multidisciplinary research computing center based upon our reputation for facilitating and accelerating discovery through effective and novel applications of cyberinfrastructure. 6

Center For Research Computing (CRC), University of Notre Dame, Indiana CRC Goals Research: To help Notre Dame be among the world’s leaders in conducting multidisciplinary research through the application of cyberinfrastructure. Infrastructure: To provide reliable advanced computational architectures, software solutions and multidisciplinary collaborative lab spaces. Service and Education: To develop a customer service strategy, improving support for current CRC customers while attracting new customers. Economic Development: To facilitate technology transfer and accelerate innovation. 7

Center For Research Computing (CRC), University of Notre Dame, Indiana Organization Chart 8 staff members 25 staff members 8

Center For Research Computing (CRC), University of Notre Dame, Indiana CRC in Numbers ~$1.5M in re-charge projects per year PI/co-PIs on grants – $50M total value – $12.9M annual research expenditures 65 publications co-authored by CRC computational scientists over last three years users (350 faculty, 700 grad students) 100+ CI projects of various size supported over last two years 20,000 computing cores managed by the CRC 4x more computational resources (since I joined ND) 5x more users (since I joined ND) 9

Center For Research Computing (CRC), University of Notre Dame, Indiana User Growth Number of Active Accounts By Request -More Computationally Based Faculty -Better Outreach -More Capable Facilities -Migration of Computational Faculty 10

Center For Research Computing (CRC), University of Notre Dame, Indiana 11

Center For Research Computing (CRC), University of Notre Dame, Indiana Equipment Facilities ND CRC Data Center -Located at Union Station sqft machine room -650 sqft office -4 offices -1 hotel station -> 1,600 servers -20,000 Cores 12

Center For Research Computing (CRC), University of Notre Dame, Indiana ND CRC application groups Molecular dynamics groups, Chemical engineering and chemistry Civil engineering (storm surge, winds and high buildings, hurricane center) AME (flow problems, gas turbines) Biology (genomics, infectious diseases, ecology, climate change) Social sciences Biology and social sciences are growing fast! 13

Center For Research Computing (CRC), University of Notre Dame, Indiana ND Collaboration examples VecNet and MTC projects – Gates Foundation malaria projects – Malaria transmission and intervention: data, models and simulations – International collaboration involving UK, Greece, Australia, Mexico, Switzerland CyberEye – hurricane preparedness center ND CMS and physics groups – Support the CMS infrastructure – Data preservation for HEP (DASPOS - NSF Grant) – QuarkNet (NSF) program - research and infrastructure with Stanford – 200,000 computers around the world 14

Center For Research Computing (CRC), University of Notre Dame, Indiana NDCMSEARTH /pscratch Functionality: Backend server Condor central manger Condor submit host Users do not login NFS serves locations: DAS array – /store Condor software Name Node for Hadoop Software: RHEL Server 5.8 CE/SE OSG Condor Hardware: Dell PowerEdge R815 CPUs - 4x8 cores RAM – 64GB /store r22bd8810 Direct Attached Storage Capacity – 80TB (raided) Connectios: FC direct to NDCMS EARTH and WN via NFS Enterprise Level Switch BlackDiamond 8810 Panasas – 7 x 10Gb FC EARTH – 10Gb FC NDCMS – 10Gb FC Stack Switches – 1Gb TP Functionality: User interactive host Condor submit host CE/SE OSG CMS Software Stack: Access to CMSSW/CRAB/gLite PhEDEx via panFS Software: RHEL Server 5.8 CE/SE OSG Condor Hardware: Dell PowerEdge R815 CPUs - 4x8 cores RAM – 128GB Panasas Storage Capacity – 220TB (raided) CMSSW/CRAB/gLite/PhEDEx resides here Access to CMS Software from NDCMS/EARTH/WN via panFS protocol. switch Stack Switches Extreme Summit x460 Work Nodes 15x2TB Local HDDs Internet Cisco Router 72 hosts – condor work nodes HP Proliant DL165 G6 CPUs – 2x6 cores RAM – 12GB RHEL Server servers have 3x2TB disks, organized in the hadoop cluster with NDCMS as name node. Replication factor

Center For Research Computing (CRC), University of Notre Dame, Indiana Summary CRC is at the forefront of Notre Dame’s expanding research efforts Growing demand for CRC infrastructure and services, both CI and HPC Great opportunities still out there! – Reach out to remaining ND departments – National Cyberinfrastructure, capitalize on existing collaborations and build new – International collaboration 16

Center For Research Computing (CRC), University of Notre Dame, Indiana Questions? I welcome your questions and engagement to help you decide on my application 17