Download presentation
Presentation is loading. Please wait.
Published byLizbeth Blanche Parks Modified over 8 years ago
1
Center for Research Computing at Notre Dame Jarek Nabrzyski, Director naber@nd.edu
2
What is Research Computing at Notre Dame? HTC and HPC (traditional) –25,000 cores –used by 1,700 users –supported by 10 HPC engineers and user support staff Cyberinfrastructure development –10-20 CI projects each year with 30-40 collaborators –supported by 20 research programmers, 10 computational scientists, a few HPC engineers, and a bunch of grad students and undergraduate interns
3
Who are our collaborators at ND? Faculty from 87% of ND Departments, Centers and Institutes HPC/HTC stable, CI Dev – dramatic growth (figure below shows collaborations with faculty)
4
HPC Group Charter – Core Services 1.High Performance Computing (HPC) 2.High Throughput Computing (HTC) 3.Networking: Individual Clusters, CRC Data Center, CRC to Campus, National/International 4.User Education and Training 5.Storage (High Performance, Persistent User Space, Backup) 6.Software Application Installation/Operation 7.Access/Interface to Multi Institutional Research Grids: XSEDE and OSG 8.Cloud/Virtualization Services 9.System Design/Specification/Acquisition/Installation 10.Security/Monitoring * Provide dedicated system administration recharge services
5
Policies Resource Allocation Policy – Free (fair share) and Fee (custom/dedicated) Faculty Cluster Partnership Program Storage Policy – Over 3 Petabytes housed at the CRC AFS, NFS, Panasas, Hadoop, Chirp – 4TB of Persistent Data per Group (no cost) Software Policy Virtual Machine Services Policy CRC policies are available at: https://crc.nd.edu/index.php/services/policies https://crc.nd.edu/index.php/services/policies
6
Challenges HPC is only one of many components –A majority of workloads lean toward HTC –Submissions range from 10,000 single core to 2,000 HPC Software Expertise –Matlab, Mathematica and R dominate user survey –Custom and third party software dev/profiling/parallel expertise req Heterogeneity –CPU/GPU, Filesystems (data storage), Software Data Storage, Transit and Archive –Users expect Dropbox Simplicity & Google Drive Pricing Setting User Expectation and Balancing Support Requests
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.