Tom Furlani Director, Center for Computational Research SUNY Buffalo Metrics for HPC September 30, 2010.

Slides:



Advertisements
Similar presentations
New Service Models. Why? Simply put: we cant do all you tell us you want and need Service and collection models that defined excellence in academic libraries.
Advertisements

An Analysis of Node Sharing on HPC Clusters using XDMoD/TACC_Stats Joseph P White, Ph.D Scientific Programmer - Center for Computational Research University.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
Background Chronopolis Goals Data Grid supporting a Long-term Preservation Service Data Migration Data Migration to next generation technologies Trust.
TDL Labs Partnerships for Exploration Luis Francisco-Revilla, Unmil P. Karadkar School of Information The University of Texas at Austin.
Clouds from FutureGrid’s Perspective April Geoffrey Fox Director, Digital Science Center, Pervasive.
DataGrid is a project funded by the European Union 22 September 2003 – n° 1 EDG WP4 Fabric Management: Fabric Monitoring and Fault Tolerance
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
© The Trustees of Indiana University Centralize Research Computing to Drive Innovation…Really Thomas J. Hacker Research & Academic Computing University.
Tom Furlani, PhD Center for Computational Research University at Buffalo, SUNY Solving the “last mile of computing problem” – developing portals to enable.
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
The Creation of a Big Data Analysis Environment for Undergraduates in SUNY Presented by Jim Greenberg SUNY Oneonta on behalf of the SUNY wide team.
Re-organizing Information Technology University at Buffalo.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
DRAFT 1 Institutional Research Computing at WSU: A community-based approach Governance model, access policy, and acquisition strategy for consideration.
Research Support Services Research Support Services.
CS 732 Software Engineering Semester 1/2545 Dr.Choonhapong Thaiupathump.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
#HASummit14 Session #30 Breaking Down Silos: Resolving Academic, Medical, and Research Interests Once and for All Presenter Pre-Session Poll Question What.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Institute for Digital Research and Education (IDRE) UCLA’s CI Vision Research CI DataNet Cyber Learning Institute for Digital Research and Education (IDRE)
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
Scientific Computing Advisory Board Kickoff Meeting July 11, 2012, 11 AM Icahn L3-36.
1 Florida Cyberinfrastructure Development: SSERCA Fall Internet2 Meeting Raleigh, Va October 3, 2011 Paul Avery University of Florida
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
17-April-2007 High Performance Computing Basics April 17, 2007 Dr. David J. Haglin.
Why Take This On? Back in the 1970s, in response to the rising awareness that computers could be used for quantitative analysis, SPSS (or similar) was.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Biology - 3 Biotechnology – DNK Marine Sciences - 16 Physics - 3 Geology - 5 Mathematics - 1 Chemistry - DNK Dean Office - DNK Civil Eng. - DNK Chemical.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
Tom Furlani, Center for Computational Research University at Buffalo, October 15, 2015 Coexisting with Protected Health Information at CCR.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
Jonathan Carroll-Nellenback.
Contract Year 1 Review IMT Tilt Thompkins MOS - NCSA 15 May 2002.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Open XDMoD Overview Tom Furlani, Center for Computational Research
E-Compliance for Research Patricia Hagen, PhD, Associate Provost, Research Compliance and Electronic Initiatives Christine McMahon, M.S., Manager, Research.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Power and Cooling at Texas Advanced Computing Center Tommy Minyard, Ph.D. Director of Advanced Computing Systems 42 nd HPC User Forum September 8, 2011.
Tackling I/O Issues 1 David Race 16 March 2010.
OKLAHOMA Supercomputing Symposium 2011 University of Oklahoma October 11, 2011 James Wicksted, RII Project Director Associate Director, Oklahoma EPSCoR.
1 Supplemental line if need be (example: Supported by the National Science Foundation) Delete if not needed. XDMoD Financial Analytics Craig Stewart ORCID.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Steve Gallo Center for Computational Research SUNY Buffalo Technology Audit for TG:XD March 10, 2010.
A worldwide e-Infrastructure and Virtual Research Community for NMR and structural biology Alexandre M.J.J. Bonvin Project coordinator Bijvoet Center for.
Introduction to Data Analysis with R on HPC Texas Advanced Computing Center Feb
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
The LGI Pilot job portal EGI Technical Forum 20 September 2011 Jan Just Keijser Willem van Engen Mark Somers.
Scientific Data Processing Portal and Heterogeneous Computing Resources at NRC “Kurchatov Institute” V. Aulov, D. Drizhuk, A. Klimentov, R. Mashinistov,
What is HPC? High Performance Computing (HPC)
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Monitoring and Information Services Technical Group Report
Matt Link Associate Vice President (Acting) Director, Systems
Computer Science Department, University of Missouri, Columbia
Recap: introduction to e-science
Computing and Information Technology Three Year Plan:
Introduction to XSEDE Resources HPC Workshop 08/21/2017
VirMan: Virtual Manufacturing Plant
Dean Martin Cadwallader Dean of the Graduate School
Technology Committee Report to the Budget Committee & College Planning Council March 7, 2012.
San Diego Supercomputer Center
School of Education Opportunity for Discovery, Learning & Engagement
Grid Application Model and Design and Implementation of Grid Services
Final Review 27th March Final Review 27th March 2019.
Presentation transcript:

Tom Furlani Director, Center for Computational Research SUNY Buffalo Metrics for HPC September 30, 2010

Center for Computational Research  More than 10 years experience delivering HPC in an academic setting  Mission: Enabling and facilitating research within the university  Staff: 15 FTE  Provide  Cycles, software engineering, scientific computing/modeling, medical informatics, visualization  Computational Cycles Delivered in 2009:  360,000 jobs run (1000 per day)  720,000 CPU days delivered  $9M Infrastructure Upgrades in 2010: (6000 cores, 800 TB storage)  NSF MRI – Netezza Data Intensive HPC Cluster  NSF CDI – GPU Cluster  NYSERDA Green IT Cluster  NIH S10 – HPC Cluster  Portal/Tool Development  WebMO (Chemistry)  iNquiry (Bioinformatics)  UBMoD (Metrics on Demand)  NYSTAR HPC 2 Consortium  UB, RPI, StonyBrook, Brookhaven, NYSERNet  Bringing HPC to NYS Industry

Outline  Center for Computational Research  Technology Audit Service  Vision  The Team  Progress to Date Application Kernels XDMoD Portal

Why Bother?  If you don’t measure it, you can’t improve it  Constant budgetary pressure  HPC is expensive  Competing for limited resources with many other worthy programs  Can’t simply state the obvious - that HPC capability is crucial for a research university

The Good News  Advanced computing critical to research  Simulation based engineering and science  Data driven science  HPC Centers have a good story to tell  Transparency is good  Much of the info is collected already  Log files from queuing systems

Which Metrics (ROI)  Resource Utilization  Provide instantaneous and historical information on resource utilization  Improve system performance for users  UB Metrics on Demand (UBMoD)  Grant Database  Track grant proposals submitted and funded by researchers utilizing HPC resources  Track funds budgeted to center from grants

UB Metrics on Demand Portal  UBMoD: Web-based Interface for On-demand Metrics  CPU cycles delivered, Storage, Queue Statistics, etc  Customized interface (Provost, Dean, Chairs, Faculty)

CCR Grant Database  Input data on grants proposals  Search data to generate reports

Perspective