NSF, NYS, Dell, HP An Overview of CSNY, the Cyberinstitute of the State of New York at buffalo Russ Miller CSNY Computer Sci & Eng, SUNY-Buffalo Hauptman-Woodward.

Slides:



Advertisements
Similar presentations
Prof. Natalia Kussul, PhD. Andrey Shelestov, Lobunets A., Korbakov M., Kravchenko A.
Advertisements

SCARF Duncan Tooke RAL HPCSG. Overview What is SCARF? Hardware & OS Management Software Users Future.
HPC in Poland Marek Niezgódka ICM, University of Warsaw
Mark L. Green, Ph.D. Head of Cyberinfrastructure GRASE Bioinformatics and the Open Science Grid University at Buffalo The State University of New York.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
July 21, 2005Interdisciplinary Modeling of Acquatic Ecosystems, Lake Tahoe 1 Advanced Computing in Environmental Sciences ACES Vanda Grubišić Desert Research.
1 Workshop 20: Teaching a Hands-on Undergraduate Grid Computing Course SIGCSE The 41st ACM Technical Symposium on Computer Science Education Friday.
Simo Niskala Teemu Pasanen
Mark L. Green, Ph.D. Head of Cyberinfrastructure Center for Computational Research and New York State Grid-Cyberinstitute University at Buffalo The State.
Computing at COSM by Lawrence Sorrillo COSM Center.
The Creation of a Big Data Analysis Environment for Undergraduates in SUNY Presented by Jim Greenberg SUNY Oneonta on behalf of the SUNY wide team.
Russ Miller Director, Center for Computational Research UB Distinguished Professor, Computer Science & Engineering Senior Research Scientist, Hauptman-Woodward.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Hungarian Supercomputing GRID
Dr. Russ Miller, Director Professor of Computer Science & Engineering, UB Senior Research Scientist, Hauptman-Woodward Inst. Visualization.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Gurcharan S. Khanna Director of Research Computing RIT
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
Budapest 2006 Grid Activities in Ukraine Nataliya Kussul Space Research Institute NASU-NSAU, Ukraine WGISS 21, Budapest 2006.
Russ Miller Center for Computational Research Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst IDF: Multi-Core Processing for.
Mark L. Green, Ph.D. Grid Computational Scientist Monitoring and Information Services Architecture and Deployment University at Buffalo The State University.
Russ Miller Center for Computational Research Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst CCR User Advisory Board Meeting.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Rensselaer Why not change the world? Rensselaer Why not change the world? 1.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
Evaluation of Agent Teamwork High Performance Distributed Computing Middleware. Solomon Lane Agent Teamwork Research Assistant October 2006 – March 2007.
The Globus Project: A Status Report Ian Foster Carl Kesselman
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
1 Center for Computational Research, SUNY-Buffalo 2 Computer Science & Engineering SUNY-Buffalo 3 Hauptman-Woodward Medical Research Institute BnP on the.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Purdue Campus Grid Preston Smith Condor Week 2006 April 24, 2006.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
Russ Miller & Mark Green Center for Computational Research Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst The Center for Computational.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Facilities and How They Are Used ORNL/Probe Randy Burris Dan Million – facility administrator.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Probe Plans and Status SciDAC Kickoff July, 2001 Dan Million Randy Burris ORNL, Center for.
Authors: Ronnie Julio Cole David
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Building Cyberinfrastructure into a University Culture EDUCAUSE Live! March 30, 2010 Curt Hillegas Director, TIGRESS HPC Center Princeton University 1.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
TeraGrid Quarterly Meeting Arlington, VA Sep 6-7, 2007 NCSA RP Status Report.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
SUPERCOMPUTING 2002, Baltimore, , SUN „Grid Day” PROGRESS Access environment to computational services performed by cluster of SUNs Poznań Supercomputing.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
University at BuffaloThe State University of New York CCR Center for Computational Research Grid Computing Overview Coordinate Computing Resources, People,
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
A Research Collaboratory for Open Source Software Research Yongqin Gao, Matt van Antwerp, Scott Christley, Greg Madey Computer Science & Engineering University.
The Globus Toolkit The Globus project was started by Ian Foster and Carl Kesselman from Argonne National Labs and USC respectively. The Globus toolkit.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
Russ Miller Director, Center for Computational Research UB Distinguished Professor, Computer Science & Engineering Senior Research Scientist, Hauptman-Woodward.
Russ Miller Director, Center for Computational Research UB Distinguished Professor, Computer Science & Engineering Senior Research Scientist, Hauptman-Woodward.
Russ Miller & Mark Green Center for Computational Research & Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst GT’04 Panel: Storage.
NGS Oracle Service.
The National Grid Service
The National Grid Service Mike Mineter NeSC-TOE
Presentation transcript:

NSF, NYS, Dell, HP An Overview of CSNY, the Cyberinstitute of the State of New York at buffalo Russ Miller CSNY Computer Sci & Eng, SUNY-Buffalo Hauptman-Woodward Medical Res Inst

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Cyberinfrastructure Digital Data-Driven Society Knowledge-Based Economy CI, HPC, & CSE are Critical to 21 st Century  Discovery  Economic Development  EOT Requires Development of Software, Algorithms, Portals, Interfaces Seamless, Ubiquitous, Secure, Interwoven, Dynamic:  Compute Systems, Storage, Instruments, Sensors  Computational Methodologies (Algorithms)  Networking  HCI

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Organization of CSNY CSNY HPC (CCR) Computing Data Visualization Networking CSE MultiScale Sciences Engineering Life Sciences Media CI Scheduling Monitoring Virtual Reality Enabling Programmers GUI Design Integration Abilene MAN LAN Buffalo Rochester Albany 32 AoA Abilene HEAnet CA*net Syracuse OC-12 GigE legend NYSERNet PoP 10 GigE R&E Network DWDM NLR ESnet Courtesy of NYSERNet

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York BioACE: Bioinformatics System  Sun V880 (3), Sun 6800  Sun 280R (2)  Intel PIIIs  Sun 3960: 7 TB Disk Storage EMC SAN  35 TB Disk  190 TB Tape Staff  11 Technical Staff  3 Administrative Staff HPC: Overview of CCR’s Resources Dell Linux Cluster (10TF peak)  1600 Xeon EM64T Processors (3.2 GHz)  2 TB RAM; 65 TB Disk  Myrinet / Force10  30 TB EMC SAN Dell Linux Cluster (3TF peak)  600 P4 Processors (2.4 GHz)  600 GB RAM; 40 TB Disk; Myrinet SGI Altix3700 (0.4TF peak)  64 Processors (1.3GHz ITF2)  256 GB RAM  2.5 TB Disk

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Computational Science & SUNY-Buffalo Life Sciences MultiScale Analysis Environmental Modeling Multimedia Grid-Enabling Application Templates  Structural Biology (SnB)  Groundwater Modeling (Ostrich, POMGL, Split)  Earthquake Engineering (EADR)  Computational Chemistry (Q-Chem)  Geographic Information Systems & Biohazards (Titan)

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York CSNY CyberInfrastructure Integrated Data Grid  Automated Data File Migration based on profiling users. Lightweight Grid Monitor (Dashboard) Predictive Scheduler  Define quality of service estimates of job completion, by better estimating job runtimes by profiling users. Dynamic Resource Allocation  Develop automated procedures for dynamic computational resource allocation. High-Performance Grid-Enabled Data Repositories  Develop automated procedures for dynamic data repository creation and deletion. Virtual Reality

ACDC-Grid Monitoring: The ACDC-Grid DASHBOARD

User starts up – default image of structure. Heterogeneous Back-End Interactive Collaboratory

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York ACDC-Grid Administration

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York CSNY Enabling Staff Director Programmers  Interface with Computational Scientists and Disciplinary End Users  Grid Integration  GUI Development  Implement CI Advances Students  Undergraduates, Graduates, Post-Docs

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York CSNY Projects Western New York Grid (*) Grass Roots NYS Grid  SUNY-Buffalo *  Niagara University *  Canisius College  SUNY-Geneseo *  SUNY-Binghamton  Columbia  Hauptman-Woodward Inst. *  Roswell Park Cancer Institute Dashboard Predictive Scheduler Participation  OSG  OSG-ITB  TeraGrid  CaBIG GRASE VO: Grid Resources for Advanced Science and Engineering Virtual Organization  (Non-Physics Research)  Structural Biology  Groundwater Modeling  Earthquake Engineering  Computational Chemistry  GIS/BioHazards

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York DISCOM SinRG APGrid IPG … Grid Computing

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Grid Computing Overview Coordinate Computing Resources, People, Instruments in Dynamic Geographically-Distributed Multi-Institutional Environment Treat Computing Resources like Commodities  Compute cycles, data storage, instruments  Human communication environments No Central Control; No Trust Imaging Instruments Large-Scale Databases Data Acquisition Analysis Advanced Visualization Computational Resources LHC

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York ACDC Data Grid Overview (Grid-Available Data Repositories ) 300 Dual Processor 2.4 GHz Intel Xeon RedHat Linux TB Scratch Space Joplin: Compute Cluster 75 Dual Processor 1 GHz Pentium III RedHat Linux TB Scratch Space Nash: Compute Cluster Crosby: Compute Cluster SGI Origin MHz IP35 IRIX m 360 GB Scratch Space 9 Dual Processor 1 GHz Pentium III RedHat Linux GB Scratch Space Mama: Compute Cluster 16 Dual Sun Blades 47 Sun Ultra5 Solaris GB Scratch Space Young: Compute Cluster Note: Network connections are 100 Mbps unless otherwise noted. 182 GB Storage 100 GB Storage 56 GB Storage 100 GB Storage 70 GB Storage Network Attached Storage 1.2 TB Storage Area Network 75 TB 136 GB Storage CSE Multi-Store 40 TB 4 Processor Dell GHz Intel Xeon RedHat Linux GB Scratch Space ACDC: Grid Portal

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York ACDC-Grid Cyber-Infrastructure Integrated Data Grid  Automated Data File Migration based on profiling users. Lightweight Grid Monitor (Dashboard) Predictive Scheduler  Define quality of service estimates of job completion, by better estimating job runtimes by profiling users. Dynamic Resource Allocation  Develop automated procedures for dynamic computational resource allocation. High-Performance Grid-Enabled Data Repositories  Develop automated procedures for dynamic data repository creation and deletion.

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York ACDC-Grid Collaborations High-Performance Networking Infrastructure WNY Grid Initiative Grid3+ Collaboration iVDGL Member  Only External Member Open Science Grid Member  Organizational Committee  Blueprint Committee  Security Working Group  Data Working Group  GRASE VO Grid-Lite: Campus Grid  HP Labs Collaboration Innovative Laboratory Prototype  Dell Collaboration

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Structural Biology  SnB and BnP for Molecular Structure Determination/Phasing Groundwater Modeling  Ostrich: Optimization and Parameter Estimation Tool  POMGL: Princeton Ocean Model Great Lakes for Hydrodynamic Circulation  Split: Modeling Groundwater Flow with Analytic Element Method Earthquake Engineering  EADR: Evolutionary Aseismic Design and Retrofit; Passive Energy Dissipation System for Designing Earthquake Resilient Structures Computational Chemistry  Q-Chem: Quantum Chemistry Package Geographic Information Systems & BioHazards  Titan: Computational Modeling of Hazardous Geophysical Mass Flows Grid-Enabling Application Templates (GATs)

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Grid Services and Applications ACDC-Grid Computational Resources ACDC-Grid Data Resources ACDC-Grid Data Resources Applications Local Services LSF Condor MPI TCP SolarisIrix WINNT UDP High-level Services and Tools Globus Toolkit globusrun MPI NWS MPI-IO Core Services Metacomputing Directory Service GRAM Globus Security Interface GASS C, C++, Fortran, PHP Shake-and-Bake Oracle MySQL Apache PBS Maui Scheduler RedHat Linux Stork Adapted from Ian Foster and Carl Kesselman

Startup Screen for ACDC-Grid Job Submission

Instructions and Description for Running a Job on ACDC-Grid

Software Package Selection

Full Structure / Substructure Template Selection

Default Parameters Based on Template

Default Parameters (cont’d)

Generating Reflections (Drear)

Invariant Generation

SnB Setup

SnB Setup (cont’d)

SnB Review (Grid job ID: 447)

Graphical Representation of Intermediate Job Status

Histogram of Completed Trial Structures

Status of Jobs

Molecule scaled, rotated, and labeled.

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York Acknowledgments Mark Green Cathy Ruby Amin Ghadersohi Naimesh Shah Steve Gallo Jason Rappleye Jon Bednasz Sam Guercio Martins Innus Cynthia Cornelius NSF, NIH, NYS, NIMA, NTA, Oishei, Wendt, DOE

University at BuffaloThe State University of New York CSNY Cyberinstitute of the State of New York SGI Altix3700 (0.4TF peak)  64 Processors (1.3GHz ITF2)  256 GB RAM  2.5 TB Disk Apex Bioinformatics System  Sun V880 (3), Sun 6800  Sun 280R (2)  Intel PIIIs  Sun 3960: 7 TB Disk Storage HP/Compaq SAN  75 TB Disk; 190 TB Tape  64 Alpha Processors (400 MHz)  32 GB RAM; 400 GB Disk Overview of Computatial Science at SUNY-Buffalo Dell Linux Cluster (10TF peak)  1600 Xeon EM64T Processors (3.2 GHz)  2 TB RAM; 65 TB Disk  Myrinet / Force10  30 TB EMC SAN Dell Linux Cluster (2.9TF peak)  600 P4 Processors (2.4 GHz)  600 GB RAM; 40 TB Disk; Myrinet Dell Linux Cluster (6TF peak)  4036 Processors (PIII 1.2 GHz)  2TB RAM; 160TB Disk; 16TB SAN IBM BladeCenter Cluster (3TF peak)  532 P4 Processors (2.8 GHz)  5TB SAN SGI Intel Linux Cluster (0.1TF peak)  150 PIII Processors (1 GHz)  Myrinet