1 Center for Computational Research, SUNY-Buffalo 2 Computer Science & Engineering SUNY-Buffalo 3 Hauptman-Woodward Medical Research Institute BnP on the.

Slides:



Advertisements
Similar presentations
DOSAR Workshop VI April 17, 2008 Louisiana Tech Site Report Michael Bryant Louisiana Tech University.
Advertisements

ANL NCSA PICTURE 1 Caltech SDSC PSC 128 2p Power4 500 TB Fibre Channel SAN 256 4p Itanium2 / Myrinet 96 GeForce4 Graphics Pipes 96 2p Madison + 96 P4 Myrinet.
Mark L. Green, Ph.D. Head of Cyberinfrastructure GRASE Bioinformatics and the Open Science Grid University at Buffalo The State University of New York.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
ATLAS Tier 2 Status (IU/BU) J. Shank Boston University iVDGL Facilities Workshop (March 20-22, 2002) BNL.
UK -Tomato Chromosome Four Sarah Butcher Bioinformatics Support Service Centre For Bioinformatics Imperial College London
Mark L. Green, Ph.D. Head of Cyberinfrastructure Center for Computational Research and New York State Grid-Cyberinstitute University at Buffalo The State.
Computing at COSM by Lawrence Sorrillo COSM Center.
Russ Miller Director, Center for Computational Research UB Distinguished Professor, Computer Science & Engineering Senior Research Scientist, Hauptman-Woodward.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
Dr. Russ Miller, Director Professor of Computer Science & Engineering, UB Senior Research Scientist, Hauptman-Woodward Inst. Visualization.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
National Center for Supercomputing Applications GLORIAD Science Applications Astronomy – Virtual Observatories Global Climate Change Richard M. Crutcher.
Telescience Project and bandwidth challenge on the TransPAC Toyokazu Akiyama Cybermedia Center, Osaka University.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
1 Jack Dongarra University of Tennesseehttp://
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
Russ Miller Center for Computational Research Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst IDF: Multi-Core Processing for.
Presentation To. Mission Think Dynamics is in the business of automating the management of data center resources thereby enabling senior IT executives.
Russ Miller Center for Computational Research Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst CCR User Advisory Board Meeting.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
Component 4: Introduction to Information and Computer Science
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
St.Petersburg state university computing centre and the 1st results in the DC for ALICE V.Bychkov, G.Feofilov, Yu.Galyuck, A.Zarochensev, V.I.Zolotarev.
José D. Zamora, Sean R. Morriss and Manuela Campanelli.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
National Computational Science National Center for Supercomputing Applications National Computational Science NCSA Terascale Clusters Dan Reed Director,
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
Russ Miller & Mark Green Center for Computational Research Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst The Center for Computational.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Facilities and How They Are Used ORNL/Probe Randy Burris Dan Million – facility administrator.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Probe Plans and Status SciDAC Kickoff July, 2001 Dan Million Randy Burris ORNL, Center for.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Introduction to NW-GRID R.J. Allan CCLRC Daresbury Laboratory.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
NSF, NYS, Dell, HP An Overview of CSNY, the Cyberinstitute of the State of New York at buffalo Russ Miller CSNY Computer Sci & Eng, SUNY-Buffalo Hauptman-Woodward.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation Analyzing.
TeraGrid Quarterly Meeting Arlington, VA Sep 6-7, 2007 NCSA RP Status Report.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Sergiu April 2006June 2006 Overview of TeraGrid Resources and Services Sergiu Sanielevici, TeraGrid Area Director for User.
Capacity and Capability Computing using Legion Anand Natrajan ( ) The Legion Project, University of Virginia (
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
University at BuffaloThe State University of New York CCR Center for Computational Research Grid Computing Overview Coordinate Computing Resources, People,
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
TeraGrid Overview John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration University of Chicago/Argonne National Laboratory March 25,
Southern California Infrastructure Philip Papadopoulos Greg Hidley.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Russ Miller Director, Center for Computational Research UB Distinguished Professor, Computer Science & Engineering Senior Research Scientist, Hauptman-Woodward.
Russ Miller Director, Center for Computational Research UB Distinguished Professor, Computer Science & Engineering Senior Research Scientist, Hauptman-Woodward.
Creating Grid Resources for Undergraduate Coursework John N. Huffman Brown University Richard Repasky Indiana University Joseph Rinkovsky Indiana University.
Introduction to Data Analysis with R on HPC Texas Advanced Computing Center Feb
Russ Miller & Mark Green Center for Computational Research & Computer Science & Engineering SUNY-Buffalo Hauptman-Woodward Medical Inst GT’04 Panel: Storage.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Joint Techs, Columbus, OH
The National Grid Service Mike Mineter NeSC-TOE
Presentation transcript:

1 Center for Computational Research, SUNY-Buffalo 2 Computer Science & Engineering SUNY-Buffalo 3 Hauptman-Woodward Medical Research Institute BnP on the Grid Russ Miller 1,2,3, Mark Green 1,2, Charles M. Weeks 3 University at Buffalo The State University of New York NSF, NIH, DOE, NYS

University at BuffaloThe State University of New York CCR Center for Computational Research DISCOM SinRG APGrid IPG … Grid Computing

University at BuffaloThe State University of New York CCR Center for Computational Research Grid Computing Overview Coordinate Computing Resources, People, Instruments in Dynamic Geographically-Distributed Multi-Institutional Environment Treat Computing Resources like Commodities  Compute cycles, data storage, instruments  Human communication environments No Central Control; No Trust Imaging Instruments Computational Resources Large-Scale Databases Data Acquisition Analysis Advanced Visualization Thanks to Mark Ellisman

University at BuffaloThe State University of New York CCR Center for Computational Research Factors Enabling the Grid Internet is Infrastructure  Increased network bandwidth and advanced services Advances in Storage Capacity  Terabyte costs less than $5,000 Internet-Aware Instruments Increased Availability of Compute Resources  Clusters, supercomputers, storage, visualization devices Advances in Application Concepts  Computational science: simulation and modeling  Collaborative environments  large and varied teams Grids Today  Moving towards production; Focus on middleware

University at BuffaloThe State University of New York CCR Center for Computational Research SnB on Grids ACDC-Grid (Western New York)  CIT (UB), CCR (UB), CSE (UB), Dental (UB), HWI  Linux, Windows, IRIX, AIX, Solaris  Pentium, Itanium, Power, MIPS Grid3+ (International): GriPhyN, PPDG  29 Sites: ANL, SMU, BNL, BU, CalTech-Grid3, CalTech-PG, FIU, HU, IU, JHU, KNU, OU-HEP, OU-OSCER, PDSF, PSU, Rice, UB, UCSD, UCSD- Prod, UIC, UFL-Grid3, UFL-PG, UMICH, UNM, FNAL, UTA, UWMad, UWMil, Vanderbilt  GriPhyN, PPDG, iVDGL, LIGO,  VOs: iVDGL, LIGO, SDSS, USATLAS, USCMS and BTEV  Linux/Pentium, VDT, Globus, ACDC Monitoring, MonaLisa, Ganglia, Condor, PBS, LSF, FBS, PyGlobus, Perl, Pacman IBM NE BioGrid (Northeast USA)  MIT, Harvard, MGH  Regatta, Pentium, Linux

University at BuffaloThe State University of New York CCR Center for Computational Research NSF Extensible TeraGrid Facility NCSA: Compute IntensiveSDSC: Data IntensivePSC: Compute Intensive IA64 Pwr4 EV68 IA32 EV7 IA64 Sun 10 TF IA large memory nodes 230 TB Disk Storage GPFS and data mining 4 TF IA-64 DB2, Oracle Servers 500 TB Disk Storage 6 PB Tape Storage 1.1 TF Power4 6 TF EV68 71 TB Storage 0.3 TF EV7 shared-memory 150 TB Storage Server 1.25 TF IA Viz nodes 20 TB Storage 0.4 TF IA-64 IA32 Datawulf 80 TB Storage Extensible Backplane Network LA Hub Chicago Hub IA32 Storage Server Disk Storage Cluster Shared Memory Visualization Cluster LEGEND 30 Gb/s IA64 30 Gb/s Sun ANL: VisualizationCaltech: Data collection analysis 40 Gb/s Backplane Router Figure courtesy of Rob Pennington, NCSA

University at BuffaloThe State University of New York CCR Center for Computational Research Apex Bioinformatics System  Sun V880 (3), Sun 6800  Sun 280R (2)  Intel PIIIs  Sun 3960: 7 TB Disk Storage HP/Compaq SAN  75 TB Disk  190 TB Tape  64 Alpha Processors (400 MHz)  32 GB RAM; 400 GB Disk IBM RS/6000 SP: 78 Processors Sun Cluster: 80 Processors SGI Intel Linux Cluster  150 PIII Processors (1 GHz)  Myrinet Major CCR Resources (12TF & 290TB) Dell Linux Cluster: #22  #25  #38  #95  600 P4 Processors (2.4 GHz)  600 GB RAM; 40 TB Disk; Myrinet Dell Linux Cluster: #187  #368  off  4036 Processors (PIII 1.2 GHz)  2TB RAM; 160TB Disk; 16TB SAN IBM BladeCenter Cluster: #106  532 P4 Processors (2.8 GHz)  5TB SAN SGI Origin3700 (Altix)  64 Processors (1.3GHz ITF2)  256 GB RAM  2.5 TB Disk SGI Origin3800  64 Processors (400 MHz)  32 GB RAM; 400 GB Disk

University at BuffaloThe State University of New York CCR Center for Computational Research Advanced Computational Data Center ACDC: Grid Overview 300 Dual Processor 2.4 GHz Intel Xeon RedHat Linux TB Scratch Space Joplin: Compute Cluster 75 Dual Processor 1 GHz Pentium III RedHat Linux TB Scratch Space Nash: Compute Cluster 9 Single Processor Dell P4 Desktops School of Dental Medicine 13 Various SGI IRIX Processors Hauptman-Woodward Institute 25 Single Processor Sun Ultra5s Computer Science & EngineeringCrosby: Compute Cluster SGI Origin MHz IP35 IRIX m 360 GB Scratch Space 9 Dual Processor 1 GHz Pentium III RedHat Linux GB Scratch Space Mama: Compute Cluster 16 Dual Sun Blades 47 Sun Ultra5 Solaris GB Scratch Space Young: Compute Cluster T1 Connection Note: Network connections are 100 Mbps unless otherwise noted. 19 IRIX, RedHat, & WINNT Processors CCR RedHat, IRIX, Solaris, WINNT, etc Expanding ACDC: Grid Portal 4 Processor Dell GHz Intel Xeon RedHat Linux GB Scratch Space 1 Dual Processor 250 MHz IP30 IRIX 6.5 Fogerty: Condor Flock Master

University at BuffaloThe State University of New York CCR Center for Computational Research BCOEB Medical/Dental Network Connections

University at BuffaloThe State University of New York CCR Center for Computational Research ACDC Data Grid Overview (Grid-Available Data Repositories ) 300 Dual Processor 2.4 GHz Intel Xeon RedHat Linux TB Scratch Space Joplin: Compute Cluster 75 Dual Processor 1 GHz Pentium III RedHat Linux TB Scratch Space Nash: Compute Cluster Crosby: Compute Cluster SGI Origin MHz IP35 IRIX m 360 GB Scratch Space 9 Dual Processor 1 GHz Pentium III RedHat Linux GB Scratch Space Mama: Compute Cluster 16 Dual Sun Blades 47 Sun Ultra5 Solaris GB Scratch Space Young: Compute Cluster Note: Network connections are 100 Mbps unless otherwise noted. 182 GB Storage 100 GB Storage 56 GB Storage 100 GB Storage 70 GB Storage Network Attached Storage 1.2 TB Storage Area Network 75 TB 136 GB Storage CSE Multi-Store 40 TB 4 Processor Dell GHz Intel Xeon RedHat Linux GB Scratch Space ACDC: Grid Portal

University at BuffaloThe State University of New York CCR Center for Computational Research ACDC-Grid Cyber-Infrastructure Predictive Scheduler  Define quality of service estimates of job completion, by better estimating job runtimes by profiling users. Data Grid  Automated Data File Migration based on profiling users. High-performance Grid-enabled Data Repositories  Develop automated procedures for dynamic data repository creation and deletion. Dynamic Resource Allocation  Develop automated procedures for dynamic computational resource allocation.

University at BuffaloThe State University of New York CCR Center for Computational Research ACDC-Grid Browser view of “miller” group files published by user “rappleye”

University at BuffaloThe State University of New York CCR Center for Computational Research ACDC-Grid Administration

University at BuffaloThe State University of New York CCR Center for Computational Research Molecular Structure Determination via Shake-and-Bake SnB Software by UB/HWI  “Top Algorithms of the Century” Worldwide Utilization Critical Step  Rational Drug Design  Structural Biology  Systems Biology Vancomycin  “Antibiotic of Last Resort” Current Efforts  Grid  Collaboratory  {Intelligent Learning}

University at BuffaloThe State University of New York CCR Center for Computational Research Molecular Structure Determination via Shake-and-Bake SnB Software by UB/HWI  “Top Algorithms of the Century” Worldwide Utilization Critical to Rational Drug Design Important Link in Structural Biology Vancomycin: Antibiotic of Last Resort Current Effort  Grid  Collaboratory  Intelligent Learning

Screenshots 1 General Information Screen for TriVanco

Screenshots 1 Normalize Reflections and Generate Invariants

Screenshots 1 Define SnB Parameters: 10,000 trials; 404 cycles; triplets

Screenshots 1 Run SnB job on Grid

Screenshots 1 Histogram of Final Rmin Values after 200 Trials Completed

Screenshots 1 Histogram of Final Rmin Values after 2471 Trials Completed

Screenshots 1 Cycle-by-Cycle Rmin Trace of Best Trial

Screenshots 1 Histogram of Final Rmin Values after 9250 Trials Completed

University at BuffaloThe State University of New York CCR Center for Computational Research Live Demo Time to Gamble Demonstration using ACDC Grid in Buffalo

User starts up – default image of structure.

Molecule scaled, rotated, and labeled.

User Mouse to Select Carbon Atoms

Remove Carbon Atoms (and Links)

User Adds Bond Between Atoms

Scale Radius of Atoms

Continue Scaling Atoms

University at BuffaloThe State University of New York CCR Center for Computational Research Middleware Grid (Computational and Data)  Globus Toolkit  direct upgrade WSRF  Condor  Network Weather Service 2.6  Apache2 HTTP Server  PHP  MySQL 3.23  phpMyAdmin Collaboratory  OpenGL (LibDMS, DevIL, GLUT)  Windows, IRIX, Mac OS X, Linux  CAVE, Desktop

University at BuffaloThe State University of New York CCR Center for Computational Research ACDC-Grid Collaborations High-Performance Networking Infrastructure WNY Grid Initiative Grid3+ Collaboration iVDGL Member  Only External Member Open Science Grid Member  Organizational Committee  Blueprint Committee  Security Working Group  Data Working Group Grid-Based Visualization  SGI Collaboration Grid-Lite: Campus Grid  HP Labs Collaboration Innovative Laboratory Prototype  Dell Collaboration

University at BuffaloThe State University of New York CCR Center for Computational Research Acknowledgments Amin Ghadersohi Naimesh Shah Stephen Potter Cathy Ruby Steve Gallo Jason Rappleye Martins Innus Jon Bednasz Sam Guercio Dori Macchioni

University at BuffaloThe State University of New York CCR Center for Computational Research