Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.

Slides:



Advertisements
Similar presentations
Supercomputing Institute for Advanced Computational Research © 2009 Regents of the University of Minnesota. All rights reserved. The Minnesota Supercomputing.
Advertisements

Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Workshop on HPC in India Challenges of Garuda : The National Grid Computing Initiative of India Subrata Chattopadhyay C-DAC Knowledge Park Bangalore, India.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
HPC in Poland Marek Niezgódka ICM, University of Warsaw
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
Prof. Jesús A. Izaguirre Department of Computer Science and Engineering Computational Biology and Bioinformatics at Notre Dame.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Purdue RP Highlights TeraGrid Round Table September 23, 2010 Carol Song Purdue TeraGrid RP PI Rosen Center for Advanced Computing Purdue University.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
The Research Computing Center Nicholas Labello
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
CSG - Research Computing Redux John Holt, Alan Wolf University of Wisconsin - Madison.
EPSCoR Cyberinfrastructure Assessment Workshop North Dakota Jurisdictional Assessment October 15, 2007 Bonnie Neas VP for IT North Dakota State University.
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 North West Grid Overview R.J. Allan CCLRC Daresbury Laboratory A world-class Grid.
National Ecological Observatory Network
Purdue Campus Grid Preston Smith Condor Week 2006 April 24, 2006.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
BalticGrid-II Project BalticGrid-II Kick-off Meeting, , Vilnius1 Joint Research Activity Enhanced Application Services on Sustainable e-Infrastructure.
08/05/06 Slide # -1 CCI Workshop Snowmass, CO CCI Roadmap Discussion Jim Bottum and Patrick Dreher Building the Campus Cyberinfrastructure Roadmap Campus.
Internet 2 Applications Update Ted Hanss 8 October 1997 Washington D.C. Ted Hanss 8 October 1997 Washington D.C.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Future of grids V. Breton CNRS. EGEE training, CERN, May 19th Table of contents Introduction Future of infrastructures : from networks to e-
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Group Science J. Marc Overhage MD, PhD Regenstrief Institute Indiana University School of Medicine.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
26/05/2005 Research Infrastructures - 'eInfrastructure: Grid initiatives‘ FP INFRASTRUCTURES-71 DIMMI Project a DI gital M ulti M edia I nfrastructure.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Bob Jones EGEE Technical Director
Clouds , Grids and Clusters
Tools and Services Workshop
Joslynn Lee – Data Science Educator
I Brazilian LHC Computing Workshop Welcome
Introduce yourself Presented by
Future EU Grid Projects
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
What is a Grid? Grid - describes many different models
Presentation transcript:

Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet

2 Northwest Indiana Computational Grid What Is the NWIC Grid? Collaboration of multiple academic institutions in NW Indiana: –Purdue University, West Lafayette –Purdue University, Calumet –University of Notre Dame In cooperation with DOE & Argonne National Labs To enhance our joint research capabilities driven by high-end cyberinfrastructure –Cycles, bandwidth, storage, visualization –Train users to effectively middleware that makes the grid transparent –Applications research that demonstrates the grid’s potential Cyberinfrastructure education With significant initial funding thru DOE

3 Northwest Indiana Computational Grid Purdue, WL Purdue, Calumet Argonne Notre Dame

4 Northwest Indiana Computational Grid Goal of Effort Enhance our collective impact on Science & Engineering through –promoting grid computing in our research, teaching and outreach This involves –Fostering new collaborations and new research projects –Building shared infrastructure: hardware, middleware, access privileges –Developing new paradigms for high-end grid computing

5 Northwest Indiana Computational Grid Some Key Efforts Parallel 3D Potts model package –Morphogenesis model for cells by lattice distribution –Energy model to explain morphogenesis Star cluster evolution –How do binary stars impact cluster evolution –How is the evolutionary trajectory impacted by the mass of the cluster 9/11 World Trade Center North Tower Impact –Replicate exterior damage to calibrate model parameters –Determine plausible scenarios for interior structural damage

6 Northwest Indiana Computational Grid Open Science Grid NWICG Partner grid of the OSG consortium Demonstrated connectivity between partners using OSG –Used OSG to run 9/11 LS-DYNA simulations at NotreDame –Demonstrated debugging prototype that diagnosed previously unknown problems in tens of thousands of jobs on Teragrid Developing wide range of software to use OSG –Support HEP communities at Notre Dame and Purdue Calumet –global-scale system for securely distributing complex software for high energy physics codes –tactical storage systems to interoperate with the EGEE European Grid, for international scale bioinformatics dataset retrieval –“identity boxing” techniques for space allocation facilities to improve robustness of grid.

7 Northwest Indiana Computational Grid Application Drivers Energy Systems modeling Nuclear Stewardship modeling Environmental modeling New material properties via molecular modeling Modeling of dispersal of radioactive materials Computational Nanoscale Device modeling Computational astrophysics Design optimization for manufacture Computational ecology Multiscale modeling in biological systems Peta scale data set processing from high energy physics Distributed environmental sensor networks & processing On-line data repositories: biometrics, genome, multi-media 3D Stereo of fluid dynamics in a blast furnace Remote Observatory Large scale biometric databases Mosquito genomic databases

8 Northwest Indiana Computational Grid Infrastructure Purdue West Lafayette –SGI Altix supercomputer (128 core,.5TB RAM, 33TB disk) –Campus grid (Condor) available to researchers at all NWICG campuses, and all communities in the OSG Purdue Calumet –Visualization facility –Condor pool, flocked with pools in West Lafayette Notre Dame –Sun Opteron cluster (144 CPU, 36 TB of NetApp storage) –Condor pool, flocked with pools in West Lafayette

9 Northwest Indiana Computational Grid Key Challenges H/W Infrastructure is the “easy” part –But do need coordination Build and ensure availability of top-notch human technical support –Both system administration and application development –Accessible to all member campuses –Build knowledge by working with colleagues in OSG Develop management infrastructure to support multi-institution proposals Most of all: keep eye on leveraging resources to enhance real science and engineering

10 Northwest Indiana Computational Grid Purdue, WL Purdue, Calumet Argonne Notre Dame Next Steps Complete the physical infrastructure –Deploying Altix at Purdue, finalizing WAN connections to research networks and I-light at ND and Calumet Optimize collaborations –Administrative level –Middleware level –Research level –Resource level Support efforts that develop compelling grid paradigms

11 Northwest Indiana Computational Grid More Information Information about: –Mission –Resources –Projects –Workshops –Latest news