PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.

Slides:



Advertisements
Similar presentations
C YBERINFRASTRUCTURE C ENTER FOR P OLAR S CIENCE (CICPS) Cyberinfrastructure for Remote Sensing of Ice Sheets Dr. Linda Hayden Dr. Geoffrey Fox Dr. Prasad.
Advertisements

Architecture and Measured Characteristics of a Cloud Based Internet of Things May 22, 2012 The 2012 International Conference.
Evaluation of Cloud Storage for Preservation and Distribution of Polar Data. Nadirah Cogbill Mentors: Marlon Pierce, Yu (Marie) Ma, Xiaoming Gao, and Jun.
1 Challenges and New Trends in Data Intensive Science Panel at Data-aware Distributed Computing (DADC) Workshop HPDC Boston June Geoffrey Fox Community.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Our Goal: To develop and implement innovative and relevant research collaboration focused on ice sheet, coastal, ocean, and marine research. NSF: Innovation.
Clouds from FutureGrid’s Perspective April Geoffrey Fox Director, Digital Science Center, Pervasive.
1 Supplemental line if need be (example: Supported by the National Science Foundation) Delete if not needed. Supporting Polar Research with National Cyberinfrastructure.
Student Visits August Geoffrey Fox
Cyberinfrastructure: Initiatives at the US National Science Foundation Stephen Nash Program Director, Operations Research U.S. National Science Foundation.
1 Multicore and Cloud Futures CCGSC September Geoffrey Fox Community Grids Laboratory, School of informatics Indiana University
SALSASALSASALSASALSA Digital Science Center June 25, 2010, IIT Geoffrey Fox Judy Qiu School.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Tools for e-Research Mat Wyatt. 2 e-Research Sensor nets data compute… Models/ software/ workflows colleagues instruments.
DuraCloud A service provided by Sandy Payette and Michele Kimpton.
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
18:15:32Service Oriented Cyberinfrastructure Lab, Grid Deployments Saul Rioja Link to presentation on wiki.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
Experimenting with FutureGrid CloudCom 2010 Conference Indianapolis December Geoffrey Fox
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
Cyberinfrastructure Geoffrey Fox Indiana University with Linda Hayden Elizabeth City State University April Virtual meeting.
CYBERINFRASTRUCTURE FOR THE GEOSCIENCES High Performance Computing applications in GEON: From Design to Production Dogan Seber.
OpenQuake Infomall ACES Meeting Maui May Geoffrey Fox
FRIDAY NOVEMBER 7, 2008 National Selected Show Case Projects (with strong student research component) Cyberinfrastructure, Remote Sensing, Collaboration.
1 PolarGrid Open Grid Forum OGF21, Seattle Washington October Geoffrey Fox Computer Science, Informatics, Physics Pervasive Technology Laboratories.
Multi-Channel Radar Depth Sounder (MCRDS) Signal Processing: A Distributed Computing Approach Je’aime Powell 1, Dr. Linda Hayden 1, Dr. Eric Akers 1, Richard.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
GRSS Technical Committees and Chapter Meeting IGARSS 2007 Dr. Linda Bailey Hayden
Crystal25 Hunter Valley, Australia, 11 April 2007 Crystal25 Hunter Valley, Australia, 11 April 2007 JAINIS (JCU and Indiana Instrument Services): A Grid.
GEM Portal and SERVOGrid for Earthquake Science PTLIU Laboratory for Community Grids Geoffrey Fox, Marlon Pierce Computer Science, Informatics, Physics.
1 Meadowood January Geoffrey Fox Associate Dean for Research and Graduate Studies, School of Informatics and Computing Indiana.
Perspectives on Cyberinfrastructure Daniel E. Atkins Professor, University of Michigan School of Information & Dept. of EECS October 2002.
Building Effective CyberGIS: FutureGrid Marlon Pierce, Geoffrey Fox Indiana University.
National Ecological Observatory Network
Pascucci-1 Valerio Pascucci Director, CEDMAV Professor, SCI Institute & School of Computing Laboratory Fellow, PNNL Massive Data Management, Analysis,
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
1 CReSIS Lawrence Kansas February Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science.
Forward Observer In-Flight Dual Copy System Richard Knepper, Matthew Standish NASA Operation Ice Bridge Field Support Research Technologies Indiana University.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
November Geoffrey Fox Community Grids Lab Indiana University Net-Centric Sensor Grids.
Virtual Appliances CTS Conference 2011 Philadelphia May Geoffrey Fox
Community Grids Lab at Pervasive Technology Labs Geoffrey Fox
DuraCloud Open technologies and services for managing durable data in the cloud Michele Kimpton, CBO DuraSpace.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Future Grid Future Grid Overview. Future Grid Future GridFutureGridFutureGrid The goal of FutureGrid is to support the research that will invent the future.
SALSASALSASALSASALSA Digital Science Center February 12, 2010, Bloomington Geoffrey Fox Judy Qiu
Big Data to Knowledge Panel SKG 2014 Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China August Geoffrey Fox
HPC in the Cloud – Clearing the Mist or Lost in the Fog Panel at SC11 Seattle November Geoffrey Fox
→ MIPRO Conference,Opatija, 31 May -3 June 2005 Grid-based Virtual Organization for Flood Prediction Miroslav Dobrucký Institute of Informatics, SAS Slovakia,
Dr. Linda Hayden, Box 672 Elizabeth City State University Elizabeth City, NC Cyberinfrastructure for Remote Sensing.
Directions in eScience Interoperability and Science Clouds June Interoperability in Action – Standards Implementation.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Big Data Workshop Summary Virtual School for Computational Science and Engineering July Geoffrey Fox
Clouds , Grids and Clusters
Tools and Services Workshop
Project Title Watershed Watch 2007 Elizabeth City State University
Clouds from FutureGrid’s Perspective
Big Data Architectures
Cyberinfrastructure and PolarGrid
PolarGrid and FutureGrid
Status of Grids for HEP and HENP
Watershed Watch 2007 :: Elizabeth City State University
CReSIS Cyberinfrastructure
Project Title Watershed Watch 2013 Elizabeth City State University
Project Title Watershed Watch 2009 Elizabeth City State University
Presentation transcript:

PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University – Bloomington Director of Digital Science Center of the Pervasive Technology Institute Linda Hayden (co-PI) ECSU 1 of 12

What is Cyberinfrastructure? Cyberinfrastructure is infrastructure that supports distributed research and learning (e-Science, e-Research, e-Education or digital science at Indiana University) Links data, people and computers Exploits Internet technology (Web2.0 and Clouds) adding (via Grid technology) management, security, supercomputers etc. It has two aspects: parallel – low latency (microseconds) between nodes and distributed – highish latency (milliseconds) between nodes Parallel needed to get high performance on individual large simulations, data analysis etc.; must decompose problem Distributed aspect integrates already distinct components (data) Integrate with TeraGrid (and Open Science Grid) 2 of 12

Support CReSIS with Cyberinfrastructure Cyberinfrastructure is distributed hardware and software supporting collaborative science Base and Field Camps for Arctic and Antarctic expeditions Training and education resources at ECSU Collaboration Technology at ECSU Lower-48 System at Indiana University and ECSU to support off line data analysis and large scale simulations installed and currently being tested (total ~ 20 TF) CReSIS analysis suitable for clouds 3 of 12

Indiana University Cyberinfrastructure Experience Indiana University PTI team is a partnership between a research group (Community Grids Laboratory led by Fox) and the University IT Research Technologies (UITS-RT led by Stewart) This allows us robust systems support from expeditions to lower 48 systems with use of leading edge technologies PolarGrid would not have succeeded without this collaboration IU runs Internet2/NLR Network Operations Center IU is a member of TeraGrid and Open Science Grid IU leads FutureGrid – NSF facility to support testing of new systems and application software – Fox PI IU has provided Cyberinfrastructure for LEAD (Tornado forecasting), QuakeSim (Earthquakes), Sensor Grids for Air Force in areas with some overlap with CReSIS requirements 4 of 12

5

6 PolarGrid goes to Greenland

NEEM 2008 Base Station7

PolarGrid Greenland 2008 Base System (Ilulissat Airborne Radar) –8U, 64 core cluster, 48TB external fibre-channel array –Laptops (one off processing and image manipulation) –2TB MyBook tertiary storage –Total data acquisition 12TB (plus 2 back up copies) –Satellite transceiver available if needed, but used wired network at airport used for sending data back to IU Base System (NEEM Surface Radar, Remote Deployment) –2U, 8 core system utilizing internal hard drives hot swap for data backup –4.5TB total data acquisition (plus 2 backup copies) –Satellite transceiver used for sending data back to IU –Laptops (one off processing and image manipulation) 8 of 12

PolarGrid Summary Supported several expeditions starting July 2008 Ilulissat: airborne radar NEEM: ground-based radar, remote deployment Thwaites: ground-based radar Punta Arenas/Byrd Camp: airborne radar Thule/Kangerlussuaq: airborne radar IU-funded Sys-Admin support in the field –1 admin Greenland NEEM 2008 –1 admin Greenland 2009 (March 2009) –1 admin Antarctica 2009/2010 (Nov 09 – Feb 2010) –1 admin Greenland Thule March 2010 –1 admin Greenland Kangerlussuaq-Thule April of 12

PolarGrid Summary Expedition Cyberinfrastructure simplified after initial experiences as power/mobility more important than ability to do sophisticated analysis. Smaller system footprint and data management has driven cost per system down. Complex storage environments are not practical in a mobile data processing environment Pre-processing data in the field has allowed validation of data acquisition during collection phases Offline analysis partially done on PolarGrid system at Indiana University 10 of 12

11 0f 12 Assistant Professor, Eric Akers, and graduate student, Je’aime Powell, from ECSU travel to Greenland 2008 ECSU and PolarGrid Initially a 64-core cluster, allowing near real-time analysis of radar data by the polar field teams. –Factor of 10 larger system being tested at IU and will be installed at ECSU An educational videoconferencing Grid to support educational activities PolarGrid Laboratory for students ECSU supports PolarGrid Cyberinfrastructure in the field

Possible Future CReSIS Contributions Base and Field Camps for Arctic and Antarctic expeditions Initial data analysis to monitor experimental equipment Training and education resources Computer labs; Cyberlearning/collaboration Full off-line analysis of data on “lower 48” systems exploiting PolarGrid, Indiana University (archival and dynamic storage), TeraGrid Data management, metadata support and long term data repositories Parallel (multicore/cluster) versions of simulation and data analysis codes Portals for ease of use 12 of 12