Life and Health Sciences Summary Report. “Bench to Bedside” coverage Participants with very broad spectrum of expertise bridging all scales –From molecule.

Slides:



Advertisements
Similar presentations
GRADD: Scientific Workflows. Scientific Workflow E. Science laboris Workflows are the new rock and roll of eScience Machinery for coordinating the execution.
Advertisements

A Workflow Engine with Multi-Level Parallelism Supports Qifeng Huang and Yan Huang School of Computer Science Cardiff University
Computational Paradigms in the Humanities – eHumanities and their role and impact in transdisciplinary research Gerhard Budin University of Vienna.
Test Automation Success: Choosing the Right People & Process
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
Materials by Design G.E. Ice and T. Ozaki Park Vista Hotel Gatlinburg, Tennessee September 5-6, 2014.
HPC - High Performance Productivity Computing and Future Computational Systems: A Research Engineer’s Perspective Dr. Robert C. Singleterry Jr. NASA Langley.
Workshop Charge J.C. Wells & M. Sato Park Vista Hotel Gatlinburg, Tennessee September 5-6, 2014.
1 Richard White Design decisions: architecture 1 July 2005 BiodiversityWorld Grid Workshop NeSC, Edinburgh, 30 June - 1 July 2005 Design decisions: architecture.
Automated Analysis and Code Generation for Domain-Specific Models George Edwards Center for Systems and Software Engineering University of Southern California.
ARCS Data Analysis Software An overview of the ARCS software management plan Michael Aivazis California Institute of Technology ARCS Baseline Review March.
Problem-Solving Environments: The Next Level in Software Integration David W. Walker Cardiff University.
Introduction to Software Testing
Chapter 12: Simulation and Modeling Invitation to Computer Science, Java Version, Third Edition.
Business process management (BPM) Petra Popovičová.
WORKFLOWS IN CLOUD COMPUTING. CLOUD COMPUTING  Delivering applications or services in on-demand environment  Hundreds of thousands of users / applications.
- Chaitanya Krishna Pappala Enterprise Architect- a tool for Business process modelling.
Iterative computation is a kernel function to many data mining and data analysis algorithms. Missing in current MapReduce frameworks is collective communication,
Co-Design Breakout A. Maccabe & M. Sato Park Vista Hotel Gatlinburg, Tennessee September 5-6, 2014.
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
Chapter 12: Simulation and Modeling
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
SCIENCE-DRIVEN INFORMATICS FOR PCORI PPRN Kristen Anton UNC Chapel Hill/ White River Computing Dan Crichton White River Computing February 3, 2014.
gpucomputing.net is a research and development community site dedicated to fostering collaborative and interdisciplinary work on the various disciplines.
© Fujitsu Laboratories of Europe 2009 HPC and Chaste: Towards Real-Time Simulation 24 March
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
Slide 1 Auburn University Computer Science and Software Engineering Scientific Computing in Computer Science and Software Engineering Kai H. Chang Professor.
ExTASY 0.1 Beta Testing 1 st April 2015
Effective User Services for High Performance Computing A White Paper by the TeraGrid Science Advisory Board May 2009.
ICOM 5995: Performance Instrumentation and Visualization for High Performance Computer Systems Lecture 7 October 16, 2002 Nayda G. Santiago.
Managing Service Metadata as Context The 2005 Istanbul International Computational Science & Engineering Conference (ICCSE2005) Mehmet S. Aktas
Uncovering the Multicore Processor Bottlenecks Server Design Summit Shay Gal-On Director of Technology, EEMBC.
The iPlant Collaborative Community Cyberinfrastructure for Life Science Tools and Services Workshop Objectives.
Model-Driven Analysis Frameworks for Embedded Systems George Edwards USC Center for Systems and Software Engineering
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
NIH Resource for Biomolecular Modeling and Bioinformatics Beckman Institute, UIUC NAMD Development Goals L.V. (Sanjay) Kale Professor.
2 2009/10 Object Oriented Technology 1 Topic 2: Introduction to Object-Oriented Approach Reference: u Ch.16 Current Trends in System Development (Satzinger:
1CPSD Software Infrastructure for Application Development Laxmikant Kale David Padua Computer Science Department.
FDT Foil no 1 On Methodology from Domain to System Descriptions by Rolv Bræk NTNU Workshop on Philosophy and Applicablitiy of Formal Languages Geneve 15.
MODUS Project FP7- SME – , Eclipse Conference Toulouse, May 6 th 2013 Page 1 MODUS Project FP Methodology and Supporting Toolset Advancing.
Experts in numerical algorithms and HPC services Compiler Requirements and Directions Rob Meyer September 10, 2009.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Lawrence Livermore National Laboratory S&T Principal Directorate - Computation Directorate Tools and Scalable Application Preparation Project Computation.
Rahul Garg National Technology Specialist Microsoft Australia SOA303.
1 Technical & Business Writing (ENG-715) Muhammad Bilal Bashir UIIT, Rawalpindi.
An Overview of Scientific Workflows: Domains & Applications Laboratoire Lorrain de Recherche en Informatique et ses Applications Presented by Khaled Gaaloul.
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Computing Systems: Next Call for Proposals Dr. Panagiotis Tsarchopoulos Computing Systems ICT Programme European Commission.
Clinical Research Informatics [CRI]. Informatics, defined generally as the intersection of information and computer science with a health-related discipline,
From Use Cases to Implementation 1. Structural and Behavioral Aspects of Collaborations  Two aspects of Collaborations Structural – specifies the static.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
EU-Russia Call Dr. Panagiotis Tsarchopoulos Computing Systems ICT Programme European Commission.
CERN VISIONS LEP  web LHC  grid-cloud HL-LHC/FCC  ?? Proposal: von-Neumann  NON-Neumann Table 1: Nick Tredennick’s Paradigm Classification Scheme Early.
HPC University Requirements Analysis Team Training Analysis Summary Meeting at PSC September Mary Ann Leung, Ph.D.
From Use Cases to Implementation 1. Mapping Requirements Directly to Design and Code  For many, if not most, of our requirements it is relatively easy.
Evolution of successful Forum for Computational Excellence (FCE) Pilot project – raising awareness for HEP response to rapid evolution of the computational.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
BioExcel - Intro Erwin Laure, KTH. PDC Center for High Performance Computing BioExcel Consortium KTH Royal Institute of Technology – Sweden University.
VisIt Project Overview
Business process management (BPM)
Engineering (Richard D. Braatz and Umberto Ravaioli)
EUDAT: collaborative pan-European infrastructure providing research data services, training and consultancy This work is licensed.
Status and Challenges: January 2017
Business process management (BPM)
ICT NCP Infoday Brussels, 23 June 2010
Population Information Integration, Analysis and Modeling
Model-Driven Analysis Frameworks for Embedded Systems
Technical Capabilities
Defining the Grid Fabrizio Gagliardi EMEA Director Technical Computing
Big Data, Simulations and HPC Convergence
Presentation transcript:

Life and Health Sciences Summary Report

“Bench to Bedside” coverage Participants with very broad spectrum of expertise bridging all scales –From molecule to population Life Sciences: –computational chemistry, computational biology –molecular, cellular, and systems biology applications Health Sciences: –biomedical engineering, clinical sciences –multi-modality, multi-source data integration and analytics for personalized medicine and population health applications

Charge Question 1 What technical breakthroughs in science and engineering research can be enabled by exascale platforms and are attractive targets for Japan-US collaboration over the next 10 years? Synergies in Life Sciences: strong scaling for MD simulations biological self-assembly bridging length and time scale for cellular simulations (cell community) whole primate brain simulation data analytics for event detection, feature selection, sub- state discovery

Charge Question 2 What is the representative suite of applications in your research area, available today, which should form the basis of your co-design communication with computer architects? Length scales AtomisticCoarse- grained CellularTissueOrgan Time-scalesAMBER, GROMACS, CHARMM MartiniPySB,…NEURON, NEST, GENESIS (neuroscience simulation) Brownian Motion BrownianBioNetGen… Identify core data analytic routines that can use parallel architectures for execution – wrappers that can automatically parallelize code Data Generation and Analysis must happen in tandem – within one workflow and involve interactive visualization and feedback – the latter critical to clinical sciences as well

Charge Questions 3 How can the application research community, represented by a topical breakout at this workshop, constructively engage the vendor community in co-design? a)How should these various aspects of the application and architecture be optimized for effective utilization of exascale compute and data resources? b)Consider all aspects of exascale application: formulation and basic algorithms, programming models & environments, data analysis and management, hardware characteristics. Biological simulations are a small proportion of the market Can the existing architectures support the “Brain” initiatives? Involving vendors in discussion of these designs

Charge Question 4 How can you best manage the “conversations” with computer designers/architects around co-design such that (1) they are practical for computer design, and (2) the results are correctly interpreted within both communities? a)What are the useful performance benchmarks from the perspective of your domain? Benchmarks: Already available within the molecular simulation community. Not as much for the clinical and healthcare domains. No standard benchmarks for Neuroscience related applications a)Are mini-apps an appropriate and/or feasible approach to capture your needs for communication to the computer designers? Examples include FFTW, Fast-multipole methods, Grid computations, a)Are there examples of important full applications that are an essential basis for communication with computer designers? b)Can these be simplified into skeleton apps or mini-apps to simplify and streamline the co-design conversation

Charge Question 5 Describe the most important programming models and environment in use today within your community and characterize these as sustainable or unsustainable. a)Do you have appropriate methods and models to expose application parallelism in a high-performance, portable manner? MPI and Open MP for simulations; Neuroscience: data parallel model of computations; a)Are best practices in software engineering often or seldom applied? b)Going forward, what are the critically important programming languages? Programming languages: C/C++, Python, Java a)On which libraries and/or domain-specific languages (DSL) is your research community dependent? b)Are new libraries or DSL’s needed in your research domain? Support for Natural Language Processing required for Healthcare/ Clinical sciences. a)Are these aspects of your programming environment sustainable or are new models needed to ensure their availability into the future? Problems include: persistance of data; load balancing with heterogenous datasets; Data transfer costs not expensive in neuroscience applications but more expensive in life sciences data.

Charge Question 6 Does your community have mature workflow tools that are implemented within leadership computing environments to assist with program composition, execution, analysis, and archival of results? If no, what are your needs and is their opportunity for value added? NO a)For example, do you need support for real-time, interactive workflows to enable integration with real-time data flows? a) YES, we need such tools

Breakout Charge Questions, continued What are the new programming models, environments and tools that need to be developed to achieve our science goals with sustainable application software? WE ARE NOT THERE YET

Charge Question 8 Is there a history, a track record in your research community for co-design for HPC systems in the installed machines in the past, and is there any co- design study done for these systems to document the effectiveness of co-design? Life Sciences: ANTON as a supercomputer for Molecular dynamics simulations, installed in Pittsburgh Supercomputing Center; AMBER simulation implementation on GPUs; Neuromorphic chips in computational neuroscience Healthcare/Clinical Sciences: Ongoing debate whether we need HPC for applications in this domain

Actionable Items 1. Benchmarking of different tools against different applications in terms of scalability, efficiency, and performance 2. In situ analysis and visualization of simulations to guide simulations 3. Continue discussion on systems neuroscience