CoSeC: Computational Science Centre for Research Communities

Slides:



Advertisements
Similar presentations
Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
Advertisements

STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
1 Project overview Presented at the Euforia KoM January, 2008 Marcin Płóciennik, PSNC, Poland.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
Cisco and NetApp Confidential. Distributed under non-disclosure only. Name Date FlexPod Entry-level Solution FlexPod Value, Sized Right for Smaller Workloads.
Structural Genomics – an example of transdisciplinary research at Stanford Goal of structural and functional genomics is to determine and analyze all possible.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Kick-off University Partners Meeting September 18, 2012 Michael Owen, VP Research, Innovation & International, UOIT on behalf of Consortium partners Southern.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Steven Newhouse, Head of Technical Services European Bioinformatics Institute: ICT Challenges.
1 Advanced Storage Technologies for High Performance Computing Sorin, Faibish EMC NAS Senior Technologist IDC HPC User Forum, April 14-16, Norfolk, VA.
CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space.
M W PooleJoint DL-RAL Accelerator Workshop Jan 09 Introduction to ASTeC Programmes M W Poole Director.
Shaping Future Science Introducing the National Science and Innovation Campuses.
Integrated e-Infrastructure for Scientific Facilities Kerstin Kleese van Dam STFC- e-Science Centre Daresbury Laboratory
VO Sandpit, November 2009 e-Infrastructure to enable EO and Climate Science Dr Victoria Bennett Centre for Environmental Data Archival (CEDA)
John Womersley John Womersley Director, Science Programmes Science and Technology Facilities Council Technology Gateway Centres.
Astro-WISE & Grid Fokke Dijkstra – Donald Smits Centre for Information Technology Andrey Belikov – OmegaCEN, Kapteyn institute University of Groningen.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
10 April © Keith G Jeffery HEPiX April 1999 Computing at CLRC Keith G Jeffery Acting Director, Information Technology.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Virtualisation & Cloud Computing at RAL Ian Collier- RAL Tier 1 HEPiX Prague 25 April 2012.
RAL Site Report HEPiX FAll 2014 Lincoln, Nebraska October 2014 Martin Bly, STFC-RAL.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
VO Sandpit, November 2009 e-Infrastructure for Climate and Atmospheric Science Research Dr Matt Pritchard Centre for Environmental Data Archival (CEDA)
A Data Centre for Science and Industry Roadmap. INNOVATION NETWORKING DATA PROCESSING DATA REPOSITORY.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
| nectar.org.au NECTAR TRAINING Module 2 Virtual Laboratories and eResearch Tools.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
RAL Site Report HEPiX Spring 2015 – Oxford March 2015 Martin Bly, STFC-RAL.
Data Preservation at Rutherford Lab David Corney 9 th July 2010 KEK.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
STFC in INDIGO DataCloud WP3 INDIGO DataCloud Kickoff Meeting Bologna April 2015 Ian Collier
STFC’s National Laboratories Round Table on Synergies and Complementarity among Laboratories John Womersley Chief Executive, STFC 13 th Pisa meeting on.
Usecases: 1.ISIS Neutron Source 2.DP for HEP Matthew Viljoen STFC, UK APARSEN-EGI workshop: preserving big data for research Amsterdam Science Park 4-6.
Enterprise Requirements: Industry Workshops and OGF Robert Cohen, Area Director, Enterprise Requirements.
Earth System Modelling: an HPC perspective Mike Ashworth & Rupert Ford Scientific Computing Department and STFC Hartree Centre STFC Daresbury Laboratory.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
Project Database Handler The Project Database Handler is a brokering application which will mediate interactions between the project database and other.
Public Outreach at RAL: Engaging the Next Generation of Scientists and Engineers Rob Appleyard On behalf of Sophy Palmer, James Adams, George Ryall, Ian.
Towards a unified MOD resource: An Overview
Bob Jones EGEE Technical Director
Elastic Cyberinfrastructure for Research Computing
Pasquale Pagano (CNR-ISTI) Project technical director
H2020, COEs and PRACE.
Clouds , Grids and Clusters
Joslynn Lee – Data Science Educator
Computing models, facilities, distributed computing
A successful public-private partnership
Scientific Computing Department
Introduction to ASTeC Programmes
LinkSCEEM-2: A computational resource for the development of Computational Sciences in the Eastern Mediterranean Mostafa Zoubi SESAME Outreach SESAME,
Data Science Diversity from the Perspective of a National Laboratory
Report from Computing Advisory Panel
National e-Infrastructure Vision
Work Package 2 Interactions with Targeted Communities
Grid Portal Services IeSE (the Integrated e-Science Environment)
UK Status and Plans Scientific Computing Forum 27th Oct 2017
JASMIN Success Stories
Recognising the vital role of the HE technician
Computing Infrastructure for DAQ, DM and SC
Brian Matthews STFC EOSCpilot Brian Matthews STFC
Bird of Feather Session
EOSC-hub Contribution to the EOSC WGs
Presentation transcript:

CoSeC: Computational Science Centre for Research Communities Barbara Montanari CoSeC Director & Acting Head of Applications Division Scientific Computing Department STFC

Partnerships in Research Software CoSeC supports the advancement of research by: developing software in multiple disciplines providing a hub for exchanging knowledge through training and outreach nurturing strong collaborations among researchers

CoSec: a New Partnership of Research Councils

One Centre, Two Sites Daresbury Lab Rutherford Appleton Lab

Community Support CCP4 HEC Biosim CCP5 HEC UKCP CCP9 HEC MCC CCPi CCP-NC CCP-mag CCP Plasma CCP-EM CCP PET-MR CCPQ CCPBiosim HEC Biosim HEC UKCP HEC MCC HEC Plasma HEC UKCOMES Software Outlook

What CoSeC People Do Train & support users Develop theories & methods Science domain expertise Develop theories & methods Develop software Software performance, optimization & porting Data post-processing Maintain, license & distribute software Workflow & visualization Validate & consolidate methods Develop staff Train & support users Build community

1973: Dawn of the CCPs “Primary aim is to bring together scientists to: Provide for the rapid interchange of information on theory, algorithms, and computer codes Collect, maintain and develop relevant items of software Encourage basic research by providing facilities for rapid computer implementation of new methods and techniques Assess and advice on associated computational needs Disseminate information among scientists” Computer Physics Communications 114 (1998) xii-xvii

Origin of UK Software Infrastructure “To assist the CCPs, the Science Research Council will provide support as follows: Support from staff from the Research Council’s Laboratories Short term appointments of Senior Visiting Fellows Longer term Research Assistantships Funding for networking events” Computer Physics Communications 114 (1998) xii-xvii

Origin of UK Software Infrastructure “To assist the CCPs, the Science Research Council will provide support as follows: Support from staff from the Research Council’s Laboratories Short term appointments of Senior Visiting Fellows Longer term Research Assistantships Funding for networking events” Computer Physics Communications 114 (1998) xii-xvii

The Past Modelling& simulation calculation Output data analysis, visualisation & processing Job submission Input preparation Biology Engineering Biochemistry Medical Science Physical Sciences

The Present Modelling& simulation calculation Output data analysis, visualisation & processing Job submission Input preparation Biology Engineering Biochemistry Medical Science Physical Sciences

Code Usage and User Training Metrics Blue: total Red: total/staff headcount

A Question of Balance Across the ecosystem that is the UK software infrastructure, we need to balance time and resource spent on: Shorter term “low hanging fruits” Longer term, larger scale software development efforts Nourishing and maintaining expertise

STFC: Home of the Large Experimental Facilities ISIS Neutron Source Diamond Light Source Central Laser Facility Common goal with RSEs: support research

Materials Workbench K. Dymkowski et al.

Hartree Centre and CoSeC Located at Daresbury Lab Partnership with IBM, Atos, etc. CoSeC expertise and codes are used for a number of computational science projects for Industry

Scientific Computing Department CoSeC Computational Science & Engineering Data & systems div., computational maths, software engineering, visualization

Scientific Computing Department: Tier1 Centre for CERN

Scientific Computing Department: JASMIN A world leading, unique hybrid of: 16PB high performance storage (~250GByte/s) High-performance computing (~4,000 cores) Non-blocking Networking (> 3Tbit/sec), and Optical Private Network WAN’s Coupled with cloud hosting capabilities JASMIN Holds >60% of Data used by the latest IPCC report on Climate change. JASMIN = Joint Analysis System Meeting Infrastructure Needs. A “Super Data Cluster” not a “Super Computer” Data movement and analysis. Funded by NERC for all of NERC sciences. Hosted at STFC RAL by the Research Infrastructure group of the SCD Satellite data, weather data. Climate simulations from the biggest super computers (“Archer”, MetOffce “Monsoon”, DKRZ.) Eg JASMIN Holds >60% of Data used by the latest IPCC report on Climate change. Largest data set 600TB Plus other environmental research eg Genetic data from bugs in the environment (Environmental Genomics) 16 PetaBytes useable (20PB raw) ~ 3,200,000 DVD’s = ~ 6km high tower of DVD’s or > 36,000 years of MP3 Two largest Panasas ‘realms’ in the world (109 and 125 shelves). Largest single site Panasas customer in the world (251 shelves) 900TB useable (1.44PB raw) NetApp iSCSI/NFS for virtualisation + Dell Equallogic PS6210XS for high IOPS low latency iSCSI 4,000 CPU cores split dynamically between batch cluster and cloud/virtualisation (VMware vCloud Director and vCenter/vSphere) 39 Racks >3 Tera bits per second bandwidth (~3500 DVD’s per minute). IO Capability of ~250GBytes/sec – Arguably in the top 10 in the world for IO performance. “hyper” converged network infrastructure - 10GbE + MPI low latency (~10uS) + iSCSI over same network fabric. (ie No separate SAN or Infiniband) Finalist for BCS UK industry Awards “Big Data Project of the Year” 2012 and 2014 Managed with 2FTE. Recruiting for a 3rd team member

Scientific Computing Department: Data Division Integrated data management pipelines for data handling From data acquisition to storage A Catalogue of Experimental Data Metadata as Middleware Automated metadata capture Providing access to the user Integrated into Analysis frameworks In daily production use at the large experimental facilities

CoSeC People & RSEs Research Councils job portal: www.topcareer.jobs We share all your values: Support, courage, community, diversity,…. We share many of your issues: Bus factor, … Some significant differences: Type of contract, length of service Research Councils job portal: www.topcareer.jobs

CoSeC People and RSEs Common goal: support research Let’s continue to support each other and work together Let’s lobby together at all levels for recognition of the importance of research software and the people who write it

Theme for the event: Joining Up the UK e-Infrastructure

Many thanks to: All the super talented CoSeC people Our communities and steering bodies RSE Chairs for allowing us to host the CoSeC launch here CoSeC launch team: Cathy Jones Damian Jones Marion O’Sullivan Dominik Jochym Andy Collins