CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space.

Slides:



Advertisements
Similar presentations
HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
Advertisements

Founded in 2010: UCL, Southampton, Oxford and Bristol Key Objectives of the Consortium: Prove the concept of shared, regional e-infrastructure services.
STFC and the UK e-Infrastructure Initiative The Hartree Centre Prof. John Bancroft Project Director, the Hartree Centre Member, e-Infrastructure Leadership.
LEIT (ICT7 + ICT8): Cloud strategy - Cloud R&I: Heterogeneous cloud infrastructures, federated cloud networking; cloud innovation platforms; - PCP for.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
The role of the ISIC facility for Climate and Environmental Monitoring from Space (CEMS) in the development of Quality Assured Datasets and Downstream.
Driving Innovation Robert Lowson National Contact Point FP7 Space Cork 17 September 2012 The United Kingdom; a major force in space 1.
VO Sandpit, November 2009 NERC Big Data And what’s in it for NCEO? June 2014 Victoria Bennett CEDA (Centre for Environmental Data Archival)
Technical Review Group (TRG)Agenda 27/04/06 TRG Remit Membership Operation ICT Strategy ICT Roadmap.
Presented by Sujit Tilak. Evolution of Client/Server Architecture Clients & Server on different computer systems Local Area Network for Server and Client.
Constellation Technologies Providing a support service to commercial users of gLite Nick Trigg.
Presentation to the Housing Technology Conference Tim Cowland- Senior Consultant 27 th February 2014 The Rise of the Housing Cloud.
Centre for Earth Systems Engineering Research Infrastructure Transitions Research Consortium (ITRC) David Alderson & Stuart Barr What is the aim of ITRC?
Open Source Grid Computing in the Finance Industry Alex Efimov STFC Kite Club Knowledge Exchange Advisor UK CERN Technology Transfer Officer
IS-ENES [ees-enes] InfraStructure for the European Network for Earth System Modelling IS-ENES will develop a virtual Earth System Modelling Resource Centre.
European Life Sciences Infrastructure for Biological Information ELIXIR
EGI-Engage EGI-Engage Engaging the EGI Community towards an Open Science Commons Project Overview 9/14/2015 EGI-Engage: a project.
Cloud Computing Zach Ciccone Claudia Rodriguez Annia Aleman Xiaoying Tu Nov 14, 2013.
EC Grant Agreement no GEOSS Interoperability for Weather Ocean and Water Enhancing the GEOSS Infrastructure for all the Stakeholders.
Cloud Computing 1. Outline  Introduction  Evolution  Cloud architecture  Map reduce operation  Platform 2.
Climate Sciences: Use Case and Vision Summary Philip Kershaw CEDA, RAL Space, STFC.
Presentation to Senior Management Team 24 th October 2008 UCD IT Services IT Strategy
INFSO-RI Enabling Grids for E-sciencE EGEODE VO « Expanding GEosciences On DEmand » Geocluster©: Generic Seismic Processing Platform.
Nicholas LoulloudesMarch 3 rd, 2009 g-Eclipse Testing and Benchmarking Grid Infrastructures using the g-Eclipse Framework Nicholas Loulloudes On behalf.
VO Sandpit, November 2009 e-Infrastructure to enable EO and Climate Science Dr Victoria Bennett Centre for Environmental Data Archival (CEDA)
Grids, Clouds and the Community. Cloud Technology and the NGS Steve Thorn Edinburgh University Matteo Turilli, Oxford University Presented by David Fergusson.
IODE Ocean Data Portal - technological framework of new IODE system Dr. Sergey Belov, et al. Partnership Centre for the IODE Ocean Data Portal MINCyT,
The UK NGI Claire Devereux NGI Manager. Overview What is it? Who does it include? How does it work? Why do we need it? How is it evolving and what does.
John Womersley John Womersley Director, Science Programmes Science and Technology Facilities Council Technology Gateway Centres.
Digital Earth Communities GEOSS Interoperability for Weather Ocean and Water GEOSS Common Infrastructure Evolution Roberto Cossu ESA
JASMIN and CEMS: The Need for Secure Data Access in a Virtual Environment Cloud Workshop 23 July 2013 Philip Kershaw Centre for Environmental Data Archival.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Virtualisation & Cloud Computing at RAL Ian Collier- RAL Tier 1 HEPiX Prague 25 April 2012.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
VO Sandpit, November 2009 e-Infrastructure for Climate and Atmospheric Science Research Dr Matt Pritchard Centre for Environmental Data Archival (CEDA)
Cloud of Clouds for UK public sector. Cloud Services Integrator.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
7. Grid Computing Systems and Resource Management
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
3/12/2013Computer Engg, IIT(BHU)1 CLOUD COMPUTING-1.
CEOS WGISS-40 COSMO-SkyMed Ground Segment & other Ground Segment Developments Richard Lowe – Group Manger, EO Systems & Operations.
The National Grid Service Mike Mineter.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Cloud-based e-science drivers for ESAs Sentinel Collaborative Ground Segment Kostas Koumandaros Greek Research & Technology Network Open Science retreat.
G-Cloud - The Delivery of a Shared Computing Platform for Government Ian Osborne Director, Digital Systems KTN Intellect.
Environmental Data Solving business problems with environmental data Andy Kirkman, Environmental Science to Service Partnership.
Using a Simple Knowledge Organization System to facilitate Catalogue and Search for the ESA CCI Open Data Portal EGU, 21 April 2016 Antony Wilson, Victoria.
KTNUK Simon Yarwood July Introducing the KTN The UK’s Innovation Network The KTN is the UK’s innovation network. We bring together.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
IODE Ocean Data Portal - technological framework of new IODE system Dr. Sergey Belov, et al. Partnership Centre for the IODE Ocean Data Portal.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Extreme Scale Infrastructure
Bob Jones EGEE Technical Director
Tokamak data mirror for JET and MAST Moving towards an open data repository for European nuclear fusion research.
Opportunities for Collaboration
Presentation on Copernicus Dissemination
NA3: User Community Support Team
JASMIN Success Stories
PROCESS - H2020 Project Work Package WP6 JRA3
Connecting the European Grid Infrastructure to Research Communities
Exploitation Platforms and Common Reference Architecture
Brian Matthews STFC EOSCpilot Brian Matthews STFC
Cloud Platform Helps SMEs to Leverage the Power of On-Demand Data Analytics “The platform-as-a-service capabilities of Windows Azure gave us the flexibility.
Expand portfolio of EGI services
Productive + Hybrid + Intelligent + Trusted
EOSC-hub Contribution to the EOSC WGs
Presentation transcript:

CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space

CEMS – what is it A joint academic-industrial facility for climate and environmental data services Based in ISIC, Harwell  Alongside Visualisation Centre, EO Hub, SRU

CEMS – what is it CEMS will provide:  Access to large-volume climate and EO datasets, alongside processing capability;  Commercial and scientific applications and services, hosted alongside key datasets;  Data quality, integrity and visualisation tools alongside advice and consultancy; Initial partners:

CEMS Overview

CEMS Infrastructure Layered architecture  High performance storage (1.7 PB)  Panasas parallel filesystem: resilience, scalability, fast performance, eliminating I/O bottlenecks  Processing hardware (20 nodes, each with 12 cores)  Managed through cloud-based environment  Virtualisation layer based on VMware  Allows computing resources to be used by third parties via a cloud  Platform to host applications and services for academic and commercial user communities Dual site  Deployed across neighbouring academic and commercial sites on the Harwell campus 5

Data 6 3.5PB Storage STFC SCARF HPC (2000 CPUs) JASMIN 10 x compute nodes 0.7PB Storage CEMS CommercialCEMS Academic 10GBit link ISIC - STFC 20 x compute nodes Electron Building, ISICR89 Building, RAL STFC 1100 blades, fast storage connected into low latency network ATLAS Tape Store (3.5PB) 10 x compute nodes 1.1PB Storage Initial CEMS funding has enabled the creation of new state of the art infrastructure to host data, and services to process it EO datasets are being transferred from UK’s NEODC (NERC EO Data Centre) to CEMS infrastructure (a huge undertaking) New datasets to follow CEMS also integrates with other UK academic activities  JASMIN: access to CMIP5 for intercomparison with model data  STFC Scientific Computing: SCARF processing cluster and tape back-up Harwell Site, Oxfordshire Data sharing and integration of services between academic and commercial sites

CEMS Concept – multi-level approach

Exploiting Cloud Computing for CEMS CEMS provides a flexible resource for processing and storage of EO data, using a cloud computing model Different groups can be allocated portions of the storage, network and compute infrastructure  tailored to their needs  without the need for the upfront capital for hardware purchase. We’ve built links with the user community, including NCEO and the ESA CCI programme  First users are accessing the system

Core System  The core system provides the underlying functionality for CEMS to operate  It builds on top of hardware and cloud virtualisation layers  Key use cases were prioritised  User management and support services  Data discovery and access  Virtual machine access  This is a baseline upon which to build

Applications and Services Demonstrator applications were developed in parallel with the deployment of the infrastructure  These proto-CEMS apps illustrate and showcase the possibilities for use of CEMS  Interactive (web based) applications  EO Data Processing  Aimed to demonstrate one of more of the following:  Complex or demanding processing of large datasets  Bringing multiple datasets together  Visualisation ISIC video wall showing Model Data (Met Office Hadley Centre HADGEM2 model, near-surface air temperature) and NASA Blue Marble on interactive spinning globes OceanDIVA demonstrator developed by Reading e-Science Centre with data and scientific advice provided by University of Edinburgh

Applications and Services: OceanDIVA  Quick-look tool for comparing model/assimilation with in-situ data

CEMS Evolution EngagementCEMS ComponentsTimeline Technical implementation Hardware Data Integrity Demonstrators Est. CEMS Board and Integrated Project Team, groups: e.g. outreach, business modelling CEMS Vision Identifying the goals Establishing initial collaboration August 2011: User Requirements Analysis Nov 2011: £3m funding from UKSA Dec 2011: H/W purchase Core System Virtualisation and Cloud Mar 2012: H/W delivery + demonstration S/W Service providers build apps to host on CEMS First hosted processing for research groups and industry Aug 2012: core software services + virtualisation Applications Support services, consultancy An eco-system of applications and support services for End user communities Sept 2012+: 1 st cloud users, growing user community

Who will use CEMS, and how?  Bids/Proposals – CEMS usage in projects (ESA, EC, UK)  Private Use – use of CEMS to support internal use e.g. data, compute or core services to support product development  Application Hosting – infrastructure for hosting and deployment, as well as a shop window to reach a large customer base.  Academic Usage – scientific community processing and storage of long time series of data  Community Data Hosting – we are looking to build up the data offering  Capability Development – tools or core services (e.g. to manipulate, inter- compare or visualise data sets) developed and supplied by users/customers

Current Status and Future Developments  CEMS is open for business:   Contact  Future developments of the system: 1.Storage and hardware: expand to support user community 2.Cloud: collaboration with external partners – GSCB, European Grid Infrastructure, Open Source clouds 3.Core system: Earth System Grid Federation – a federated infrastructure for climate data, suitable for EO also 4.Apps and services:  Hosted processing – OGC WPS  Hosted interactive development environments – iPython Notebook Individual user exploitation of cloud Virtualisation IPython Notebook interactive environment

Questions 15

Data Integrity 16 A technical study (led by Telespazio VEGA) was carried out to develop and define the concept of data integrity  Reviewed requirements of users and data providers  Results showed agreement across government and commercial sectors about priority needs:  Metadata completeness (accessibility, interoperability)  Expert advice (data policy also)  Made recommendations for data integrity services  Helpdesk and consultancy support  Certification framework for DI information  Mechanisms for providing  tools and services for data validation and assessment,  comparative validation studies between co-hosted datasets Data Integrity is taken to mean a measure of confidence in the data, arrived at by characterising and monitoring quality at specific points along the production chain