Presentation is loading. Please wait.

Presentation is loading. Please wait.

CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space.

Similar presentations


Presentation on theme: "CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space."— Presentation transcript:

1 CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space

2 CEMS – what is it A joint academic-industrial facility for climate and environmental data services Based in ISIC, Harwell  Alongside Visualisation Centre, EO Hub, SRU

3 CEMS – what is it CEMS will provide:  Access to large-volume climate and EO datasets, alongside processing capability;  Commercial and scientific applications and services, hosted alongside key datasets;  Data quality, integrity and visualisation tools alongside advice and consultancy; Initial partners:

4 CEMS Overview

5 CEMS Infrastructure Layered architecture  High performance storage (1.7 PB)  Panasas parallel filesystem: resilience, scalability, fast performance, eliminating I/O bottlenecks  Processing hardware (20 nodes, each with 12 cores)  Managed through cloud-based environment  Virtualisation layer based on VMware  Allows computing resources to be used by third parties via a cloud  Platform to host applications and services for academic and commercial user communities Dual site  Deployed across neighbouring academic and commercial sites on the Harwell campus 5

6 Data 6 3.5PB Storage STFC SCARF HPC (2000 CPUs) JASMIN 10 x compute nodes 0.7PB Storage CEMS CommercialCEMS Academic 10GBit link ISIC - STFC 20 x compute nodes Electron Building, ISICR89 Building, RAL STFC 1100 blades, fast storage connected into low latency network ATLAS Tape Store (3.5PB) 10 x compute nodes 1.1PB Storage Initial CEMS funding has enabled the creation of new state of the art infrastructure to host data, and services to process it EO datasets are being transferred from UK’s NEODC (NERC EO Data Centre) to CEMS infrastructure (a huge undertaking) New datasets to follow CEMS also integrates with other UK academic activities  JASMIN: access to CMIP5 for intercomparison with model data  STFC Scientific Computing: SCARF processing cluster and tape back-up Harwell Site, Oxfordshire Data sharing and integration of services between academic and commercial sites

7 CEMS Concept – multi-level approach

8 Exploiting Cloud Computing for CEMS CEMS provides a flexible resource for processing and storage of EO data, using a cloud computing model Different groups can be allocated portions of the storage, network and compute infrastructure  tailored to their needs  without the need for the upfront capital for hardware purchase. We’ve built links with the user community, including NCEO and the ESA CCI programme  First users are accessing the system

9 Core System  The core system provides the underlying functionality for CEMS to operate  It builds on top of hardware and cloud virtualisation layers  Key use cases were prioritised  User management and support services  Data discovery and access  Virtual machine access  This is a baseline upon which to build

10 Applications and Services Demonstrator applications were developed in parallel with the deployment of the infrastructure  These proto-CEMS apps illustrate and showcase the possibilities for use of CEMS  Interactive (web based) applications  EO Data Processing  Aimed to demonstrate one of more of the following:  Complex or demanding processing of large datasets  Bringing multiple datasets together  Visualisation ISIC video wall showing Model Data (Met Office Hadley Centre HADGEM2 model, near-surface air temperature) and NASA Blue Marble on interactive spinning globes OceanDIVA demonstrator developed by Reading e-Science Centre with data and scientific advice provided by University of Edinburgh

11 Applications and Services: OceanDIVA  Quick-look tool for comparing model/assimilation with in-situ data

12 CEMS Evolution EngagementCEMS ComponentsTimeline Technical implementation Hardware Data Integrity Demonstrators Est. CEMS Board and Integrated Project Team, groups: e.g. outreach, business modelling CEMS Vision Identifying the goals Establishing initial collaboration August 2011: User Requirements Analysis Nov 2011: £3m funding from UKSA Dec 2011: H/W purchase Core System Virtualisation and Cloud Mar 2012: H/W delivery + demonstration S/W Service providers build apps to host on CEMS First hosted processing for research groups and industry Aug 2012: core software services + virtualisation Applications Support services, consultancy An eco-system of applications and support services for End user communities Sept 2012+: 1 st cloud users, growing user community

13 Who will use CEMS, and how?  Bids/Proposals – CEMS usage in projects (ESA, EC, UK)  Private Use – use of CEMS to support internal use e.g. data, compute or core services to support product development  Application Hosting – infrastructure for hosting and deployment, as well as a shop window to reach a large customer base.  Academic Usage – scientific community processing and storage of long time series of data  Community Data Hosting – we are looking to build up the data offering  Capability Development – tools or core services (e.g. to manipulate, inter- compare or visualise data sets) developed and supplied by users/customers

14 Current Status and Future Developments  CEMS is open for business:  http://isic-space.com/cems/ http://isic-space.com/cems/  Contact cemsinfo@isic-space.comcemsinfo@isic-space.com  Future developments of the system: 1.Storage and hardware: expand to support user community 2.Cloud: collaboration with external partners – GSCB, European Grid Infrastructure, Open Source clouds 3.Core system: Earth System Grid Federation – a federated infrastructure for climate data, suitable for EO also 4.Apps and services:  Hosted processing – OGC WPS  Hosted interactive development environments – iPython Notebook Individual user exploitation of cloud Virtualisation IPython Notebook interactive environment

15 Questions 15

16 Data Integrity 16 A technical study (led by Telespazio VEGA) was carried out to develop and define the concept of data integrity  Reviewed requirements of users and data providers  Results showed agreement across government and commercial sectors about priority needs:  Metadata completeness (accessibility, interoperability)  Expert advice (data policy also)  Made recommendations for data integrity services  Helpdesk and consultancy support  Certification framework for DI information  Mechanisms for providing  tools and services for data validation and assessment,  comparative validation studies between co-hosted datasets Data Integrity is taken to mean a measure of confidence in the data, arrived at by characterising and monitoring quality at specific points along the production chain


Download ppt "CEMS: The Facility for Climate and Environmental Monitoring from Space Victoria Bennett, ISIC/CEDA/NCEO RAL Space."

Similar presentations


Ads by Google