Download presentation
Presentation is loading. Please wait.
1
JASMIN Success Stories
NERC Scientific Computing Forum 29 June 2017 Dr Matt Pritchard Centre for Environmental Data Analysis STFC / RAL Space
2
Outline What is JASMIN Success stories
What are the services provided to users? How do they access them? Current facts / figures Success stories Service provider stories User stories Challenges Next steps
3
Logical View CEDA Archive Services Analysis Environment
JASMIN Compute and Storage (Lotus + Private Cloud + Tape Store + Data Transfer Zone) Internal Helpdesk CEDA Archive Services Data Centres, Curation, DB systems User management, External Helpdesk CEDA Data Centres IPCC DDC ESGF etc Analysis Environment Compute Cloud: PaaS (JAP + Science VMs + User Management), IaaS, Group Workspaces: Fast Disk & Elastic Tape External Helpdesk NERC Managed Analysis Compute NERC Cloud Analysis Compute
4
Functional View Key Long Term Archive Storage
CEDA JASMIN Long Term Archive Storage Short-Term Project Storage Archive Tape Elastic Tape Group Workspaces Archive gws1 gws2 gws3 NERC Managed Cloud Analysis Compute Unmanaged Cloud Tenancies Managed Cloud Tenancies Interactive Compute sci LOTUS IaaS Batch Compute login bastion xfer IaaS CEDA Service CEDA Services Data Transfer Zone Functional View xfer gridftp
5
Network View Key Archive Managed Compute Managed Cloud Tenancies
CEDA JASMIN External Archive GWS Managed Compute Managed Cloud Tenancies Unmanaged Cloud sci LOTUS Firewall / Router CEDA Servicez Data Transfer Zone login bastion xfer CEDA Services xfer[23] xfer[23] JASMIN head router Other STFC Dept router JASMIN DTZ router gridftp, globus Optical Private Networks perfSONAR RAL Core Switch k9.leeds dtn02.rdf Archive FTP RAL STFC Firewall Met Office ESGF DN (gridftp) Catapult RAL Site access Routers JANET POP
6
Current facts & figures
1425 JASMIN user accounts 150 Group Workspaces (0.1 TB -> 700 TB) 10PB GWS capacity (>8.5 PB used) 5 PB Archive capacity (>4.5 PB used!) 5000 Compute nodes (LOTUS, virtualization, cloud) >>1000 Virtual machines >20 Cloud tenancies
7
JASMIN Evolution Phase Cost Storage Compute Network Other 1 £5 M
5 PB Panasas ? PB tape iSCSI arrays 700 cores Initial core network: Gnodal Virtualisation Prototype cloud Light paths JAP 1.5 £0.7 M 0.4 PB Panasas Tape drives, media ET service ? cores Gnodal switch upgrade Expansion of VM estate 2 £5.4 M 7 PB Panasas 6 PB tape 0.9 PB NetApp 3000 cores Major core network redesign & implementation CIS software Cloud management s/w 3 £2 M 2 PB Panasas Tape drives 800 cores “ 3.5 £1.2 M 1.2 PB Panasas 1.2 PB Object Store 5 PB tape 1000 cores misc Support / license renewals 4 TBC
8
JASMIN data growth
9
JASMIN Accounts Portal
10
JASMIN Cloud Portal
11
Data Transfer Zone “Science DMZ” concept
Secure, friction-free path for science data Corporate firewall better able to handle “business” traffic
12
JASMIN User Stories
13
JASMIN User Stories COMET: Seismic Hazard monitoring with Sentinel-1 InSAR “We’ve had fantastic support from the team who have helped us to build a suitable system. The level of support provided has really helped us to achieve our goals and I don’t think a bespoke solution like this would be available anywhere else. Due to the enormous volumes of data we’re dealing with (each image is around 8GB when zipped), the collocation of the archive with the JASMIN system is essential.” Emma Hatton, University of Leeds, JAMSIN Conference June 2017
14
JASMIN User Stories
15
JASMIN User Stories
16
JASMIN User Stories
17
JASMIN User Stories
18
JASMIN User Stories
19
JASMIN User Stories
20
Challenges Scale Variety User expertise Effort Evolution
Capital-heavy model Evolution Of Workflows Of Technology Of User expectations Of…
21
JASMIN Next steps JASMIN Phase 3.5 JASMIN Phase 4 2016/2017
1000 cores added to LOTUS (now in place) Object Store Proof of Concept Limited new fast disk integration 12/13 July JASMIN Phase 4 2017/18 Project now underway “limiting” case in JASMIN Science Case Challenges Storage JASMIN Phase 1 storage end-of-life March 2018 (5PB) Next: mixture of fast/parallel disk AND object store Compute Migration to OpenStack (cloud management infrastructure)
22
Further information JASMIN Centre for Environmental Data Analysis
Centre for Environmental Data Analysis CEDA & JASMIN help documentation STFC Scientific Computing Department JASMIN paper Lawrence, B.N. , V.L. Bennett, J. Churchill, M. Juckes, P. Kershaw, S. Pascoe, S. Pepler, M. Pritchard, and A. Stephens. Storing and manipulating environmental big data with JASMIN. Proceedings of IEEE Big Data 2013, p68-75, doi: /BigData
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.