Download presentation
Presentation is loading. Please wait.
Published byNoreen Ball Modified over 5 years ago
1
Open Source Activity Showcase Computational Storage SNIA SwordfishTM
2
SNIA-at-a-Glance
3
Standards Development
4
Areas of Focus Our areas of focus are well-aligned with both existing and emerging technologies, and with the needs of legacy and of emerging IT infrastructures. Each of the nine category areas have technical work associated with them focused on standards and specifications (see slide 5). The SNIA website home page has links to each of these areas so that you can dive into the specifics of SNIA activities. SNIA also has the critical mass of vendors, and the appropriate structure, IP policies, and experience to add new categories, new technologies and new projects as the need arises © 2018 Storage Networking Industry Association. All Rights Reserved.
5
2019 Technical Work Group Activity
For 2018, we have an extensive technical agenda and committed roadmap of new standards. is the launching place to view current work, groups supporting it, draft and published standards and specifications. © 2018 Storage Networking Industry Association. All Rights Reserved.
7
Technical Working Group Member Use Cases
Storage Sessions Friday in Room 212AB SNIA TWG Intro :20 – 8:40 Industry Panel :00 – 4:55
8
32 Participating Companies 100 Individual Members
9
NVMe FP-CSP RAID Offload
NVMe based Computational Storage Processor (CSP) advertises Fixed Purpose accelerator capable of RAID parity generation. Operating System detects the presence of the NVMe FP-CSP Used by the device-mapper to offload parity calculations for RAID. This can be combined with p2pdma to further offload IO CPU PCIe Subsystem DRAM . . . CMB NVMe CSP NVMe SSDs md-raid
10
Efficient Utilization of Compute
Systems requiring similar sets of Computational Units can take advantage of large scale dynamically configurable CSAs utilizing FaberXTM network. Large scale CSA shared by multiple hosts Higher Compute Utilization achieved by pooling CSPs for Hosts Higher Storage Utilization achieved by pooling CSDs for CSPs CSDs and CSPs ability to do p2p data exchange Large, dynamically configurable, variety of Fixed Purpose CSDs will be available for each Host Large Scale Dynamically Configurable CSA FabreXTM DRAM . . . CSPs CSDs
11
AI Inference at the Storage
Generate Metadata database (e.g tags) over a large set of unstructured data locally with an integrated AI inference engine Operation may be: Triggered by a host processor Done offline as a background task (batches) Metadata database may be then used by upper layer Big Data Analytics software for further processing Can work both on direct attached storage or on remote over the network storage Examples: Video search, Ad insertion, Voice call analysis, Images, Text scan, etc Computational Storage Array Network CPU CSDs .
12
Video Transcoder Processor
Computational Storage Processor (CSP) designed for scalable high-efficiency Video Encoding/Transcoding Integrated with FFmpeg Linear scalable encoding capability, offload CPU from heavy encoding computing Save 90% of power consumptions Deterministic low latency PCIe Interface Codensity™ Video Transcoders NVMe FP-CSP inside U.2 module, w/o SSD functions Qty=1..N
13
Real-Time AI Genomics Improvement
The Basic Local Alignment Search Tool (BLAST). Compute in Storage removes CPUs lack of bandwidth to the data DNA and Protein alignment Database Management Up to 100% more performance at no cost in CPU or Memory Resolves the IO Bottleneck between CPU and Storage
14
Hadoop: Job Throughput Improvement
14% ↓ vs. baseline 116% Job Throughput 23% ↓ vs. baseline 131% Job Throughput Compute Offload AND Flash Temp 37% ↓ vs. baseline 160% Job Throughput Baseline: Compute & Storage I/O Bound Compute Offload Only Flash Temp Only … One per server, 9 total 3.1 w/ EC (6+3) Datanode Config: Dual E5-2640v3, 128GB DRAM, 12*6TB SAS HDD All benchmarks configurations use HDD as main storage 24 Mapper/Reducers per Datanode *9 = 216 total Better performance on CSS reported with lower Mapper/Reducers possible
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.