Liquid State Machine with Memristive Microwires

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Biological analogy
Advertisements

Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Artificial Neural Network
Artificial Neural Networks
Dense-Near/Sparse-Far Hybrid Reconfigurable Neural Network Chip Robin Emery Alex Yakovlev Graeme Chester.
Neurmorphic Architectures Kenneth Rice and Tarek Taha Clemson University.
Machine Learning Neural Networks
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
How to do backpropagation in a brain
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Machine Learning. Learning agent Any other agent.
Memristive devices for neuromorphic computation
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
NIMIA October 2001, Crema, Italy - Vincenzo Piuri, University of Milan, Italy NEURAL NETWORKS FOR SENSORS AND MEASUREMENT SYSTEMS Part II Vincenzo.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 14/15 – TP19 Neural Networks & SVMs Miguel Tavares.
NEURAL NETWORKS FOR DATA MINING
An informal description of artificial neural networks John MacCormick.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Building high-level features using large-scale unsupervised learning Anh Nguyen, Bay-yuan Hsu CS290D – Data Mining (Spring 2014) University of California,
1 Introduction to Neural Networks And Their Applications.
Intelligent vs Classical Control Bax Smith EN9940.
Neural Networks II By Jinhwa Kim. 2 Neural Computing is a problem solving methodology that attempts to mimic how human brain function Artificial Neural.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Features of Biological Neural Networks 1)Robustness and Fault Tolerance. 2)Flexibility. 3)Ability to deal with variety of Data situations. 4)Collective.
Aim of neuromorphic engineering: o Design and construct physical models of biological neural networks that replicate:  robust computation  adaptability.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Lecture 5 Neural Control
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Computer Architecture Lecture 26 Past and Future Ralph Grishman November 2015 NYU.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Creating Situational Awareness with Data Trending and Monitoring Zhenping Li, J.P. Douglas, and Ken. Mitchell Arctic Slope Technical Services.
Michael Holden Faculty Sponsor: Professor Gordon H. Dash.
Summary Presentation of Cortical Computing with Memrisitve Nanodevices Authored by Greg S. Snider Hewlett-Packard Laboratories Published Winter 2008, SciDAC.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
The Relationship between Deep Learning and Brain Function
Learning with Perceptrons and Neural Networks
DeepCount Mark Lenson.
Article Review Todd Hricik.
Professor Martin McGinnity1,2, Dr. John Wade1 and MSc. Pedro Machado1
Materials and Devices for Neural Systems and Interfaces
Neural Network Hardware
Dr. Unnikrishnan P.C. Professor, EEE
Stripes: Bit-Serial Deep Neural Network Computing
Introduction to Neural Networks And Their Applications
NEURAL NETWORKS Lab 1 dr Zoran Ševarac FON, 2013.
Machine Learning Today: Reading: Maria Florina Balcan
Research Interests.
OVERVIEW OF BIOLOGICAL NEURONS
Demonstration of STDP based Neural Networks on an FPGA
network of simple neuron-like computing elements
INTRODUCTION.
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
Final Project presentation
Neural Networks Geoff Hulten.
Deep Neural Networks for Onboard Intelligence
Artificial Neural Networks
ARTIFICIAL NEURAL networks.
David Kauchak CS158 – Spring 2019
EE 193/Comp 150 Computing with Biological Parts
Machine Learning.
Brain-Like Computing with Atomically Thin Materials
Summary Presentation of Cortical Computing with Memrisitve Nanodevices
Presentation transcript:

Liquid State Machine with Memristive Microwires Jack Kendall – Rain Neuromorphics

Motivation Why hasn’t neuromorphics succeeded at the same level as deep learning? Lack of good neuromorphic algorithms. Poor scalability of fully-connected layers.

Sparsity is critical for scalable neuromorphic hardware The brain uses primarily two forms of sparsity: Small-World and Scale-Free. These forms of sparsity minimize wiring cost while maximizing network efficiency. They allow the wiring density to scale at the same rate as the neuron density. But what plays the mischief with this masterly.

Natural sparsity can be achieved with random memristive nanowires OUR AGENDA The wires mimic 1-D structures of biological neurons: axons and dendrites. Memristive coating forms artificial synapses at interface with silicon neurons.

Memristive Nanowire Neural Network (MN3) Excellent scaling characteristics: ~ 1 billion trainable parameters per cm2 [1]. Intrinsic small-world connectivity [1]. Supports both reservoir computing [2] and deep learning [3] algorithms. [1] R. Pantone, J. Kendall, and J. Nino, “Memristive Nanowires Exhibit Small-World Connectivity,” Neural Networks (in press). [2] L. Suarez, J. Kendall, and J. Nino, “Evaluation of the Computational Capabilities of a Memristive Random Network (MN3) under the context of Reservoir Computing Mathematical and Computational Analysis,” Neural Networks (in press). [3] J. Kendall and J. Nino, Deep Learning in Bipartite Memristive Networks. U.S. Patent Application No. 15/985,212. Unpublished (filing date 05/21/2018). But what plays the mischief with this masterly.

MN3 Proof of Concept 100 electrode ball-grid array. 22 μm diameter tungsten wires (W). Wires oxidized at 400 °C to form a memristive coating tungsten oxide (WOx).

Resulting I-V curves for the tungsten wires with Ag top electrode. MN3 Hardware Details Single W-WOx Wire MN3 with Ag Electrodes W/WOx/Ag I-V Curve A thin layer of tungsten oxide is thermally grown on the 22 μm diameter wires. Wires are randomly deposited on an electrode array and connected to the array using silver paste. Resulting I-V curves for the tungsten wires with Ag top electrode.

Analog Spiking Neurons Axon Hillock model (Mead 1989). Built and tested with discrete components. Coupled with MN3 memristor network through single-transistor synapses. But what plays the mischief with this masterly.

OUR AGENDA MN3-Based Liquid State Machine Custom designed PCB with 40 axon-hillock spiking neurons. 10 input neurons with variable tuning curves. Teensy 3.6 microcontroller and Altera FPGA for I/O and spike detection.

Liquid State Machine System Overview

Mackey-Glass Predictions Spoken Digits Spike Raster Liquid State Machine Results Mackey-Glass Predictions Spoken Digits Spike Raster Actual Data: Green Curve Prediction: Blue Curve (overlaid) Achieves 94% accuracy on an isolated subset of the TIDIGITS spoken digits dataset.

Next Steps and Future Work Successfully trained deep MN3 models via novel backpropagation algorithm. Sparse Vector-Matrix Multiply (VMM) A.I. Accelerator. First 5,000 neuron VLSI chip planned for 2019 (TSMC 180 nm). But what plays the mischief with this masterly.