DLP Driven, Learning, Optical Neural Networks

Slides:



Advertisements
Similar presentations
KULIAH II JST: BASIC CONCEPTS
Advertisements

Multi-Layer Perceptron (MLP)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Financial Informatics –XVI: Supervised Backpropagation Learning
Prediction to Protein Structure Fall 2005 CSC 487/687 Computing for Bioinformatics.
ICTON 2007, Rome, Italy The Performance of PPM using Neural Network and Symbol Decoding for Diffused Indoor Optical Wireless Links 1 S. Rajbhandari, Z.
Brian Merrick CS498 Seminar.  Introduction to Neural Networks  Types of Neural Networks  Neural Networks with Pattern Recognition  Applications.
Simple Neural Nets For Pattern Classification
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
1 Artificial Neural Networks: An Introduction S. Bapi Raju Dept. of Computer and Information Sciences, University of Hyderabad.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Data Mining with Neural Networks (HK: Chapter 7.5)
Artificial Neural Networks
An Introduction To The Backpropagation Algorithm Who gets the credit?
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks William Lai Chris Rowlett. What are Neural Networks? A type of program that is completely different from functional programming. Consists.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CIS 588 Neural Computing Course details. CIS 588 Neural Computing Course basics:  Instructor - Iren Valova  Tuesday, Thursday 5 - 6:15pm, T 101  1.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
DLP ® -Driven, Optical Neural Network Results and Future Design Emmett Redd Professor Missouri State University.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Computer Science and Engineering
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
CONTENTS:  Introduction  What is neural network?  Models of neural networks  Applications  Phases in the neural network  Perceptron  Model of fire.
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Damageless Information Hiding Technique using Neural Network Keio University Graduate School of Media and Governance Kensuke Naoe.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Computer Architecture Lecture 26 Past and Future Ralph Grishman November 2015 NYU.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
EEE502 Pattern Recognition
CHAPTER 6 SPECIAL NETWORKS “Principles of Soft Computing, 2nd Edition”
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Neural Networks 2nd Edition Simon Haykin
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
An Introduction To The Backpropagation Algorithm.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Networks.
CSE 473 Introduction to Artificial Intelligence Neural Networks
Artificial Neural Network & Backpropagation Algorithm
network of simple neuron-like computing elements
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Presentation transcript:

DLP Driven, Learning, Optical Neural Networks Emmett Redd & A. Steven Younger Associate Professor Missouri State University EmmettRedd@MissouriState.edu

Neural Network Applications Stock Prediction: Currency, Bonds, S&P 500, Natural Gas Business: Direct mail, Credit Scoring, Appraisal, Summoning Juries Medical: Breast Cancer, Heart Attack Diagnosis, ER Test Ordering Sports: Horse and Dog Racing Science: Solar Flares, Protein Sequencing, Mosquito ID, Weather Manufacturing: Welding Quality, Plastics or Concrete Testing Pattern Recognition: Speech, Article Class., Chem. Drawings No Optical Applications—We are starting with Boolean most from www.calsci.com/Applications.html

Optical Computing & Neural Networks Optical Parallel Processing Gives Speed Lenslet’s Enlight 256—8 Giga Multiply and Accumulate per second Order 1011 connections per second possible with holographic attenuators Neural Networks Parallel versus Serial Learn versus Program Solutions beyond Programming Deal with Ambiguous Inputs Solve Non-Linear problems Thinking versus Constrained Results

Optical Neural Networks Sources are modulated light beams (pulse or amplitude) Synaptic Multiplications are due to attenuation of light passing through an optical medium Geometric or Holographic Target neurons sum signals from many source neurons. Squashing by operational-amps or nonlinear optics

Standard Neural Net Learning We use a Training or Learning algorithm to adjust the weights, usually in an iterative manner. y1 x1 xN S … … … S yM Learning Algorithm Target Output (T) Other Info

FWL-NN is equivalent to a standard Neural Network + Learning Algorithm

Optical Recurrent Neural Network Signal Source (Layer Input) Target Neuron Summation Synaptic Medium (35mm Slide) Postsynaptic Optics Presynaptic Optics Micromirror Array Squashing Functions Recurrent Connections Layer Output A Single Layer of an Optical Recurrent Neural Network. Only four synapses are shown. Actual networks will have a large number of synapses. A multi-layer network has several consecutive layers.

Optical Fixed-Weight Learning Synapse Tranapse x(t) y(t-1) Σ W(t-1) T(t-1) x(t-1) Planapse

Definitions Fixed-Weight Learning Neural Network (FWL-NN) – A recurrent network that learns without changing synaptic weights Potency – A weight signal Tranapse – A Potency modulated synapse Planapse – Supplies Potency error signal Recurron – A recurrent neuron Recurral Network – A network of Recurrons

Page Representation of a Recurron Tranapse Planapse U(t-1) A(t) B(t) Bias O(t-1) Σ

Optical Neural Network Constraints Finite Range Unipolar Signals [0,+1] Finite Range Bipolar Attenuation[-1,+1] Excitatory/Inhibitory handled separately Limited Resolution Signal Limited Resolution Synaptic Weights Alignment and Calibration Issues

Proposed Design Digital Micromirror Device 35 mm slide Synaptic Media CCD Camera Synaptic Weights - Positionally Encoded - Digital Attenuation Planapse, Tranapse trained using Standard Backpropagation Allows flexibility for evaluation.

Generation, Acquisition, Processing Demonstration Left Images (time order) Gray-scale Source Neuron Signal Equivalent Pulse Modulated Source Neuron Signal Gray-scale Source Neuron Signal – Reprise Blank during Target Summation and Squashing Middle Image – Fixed Weights on Attenuator Slide (static) Right Images (time order) Blank during First Source Gray-scale Accumulating (Integrating Pulses) Target Neuron Input Gray-scale during Summation and Squashing Phases and Layers Synapses Connect Two Layers (N and N+1) Only Source Neurons (layer N) Illuminate Only Target Neurons (layer N+1) Accumulate

Hard and Soft Optical Alignment Hardware Alignment Best for Gross Alignment Precision Stages Required for Precise Alignment Hard to Correct for Distortions in Optics Software Alignment Fine Image Displacements Image Rotation Integrating Regions of Arbitrary Shape and Size Diffraction Effects Filtered and Ignored Correct for Optical Path Changes

Photonic Circuit Elements NEAR Optical Fiber Fiber Amplifiers Splitters MEDIUM Cylindrical Lenses before and after Synaptic Media (1-D systems, sparse media) Non-Linear Crystals for Squashing FAR Integrated Emitters, Attenuator, Detectors/Limiters Holographic Attenuators (destructive interference) Stackable

DLP Driven, Learning, Optical Neural Networks Emmett Redd and A. Steven Younger Associate Professor Missouri State University EmmettRedd@MissouriState.edu This material is based upon work supported by the National Science Foundation under Grant No. 0446660.