Presentation is loading. Please wait.

Presentation is loading. Please wait.

DLP ® -Driven, Optical Neural Network Results and Future Design Emmett Redd Professor Missouri State University.

Similar presentations


Presentation on theme: "DLP ® -Driven, Optical Neural Network Results and Future Design Emmett Redd Professor Missouri State University."— Presentation transcript:

1 DLP ® -Driven, Optical Neural Network Results and Future Design Emmett Redd Professor Missouri State University

2 Neural Network Applications Stock Prediction: Currency, Bonds, S&P 500, Natural Gas Business: Direct mail, Credit Scoring, Appraisal, Summoning Juries Medical: Breast Cancer, Heart Attack Diagnosis, ER Test Ordering Sports: Horse and Dog Racing Science: Solar Flares, Protein Sequencing, Mosquito ID, Weather Manufacturing: Welding Quality, Plastics or Concrete Testing Pattern Recognition: Speech, Article Class., Chem. Drawings No Optical Applications—We are starting with Boolean most from www.calsci.com/Applications.html

3 Optical Computing & Neural Networks Optical Parallel Processing Gives Speed –Lenslet’s Enlight 256—8000 Giga Multiply and Accumulate per second –Order 10 11 connections per second possible with holographic attenuators Neural Networks –Parallel versus Serial –Learn versus Program –Solutions beyond Programming –Deal with Ambiguous Inputs –Solve Non-Linear problems –Thinking versus Constrained Results

4 Optical Neural Networks Sources are modulated light beams (pulse or amplitude) Synaptic Multiplications are due to attenuation of light passing through an optical medium (30 fs) Geometric or Holographic Target neurons sum signals from many source neurons. Squashing by operational-amps or nonlinear optics

5 Standard Neural Net Learning We use a Training or Learning algorithm to adjust the weights, usually in an iterative manner. … x 1 x N y1y1 … yMyM   … Learning Algorithm Target Output (T) Other Info

6 FWL-NN is equivalent to a standard Neural Network + Learning Algorithm     FWL-NN + Learning Algorithm

7 Optical Recurrent Neural Network Squashing Functions Synaptic Medium (35mm Slide) Target Neuron Summation Signal Source (Layer Input) Layer Output A Single Layer of an Optical Recurrent Neural Network. Only four synapses are shown. Actual networks will have a large number of synapses. A multi-layer network has several consecutive layers. Micromirror Array Presynaptic Optics Postsynaptic Optics Recurrent Connections

8 Definitions Fixed-Weight Learning Neural Network (FWL-NN) – A recurrent network that learns without changing synaptic weights Potency – A weight signal Tranapse – A Potency modulated synapse Planapse – Supplies Potency error signal Zenapse – Non-Participatory synapse Recurron – A recurrent neuron Recurral Network – A network of Recurrons

9 Optical Fixed-Weight Learning Synapse W(t-1) x(t) Σ x(t-1) T(t-1) Planapse Tranapse y(t-1)

10 Page Representation of a Recurron Tranapse Planapse Tranapse U(t-1) A(t) B(t) Bias O(t-1) Σ

11 Optical Neural Network Constraints Finite Range Unipolar Signals [0,+1] Finite Range Bipolar Attenuation[-1,+1] Excitatory/Inhibitory handled separately Limited Resolution Signal Limited Resolution Synaptic Weights Alignment and Calibration Issues

12 Optical System DMD or DLP ®

13 Optical Recurrent Neural Network Squashing Functions Synaptic Medium (35mm Slide) Target Neuron Summation Signal Source (Layer Input) Layer Output A Single Layer of an Optical Recurrent Neural Network. Only four synapses are shown. Actual networks will have a large number of synapses. A multi-layer network has several consecutive layers. Micromirror Array Presynaptic Optics Postsynaptic Optics Recurrent Connections

14 Design Details Digital Micromirror Device 35 mm slide Synaptic Media CCD Camera Synaptic Weights - Positionally Encoded - Digital Attenuation Allows flexibility for evaluation. and Networks Recurrent AND Unsigned Multiply FWL Recurron

15 DMD/DLP ® —A Versatile Tool Alignment and Distortion Correction –Align DMD/DLP ® to CCD——PEGS –Align Synaptic Media to CCD——HOLES –Calculate DMD/DLP ® to Synaptic Media Alignment——Putting PEGS in HOLES –Correct projected DMD/DLP ® Images Nonlinearities

16 Stretch, Squash, and Rotate 1 x 1 y 0 x 0 y 1 x 2 y 0 x 1 y 1 x 0 y 2 x 3 y 0 x 2 y 1 x 1 y 2 x 0 y 3 etc. None Linear Quadratic Cubic etc.

17 Where We Are and Where We Want to Be DMD Dots Image (pegs) 20040060080010001200 100 200 300 400 500 600 700 800 900 1000 Slide Dots Image (holes) 20040060080010001200 100 200 300 400 500 600 700 800 900 1000 C C'C'

18 DMD Dots Image (pegs) 20040060080010001200 100 200 300 400 500 600 700 800 900 1000 CCD Image of Known DLP ® Positions Automatically Finds Points via Projecting 42 Individual Pegs C = H ● D H = C ● D P

19 Slide Dots Image (holes) 20040060080010001200 100 200 300 400 500 600 700 800 900 1000 CCD Image of Holes in Slide Film Manually Click on Interference to Zoom In

20 Mark the Center

21 Options for Automatic Alignment Make holes larger so diffraction is reduced –A single large Peg might illuminate only one Hole at a time –Maximum intensity would mark Hole location Fit ellipse to 1 st minimum— Ax 2 +Bxy+Cy 2 +Dx+Ey+1 = 0 –Find its center— x c =(BE-2CD)/(4AC-B 2 ), y c =(BD-2AE)/(4AC-B 2 ) – Hole location

22 Ellipse Fitting

23 Eighty-four Clicks Later C' = M ● C M = C' ● C P

24 DLP ® Projected Regions of Interest D' = H -1 ●M -1 ●H●D and C' = H ● D'

25 Nonlinearities Measured light signals vs. Weights for FWL Recurron Opaque slides aren’t, ~6% leakage.

26 Neural Networks and Results Recurrent AND Unsigned Multiply Fixed Weight Learning Recurron

27 ININ I N-1 I N  I N-1 1-cycle delay Recurrent AND

28 ININ I N-1 I N  I N-1 Bias (1) Source Neurons (buffers) Terminal Neurons -14/16 10/16 -1/2 2/2 Σ Σ logsig(16*Σ) linsig(2*Σ) 0 1 0 1 Recurrent AND Neural Network

29 Synaptic Weight Slide Weights 0.5 10/16 -1/2 -14/16 10/16 2/2

30 Recurrent AND Demo MATLAB

31 Pulse Image (Regions of Interest)

32 Output Swings Larger than Input

33 Synaptic Weight Slide

34 Unsigned Multiply Results About 4 bits Blue-expected Black-obtained Red-squared error

35 Page Representation of a Recurron Tranapse Planapse Tranapse U(t-1) A(t) B(t) Bias O(t-1) Σ

36 FWL Recurron Synaptic Weight Slide

37 Optical Fixed-Weight Learning Synapse W(t-1) x(t) Σ x(t-1) T(t-1) Planapse Tranapse y(t-1)

38 Optical Recurrent Neural Network Squashing Functions Synaptic Medium (35mm Slide) Target Neuron Summation Signal Source (Layer Input) Layer Output A Single Layer of an Optical Recurrent Neural Network. Only four synapses are shown. Actual networks will have a large number of synapses. A multi-layer network has several consecutive layers. Micromirror Array Presynaptic Optics Postsynaptic Optics Recurrent Connections

39 Future: Integrated Photonics Photonic (analog) i. Concept α. Neuron β. Weights γ. Synapses Photonics Spectra and Luxtera

40 Continued ii. Needs α. Laser β. Amplifier (detectors and control) γ. Splitters δ. Waveguides on Two Layers ε. Attenuators ζ. Combiners η. Constructive Interference θ. Destructive Interference ι. Phase Photonics Spectra and Luxtera

41 Interest Generated? We wish to implement Optical Neural Networks in Silicon—Including Fixed Weight Learning. To do so, we need Collaborators. Much research remains, but an earlier start means an earlier finish. Please contact me if interested.

42 DLP ® -Driven, Optical Neural Network Results and Future Design Emmett Redd & A. Steven Younger Missouri State University EmmettRedd@MissouriState.edu

43 Source Pulse


Download ppt "DLP ® -Driven, Optical Neural Network Results and Future Design Emmett Redd Professor Missouri State University."

Similar presentations


Ads by Google