Download presentation
Presentation is loading. Please wait.
2
IT 691 Final Presentation Pace University Created by: Robert M Gust Mark Lee Samir Hessami Mark Lee Samir Hessami
3
2 Project Description Research Numenta Platform for Intelligent Computing (NuPIC) which attempts to harness brain like function to solve problems Research Numenta Platform for Intelligent Computing (NuPIC) which attempts to harness brain like function to solve problems Brainchild of Jeff Hawkins, inventor of the Palm Pilot Brainchild of Jeff Hawkins, inventor of the Palm Pilot
4
3 Artificial Intelligence Two schools of thought Two schools of thought Understand how the human brain works and build models that behave in the same way (model the whole brain as a single entity) Understand how the human brain works and build models that behave in the same way (model the whole brain as a single entity) Examine the interconnections of neurons in the brain and produce similar activity by leveraging it’s structure Examine the interconnections of neurons in the brain and produce similar activity by leveraging it’s structure *Picton, P. (2000). Neural Networks. Palgrave.
5
4 Artificial Intelligence Based around logic, usually in the form of a set of rules Based around logic, usually in the form of a set of rules Expert Systems (medical diagnosis, game playing) Expert Systems (medical diagnosis, game playing) Fuzzy logic Fuzzy logic Mathematical models attempt to leverage organization of neurons (Nodes are arranged in various configurations to emulate brain function) Mathematical models attempt to leverage organization of neurons (Nodes are arranged in various configurations to emulate brain function) Neural Networks (pattern recognition) Neural Networks (pattern recognition)
6
5 Artificial Intelligence What is a Neural Network? What is a Neural Network? “Neural Networks are an attempt to create machines that work in a similar way to the human brain by building these machines using components that behave like biological neurons” “Neural Networks are an attempt to create machines that work in a similar way to the human brain by building these machines using components that behave like biological neurons” *Picton, P. (2000). Neural Networks. Palgrave.
7
6 Examples of Neural Networks The perceptron is a type of artificial neural network invented in 1957 at the Cornell Aeronautical Laboratory by Frank Rosenblatt. The perceptron is a type of artificial neural network invented in 1957 at the Cornell Aeronautical Laboratory by Frank Rosenblatt. The Perceptron is a single layer neural network whose weights and biases could be trained to produce a correct target *vector when presented with the corresponding input vector. The Perceptron is a single layer neural network whose weights and biases could be trained to produce a correct target *vector when presented with the corresponding input vector. The training technique used is called the perceptron learning rule.Perceptrons are especially suited for simple problems in pattern classification. The training technique used is called the perceptron learning rule.Perceptrons are especially suited for simple problems in pattern classification. *Vector: Any device of transportation or movement *Source: http://en.wikipedia.org/wiki/
8
7 Examples of Neural Networks The best-known example of a neural network training algorithm is back propagation The best-known example of a neural network training algorithm is back propagation *Backpropagation is a supervised learning technique used for training artificial neural networks. The term is an abbreviation for “backwards propagation of errors”, which requires that the transfer function used by the artificial neurons (or “nodes”) be differentiable. *Backpropagation is a supervised learning technique used for training artificial neural networks. The term is an abbreviation for “backwards propagation of errors”, which requires that the transfer function used by the artificial neurons (or “nodes”) be differentiable. *Source: http://en.wikipedia.org/wiki/
9
8 Hierarchical Temporal Memory (HTM) Hierarchical Hierarchical Subdivide problem so it may be addressed in a hierarchy Subdivide problem so it may be addressed in a hierarchy Temporal Temporal Include time in pattern recognition problem Include time in pattern recognition problem Memory Memory Spatial (stored images used for comparative purposes in pattern recognition) Spatial (stored images used for comparative purposes in pattern recognition)
10
9 Numenta HTM Image Recognition Image is fed into network Image is fed into network Static image is moved within field of view during learning phase and identifying phase. Static image is moved within field of view during learning phase and identifying phase.
11
10 Numenta HTM Network Level 1 Level 1 – 64 nodes Level 1 is a direct mapping of the input image - A’s receptive field is a 4 x 4 This is unsupervised learning Examining a single node from Level 1
12
11 HTM Node in Inference Mode Level 2 The node is attempting to name the given input. In this case the pattern is identified as the binary 0100. Which is passed to level 2.
13
12 Single HTM Node’s Set of Static Images Step 1: Spatial – form sets of images using pixel- wise similarity. Creates a finite set of images for temporal analysis. Step 1: Spatial – form sets of images using pixel- wise similarity. Creates a finite set of images for temporal analysis.
14
13 Single HTM Node’s Set of Sequences Step 2: Temporal – form sets of images using their temporal proximity to one another (is pattern a frequently followed by pattern b). Step 2: Temporal – form sets of images using their temporal proximity to one another (is pattern a frequently followed by pattern b).
15
14 HTM Node in Inference Mode Spatial: Spatial: Remove Noise Remove Noise Temporal: Temporal: Capable of pooling together patterns that are very dissimilar from a pixel-wise perspective Capable of pooling together patterns that are very dissimilar from a pixel-wise perspective Must have a finite set of points to use Must have a finite set of points to use
16
15 Numenta HTM Network Levels 1 and 2 Level 2 – 16 nodes Level 2 receives its Input from the output of 4 level 1 nodes – C & D’s receptive fields are 8 x 8
17
16 Numenta HTM Network Structure Level 1 – 64 nodes Level 2 – 16 nodes Level 1 is a direct mapping of the input image - A’s receptive field is a 4 x 4 Level 2 receives its Input from the output Of 4 level 1 nodes – C & D’s receptive fields are 8 x 8 Level 3 - the invariant form (label / name) *George, D & Jaros, B. (2007). The HTM Learning Algorithms. Numenta.
18
17 Prediction What is prediction? What is prediction? Statistically based assumption Statistically based assumption Neuroanatomists have known for a long time that the brain is saturated with feedback connections. For example, in the circuit between the neocortex and a lower structure called the thalamus, connections going backward (toward the input) exceed the connections going forward by almost a factor of ten! That is, for every fiber feeding information forward into the neocortex, there are ten fibers feeding information back toward the senses. Neuroanatomists have known for a long time that the brain is saturated with feedback connections. For example, in the circuit between the neocortex and a lower structure called the thalamus, connections going backward (toward the input) exceed the connections going forward by almost a factor of ten! That is, for every fiber feeding information forward into the neocortex, there are ten fibers feeding information back toward the senses. * Hawkins, (2004): On Intelligence, 25
19
18 Further Research Not just for image recognition Not just for image recognition Unsupervised learning opens up the possibility for determining causality for novel problems Unsupervised learning opens up the possibility for determining causality for novel problems Hawkin’s: Weather pattern example Hawkin’s: Weather pattern example
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.