Download presentation
Presentation is loading. Please wait.
Published bySamantha Preston Modified over 9 years ago
1
Neural Networks in Computer Science n CS/PY 231 Lab Presentation # 1 n January 14, 2005 n Mount Union College
2
Some Computing History n First all-electronic computing device: ENIAC (1946) Ekert & Mauchly (Penn) –application: computing ballistic firing tables for US Navy –special-purpose device: hard-wired, like the computer in your car’s engine to change the problem, device must be rewired all memory dedicated to data storage
3
Innovation: Stored Program Concept n First stored program computer: EDSAC (1952) von Neumann (Princeton) n reprogramming became a logical (not physical) task n memory stores both software and data n much easier to program; therefore, use of computers explodes commercially
4
von Neumann Architecture n one powerful, fast processor –device capable of performing calculations and comparisons n large memory that the processor may access n serial processing model: processor performs one operation at a time, in series
5
von Neumann Architecture Memory Processor Program Instructions Data
6
Traits of von Neumann Machines n much better than humans at performing calculations and logical step-by-step procedures n since computers were so much better at these kinds of tasks, it was assumed that they would outperform humans in all mental tasks –as soon as we could determine how to program the machines for those tasks
7
Not So Fast, My Friend… n von Neumann architecture turns out to have serious limitations when confronted with certain problems n von Neumann bottleneck: even with a fast processor, doing one thing at a time is too slow when facing a huge task n idea: have several processors working on the same problem
8
Example: Sorting a list of values n data: 45, 67, 12, 87, 19, 55, 32, 29 n method: bubble sort n single processor: 27 comparisons, 16 switches n two processors: 12 comparisons, 5 switches, 8 merge steps n complications: coordination of processors, communication, limited # of processors, etc.
9
A Paradox n Researchers discovered that machines based on the von Neumann architecture are: –good at problems that are difficult for humans (e.g., calculations), and –bad at problems that humans find to be easy (e.g., pattern recognition) n This seems to be a limitation of the machine’s architecture
10
A Novel Idea n Why not build computers that are structured as brains are? n Problem: brain structure is fabulously complex and consists of massive numbers of processors (neurons) n We can start with small artificial brains and add complexity as we progress
11
Processing Speeds n Modern processors are almost a billion times faster than real neurons n However, the brain contains hundreds of billions of neurons, with trillions of connections, each of which can process and store signals n The brain ends up having much more raw processing power than a single processor
12
Essential for Pattern Recognition n Massive Parallelism n many tasks are performed by the brain at the same instant of time (breathing control, digestion, visual processing, thinking, etc.) n problem: how to program such a complex combination of processing devices? n solution: look to nature!
13
Neural Networks n term for a computing device based on brain structure n idea: connecting a huge number of simple processors into a large interconnected network n organized in layers: input, middle (hidden), output n How does this network know what answers to produce for some input?
14
Traditional Computer Programming n we must develop a step-by-step sequence of operations that when carried out will produce the answers to the problem being considered n someone must write such a program -- this is a difficult task n writing software for parallel architectures is MUCH harder than this!!
15
Training Neural Networks n A neural network may be trained to produce a desired output when exposed to a given input pattern n it turns out that networks are GREAT at solving pattern recognition tasks n they may be trained to recognize any patterns (visual, audio, representational) –as long as the input is encoded in a form that the network can receive)
16
Modeling Nature n much of the complexity of a brain cell is present to support the living operation of the cell n we can devise a simplified model of the neuron that focuses on the processing function of the cell n we will then use these units to construct artificial neural networks (brains)
17
The Perceptron n a simple model of a real neuron n contains essential features: –a number of input connections (values received from other neurons) –each input connection has an associated weight –the perceptron has a threshold value, and produces –an output value (the answer)
18
The Perceptron I 1 I 2 I 3 I n : : Output (threshold) w 1 w 2 w 3 w n
19
The Perceptron’s Calculation n at any instant of time, the output generated by the perceptron is calculated as follows: n first, each input signal is multiplied by the weight associated with that connection n all of these products are added together n if the sum equals or exceeds the threshold, the perceptron “fires”
20
The Perceptron’s Calculation n if the sum is less than the threshold value, the perceptron does not produce a signal n we use output values of 1 for firing and 0 for not firing n simplification of what actual brain cells do –but this simple concept is the basis for neural network computing!
21
Simplest Perceptron n limited to two input connections n can be used to classify input patterns into classes (accepted or rejected) n Perceptrons can be used to solve a wide variety of decision problems –but are too simple for many other problems –for those, we’ll have to combine perceptrons in a hierarchy or network
22
Neural Networks in Computer Science n CS/PY 231 Lab Presentation # 1 n January 14, 2005 n Mount Union College
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.