Artificial Neural Networks

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Artificial Neural Networks
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
CS 4700: Foundations of Artificial Intelligence
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial neural networks:
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Neural Networks.
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Appendix B: An Example of Back-propagation algorithm
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Multi-Layer Perceptron
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS621 : Artificial Intelligence
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Neural Networks References: “Artificial Intelligence for Games” "Artificial Intelligence: A new Synthesis"
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Machine Learning Supervised Learning Classification and Regression
Neural networks.
Supervised Learning in ANNs
Real Neurons Cell structures Cell body Dendrites Axon
Deep Learning Libraries
CSE 473 Introduction to Artificial Intelligence Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
CS621: Artificial Intelligence
Prof. Carolina Ruiz Department of Computer Science
of the Artificial Neural Networks.
XOR problem Input 2 Input 1
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Capabilities of Threshold Neurons
Machine Learning: Lecture 4
Machine Learning: UNIT-2 CHAPTER-1
Artificial Neural Networks
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
David Kauchak CS158 – Spring 2019
Deep Learning Libraries

CS621: Artificial Intelligence Lecture 18: Feedforward network contd
PYTHON Deep Learning Prof. Muhammad Saeed.
Prof. Carolina Ruiz Department of Computer Science
Overall Introduction for the Lecture
Presentation transcript:

Artificial Neural Networks

What are they? Inspired by the Human Brain. The human brain has about 86 Billion neurons and requires 20% of your body’s energy to function. These neurons are connected to between 100 Trillion to 1 Quadrillion synapses!

What are they?

What are they? Originally developed by Warren McCulloch and Walter Pitts[3] (1943)  Started off as an unsupervised learning tool. Had problems with computing time and could not compute XOR Was abandoned in favor of other algorithms Werbos's (1975) backpropagation algorithm Incorporated supervision and solved XOR But were still too slow vs. other algorithms e.g., Support Vector Machines Backpropagation was accelerated by GPUs in 2010 and shown to be more efficient and cost effective

GPUS GPUS handle parallel operations much better (thousands of threads per core) but are not as quick as CPUs. However, the matrix multiplication steps in ANNs can be run in parallel resulting in considerable time + cost savings. The best CPUs handle about 50GB/s while the best GPUs handle 750GB/s memory bandwidth.

Applications http://news.mit.edu/2017/artificial-intelligence-suggests-recipes-based- on-food-photos-0720 Image to food to ingredients to recipes.

Applications https://www.nasa.gov/press-release/artificial-intelligence-nasa-data- used-to-discover-eighth-planet-circling-distant-star Images of light drop compared and new ones found.

Idea behind them Obtain some structured data (always a good idea ). Use some subset of that data as training Feed each training example through the network Calculate the error for each training example Update the weights for each neuron to minimize the error using Gradient Descent (Back Propagation) Feed in the data again until you reach the desired % error or trials run out If you reached % error or trials stop and go to the next training input Else (Back Propagation)

An example

Forward Propagation Assign random weights to the synapses Feed in the training data Calculate the hidden layers neurons from the inputs and the weights using an activation function Calculate the output from the hidden layer neurons and the output weights Calculate the error from what is expected

Activation Function

Forward propagation

We need to adjust the weights to minimize the error Back propagation We need to adjust the weights to minimize the error

Back propagation

Back Propagation

Live Example! https://teachablemachine.withgoogle.com/

Strike the pose https://storage.googleapis.com/tfjs-models/demos/posenet/camera.html

Can a machine play music? https://magenta.tensorflow.org/demos/performance_rnn/index.html#2|2,0,1,0,1,1,0,1,0,1,0,1|1,1,1,1,1,1,1,1,1,1,1,1|1,1,1,1,1,1,1,1,1,1,1,1|false

Tensor Flow & Keras import numpy as np from keras.models import Sequential from keras.layers.core import Dense # the four different states of the XOR gate training_data = np.array([[0,0],[0,1],[1,0],[1,1]], "float32") # the four expected results in the same order target_data = np.array([[0],[1],[1],[0]], "float32") #use sequential vs functional since we’re feedforward model = Sequential() #Dense is used for single input data 0,1,1,0 for each neuron model.add(Dense(16, input_dim=2, activation='relu')) model.add(Dense(1, activation='sigmoid')) #this is the building of the net model.compile(loss='mean_squared_error', optimizer='adam', metrics=['binary_accuracy']) #now optimize model.fit(training_data, target_data, nb_epoch=500, verbose=2) print model.predict(training_data).round()

Derivative of the sigmoid