 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Slides from: Doug Gray, David Poole
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Artificial Intelligence (CS 461D)
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks (ANNs)
CS 484 – Artificial Intelligence
Neural Networks William Lai Chris Rowlett. What are Neural Networks? A type of program that is completely different from functional programming. Consists.
Supervised Learning: Perceptrons and Backpropagation.
Artificial Intelligence
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Artificial Intelligence (AI) Addition to the lecture 11.
West Virginia University
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Machine Learning. Learning agent Any other agent.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Neural Networks AI – Week 21 Sub-symbolic AI One: Neural Networks Lee McCluskey, room 3/10
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Explorations in Neural Networks Tianhui Cai Period 3.
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
I Robot.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
Neural Networks in Computer Science n CS/PY 231 Lab Presentation # 1 n January 14, 2005 n Mount Union College.
Learning Agents MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Artificial Intelligence, Expert Systems, and Neural Networks Group 10 Cameron Kinard Leaundre Zeno Heath Carley Megan Wiedmaier.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
A field of study that encompasses computational techniques for performing tasks that require intelligence when performed by humans. Simulation of human.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
Business Analytics Several odds and ends Copyright © 2016 Curt Hill.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Big data classification using neural network
Deep Learning Amin Sobhani.
Learning in Neural Networks
Artificial Intelligence (CS 370D)
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Intelligent Information System Lab
Dr. Unnikrishnan P.C. Professor, EEE
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Chapter 12 Advanced Intelligent Systems
Chapter 3. Artificial Neural Networks - Introduction -
network of simple neuron-like computing elements
Artificial Intelligence 12. Two Layer ANNs
ARTIFICIAL NEURAL networks.
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
The Network Approach: Mind as a Web
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead to the evolution of “ Artificial Neural Network”

 A neural network is designed as an interconnected system of processing elements each with a limited numbers of input and output rather than being programmed these system learns to recognize pattern.

 Brain is divided into two parts – Left & Right  Left part – rules, concepts & calculations.  Follows “Rule Based Learning” so similar to “Expert System”.  Right part – picture, image, control.  Follows “Experience Based Learning” so similar to “Neural Network”

 Size  No. of neurons  Process

 Human’s capability in real time visual perception, speech understanding and sensory task that implement in machine

 ANN classified two types  Recurrent  Non recurrent

 Conventional computer –single processor sequentially dictate every piece of the action  ANN – very large number of processing elements that individually deals with a piece of a big problem

 Trained by learning example  Memory & processing elements collocated  Self organizing during learning  Knowledge stored is adaptable  Processing is anarchic  Speed in millisecond  Programmed with instruction  Memory & processing elements separate  Software dependent  Knowledge stored in address memory  Processing is autocratic  Speed nanosecond

 Input are summed and passed to a scaling function and decide which one pass first

BACK  If neurons receives an input from another neurons and if both highly active the weight between the neurons should be strengthened.

 If the desired output and the input are both highly active or both inactive increment the connection weight by the learning rate otherwise decrement the weight by the learning rate.

 Neuron Organized in the form of layer.  Simple form, because network is feed- forward or acyclic type.

 Present more then one hidden layer are connect is called neurons or unit.  If, the size of i/p layer is very large then hidden layer extracts higher order statistic which is valuable.

 Teacher teaches n/w giving environment into form of i/p-o/p pre-calculated example.  ANN observed i/p and compared predefine o/p.  Difference is calculated refer as error signal.

 Is involved in exploring environment because right input response available.  System receive on i/p in environment and process o/p response.

 Self origination learning because no external teacher.  Tuned the regularities after optimized i/p.

 We can recognition last encounter person using voice or smelling in tanning section or define particular class.  Decision space is divided into region and region associated with class.

 A critical part of system are maintained by controller.  Relevance of learning control should be supervising because “after all the human brain is computer”.

 Info being generated by the environment very with time.  And also generate variation of environment network & never stop.  This learning is celled continues learning or learning of fly.

 Training example as possible that means i/p o/p mapping computed by network is correct.  Many i/p o/p example end up memorized the training data.  Finding future data but not finding true understanding function.  Network over trained it losses the ability to generalize between similar i/p o/p.

 Several Ann model available to chose in particular problem.  They are very fast.  Increase Accuracy,result in cost saving.  Represent any function,there for they called “universal approximation”.  Ann are able to learn representative example by back propagation error.

 LOW LEARNING RATE: problem require large but complex network.  FORGETFULL : forget old data and training new ones.  IMPRECISION : not provide precise numerical answer.  BLACK BOX APPROACH : we cant see physical part of training transfer data.  LIMITED FLEXIBILITY : implemented only one system available.

 TIME SERISE PREDICTION :  1.forcastin:sort time evolution.  2.Modelling :feature of long term.  3.charecterition: define fundamental properties.  SPEECH GENERATION :it was training to pronounce writing English text.  SPEECH RECOGNITION: speech convert into written text by markon model using some symbols.  AUTONOMOUS VEHICLE NEVIGATION: this is vision based and robot guidance method.

 HAND WRITING RECOGNITION: hidden layer are reduced free parameter and enhance provide by the writing.  IN ROBOTICS FIELD : a device of AI which behave just like human.

At last I want to say that after 200 or 300 years neural networks is so developed that it can find the errors of even human beings and will be able to rectify that errors and make human being more intelligent.