Presentation is loading. Please wait.

Presentation is loading. Please wait.

Neural Networks & Cases

Similar presentations


Presentation on theme: "Neural Networks & Cases"— Presentation transcript:

1 Neural Networks & Cases
By Jinhwa Kim

2 Neural Computing: The Basics
Neural Computing is a problem solving methodology that attempts to mimic how human brain function Artificial Neural Networks (ANN) Machine Learning

3 Neural Computing Computing technology that mimic certain processing capabilities of the human brain Knowledge representations based on Massive parallel processing Fast retrieval of large amounts of information The ability to recognize patterns based on historical cases Neural Computing = Artificial Neural Networks (ANNs) Purpose of ANN is to simulate the thought process of human brain Inspired by the studies of human brain and the nervous system

4 The Biology Analogy Neurons: brain cells
Nucleus (at the center) Dendrites provide inputs Axons send outputs Synapses increase or decrease connection strength and cause excitation or inhibition of subsequent neurons Figure 15.1

5 Artificial Neural Networks (ANN)
A model that emulates a biological neural network Software simulations of the massively parallel processes that involve processing elements interconnected in a network architecture Originally proposed as a model of the human brain’s activities The human brain is much more complex

6 Artificial Neural Networks (ANN)
Three Interconnected Artificial Neurons Biological Artificial Soma Node Dendrites Input Axon Output Synapse Weight Slow speed Fast speed Many neurons Few neurons (Billions) (Dozens)

7 ANN Fundamentals Components and Structure
“A network is composed of a number of processing elements organized in different ways to form the network structure” Processing Elements (PEs) – Neurons Network Collection of neurons (PEs) grouped in layers Structure of the Network Topologies / architectures – different ways to interconnect PEs Figure 15.3

8 ANN Fundamentals Figure 15.4

9 ANN Fundamentals Processing Information by the Network Inputs Outputs
Weights Summation Function Figure 15.5

10 ANN Fundamentals Transformation (Transfer) Function AIS 15.3
Computes the activation level of the neuron Based on this, the neuron may or may not produce an output Most common: Sigmoid (logical activation) function AIS 15.3

11 Learning in ANN Compute outputs Compare outputs with desired targets
Adjust the weights and repeat the process Figure 15.6

12 Neural Network Application Development
Preliminary steps Requirement determination Feasibility study Top management champion ANN Application Development Process 1. Collect Data 2. Separate into Training and Test Sets 3. Define a Network Structure 4. Select a Learning Algorithm 5. Set Parameters, Values, Initialize Weights 6. Transform Data to Network Inputs 7. Start Training, and Determine and Revise Weights 8. Stop and Test 9. Implementation: Use the Network with New Cases

13 Data Collection and Preparations
Collect data and separate it into Training set (60%) Testing set (40%) Make sure that all three sets represent the population: true random sampling Use training and cross validation cases to adjust the weights Use test cases to validate the trained network

14 Neural Network Architecture
There are several ANN architectures Figure 15.9

15 Neural Network Architecture
Feed forward Neural Network Multi Layer Perceptron, - Two, Three, sometimes Four or Five Layers

16 How a Network Learns Step function evaluates the summation of input values Calculating outputs Measure the error (delta) between outputs and desired values Update weights, reinforcing correct results At any step in the process for a neuron, j, we get Delta = Zj - Yj where Z and Y are the desired and actual outputs, respectively

17 How a Network Learns Updated Weights are
Wi (final) = Wi (initial) + alpha × delta × X1 where alpha is the learning rate parameter Weights are initially random The learning rate parameter, alpha, is set low Delta is used to derive the final weights, which then become the initial weights in the next iteration (row) Threshold value parameter: sets Y to 1 in the next row if the weighted sum of inputs is greater than 0.5; otherwise, to 0

18 How a Network Learns

19 Backpropagation Backpropagation (back-error propagation)
Most widely used learning Relatively easy to implement Requires training data for conditioning the network before using it for processing other data Network includes one or more hidden layers Network is considered a feedforward approach Continue

20 Backpropagation Drawbacks:
Initialize the weights Read the input vector Generate the output Compute the error Error = Out - Desired Change the weights Drawbacks: A large network can take a very long time to train May not converge

21 Testing Test the network after training
Examine network performance: measure the network’s classification ability Black box testing Do the inputs produce the appropriate outputs? Not necessarily 100% accurate But may be better than human decision makers Test plan should include Routine cases Potentially problematic situations May have to retrain

22 ANN Development Tools NeuroSolutions Statistica Neural Network Toolkit
Braincel (Excel Add-in) NeuralWorks Brainmaker PathFinder Trajan Neural Network Simulator NeuroShell Easy SPSS Neural Connector NeuroWare

23 Benefits of ANN Pattern recognition, learning, classification, generalization and abstraction, and interpretation of incomplete and noisy inputs Character, speech and visual recognition Can provide some human problem-solving characteristics Can tackle new kinds of problems Robust Fast Flexible and easy to maintain Powerful hybrid systems

24 Limitations of ANN Lack explanation capabilities
Limitations and expense of hardware technology restrict most applications to software simulations Training time can be excessive and tedious Usually requires large amounts of training and test data

25 ANN Demonstration www.roselladb.com NeuroSolutions
by NeuroDimentions, Inc. DMWizard By Knowledge Based Systems, Inc. Funded by US Army

26 Business ANN Applications
Accounting Identify tax fraud Enhance auditing by finding irregularities Finance Signatures and bank note verifications Foreign exchange rate forecasting Bankruptcy prediction Customer credit scoring Credit card approval and fraud detection* Stock and commodity selection and trading Forecasting economic turning points Pricing initial public offerings* Loan approvals

27 Business ANN Applications
Human Resources Predicting employees’ performance and behavior Determining personnel resource requirements Management Corporate merger prediction Country risk rating Marketing Consumer spending pattern classification Sales forecasts Targeted marketing, … Operations Vehicle routing Production/job scheduling, …

28 Bankruptcy Prediction with ANN
Based on a paper Published in Decision Support Systems, 1994 By Rick Wilson and Ramesh Sharda ANN Architecture Three-layer (input-hidden-output) MLP Backpropagation (supervised) learning network Training data Small set of well-known financial ratios Data available on bankruptcy outcomes Moody’s industrial manual (between 1975 and 1982)

29 Bankruptcy Prediction with ANN
Application Design Specifics Five Input Nodes X1: Working capital/total assets X2: Retained earnings/total assets X3: Earnings before interest and taxes/total assets X4: Market value of equity/total debt X5: Sales/total assets Single Output Node: Final classification for each firm Bankruptcy or Nonbankruptcy Development Tool: NeuroShell

30 Bankruptcy Prediction with ANN

31 Bankruptcy Prediction with ANN
Training Data Set: 129 firms Training Set: 74 firms; 38 bankrupt, 36 not Ratios computed and stored in input files for: The neural network A conventional discriminant analysis program Parameters Number of PEs Learning rate and Momentum Testing Two Ways Test data set: 27 bankrupt firms, 28 nonbankrupt firms Comparison with discriminant analysis

32 Bankruptcy Prediction with ANN
Results The neural network correctly predicted: 81.5 percent bankrupt cases 82.1 percent nonbankrupt cases ANN did better predicting 22 out of the 27 cases discriminant analysis predicted only 16 correctly Error Analysis Five bankrupt firms misclassified by both methods Similar for nonbankrupt firms Accuracy of about 80 percent is usually acceptable for this problem domain


Download ppt "Neural Networks & Cases"

Similar presentations


Ads by Google