Download presentation
1
Introduction to Business Analytics
Chapter 6: Neural Networks for Data Mining Matthew J. Liberatore Thomas Coghlan Fall 2008
2
Learning Objectives Understand the concept and different types of artificial neural networks (ANN) Learn the advantages and limitations of ANN Understand how backpropagation neural networks learn Understand the complete process of using neural networks Appreciate the wide variety of applications of neural networks
3
Using Neural Networks to Predict Beer Flavors with Chemical Analysis
Why is beer flavor important to the profitability of Coors? What is the objective of the neural network used at Coors? Why were the results of the Coors neural network initially poor, and what was done to improve the results? What benefits might Coors derive if this project is successful? What modification would you provide to improve the results of beer flavor prediction?
4
Business Applications of Artificial Neural Networks (ANN)
Many applications across all areas of business target customers (CRM) bank loan approval hiring stock purchase trading electricity approving loan applications fraud prevention predicting bankruptcy time series forecasting
5
What are Artificial Neural Networks?
Artificial Neural Networks (ANN) are biologically inspired and attempt to build computer models that operate like a human brain These networks can “learn” from the data and recognize patterns
6
Basic Concepts of Neural Networks
Biological and artificial neural networks Neurons Cells (processing elements) of a biological or artificial neural network Nucleus The central processing portion of a neuron Dendrite The part of a biological neuron that provides inputs to the cell
7
Basic Concepts of Neural Networks
Biological and artificial neural networks Axon An outgoing connection (i.e., terminal) from a biological neuron Synapse The connection (where the weights are) between processing elements in a neural network
8
Basic Concepts of Neural Networks
9
Basic Concepts of Neural Networks
10
Relationship Between Biological and Artificial Neural Networks
Soma – Node Dendrites – Input Axon – Output Synapse – Weight ANNs typically have much fewer neurons than humans
11
Basic Concepts of Neural Networks
Backpropagation The best-known learning algorithm in neural computing. Learning is done by comparing computed outputs to desired outputs of historical cases Network structure (three layers) Input Intermediate (hidden layer) Output
12
Basic Concepts of Neural Networks
13
Basic Concepts of Neural Networks
Network information processing Inputs Outputs Connection weights Summation function (combination function) Transformation function (activation function)
14
Basic Concepts of Neural Networks
Loan Processing Example Inputs – income level, age, home ownership nodes Outputs – approval/disapproval of loan nodes
15
Basic Concepts of Neural Networks
Connection weights The weight associated with each link in a neural network model strength of the data transferred between layers in the network They are assessed by neural networks learning algorithms
16
Basic Concepts of Neural Networks
Transformation function (activation function) maps the summation (combination) function onto a narrower range ( 0 to 1 or -1 to 1) to determine whether or not an output is produced (neuron fires) The transformation occurs before the output reaches the next level in the network Sigmoid (logical activation) function: an S-shaped transfer function in the range of zero to one –exp(x)/(1-exp(x)) Threshold value is sometimes used instead of a transformation function A hurdle value for the output of a neuron to trigger the next level of neurons. If an output value is smaller than the threshold value, it will not be passed to the next level of neurons
17
Basic Concepts of Neural Networks
18
Basic Concepts of Neural Networks
19
Learning in ANN Learning algorithm
The training procedure used by an artificial neural network Supervised learning A method of training artificial neural networks in which sample cases are shown to the network as input and the weights are adjusted to minimize the error in its outputs
20
Learning in ANN
21
Learning in ANN How a network learns Backpropagation
The best-known supervised learning algorithm in neural computing. Learning is done by comparing computed outputs to desired outputs of historical cases
22
Learning in ANN How a network learns
Procedure for a learning algorithm Initialize weights with random values and set other parameters Read in the input vector and the desired output Compute the actual output via the calculations, working forward through the layers Compute the error Change the weights by working backward from the output layer through the hidden layers
23
Learning in ANN
24
Error calculation and weights
At each hidden node and target node: compute: Linear combination function: C = w0 + w1x1 +…+ wnxn Logistic activation function: L = exp(C)/(1+exp(C) At the target node compute Bernoulli error function: sum errors over all observations, where the error is -2 ln (L) if there is a response, or -2 ln (1 – L) if there is no response In the first iteration, random weights are used In subsequent iterations, the weights are changed by a small amount so that the error is reduced The process continues until the weights cannot be reduced further
25
Developing Neural Network–Based Systems
Data collection and preparation The data used for training and testing must include all the attributes that are useful for solving the problem Recall the bankruptcy prediction problem we modeled using logistic regression -- The same data can be used to train a neural network: working capital/total assets (WC/TA) retained earnings/total assets (RE/TA) earnings before interest and taxes/total assets (EBIT/TA) market value of equity/total debt (MVE/TD) sales/total assets (S/TA)
26
Developing Neural Network–Based Systems
Selection of network structure Determination of: Input nodes Output nodes Number of hidden layers Number of hidden nodes For the bankruptcy problem (and all of our examples) we have one hidden layer The Bankruptcy problem has ten nodes in the hidden layer – sometimes one might experiment with the number of nodes
27
Developing Neural Network–Based Systems
28
Developing Neural Network–Based Systems
Learning algorithm selection Identify a set of connection weights that best cover the training data and have the best predictive accuracy Network training An iterative process that starts from a random set of weights and gradually enhances the fitness of the network model and the known data set The iteration continues until the error sum is converged to below a preset acceptable level
29
Developing Neural Network–Based Systems
Testing Black-box testing Comparing test results to actual results The test plan should include routine cases as well as potentially problematic situations If the testing reveals large deviations, the training set must be reexamined, and the training process may have to be repeated Might compare ANN results with other methods such as logistic regression
30
Developing Neural Network–Based Systems
Implementation of an ANN Implementation often requires interfaces with other computer-based information systems and user training Ongoing monitoring and feedback to the developers are recommended for system improvements and long-term success It is important to gain the confidence of users and management early in the deployment to ensure that the system is accepted and used properly
31
Developing Neural Network–Based Systems
32
Neural Networks In SAS Enterprise Miner 5.3
In your bankrupt project, create a new diagram called bankrupt_neural Drag the bankrupt data node onto your diagram From the Model tab, drag the Neural Network node onto the diagram and connect Connect the data node to the Neural Network Node
33
Highlight the Neural Network node
Highlight the Neural Network node. In the property panel window, set model selection criterion to average error
34
In the Property Panel window, click on the square to the right of network and change the defaults for the Target Layer Combination, Activation, and Error functions as indicated. Note that we are using the default of 3 hidden units (nodes).
35
The results show an excellent fit with the cumulative lift equal to the best cumulative lift, no misclassifications, and an average error nearly zero.
36
In the Property Panel click on the box to the right of Exported Data to see the individual predictions and probabilities. The logistic activation function at the target level provides the probabilities, like those obtained from logistic regression
37
Similar to what we did with logistic regression add the bankruptscore data node and the score node to the diagram as shown.
38
After running the score node, the output shows that 6 firms are predicted to go bankrupt (vs. 4 under logistic regression)
39
For details about the individual predictions, highlight the Score node and on the left-hand panel click on the square to the right of Exported Data. Then in the box that appears click on the row whose Port entry is Score. Then click on Explore.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.