© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems Turban, Aronson, and Liang Decision Support Systems and Intelligent Systems, Seventh Edition
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-2 Learning Objectives Understand second-generation intelligent systems. Learn the basic concepts and applications of case-based systems. Understand the uses of artificial neural networks. Examine the advantages and disadvantages of artificial neural networks. Learn about genetic algorithms. Examine the theories and applications of fuzzy knowledge.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-3 Household Financial’s Vision Speeds Loan Approvals With Neural Networks Vignette Loan product regulation varies in each state Develop an object-oriented loan approval system –Neural network-based Fed risk, interest rate variables, customer data Estimates credit worthiness, potential for fraud Pattern recognition –Integrates all loan approval phases –Uses intelligent underwriting engine –Reduced training time and administrative overhead –Decreased managed basis efficiency ratio –Upgradeable to web-based architecture
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-4 Machine Learning Acquisition of knowledge through historical examples Implicitly induces expert knowledge from history Different from the way that humans learn Implications of system success and failure unclear Manipulates of symbols instead of numbers
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-5 Methods Supervised learning –Induce knowledge from known outcomes New cases used to modify existing theories Statistical methods Rule induction Case based and inference Neural computing Genetic algorithms leading to survival of fittest Unsupervised learning –Determine knowledge from data with unknown outcomes Clustering data into similar groups Neural computing Genetic algorithms leading to survival of fittest
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-6 Case Reasoning Inductive Case base used for decision-making Effective when rule-based reasoning is not Case –Primary knowledge element Ossified Paradigmatic Stories
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-7
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-8 Process Features assigned as character indexes –Indexing rules identify input features Indexes used to retrieve similar cases from memory –Episodic case memories –Similarity metrics applied Old solution adjusted to fit new case –Modification rules Solution tested If successful, assigned value and stored If failure, explain, repair, test –Alter plan to fit situation –Rules for permissible alterations
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-9
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Case Reasoning Success Factors Specific business objectives Knowledge should directly support end users Appropriate design Updatable Measurable metrics Acceptable ROI User accessible Expandable across enterprise
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Human Brain 50 to 150 billion neurons in brain Neurons grouped into networks –Axons send outputs to cells –Received by dendrites, across synapses
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Neural Networks Attempts to mimic brain functions Analogy, not accurate model Artificial neurons connected in network –Organized by topologies –Structure Three or more layers –Input, intermediate (one or more hidden layers), output Receives modifiable signals
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Processing Processing elements are neurons Allows for parallel processing Each input is single attribute –Connection weight Adjustable mathematical value of input –Summation function Weighted sum of input elements Internal stimulation –Transfer function Relation between internal activation and output –Sigmoid/transfer function –Threshold value Outputs are problem solution
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Architecture Feedforward-backpropogation –Neurons link output in one layer to input in next –No feedback Associative memory system –Correlates input data with stored information –May have incomplete inputs –Detects similarities Recurrent structure –Activities go through network multiple times to produce output
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Network Learning Learning algorithms –Supervised Connection weights derived from known cases Pattern recognition combined with weighting changes Back error propagation –Easy implementation –Multiple hidden layers –Adjust learning rate and momentum –Known patterns compared to output and allows for weight adjustment –Established error tolerance –Unsupervised –Only stimuli shown to network –Humans assign meanings and determine usefulness Adaptive resonance theory Kohonen self-organizing feature maps
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Development of Systems Collect data –The more, the better Separate data into training set to adjust weights Divide into test sets for network validation Select network topology –Determine input, output, and hidden nodes, and hidden layers Select learning algorithm and connection weights Iterative training until network achieves preset error level Black box testing to verify inputs produce appropriate outputs –Contains routine and problematic cases Implementation –Integration with other systems –User training –Monitoring and feedback
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Genetic Algorithms Computer programs that apply processes of evolution –Viability of candidate solutions Self-organized Adaptable Fitness function –Measured by objective obtained Iterative process –Candidate solutions combine to produce generations Reproduction, crossover, mutation
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Genetic Algorithms Establish problem –Parameters Number of initial solutions, number of offspring, number of parents and offspring for each generation, mutation level, probability distribution of crossover point occurrence Generate initial set of solutions Compute fitness functions Total all fitness functions Compare each solution’s fitness function to total Apply crossover Apply random mutation Repeat until good enough solution or no improvement
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-19
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang Fuzzy Logic Mathematical theory of fuzzy sets Imprecise thinking Describes human perception Continuous logic Not 100% true or false, black or white Fuzzy neural networks –Fuzzification Fuzzy logic applied to input and output used to create model –Defuzzification Model converted back to original input, output scales Output becomes input for another intelligent system