Ch. Eick: Evolutionary Machine Learning n Different Forms of Learning: Learning agent receives feedback with respect to its actions (e.g. from a teacher)

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

Pat Langley Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, California
COMPUTER AIDED DIAGNOSIS: CLASSIFICATION Prof. Yasser Mostafa Kadah –
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
CHAPTER 9: Decision Trees
Genetic Algorithms (Evolutionary Computing) Genetic Algorithms are used to try to “evolve” the solution to a problem Generate prototype solutions called.
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Learning Classifier Systems. Learning Classifier Systems (LCS) The system has three layers: – A performance system that interacts with environment, –
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
1 Chapter 13 Artificial Life: Learning through Emergent Behavior.
EC: Lecture 17: Classifier Systems Ata Kaban University of Birmingham.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Learning Classifier Systems to Intrusion Detection Monu Bambroo 12/01/03.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Three kinds of learning
Genetic Algorithm for Variable Selection
Genetic Algorithm Genetic Algorithms (GA) apply an evolutionary approach to inductive learning. GA has been successfully applied to problems that are difficult.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Classifier Systems Anil Shankar Dept. of Computer Science University of Nevada, Reno.
CS Instance Based Learning1 Instance Based Learning.
Khaled Rasheed Computer Science Dept. University of Georgia
Genetic Programming.
Machine Learning CS 165B Spring 2012
Attention Deficit Hyperactivity Disorder (ADHD) Student Classification Using Genetic Algorithm and Artificial Neural Network S. Yenaeng 1, S. Saelee 2.
COGNITIVE RADIO FOR NEXT-GENERATION WIRELESS NETWORKS: AN APPROACH TO OPPORTUNISTIC CHANNEL SELECTION IN IEEE BASED WIRELESS MESH Dusit Niyato,
Genetic Algorithm.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
Ch. Eick: Evolutionary Machine Learning Classifier Systems n According to Goldberg [113], a classifier system is “a machine learning system that learns.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
CS 484 – Artificial Intelligence1 Announcements Lab 4 due today, November 8 Homework 8 due Tuesday, November 13 ½ to 1 page description of final project.
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
GATree: Genetically Evolved Decision Trees 전자전기컴퓨터공학과 데이터베이스 연구실 G 김태종.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Chapter 8 The k-Means Algorithm and Genetic Algorithm.
1 Chapter 13 Artificial Life: Learning through Emergent Behavior.
ART – Artificial Reasoning Toolkit Evolving a complex system Marco Lamieri
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Eick: LCS-Review Bull-Paper Review1 1.Holland (1986): “Classifier systems … rule-based systems with general mechanisms to process rules in parallel, for.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Learning from observations
Genetic Algorithms. Evolutionary Methods Methods inspired by the process of biological evolution. Main ideas: Population of solutions Assign a score or.
Ch. Eick: Evolutionary Machine Learning n Different Forms of Learning: Learning agent receives feedback with respect to its actions (e.g. from a teacher)
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Learning Classifier Systems BRANDEN PAPALIA, MICHAEL STEWART, JAMES PATRICK FACULTY OF ENGINEERING, COMPUTING AND MATHEMATICS.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Learning Classifier Systems (Introduction) Muhammad Iqbal Evolutionary Computation Research Group School of Engineering and Computer Science Victoria University.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Chapter 9 Genetic Algorithms Evolutionary computation Prototypical GA
Data Mining and Decision Support
Machine Learning Chapter 18, 21 Some material adopted from notes by Chuck Dyer.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Towards a Mapping of Modern AIS and Learning Classifier Systems Larry Bull Department of Computer Science & Creative Technologies University of the West.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Genetic Algorithm (GA)
Naïve Bayes Classifier April 25 th, Classification Methods (1) Manual classification Used by Yahoo!, Looksmart, about.com, ODP Very accurate when.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Christoph F. Eick: Thoughts on Designing Michigan-style Classifier Systems Thoughts on Selection Methods in Michigan-style Classifier Systems  When solving.
UCSpv: Principled Voting in UCS Rule Populations Gavin Brown, Tim Kovacs, James Marshall.
George Yauneridge.  Machine learning basics  Types of learning algorithms  Genetic algorithm basics  Applications and the future of genetic algorithms.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
General Aspects of Learning
Bull-Paper Review1 Holland (1986): “Classifier systems … rule-based systems with general mechanisms to process rules in parallel, for the adaptive generation.
General Aspects of Learning
Lecture 4. Niching and Speciation (1)
General Aspects of Learning
Presentation transcript:

Ch. Eick: Evolutionary Machine Learning n Different Forms of Learning: Learning agent receives feedback with respect to its actions (e.g. from a teacher) –Supervised Learning: feedback is received with respect to all possible actions of the agent –Reinforcement Learning: feedback is only received with respect to the taken action of the agent Unsupervised Learning: Learning when there is no hint at all about the correct action n Inductive Learning is a form of supervised learning that centers on learning a function based on sets of training examples. Popular inductive learning techniques include decision trees, neural networks, nearest neighbor approaches, discriminant analysis, and regression. n The performance of an inductive learning system is usually evaluated using n-fold cross-validation. Learning Paradigms and General Aspects of Learning

Ch. Eick: Evolutionary Machine Learning Classifier Systems n According to Goldberg [113], a classifier system is “a machine learning system that learns syntactically simple string rules to guide its performance in an arbitrary environment”. n A classifier system consists of three main components: Rule and message system Apportionment of credit system Genetic Algorithm (for evolving classifers) n First implemented in a system called CS1 by Holland/Reitman(1978). n Example of classifer rules: 00##: #0: ##:1000 ##00:0001 n Fitness of a classifier is defined by its surrounding environments that pays payoff to classifiers and extract fees from classifiers. n Classifier systems employ a Michigan approach (populations consist of single rules) in the context of an externally defined fitness function.

Ch. Eick: Evolutionary Machine Learning Bucket Brigade Algorithm n Developed by Holland for the apportionment of credits that relies on the model of a service economy, consisting of two main componens: auction and a clearing house. n The environment as well as the classifiers post messages. n Each classifier maintains a bank account that measures its strength. Classifiers that match a posted string, make a bid proportial to their strength. Usually, the highest bidding classifier is selected to post its message (other, more parallel schemes are also used) n The auction permits appropriate classifiers to post their messages. Once a classifier is selected for activation, it must clear its payments through a clearing house paying its bid to other classifiers or the environment for matching messages rendered. A matched and activated classifier sends its bid to those classifiers responsible for sending messages that matched the bidding classifiers conditions. The sent bid-money is distributed in some manner between those classifiers.

Ch. Eick: Evolutionary Machine Learning Bucket Bridgade (continued) n Rules that cooperate with a classifier are rewarded by receiving the classifiers bid, the last classifier in a chain receives the environmental reward, all the other classifiers receive the reward from their predecessor. A classifier’s strength might be subject to taxation. The idea that underlies taxation is to punish inactive classifiers: T i (t):=c tax  S i (t) n The strength of a classifier is updated using the following equation: S i (t+1)= S i (t) - P i (t) - T i (t) + R i (t) A classifier bids proportional to its strength: B i =c bid  S i n Genetic algorithms are used to evolve classifiers. A classifiers strength defines its fitness, fitter classifiers reproduce with higher probability (e.g. roulette wheel might be employed) and binary string mutation and crossover operators are used to generate new classifiers. Newly generated classifiers replace weak, low strength classifier (other schemes such as crowding could also be employed).

Ch. Eick: Evolutionary Machine Learning Pittburgh-style Systems n Populations consist of rule-sets, and not of individual rules. n No bucket brigade algorithms is necessary. n Mechanisms to evaluate individual rules are usually missing. n Michigan-style systems are geared towards applications with dynamically changing requirements (“models of adaptation”); Pitt-style systems rely on more static environments assuming a fixed fitness function for rule-sets that are not necessary in the Michigan approach. n Pittsburgh approach systems usually have to cope with variable length chromosomes. n Popular Pittsburgh-style systems include: Smith’s LS-1-system (learns symbolic rule-sets) Janikov’s GIL system (learns symbolic rules; employs operators of Michalski’s inductive learning theory as its genetic operators) Giordana&Saita’s REGAL(learns symbolic concept descriptions) DELVAUX (learns (numerical) Bayesian rule-sets)

Ch. Eick: Evolutionary Machine Learning New Trends in Learning Classifier Systems (LCS) n Holland-style LCS work is very similar to work in reinforcement learning, especially Evolutionary Reinforcement Learning and an approach called “Q-Learning”. Newer paper claim that “bucket brigade” and “Q-Learning” are basically the same thing, and that LCS can benefit from recent advances in the area of Q-learning. n Wilson accuracy-based XCS has received significant attention in the literature (to be covered later) n Holland stresses the adaptive component of “his invention” in his newer work. n Recently, many Pittsburgh-style systems have been designed that learn rule-based systems using evolutionary computing which are quite different from Holland’s data-driven message passing systems such as: Systems that learn Bayesian Rules or Bayesian Belief Networks Systems that learn fuzzy rules Systems that learn first order logic rules Systems that learn PROLOG style programs n Work somewhat similar to classifier systems has become quite popular in field of agent-based systems that have to learn how to communicate and collaborate in a distributed environment.

Ch. Eick: Evolutionary Machine Learning Important Parameters for XCS XCS learns/maintains the following parameters for all its classifiers during the course of its operation: n p is the expected payoff; has a strong influence (combined with the rule’s fitness value) if a matching classifier’s action is selected for execution.  is the error made in predicting the payoffs n F (called fitness) denotes a classifiers “normalized accuracy” --- accuracy is the inverse of the degree of error made by a classifier; F combined with as determines which classifiers are chosen to be deleted from the population. F combined with p determines which actions of competing classifiers are selected for execution. n as determines the average size of action-sets this classifier belonged to; the smaller as/F is the less likely it becomes that this classifier is deleted. exp (experience) counts how often the classifier the classifier belonged to the action set; has some influence on the prediction of other parameters --- namely, if exp is low default parameters are used when predicting the other parameter (especially, for , F and as) n Moreover, it is important to know that only classifiers belonging to the action set are considered for reproduction.

Ch. Eick: Evolutionary Machine Learning Symbolic Empirical Learning (SEL) n SEL’s topic: creating symbolic descriptions, whose structure is unknown a priori. Its most important subfield is “Learning symbolic concept descriptions from sets of examples”. Popular systems include: Systems of the ID./C4 family that employ decision trees (originated from work of Quinlan and his co-workers). C4.5 is one of the most popular, and powerful inductive learning system. Systems of the AQ.-family which originated from work of Michalski and his co-workers. n On the other hand, various systems that employ numerical empirical learning have been proposed to obtain classifications from sets of example; these include: neural networks systems that employ statistical and/or probabilistic reasoning, and fuzzy techniques. n GA-style systems (inbetween numerical and symbolic approaches)