Computational Intelligence Winter Term 2012/13 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

A Polynomial-Time Algorithm for Global Value Numbering SAS 2004 Sumit Gulwani George C. Necula.
Copyright © Cengage Learning. All rights reserved.
Logical and Artificial Intelligence in Games Lecture 13
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Extension Principle Adriano Cruz ©2002 NCE e IM/UFRJ
Copyright © 2008 Pearson Addison-Wesley. All rights reserved. Chapter 13 International Trade in Goods and Assets.
Introductory Mathematics & Statistics for Business
Chapter 1 The Study of Body Function Image PowerPoint
1 Copyright © 2013 Elsevier Inc. All rights reserved. Appendix 01.
©2001 by Charles E. Leiserson Introduction to AlgorithmsDay 9 L6.1 Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 6 Prof. Erik Demaine.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Addition Facts
Computational Intelligence Winter Term 2009/10 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Fuzzy expert systems Fuzzy inference Mamdani fuzzy inference
Department of Engineering Management, Information and Systems
Detection Chia-Hsin Cheng. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outlines Detection Theory Simple Binary Hypothesis Tests Bayes.
ABC Technology Project
Notes 18 ECE Microwave Engineering Multistage Transformers
CS105 Introduction to Computer Concepts GATES and CIRCUITS
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Chapter 4 Systems of Linear Equations; Matrices
1 COPYRIGHT © 2011 ALCATEL-LUCENT. ALL RIGHTS RESERVED. On the Capacity of Wireless CSMA/CA Multihop Networks Rafael Laufer and Leonard Kleinrock Bell.
VOORBLAD.
15. Oktober Oktober Oktober 2012.
1. 2 No lecture on Wed February 8th Thursday 9 th Feb 14: :00 Thursday 9 th Feb 14: :00.
Solving Equations How to Solve Them
Computational Intelligence Winter Term 2010/11 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Hypothesis Tests: Two Independent Samples
Factor P 16 8(8-5ab) 4(d² + 4) 3rs(2r – s) 15cd(1 + 2cd) 8(4a² + 3b²)
© 2012 National Heart Foundation of Australia. Slide 2.
Abbas Edalat Imperial College London Contains joint work with Andre Lieutier (AL) and joint work with Marko Krznaric (MK) Data Types.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
6.4 Best Approximation; Least Squares
25 seconds left…...
Copyright © Cengage Learning. All rights reserved.
A set is a collection of objects A special kind of set Fuzzy Sets
Intelligent Control Methods Lecture 11: Fuzzy control 2 Slovak University of Technology Faculty of Material Science and Technology in Trnava.
Januar MDMDFSSMDMDFSSS
Determining How Costs Behave
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
We will resume in: 25 Minutes.
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
PSSA Preparation.
Computational Intelligence Winter Term 2014/15 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Chapter 30 Induction and Inductance In this chapter we will study the following topics: -Faraday’s law of induction -Lenz’s rule -Electric field induced.
January Structure of the book Section 1 (Ch 1 – 10) Basic concepts and techniques Section 2 (Ch 11 – 15): Inference for quantitative outcomes Section.
1 Functions and Applications
9. Two Functions of Two Random Variables
Computational Intelligence
Commonly Used Distributions
Computational Intelligence Winter Term 2010/11 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence Winter Term 2014/15 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Ming-Feng Yeh General Fuzzy Systems A fuzzy system is a static nonlinear mapping between its inputs and outputs (i.e., it is not a dynamic system).
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 12 FUZZY.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
1 Lecture 4 The Fuzzy Controller design. 2 By a fuzzy logic controller (FLC) we mean a control law that is described by a knowledge-based system consisting.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Computational Intelligence
Computational Intelligence
Computational Intelligence
Computational Intelligence
Computational Intelligence
Presentation transcript:

Computational Intelligence Winter Term 2012/13 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 2 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 2 Plan for Today ● Approximate Reasoning ● Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 3 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 3 Approximative Reasoning So far: ● p: IF X is A THEN Y is B → R(x, y) = Imp( A(x), B(y) ) rule as relation; fuzzy implication ● rule: IF X is A THEN Y is B fact:X is A‘ conclusion:Y is B‘ → B‘(y) = sup x  X t( A‘(x), R(x, y) ) composition rule of inference Thus: ● B‘(y) = sup x  X t( A‘(x), Imp( A(x), B(y) ) ) given: fuzzy rule input: fuzzy set A‘ output: fuzzy set B‘

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 4 x ≠ x 0 B‘(y) = sup x  X t( A‘(x), Imp( A(x), B(y) ) ) sup t( 0, Imp( A(x), B(y) ) ) = t( 1, Imp( A(x 0 ), B(y) ) ) for x ≠ x 0 for x = x 0 here: A‘(x)= 1 for x = x 0 0 otherwise crisp input! for x ≠ x 0 since t(0, a) = 0 for x = x 0 since t(a, 1) = a 0 = Imp( ( A(x 0 ), B(y) ) Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 5 Lemma: a)t(a, 1) = a b)t(a, b) ≤ min { a, b } c)t(0, a) = 0 Proof: ad a) Identical to axiom 1 of t-norms. ad b) From monotonicity (axiom 2) follows for b ≤ 1, that t(a, b) ≤ t(a, 1) = a. Commutativity (axiom 3) and monotonicity lead in case of a ≤ 1 to t(a, b) = t(b, a) ≤ t(b, 1) = b. Thus, t(a, b) is less than or equal to a as well as b, which in turn implies t(a, b) ≤ min{ a, b }. ad c) From b) follows 0 ≤ t(0, a) ≤ min { 0, a } = 0 and therefore t(0, a) = 0. ■ by a) Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 6 Multiple rules: IF X is A 1, THEN Y is B 1 IF X is A 2, THEN Y is B 2 IF X is A 3, THEN Y is B 3 … IF X is A n, THEN Y is B n X is A‘ Y is B‘ → R 1 (x, y) = Imp 1 ( A 1 (x), B 1 (y) ) → R 2 (x, y) = Imp 2 ( A 2 (x), B 2 (y) ) → R 3 (x, y) = Imp 3 ( A 3 (x), B 3 (y) ) … → R n (x, y) = Imp n ( A n (x), B n (y) ) Multiple rules for crisp input: x 0 is given B 1 ‘(y) = Imp 1 (A 1 (x 0 ), B 1 (y) ) … B n ‘(y) = Imp n (A n (x 0 ), B n (y) ) aggregation of rules or local inferences necessary! aggregate! ) B‘(y) = aggr { B 1 ‘(y), …, B n ‘(y) }, where aggr = min max Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 7 FITA: “First inference, then aggregate!“ 1.Each rule of the form IF X is A k THEN Y is B k must be transformed by an appropriate fuzzy implication Imp k (¢, ¢) to a relation R k : R k (x, y) = Imp k ( A k (x), B k (y) ). 2.Determine B k ‘(y) = R k (x, y) ± A‘(x) for all k = 1, …, n (local inference). 3.Aggregate to B‘(y) =  ( B 1 ‘(y), …, B n ‘(y) ). FATI: “First aggregate, then inference!“ 1.Each rule of the form IF X ist A k THEN Y ist B k must be transformed by an appropriate fuzzy implication Imp k (¢, ¢) to a relation R k : R k (x, y) = Imp k ( A k (x), B k (y) ). 2.Aggregate R 1, …, R n to a superrelation with aggregating function  (¢): R(x, y) =  ( R 1 (x, y), …, R n (x, y) ). 3.Determine B‘(y) = R(x, y) ± A‘(x) w.r.t. superrelation (inference). Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/ Equivalence of FITA and FATI ? FITA: B‘(y) =  ( B 1 ‘(y), …, B n ‘(y) ) =  ( R 1 (x, y) ± A‘(x), …, R n (x, y) ± A‘(x) ) FATI: B‘(y) = R(x, y) ± A‘(x) =  ( R 1 (x, y), …, R n (x, y) ) ± A‘(x) 1. Which principle is better? FITA or FATI? Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 9 special case: A‘(x)= 1 for x = x 0 0 otherwise crisp input! On the equivalence of FITA and FATI: FITA: B‘(y) =  ( B 1 ‘(y), …, B n ‘(y) ) =  ( Imp 1 (A 1 (x 0 ), B 1 (y) ), …, Imp n (A n (x 0 ), B n (y) ) ) FATI: B‘(y) = R(x, y) ± A‘(x) = sup x  X t( A‘(x), R(x, y) )(from now: special case) = R(x 0, y) =  ( Imp 1 ( A 1 (x 0 ), B 1 (y) ), …, Imp n ( A n (x 0 ), B n (y) ) ) evidently: sup-t-composition with arbitrary t-norm and  (¢) =  (¢) Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 10 ● AND-connected premises IF X 1 = A 11 AND X 2 = A 12 AND … AND X m = A 1m THEN Y = B 1 … IF X n = A n1 AND X 2 = A n2 AND … AND X m = A nm THEN Y = B n reduce to single premise for each rule k: A k (x 1,…, x m ) = min { A k1 (x 1 ), A k2 (x 2 ), …, A km (x m ) } or in general: t-norm ● OR-connected premises IF X 1 = A 11 OR X 2 = A 12 OR … OR X m = A 1m THEN Y = B 1 … IF X n = A n1 OR X 2 = A n2 OR … OR X m = A nm THEN Y = B n reduce to single premise for each rule k: A k (x 1,…, x m ) = max { A k1 (x 1 ), A k2 (x 2 ), …, A km (x m ) } or in general: s-norm Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 11 important: ● if rules of the form IF X is A THEN Y is B interpreted as logical implication ) R(x, y) = Imp( A(x), B(y) ) makes sense ● we obtain: B‘(y) = sup x  X t( A‘(x), R(x, y) )  the worse the match of premise A‘(x), the larger is the fuzzy set B‘(y)  follows immediately from axiom 1: a  b implies Imp(a, z)  Imp(b, z) interpretation of output set B‘(y): ● B‘(y) is the set of values that are still possible ● each rule leads to an additional restriction of the values that are still possible  resulting fuzzy sets B‘ k (y) obtained from single rules must be mutually intersected!  aggregation via B‘(y) = min { B 1 ‘(y), …, B n ‘(y) } Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 12 important: ● if rules of the form IF X is A THEN Y is B are not interpreted as logical implications, then the function Fct(¢) in R(x, y) = Fct( A(x), B(y) ) can be chosen as required for desired interpretation. ● frequent choice (especially in fuzzy control): - R(x, y) = min { A(x), B(x) }Mamdami – “implication“ - R(x, y) = A(x) ¢ B(x)Larsen – “implication“  of course, they are no implications but special t-norms!  thus, if relation R(x, y) is given, then the composition rule of inference B‘(y) = A‘(x) ± R(x, y) = sup x  X min { A’(x), R(x, y) } still can lead to a conclusion via fuzzy logic. Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 13 example: [JM96, S. 244ff.] industrial drill machine → control of cooling supply modelling linguistic variable: rotation speed linguistic terms: very low, low, medium, high, very high ground set: X with 0 ≤ x ≤ [rpm] rotation speed vllmhvh 1 Approximative Reasoning

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 14 example: (continued) modelling linguistic variable: cooling quantity linguistic terms: very small, small, normal, much, very much ground set: Y with 0 ≤ y ≤ 18 [liter / time unit] cooling quantity vssnmvm 1 Approximative Reasoning industrial drill machine → control of cooling supply

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 15 rule base IF rotation speed IS very low THEN cooling quantity IS very small low medium high very high small normal much very much Approximative Reasoning example: (continued) industrial drill machine → control of cooling supply sets S vl, S l, S m, S h, S vh sets C vs, C s, C n, C m, C vm “rotation speed” “cooling quantity”

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/ input: crisp value x 0 = min -1 (no fuzzy set!) → fuzzyfication = determine membership for each fuzzy set over X → yields S’ = (0, 0, ¾, ¼, 0) via x  ( S vl (x 0 ), S l (x 0 ), S m (x 0 ), S h (x 0 ), S vh (x 0 ) ) 2.FITA: locale inference ) since Imp(0,a) = 0 we only need to consider: S m :C’ n (y) = Imp( ¾, C n (y) ) S h :C’ m (y) = Imp( ¼, C m (y) ) 3.aggregation: C’(y) = aggr { C’ n (y), C’ m (y) } = max { Imp( ¾, C n (y) ), Imp( ¼, C m (y) ) } ? ? Approximative Reasoning example: (continued) industrial drill machine → control of cooling supply

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 17 C’(y) = max { Imp( ¾, C n (y) ), Imp( ¼, C m (y) ) } in case of control task typically no logic-based interpretation: → max-aggregation and → relation R(x,y) not interpreted as implication. often: R(x,y) = min(a, b) „Mamdani controller“ thus: C‘(y) = max { min { ¾, C n (y) }, min { ¼, C m (y) } } → graphical illustration Approximative Reasoning example: (continued) industrial drill machine → control of cooling supply

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 18 C‘(y) = max { min { ¾, C n (y) }, min { ¼, C m (y) } }, x 0 = [rpm] rotation speed vllmhvh cooling quantity vssnmsm 1 Approximative Reasoning example: (continued) industrial drill machine → control of cooling supply

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 19 Fuzzy Control open and closed loop control: affect the dynamical behavior of a system in a desired manner ● open loop control control is aware of reference values and has a model of the system  control values can be adjusted, such that process value of system is equal to reference value problem: noise! ) deviation from reference value not detected ● closed loop control now: detection of deviations from reference value possible (by means of measurements / sensors) and new control values can take into account the amount of deviation

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 20 open loop control system process control wuy process value reference value control value assumption: undisturbed operation  process value = reference value Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 21 closed loop control system process control wu d y noise process value control value control deviation = reference value – process value Fuzzy Control reference value

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 22 required: model of system / process → as differential equations or difference equations (DEs) → well developed theory available so, why fuzzy control? ● there exists no process model in form of DEs etc. (operator/human being has realized control by hand) ● process with high-dimensional nonlinearities → no classic methods available ● control goals are vaguely formulated („soft“ changing gears in cars) Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 23 fuzzy description of control behavior but fact A‘ is not a fuzzy set but a crisp input → actually, it is the current process value but crisp control value required for the process / system → defuzzification (= “condense” fuzzy set to crisp value) fuzzy controller executes inference step → yields fuzzy output set B‘(y) IF X is A 1, THEN Y is B 1 IF X is A 2, THEN Y is B 2 IF X is A 3, THEN Y is B 3 … IF X is A n, THEN Y is B n X is A‘ Y is B‘ similar to approximative reasoning Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 24 defuzzification ● maximum method - only active rule with largest activation level is taken into account → suitable for pattern recognition / classification → decision for a single alternative among finitely many alternatives - selection independent from activation level of rule (0.05 vs. 0.95) - if used for control: incontinuous curve of output values (leaps) Def: rule k active, A k (x 0 ) > 0 0,5 t B‘(y) 0,5 B‘(y) 0,5 B‘(y) Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 25 defuzzification ● maximum mean value method - all active rules with largest activation level are taken into account → interpolations possible, but need not be useful → obviously, only useful for neighboring rules with max. activation - selection independent from activation level of rule (0.05 vs. 0.95) - if used in control: incontinuous curve of output values (leaps) 0,5 B‘(y) Y* = { y  Y: B‘(y) = hgt(B‘) } 0,5 B‘(y) useful solution? → Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 26 defuzzification ● center-of-maxima method (COM) - only extreme active rules with largest activation level are taken into account → interpolations possible, but need not be useful → obviously, only useful for neighboring rules with max. activation level - selection indepependent from activation level of rule (0.05 vs. 0.95) - in case of control: incontinuous curve of output values (leaps) 0,5 B‘(y) Y* = { y  Y: B‘(y) = hgt(B‘) } 0,5 B‘(y) ? 0,5 B‘(y) ? Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 27 defuzzification ● Center of Gravity (COG) - all active rules are taken into account → but numerically expensive … → borders cannot appear in output ( 9 work-around ) - if only single active rule: independent from activation level - continuous curve for output values …only valid for HW solution, today! Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 28 Excursion: COG triangle: y1y1 y2y2 y3y3 trapezoid: y1y1 y2y2 y4y4 y3y3 y B‘(y) 1 pendant in probability theory: expectation value 1 3,77... Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 29 y z=B‘(y) 1 y1y1 y2y2 y3y3 y4y4 y5y5 y6y6 y7y7 assumption: fuzzy membership functions piecewise linear output set B‘(y) represented by sequence of points (y 1, z 1 ), (y 2, z 2 ), …, (y n, z n ) ) area under B‘(y) and weighted area can be determined additively piece by piece ) linear equation z = m y + b ) insert (y i, z i ) and (y i+1,z i+1 ) ) yields m and b for each of the n-1 linear sections ) Fuzzy Control

Lecture 08 G. Rudolph: Computational Intelligence ▪ Winter Term 2012/13 30 Defuzzification ● Center of Area (COA) developed as an approximation of COG let ŷ k be the COGs of the output sets B’ k (y): Fuzzy Control