Download presentation
Presentation is loading. Please wait.
1
August 2005RSFDGrC 2005, Regina, Canada 1 Feature Selection Based on Relative Attribute Dependency: An Experimental Study Jianchao Han 1, Ricardo Sanchez 1, Xiaohua Hu 2 1 Computer Science Dept., California State University Dominguez Hills 2 College of Information Science and Technology, Drexel University
2
August 2005RSFDGrC 2005, Regina, Canada 2 Agenda Introduction Rough Set Approach Relative Attribute Dependency Based on Rough Set Theory A Heuristic Algorithm for Finding Optimal Reducts Experiment Result Related Work Summary and Future Work
3
August 2005RSFDGrC 2005, Regina, Canada 3 Introduction Data Reduction Horizontal Reduction – Sampling Vertical Reduction – Feature Selection Feature Selection Statistical Feature Selection Significant Feature Selection Rough Set Feature Selection Search Process Top-bottom search Bottom up search Exhaust search Heuristic search
4
August 2005RSFDGrC 2005, Regina, Canada 4 Rough Set Based Feature Selection Bottom-up search Brute-force search Heuristic search Rough Set Theory Introduced by Pawlak in the 1980s An efficient tool for data mining, concept generation, induction, and classification
5
August 2005RSFDGrC 2005, Regina, Canada 5 Rough Set Theory-- IS Information system IS =, where U={u 1,u 2,...,u n } is a non-empty set of tuples, called data table C is a non-empty set of condition attributes D is a non-empty set of decision attributes C D=Ø Va is the domain of attribute a with at least two elements f is a function: U×(C D) V= a C D V a,
6
August 2005RSFDGrC 2005, Regina, Canada 6 Approximation Let A C D, and t i, t j U, X be a subset of U and A C D Define R A ={ U×U: a A, t i [a]=t j [a]} Indiscernibility relation, denoted IND, is an equivalent relation on U, R A is an equivalent relation on U. Approximation space (U, IND) partitions U into equivalent classes [A]={A 1,A 2,…,A m } induced by R A Lower approximation or positive region of X: Low A (X)= {A i [A] | A i X, 1≤i≤m} Upper approximation of X based on A: Upp A (X) = {A i [A]|A i X≠Ø, 1≤i≤m} Boundary area of X: Boundary A (X) = Upp A (X)-Low A (X) Negative region of X: Neg A (X)= {A i [A]|A i U-X, 1≤i≤m}
7
August 2005RSFDGrC 2005, Regina, Canada 7 Core Attributes and Reduct Let [D]={D 1,D 2,…,D k } be the set of elementary sets partitioned by R D Approximation aggregation: Low A ([D])= k j=1 Low A (D j ) Upp A ([D])= k j=1 Upp A (D j ) a C is a Core attribute of C, if Low C ([D]) Low C-{a} ([D]), dispensable attribute otherwise R C is a reduct of C in U w.r.t. D if Low R ([D])=Low C ([D]) and B R, Low B ([D]) Low C ([D])
8
August 2005RSFDGrC 2005, Regina, Canada 8 Calculation of Reducts Finding all reducts is NP-hard Traditional method: decision matrix Some new methods still suffer from intensive computation of either discernibility functions or positive regions Our method: New equivalent definition of reducts Count distinct tuples (rows) in the IS table Efficient algorithm for finding reducts
9
August 2005RSFDGrC 2005, Regina, Canada 9 Relative Attribute Dependency Let, -- projection of U on P Let,, the degree of relative dependency of Q on D over U is the number of equivalence classes in U/IND(X). Theorem. Assume U is consistent. is a reduct of C with respect to D if and only if 1) 2)
10
August 2005RSFDGrC 2005, Regina, Canada 10 Computation Model (RAD) Input: A decision table U, condition attributes set C and decision attributes set D Output: A minimum reduct R of condition attributes set C with respect to D in U Computation: Find a subset R of C, such that, and.
11
August 2005RSFDGrC 2005, Regina, Canada 11 A Heuristic Algorithm for Finding Optimal Reducts Given the partition by D, U/IND(D), of U, the entropy, or expected information based on the partition by, of U, is given by where Entropy E(q)
12
August 2005RSFDGrC 2005, Regina, Canada 12 Algorithm Design 1. R C, Q empty 2. For each attribute do 3. Compute the entropy E(q) of q 4. Q Q { } 5. While Do 6. q 7. Q Q – { } 8. If Then 9. R R – {q} 10. Return R Algorithm complexity:
13
August 2005RSFDGrC 2005, Regina, Canada 13 Experiments Data sets – UCI repository 10 data sets Various sizes (# of tuples and attributes) Categorical attributes Preprocess – remove all inconsistent tuples
14
August 2005RSFDGrC 2005, Regina, Canada 14 Experiment Results 0 100 200 300 400 500 600 700 800 ASBCWDERHVLCSPEYSZOOAUDSOY Original Data Reduct b) Number of Condition Attributes a) Number of Rows
15
August 2005RSFDGrC 2005, Regina, Canada 15 Classification Accuracy
16
August 2005RSFDGrC 2005, Regina, Canada 16 Related Work Grzymala-Busse LERS Rough measure of rule: Ours: Nguyen et al: Similar: using Radix-sorting Ours: no discernibility relation, lower and upper approximations maintained Others
17
August 2005RSFDGrC 2005, Regina, Canada 17 Conclusion Summary Relative attribute dependency Computation model Algorithm implementation Experiment Future work Refinement Application Extension to numerical attributes
18
August 2005RSFDGrC 2005, Regina, Canada 18
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.