Rough Sets, Their Extensions and Applications 1.Introduction  Rough set theory offers one of the most distinct and recent approaches for dealing with.

Slides:



Advertisements
Similar presentations
Rough Sets in Data Mining CSE5610 Intelligent Software Systems Semester 1, 2006.
Advertisements

_ Rough Sets. Basic Concepts of Rough Sets _ Information/Decision Systems (Tables) _ Indiscernibility _ Set Approximation _ Reducts and Core _ Rough Membership.
Feature Grouping-Based Fuzzy-Rough Feature Selection Richard Jensen Neil Mac Parthaláin Chris Cornelis.
Chapter 3 : Relational Model
More Set Definitions and Proofs 1.6, 1.7. Ordered n-tuple The ordered n-tuple (a1,a2,…an) is the ordered collection that has a1 as its first element,
Discrete Mathematics Lecture 5 Alexander Bukharovich New York University.
Basic Structures: Sets, Functions, Sequences, Sums, and Matrices
Instructor: Hayk Melikya
Basic Structures: Sets, Functions, Sequences, Sums, and Matrices
Fuzzy Sets and Fuzzy Logic Theory and Applications
Rough Set Strategies to Data with Missing Attribute Values Jerzy W. Grzymala-Busse Department of Electrical Engineering and Computer Science University.
Rough Sets Theory Speaker:Kun Hsiang.
CHAPTER 8 Decision Making CS267 BY GAYATRI BOGINERI (12) & BHARGAV VADHER (14) October 22, 2007.
_ Rough Sets. Basic Concepts of Rough Sets _ Information/Decision Systems (Tables) _ Indiscernibility _ Set Approximation _ Reducts and Core.
Xi’an Jiaotong University Title: Attribute reduction in decision systems based on relation matrix Authors: Cheng Zhong and Jin-hai Li.
August 2005RSFDGrC 2005, Regina, Canada 1 Feature Selection Based on Relative Attribute Dependency: An Experimental Study Jianchao Han 1, Ricardo Sanchez.
PART 1 From classical sets to fuzzy sets 1. Introduction 2. Crisp sets: an overview 3. Fuzzy sets: basic types 4. Fuzzy sets: basic concepts FUZZY SETS.
Uncertainty Measure and Reduction in Intuitionistic Fuzzy Covering Approximation Space Feng Tao Mi Ju-Sheng.
CS 454 Theory of Computation Sonoma State University, Fall 2011 Instructor: B. (Ravi) Ravikumar Office: 116 I Darwin Hall Original slides by Vahid and.
Machine Learning -Ramya Karri -Rushin Barot. Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open.
Unsupervised Rough Set Classification Using GAs Reporter: Yanan Yean.
On Applications of Rough Sets theory to Knowledge Discovery Frida Coaquira UNIVERSITY OF PUERTO RICO MAYAGÜEZ CAMPUS
CSE & CSE6002E - Soft Computing Winter Semester, 2011 More Rough Sets.
April 10, 2002Applied Discrete Mathematics Week 10: Relations 1 Counting Relations Example: How many different reflexive relations can be defined on a.
DBSQL 3-1 Copyright © Genetic Computer School 2009 Chapter 3 Relational Database Model.
Classifying Attributes with Game- theoretic Rough Sets Nouman Azam and JingTao Yao Department of Computer Science University of Regina CANADA S4S 0A2
Chapter 9. Chapter Summary Relations and Their Properties Representing Relations Equivalence Relations Partial Orderings.
Chapter 3 – Set Theory  .
ICOM 6005 – Database Management Systems Design Dr. Manuel Rodríguez Martínez Electrical and Computer Engineering Department Lecture 2 – Relational Model.
3. Rough set extensions  In the rough set literature, several extensions have been developed that attempt to handle better the uncertainty present in.
Data Mining Knowledge on rough set theory SUSHIL KUMAR SAHU.
From Rough Set Theory to Evidence Theory Roman Słowiński Laboratory of Intelligent Decision Support Systems Institute of Computing Science Poznań University.
Richard Jensen, Andrew Tuson and Qiang Shen Qiang Shen Aberystwyth University, UK Richard Jensen Aberystwyth University, UK Andrew Tuson City University,
Discrete Mathematics Relation.
Relations and their Properties
Relations. Important Definitions We covered all of these definitions on the board on Monday, November 7 th. Definition 1 Definition 2 Definition 3 Definition.
(CSC 102) Lecture 13 Discrete Structures. Previous Lectures Summary  Direct Proof  Indirect Proof  Proof by Contradiction  Proof by Contra positive.
Chapter 9. Chapter Summary Relations and Their Properties n-ary Relations and Their Applications (not currently included in overheads) Representing Relations.
Fuzzy Optimization D Nagesh Kumar, IISc Water Resources Planning and Management: M9L1 Advanced Topics.
Chapter 8: Relations. 8.1 Relations and Their Properties Binary relations: Let A and B be any two sets. A binary relation R from A to B, written R : A.
Data Mining By Farzana Forhad CS 157B. Agenda Decision Tree and ID3 Rough Set Theory Clustering.
Relations and Functions ORDERED PAIRS AND CARTESIAN PRODUCT An ordered pair consists of two elements, say a and b, in which one of them, say a is designated.
D EPENDENCIES IN K NOWLEDGE B ASE By: Akhil Kapoor Manandeep Singh Bedi.
Table of Contents Matrices - Definition and Notation A matrix is a rectangular array of numbers. Consider the following matrix: Matrix B has 3 rows and.
Introduction Types of Matrices Operations
Rough Set Theory and Databases Senior Lecturer: Laurie Webster II, M.S.S.E.,M.S.E.E., M.S.BME, Ph.D., P.E. Lecture 28 A First Course in Database Systems.
More Rough Sets.
Relations and Their Properties
Chapter 5 Relations and Operations
CHAPTER 3 SETS, BOOLEAN ALGEBRA & LOGIC CIRCUITS
The Language of Sets If S is a set, then
Learn about relations and their basic properties
Finish Fuzzy Sets and Logic Begin Rough Sets
Rough Sets.
5 Systems of Linear Equations and Matrices
CSNB 143 Discrete Mathematical Structures
CS201: Data Structures and Discrete Mathematics I
Rough Sets.
Linear Equations in Linear Algebra
Advanced Algorithms Analysis and Design
Rough Set Theory.
Dependencies in Structures of Decision Tables
Rough Sets (Theoretical Aspects of Reasoning about Data)
Design tools and techniques for a relational database system
Copyright © Cengage Learning. All rights reserved.
Rough Set Model Selection for Practical Decision Making
Classical Sets and Fuzzy Sets
Implementation of Learning Systems
Chapter 9 -- Simplification of Sequential Circuits
Presentation transcript:

Rough Sets, Their Extensions and Applications 1.Introduction  Rough set theory offers one of the most distinct and recent approaches for dealing with incomplete or imperfect knowledge.  Rough set has resulting in various extensions to the original theory and increasingly widening field of application.  In this paper Concise overview of the basic ideas of rough set theory, Its major extensions 2. Rough set theory  Rough set theory (RST) is an extension of conventional set theory that supports of approximations in decision making.  A rough set is itself the approximation of a vague concept (set) a pair of precise concepts, called lower and upper approximations. 1ISA Lab., CU, Korea

 The lower approximation is a descriptions of the domain objects which are known with certainty to belong to the subset of interest.  The upper approximation is a description of the objects which possibly belong to the subset. 2.1 Information and decision systems  An information system can be viewed as a table of data, consisting of objects (rows in the table) and attributes (columns).  An information system may be extended by the inclusion of decision attributes.  Table 1: example of decision system xUxU abcd  e 0SRTTR 1RSSST 2TRRSS 3SSRTT 4SRTRS 5TTRSS 6TSSST 7RSSRS ISA Lab., CU, Korea

3 The table consists of four conditional features (a, b, c, d), a decision feature (e) and eight objects  I=(U, A) U is a non-empty set of finite objects (the universe of discourse) A is a non-empty finite set of attributes such that a: U  V a for every a  A. V a is the set of values that attribute a may take. 2.2 Indiscernibility  With any P  A there is an associated equivalence relation IND(P):  The partition of U, determined by IND(P) is denoted U/IND(P) or U/P, which is simply the set of equivalence classes generated by IND(P): Where,

ISA Lab., CU, Korea4  The equivalence classes of the indiscernibility relation with respect to P are denoted [x] P, x  U.  Example, P={b, c} U/IND(P)=U/IND(b)  U/IND(c)={{0, 2, 4}, {1, 3, 6, 7}, {5}}  {{2, 3, 5}, {1, 6, 7}, {0, 4}}={{2}, {0, 4}, {3}, {1, 6, 7}, {5}}. 2.3 Lower and upper approximations  Let X  U.  X can be approximated using only the information contained within P by constructing the P-lower and P-upper approximations of the classical crisp set X:  It is that a tuple that is called a rough set. Consider the approximation of concept X in Fig. 1. Each square in the diagram represents an equivalence class, generated by indiscernibility between object values.

ISA Lab., CU, Korea5  Fig 1. A rough set 2.4 Positive, negative and boundary regions  Let P and Q be equivalence relations over U, then the positive, negative and boundary regions are defined as

ISA Lab., CU, Korea6 The positive region comprises all objects of U that can be classified to classes of U/Q using the information contained within attributes P. The boundary region is the set of objects that can be possibly, but also certainly, be classified in this way. The negative region is the set of objects that cannot be classified to classes of U/Q.  For example, let P={b, c} and Q={e} then 2.5 Attribute dependency and significance  An important issue in data analysis is discovering dependencies between attributes.  A set of attributes Q depends totally on a set of attributes P, denoted P  Q, if all attribute values from Q are uniquely determined by values of attributes from P.

ISA Lab., CU, Korea7  In rough set theory, dependency is defined in the following way: For P, Q  A, it is said that Q depends on P in a degree k (0  k  1), denoted P  k Q, if where |S| stands for the cardinality of the set S.  In the example, the degree of dependency of attribute {e} from the attributes {b, c} is  Given P, Q and an attribute a  P, the significance of attribute a upon Q is defined by For example, if P={a, b, c} and Q={e} then

ISA Lab., CU, Korea8  And calculating the significance of the three attributes gives From this it follows that attribute a is indispensable, but attributes b and c can be dispensed with when considering the dependency between the decision attribute and the given individual conditional attributes.

ISA Lab., CU, Korea9 2.4 Reducts  To search for a minimal representation of the original dataset, the concept of a reduct is introduced and defined as a minimal subset R of the initial attributes set C such that for a given set of attribute D,.  R is a minimal subset if for all a  R. This means that no attributes can be removed from the subset without affecting the dependency degree.  The collection of all reducts is denoted by  The intersection of all the sets in R all is called the core, the elements of which are those attributes that cannot be eliminated without introducing more contradictions to the representation of the data set.  The QuickReduct algorithm attempts to calculate reducts for a decision problem.

ISA Lab., CU, Korea10 QuickReduct(C,D) C: the set of all conditional attributes; D: the set of decision attributes. 1) 2) 3) 4) 5) 6) 7) 8) 9) R  { } Do T  R  x  (C-R) if  R  {x} (D) >  T (D) T  R  {x} R  T Until  R (D) ==  T (D) Return R 2.7 Discernibility matrix Many applications of rough sets make use of discernibility matrices for finding rules or reducts. A discernibility matrix of a decision table is a symmetric |U|  |U| matrix with entries defined by Each c ij contains those attributes that differ between objects i and j.

ISA Lab., CU, Korea11  Table 2. The decision-relative discernibility matrix XUXU a,b,c,d 2a,c,da,b,c 3b,ca,b,d 4da,b,c,db,c,d 5a,b,c,da,b,ca,b,d 6a,b,c,db,ca,b,,c, d b,c 7a,b,c,dda,c,da,d Grouping all entries containing single attributes forms the core of the dataset (those attributes appearing in every reduct). Here, the core of dataset is {d}. A discernibility function F D is a boolean function of m boolean variables a defined as below: where

ISA Lab., CU, Korea12 The decision-relative discernibility function is Further simplification can be performed by removing those cluses that are subsumed by others: Hence, the minimal reducts are {b, d} and {c, d}.