Machine Learning -Ramya Karri -Rushin Barot. Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open.

Slides:



Advertisements
Similar presentations
Introductory Mathematics & Statistics for Business
Advertisements

PAC Learning 8/5/2005. purpose Effort to understand negative selection algorithm from totally different aspects –Statistics –Machine learning What is.
Completeness and Expressiveness
Tests of Hypotheses Based on a Single Sample
Classification. Introduction A discriminant is a function that separates the examples of different classes. For example – IF (income > Q1 and saving >Q2)
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Is it statistically significant?
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Università di Milano-Bicocca Laurea Magistrale in Informatica
Formal Logic Proof Methods Direct Proof / Natural Deduction Conditional Proof (Implication Introduction) Reductio ad Absurdum Resolution Refutation.
CHAPTER 8 Decision Making CS267 BY GAYATRI BOGINERI (12) & BHARGAV VADHER (14) October 22, 2007.
Chapter 10 Dissimilarity Analysis Presented by: ASMI SHAH (ID : 24)
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
CS292 Computational Vision and Language Pattern Recognition and Classification.
1 Introduction to Estimation Chapter Introduction Statistical inference is the process by which we acquire information about populations from.
Adapted by Doug Downey from: Bryan Pardo, EECS 349 Fall 2007 Machine Learning Lecture 2: Concept Learning and Version Spaces 1.
The Theory of NP-Completeness
Induction of Decision Trees
Experimental Evaluation
CS2420: Lecture 2 Vladimir Kulyukin Computer Science Department Utah State University.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Chapter 12 Machine Learning ID: Name: Qun Yu (page 1-33) Kai Zhu (page 34-59) Class: CS267 Fall 2008 Instructor: Dr. T.Y.Lin.
Chapter 9: Introduction to the t statistic
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 8 Tests of Hypotheses Based on a Single Sample.
Hypothesis Testing:.
Probability Distributions and Test of Hypothesis Ka-Lok Ng Dept. of Bioinformatics Asia University.
CS 478 – Tools for Machine Learning and Data Mining The Need for and Role of Bias.
 Knowledge Acquisition  Machine Learning. The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 More Rough Sets.
Machine Learning CSE 681 CH2 - Supervised Learning.
Classifying Attributes with Game- theoretic Rough Sets Nouman Azam and JingTao Yao Department of Computer Science University of Regina CANADA S4S 0A2
Formal Models in AGI Research Pei Wang Temple University Philadelphia, USA.
From Rough Set Theory to Evidence Theory Roman Słowiński Laboratory of Intelligent Decision Support Systems Institute of Computing Science Poznań University.
Machine Learning Chapter 2. Concept Learning and The General-to-specific Ordering Tom M. Mitchell.
Chapter 2: Concept Learning and the General-to-Specific Ordering.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
Outline Inductive bias General-to specific ordering of hypotheses
Overview Concept Learning Representation Inductive Learning Hypothesis
Uncertainty Management in Rule-based Expert Systems
Fall 2002Biostat Statistical Inference - Confidence Intervals General (1 -  ) Confidence Intervals: a random interval that will include a fixed.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
Learning to Share Meaning in a Multi-Agent System (Part I) Ganesh Padmanabhan.
STATISTICS. STATISTICS The numerical records of any event or phenomena are referred to as statistics. The data are the details in the numerical records.
© 2013 Cengage Learning. All rights reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Concept Learning and The General-To Specific Ordering
Computational Learning Theory Part 1: Preliminaries 1.
Concept learning Maria Simi, 2011/2012 Machine Learning, Tom Mitchell Mc Graw-Hill International Editions, 1997 (Cap 1, 2).
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
Machine Learning Chapter 7. Computational Learning Theory Tom M. Mitchell.
EXAMPLE 3 Write an indirect proof Write an indirect proof that an odd number is not divisible by 4. GIVEN : x is an odd number. PROVE : x is not divisible.
1 Estimation Chapter Introduction Statistical inference is the process by which we acquire information about populations from samples. There are.
 Knowledge Acquisition  Machine Learning. The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
Rough Sets, Their Extensions and Applications 1.Introduction  Rough set theory offers one of the most distinct and recent approaches for dealing with.
1 CS 391L: Machine Learning: Computational Learning Theory Raymond J. Mooney University of Texas at Austin.
More Rough Sets.
Chapter 2 Concept Learning
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
CS 9633 Machine Learning Concept Learning
Ordering of Hypothesis Space
Understand Decision Making
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
Confidence Interval Estimation and Statistical Inference
CONCEPTS OF ESTIMATION
9 Tests of Hypotheses for a Single Sample CHAPTER OUTLINE
Machine Learning Chapter 2
Implementation of Learning Systems
Version Space Machine Learning Fall 2018.
Machine Learning Chapter 2
Presentation transcript:

Machine Learning -Ramya Karri -Rushin Barot

Machine learning Rough Set Theory in Machine Learning? Knower’s knowledge – Closed World Assumption – Open World Assumption How does the Learners knowledge is effected by the knowledge of the knower

Learning from Examples Two agents – Knower – Learner Closed World Assumption – Universe of discourse ‘U’ – Knower has complete knowledge about the universe – The universe is closed i.e. nothing else beside U exists

Quality of learning Learner knowledge consists of attributes of objects Can learner’s knowledge can match the knower’s knowledge? Is the learner able to learn concepts demonstrated by the knower?

Quality of learning(Contd..) Quality of learning can be defined as degree of dependency between the set of knower’s and learner’s attributes i.e. how exactly the knower’s knowledge can be learned.

Example Uabcde

Example(Contd..) B= {a, b, c, d} is set of learner’s attributes and e is the knower’s attribute. The knower’s knowledge has the following concepts – |e 0 |={3, 7, 10}=X 0 – |e 1 | ={1, 2, 4, 5, 8}=X 1 – |e 2 |={6, 9}=X 2

Example(Contd..) The learner’s knowledge consists of following basic concepts

To learn knower’s knowledge means to express each knower’s basic concept by means of learner’s basic concepts Compute approximation of knower’s basic concepts, in terms of learner’s basic concepts i.e.

Inferences concept X0 is exact and can be learned fully Concept X1 is roughly B-definable i.e. only the instances 1, 2 and 8 can be learned by the learner, instances 3,7,10 do not belong to the concept, instances 4, 5, 6 and 9 cannot be decided by the learner whether they belong to X1 or not. Concept X2 is internally B-undefinable, since there are no positive instances of the concept

Derive the quality of learning POSB{e} = only those instances properly classified by the learner={1, 2, 3, 7, 8, 10} Therefore Quality of learning is

Decision algorithm Another decision algorithm

Are all the instances are necessary to learn the knower’s knowledge? Ans : Some instances are crucial for concept learning but some are not Remove instance 10 the table is as follows

Uabcde

Let us remove instance 4 and 8 Uabcde

Case of an Imperfect Teacher How lack of knowledge by the knower would affect the learner’s ability to learn ? Whether the learner would be able to discover the knower’s deficiency

UabC

B= {a, b} is the set of learner’s attributes C is knower’s attribute. Two concepts – X + and X-, denoted by + and – values. Compute whether sets X+, X- and X0 are definable in terms of attributes a and b

Every substitution for value 0 in attribute c, values + or –, the boundary region remain unchanged. the knower’s lack of knowledge is unessential The fact that he failed in classifying examples 4 and 5 does not disturb the learning process.

UabC

X+ = {1, 2, 3, 4} X_= {7, 8, 9} X0 = {5, 6}

The learner can discover that the knower is unable to classify object 6

Inductive Learning Assumption – U is not constant and changed during the learning process. – Every new instance is classified by the knower and the learner is suppose to classify it too on the basis of his actual knowledge.

Open World Assumption(OWA) Whole concept is unknown to the knower and only certain instances of the concept are known

Possibilities:- – New instance confirms actual knowledge – New instance contradicts actual knowledge – New instance is completely new case.

New instance confirms actual knowledge

New instance contradicts actual knowledge Quality of learning decrease

Conclusion If decision table is consistent it provide Highest quality of learning If decision table is inconsistent, new confirming instance increases learner’s knowledge or new contradict instance will decrease quality of learning.