Presentation is loading. Please wait.

Presentation is loading. Please wait.

INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)

Similar presentations


Presentation on theme: "INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)"— Presentation transcript:

1 INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)

2 Bayes Classifier Bayes Rule

3 Play Tennis Example John wants to play tennis everyday. However, in some days, the condition is not good. So, he decide not to play. The following table is the record for the last 14 days.

4 OutlookTemperatureHumidityWindPlayTennis SunnyHotHighWeakNo SunnyHotHighStrongNo OvercastHotHighWeakYes RainMildHighWeakYes RainCoolNormalWeakYes RainCoolNormalStrongNo OvercastCoolNormalStrongYes SunnyMildHighWeakNo SunnyCoolNormalWeakYes RainMildNormalWeakYes SunnyMildNormalStrongYes OvercastMildHighStrongYes OvercastHotNormalWeakYes RainMildHighStrongNo

5 Question: Today’s condition is Do you think John will play tennis?

6 Find We need to use naïve Bayes assumption. Assume that all events are independent. Now, let’s look at each property

7

8 Using Bayes rule

9 Since P(condition) is the same, we can conclude that John is more likely to play tennis today. Note that, we do not need to compute P(condition) to get the answer. However, if you want to get the number, we can calculate P(condition) in the way similar to normalize the probability.

10 Therefore, John is more likely to play tennis today with 58% chance.

11 Learning and Bayes Classifier Learning is the adjustment of probability values to compute a posterior probability when new data Is added.

12 Classifying Object Example Suppose we want to classify objects into two classes, A and B. There are two features that we can measure from each object, f1 and f2. We sample four objects randomly to be a database and classify it by hand. Samplef1f2Class 15.21.2B 22.35.4A 31.54.4A 44.52.1B Now, we have another sample that have f1=3.2 f2=4.2 we want to know what class it is.

13 We want to find Using Bayes rule From the table, we will count the number of events.

14 Find Again, we use the naïve Bayes assumption. Assume that all events are independent. To findwe need to assume probability distribution because the features are continuous value. The most common form of distribution is Gaussian (normal) distribution.

15 Gaussian distribution There are two parameters: mean µ and variance σ Using the maximum likelihood principle, the mean and the variance can be estimated from the samples In the database.

16 Class A f1: Mean = (2.3+1.5 )/2 = 1.9SD = 0.4 f2: Mean = (5.4+4.4 )/2 = 4.9SD = 0.5 Class B f1: Mean = (5.2+4.5 )/2 = 4.85SD = 0.35 f2: Mean = (1.2+2.1 )/2 = 1.65SD = 0.45

17 The object that we want to classify has f1=3.2 f2=4.2.

18 Therefore, From Bayes Therefore, we should classify the sample as Class A.

19 Nearest Neighbor Classification NN is considered as no model classification. Nearest Neighbor’s Principle The unknown sample is classified to be the same class as the sample with closet distance.

20 Feature 1 Feature 2 Closet Distance We classify the sample as a circle.

21 Distance between Samples Sample X and Y have multi-dimension feature values. The distance between sample X,Y can be calculated by this formula.

22 If k = 1, the distance is called Manhattan distance If k = 2, the distance is called Euclidean distance If k = ∞, the distance is the maximum value of feature Euclidean is well-known and is the prefer one.

23 Samplef1f2Class 15.21.2B 22.35.4A 31.54.4A 44.52.1B Classifying Object with NN Now, we have another sample, f1=3.2 f2=4.2 We want to know its class.

24 Compute Euclidian distance from it to all other samples The unknown sample has the closest distance to the second sample. Therefore, we classify it to be the same class as the second sample, which is Class A.

25 K-Nearest Neighbor (KNN) Instead of using the closet sample as the decided class, we use the closet k samples as the decided class.

26 Feature 1 Feature 2 Example k=3 The data is classified as a circle

27 Feature 1 Feature 2 Example k=5 The data is classified as a star.


Download ppt "INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)"

Similar presentations


Ads by Google