Download presentation
Presentation is loading. Please wait.
Published byClifford Russell Modified over 5 years ago
1
CS539 Project Report -- Evaluating hypothesis
Mingyu Feng Feb 24th, 2004
2
Learning J48 Decision Tree
Dataset: S1, 45 nominal and 10 real attributes. No preprocessing Result Correctly Classified Instances % Incorrectly Classified Instances % Kappa statistic Mean absolute error Root mean squared error Relative absolute error % Root relative squared error % errorS1(t) = 0.367 95% confidence interval = / Feb 24th, 2005
3
Learning Neural Networks
Dataset: S1, 45 nominal and 10 real attributes. No preprocessing Result Correctly Classified Instances % Incorrectly Classified Instances % Kappa statistic Mean absolute error Root mean squared error Relative absolute error % Root relative squared error % errors2(nn) = 0.352 Feb 24th, 2005
4
Difference between True Errors
errorS1(t) = 0.367, errors2(nn) = 0.352 d = 0.015, σ = 95% two-sided confidence interval for difference between two true errors / Feb 24th, 2005
5
Compare Learning Algorithms
k =11 1100 instances in D0 divided into 11 partitions (T1… T11), 100 instances each For i =1 to 11, use Ti as test set, train decision tree and neural networks on remaining data Expectation of i = , standard derivation = 95% confidence interval for estimating the difference in error between J4.8 decision trees and neural networks: / i 1 2 3 4 5 6 7 8 9 10 11 error Ti (t) 0.37 0.28 0.39 0.29 0.32 0.33 error Ti (nn) 0.40 0.34 0.31 0.38 0.3 0.36 0.35 i 0.04 -0.03 0.03 0.01 -0.01 -0.04 -0.02 Feb 24th, 2005
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.