Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 9633 Machine Learning Inductive-Analytical Methods

Similar presentations


Presentation on theme: "CS 9633 Machine Learning Inductive-Analytical Methods"— Presentation transcript:

1 CS 9633 Machine Learning Inductive-Analytical Methods

2 Inductive and Analytical Methods
Inductive methods seek general hypotheses that fit observed training data. Can fail with insufficient data May be misled by incorrect bias Analytical methods seek general hypotheses that fit observed training data and prior knowledge. Can generalize more accurately from less data Can be misled with insufficient prior data Combining approaches offers possibility of powerful learning methods.

3 Justifications Hypotheses output by analytical methods have logical justifications Hypotheses output by inductive methods have statistical justifications

4 Spectrum of Learning Tasks
Inductive Learning Plentiful data No prior knowledge Analytical Learning Perfect prior knowledge Scarce data

5 Desirable Characteristics
Given no domain theory, learn at least as effectively as purely inductive methods. Given a perfect domain theory, learn at least as effectively as purely analytical methods. Given imperfect data and imperfect domain theory, combine two operations to outperform either pure method Accommodate an unknown level of error in the training data Accommodate an unknown level of error in the domain theory.

6 Learning Problem Given Determine
A set of training examples, D, possibly containing errors A domain theory, B, possible containing errors A space of candidate hypothese H Determine A hypothesis that best fits the training examples and domain theory

7 Fitting the Domain Theory and Training Examples
errorD(h) is the proportion of examples from D that are misclassified by H errorb(h) is the probability that h will disagree with B on the classification of a randomly drawn instance How do we combine measure of errors?

8 Using Prior Knowledge Use prior knowledge to derive an initial hypothesis from which to begin the search. Use prior knowledge to alter the objective of the hypothesis search space. Use prior knowledge to alter the available search steps.

9 Knowledge Based Artificial Neural Network (KBANN)
General approach Initialize the hypothesis to perfectly fit the domain theory Inductively refine the hypothesis as necessary KBANN Initialize neural network to predict domain theory Refine to fit data

10 See Table 12.3 for Example Includes both training examples and domain theory Domain theory and training examples are not completely consistent Examples 2 and 3 are not predicted as positive by the domain theory.

11 Constructing Initial NN
Create a sigmoid unit for each Horn clause in the domain theory. Sigmoid output > 0.5 is interpreted as true Sigmoid output < 0.5 is interpreted as false Input created for each antecedent. Weights set to compute logical AND of the inputs. For each input corresponding to a non-negated antecedent, set weight to positive constant W For each input corresponding to a negated antecedent, set weight to negative W. Each unit constructed so output will be greater than 0.5 just for cases for its Horn clauses. Threshold weight of the unit w0 is set to –(n-0.5)W With 0 and 1 inputs, correct output is guaranteed. Additional input units added to each threshold unit and weights set approximately to 0.

12 Tuning the neural network
After the initial neural network is constructed, refine the network using inductive learning. The tuning process learns new dependencies.

13 Summary of KBANN KBANN generally generalizes more accurately than pure backprop, especially with scarce data. Methods have been developed for mapping the refined network back to Horn clauses.


Download ppt "CS 9633 Machine Learning Inductive-Analytical Methods"

Similar presentations


Ads by Google