Presentation is loading. Please wait.

Presentation is loading. Please wait.

Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University.

Similar presentations


Presentation on theme: "Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University."— Presentation transcript:

1 Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University

2 Overview We consider two complementary models for the same task In our case, one discriminative, one generative Combine the models by feeding output predictions into a discriminative classifier Try two different classifiers for combining: multi-class SVM and max-margin matching

3 Semantic Role Labeling Label the arguments of a verb in context I gave the dog a bone. Giver Gift Recipient PropBank: 1m labeled words (Wall Street Journal)

4 Syntactic Model Most useful features: Argument word/part of speech Path from argument to verb in parse tree S NP VP I NP gavedog bone Use a standard classifier, e.g. SVM One vs. All classifier for each possible argument type Trained across all verbs at once

5 Semantic Model Data set: words occurring as “Eater” for verb eat dog, cat, he, … Usually, will either be a person or an animal We want to generalize to unseen animals or people Google Sets problem Our idea: use a word hierarchy to find categories of words Used WordNet as the hierarchy Selected categories using Bayesian score Will be presented tomorrow Bayesian Methods for Natural Language Processing On its own, improves log-likelihood of test sets on PropBank Train one model for each argument for each verb

6 Combining Models For each word w in a sentence with verb v: For each possible argument (Giver, Gift, Recipient, etc.): Margin of One vs. All classifier using syntactic features log P v (Arg | w) using semantic model trained on Arg and verb v Use these as inputs to multi-class SVM One weight for each argument for each model (not specific to v) I wanted my dog to eat the pickle. Idogpickle Eater1.0 Food-1.51.2 Idogpickle Eater-0.05-0.29-4.61 Food-3.0-1.39-0.01 Syntax Semantic Dog? Eater: 1.0a+-0.29b Food: 1.2c+-1.39d

7 Results Tested on first 500 frames (~ ½ of the data) FeaturesTrainingF1 SyntaxNone78.9 Syntax + SemanticNone78.9 SyntaxMulti-class SVM80.2 Syntax + SemanticMulti-class SVM80.4

8 Max-Margin Matching Each argument should be assigned to only one word Complete bipartite graph: Weight of edge from a word w to argument a: Syntax only: Margin of classifier for a applied to w Syntax and Semantic: (weighted) sum of confidences of each Same set of weights as in Multi-class SVM Apply max-margin matching learning for these weights I dog pickle Eater Food 1.0 1.2 1.0a+-0.29b 1.2c+-1.39d

9 Results Training of WeightsMatching?SyntaxBoth NoneNo78.9 NoneYes80.279.9 Multi-class SVMNo80.280.4 Multi-class SVMYes81.482.5 Max-Margin MatchingYes81.482.4 Can do matching for any type of training of weights

10 Results Summary Classifying using confidences of One vs. All can help Improving One vs. All classifier may remove this benefit Combining the models in a classifier worked Able to improve using WordNet However, only worked with both matching and training! Matching helped, but training the max-margin matching did not Why? Not that much data (shared weights across all verbs) Not that many parameters

11 Future Directions Previous work* used a Markov random field over classification decisions Can’t do exact inference Can include potentials besides one word per argument We could try to extend max-margin matching Unlabeled data Bootstrap between different classifiers When to include an example? High confidence under a single classifier? Or, high confidence in combined classifier? * Toutanova, Haghighi, Manning, ACL 2005


Download ppt "Max-Margin Matching for Semantic Role Labeling David Vickrey James Connor Daphne Koller Stanford University."

Similar presentations


Ads by Google