Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Asymptotic Analysis of Generative, Discriminative, and Pseudolikelihood Estimators by Percy Liang and Michael Jordan (ICML 2008 ) Presented by Lihan.

Similar presentations


Presentation on theme: "An Asymptotic Analysis of Generative, Discriminative, and Pseudolikelihood Estimators by Percy Liang and Michael Jordan (ICML 2008 ) Presented by Lihan."— Presentation transcript:

1 An Asymptotic Analysis of Generative, Discriminative, and Pseudolikelihood Estimators by Percy Liang and Michael Jordan (ICML 2008 ) Presented by Lihan He ECE, Duke University June 27, 2008

2 Introduction Exponential family estimators Generative Fully discriminative Pseudolikelihood discriminative Asymptotic analysis Experiments Conclusions Outline

3 Introduction  Data points are not considered to be drawn independently.  There are correlations between data points.  Given data, we have to consider the joint distribution over all the data points.  Correspondingly, the overall likelihood is not the product of the likelihood for each data point.

4 Introduction Generative vs. Discriminative Generative model: A model for randomly generating observed data; Learning a joint probability distribution over both observations and labels Discriminative model: A model only of the label variables conditional on the observed data; Learning a conditional distribution over labels given observations

5 Introduction Full Likelihood vs. Pseudolikelihood Full likelihood: Pseudolikelihood: An approximation of the full likelihood; Computationally more efficient. Could be intractable; Computationally inefficient. A set of dependencies between data points

6 Estimators Exponential Family Estimators for features model parameters normalization Example: conditional random field

7 Estimators Composite Likelihood Estimators [Lindsay 1988]  One class of pseudolikelihood estimator;  Consists of a weighted sum of component likelihoods, each of which is the probability of one subset of data points conditioned on another.  Partitions the output space (denoted by r) according to a fixed distribution P r, and obtains the component likelihood.  Defines criterion function which reflects the quality of the estimator.  The maximum composite likelihood estimator

8 Estimators Three estimators to be compared in the paper:  Generative: one component  Fully discriminative: one component  Pseudolikelihood discriminative: for each data point, we have one component

9 Estimators Risk Decomposition Bayes risk have only finite dataintrinsic suboptimality of the estimator Define unrelated to data samples z

10 Asymptotic Analysis before Well-specified model:, achieves O(n -1 ) convergence rate. Misspecified model:only fully discriminative estimator achieves O(n -1 ) rate.

11 Asymptotic Analysis

12 Experiments Toy example:four-node binary-valued graphical model True model: Learned model: When, the learned model is well-specified; When, the learned model is misspecified.

13 Experiments well-specified misspecified

14 Experiments Part-of-speech (POS) Tagging: Input: a sequence of words Output: a sequence of POS tags, i.e. noun, verb,etc. (45 tags total) Specified model: Node features : indicator functions of the form Edge features : indicator functions of the form Training: Wall Street Journal, 38K sentences. Testing: Wall Street Journal, 5.5K sentences, different sections from training.

15 Experiments Use the learned generative model to sample 1000 training samples and 1000 test samples, as synthetic data.

16 Conclusions  When model is well-specified: Three estimators all achieve O(n -1 ) convergence rate; There are no approximation error; The asymptotic estimation error generative < fully discriminative < pseudolikelihood discriminative  When model is misspecified: Fully discriminative estimator still achieves O(n -1 ) convergence rate, but the other two estimators achieve O(n -1/2 ) convergence rate ; The approximation error and asymptotic estimation error for fully discriminative estimator is lower than the generative estimator and the pseudolikelihood discriminative estimator.


Download ppt "An Asymptotic Analysis of Generative, Discriminative, and Pseudolikelihood Estimators by Percy Liang and Michael Jordan (ICML 2008 ) Presented by Lihan."

Similar presentations


Ads by Google