Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multitask Learning Using Dirichlet Process

Similar presentations


Presentation on theme: "Multitask Learning Using Dirichlet Process"— Presentation transcript:

1 Multitask Learning Using Dirichlet Process
Ya Xue July 1, 2005

2 Outline Task defined: infinite mixture of priors Multitask learning
Dirichlet process Task undefined: expert network Finite expert network Infinite expert network

3 Multitask Learning - Common Prior Model
M classification tasks: Shared prior of w:

4 Drawback of This Model Assume each wm is a two-dimensional vector.

5 Proposed Model w is drawn from a Gaussian mixture model:

6 Two Special Cases Common prior model - single Gaussian:
Piecewise linear classifier – point mass function similar vs. identical

7 Clustering Unknown parameters: Another uncertainty: K.
Model selection: compute evidence/Marginal:

8 Clustering with DP: No Model Selection
We rewrite the model in another form: We define a Dirichlet process prior for parameters

9 Stick-Breaking View of DP
1 Finally we get

10 Prediction Rule of DP for Posterior Inference
is a new data point. Assuming there are K distinct values of among , belongs to an existing cluster k: belongs to new cluster:

11 Toy Problem

12

13 Task 1

14 Task 2

15 Task 3

16 Task 4

17 Task 5

18 Task 6

19 Task 7

20 Task 8

21

22 Expert Network

23 Mathematical Model Gating node j: Likelihood:

24 Mathematical Model is the unique path from the root note to expert m.
where

25 Example

26 Infinite Expert Network
Infinite number of gating node.


Download ppt "Multitask Learning Using Dirichlet Process"

Similar presentations


Ads by Google