Presentation is loading. Please wait.

Presentation is loading. Please wait.

Local factors in a graphical model

Similar presentations


Presentation on theme: "Local factors in a graphical model"— Presentation transcript:

1 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging Possible tagging (i.e., assignment to remaining variables) v v v find preferred tags Observed input sentence (shaded) 1

2 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging Possible tagging (i.e., assignment to remaining variables) Another possible tagging v a n find preferred tags Observed input sentence (shaded) 2

3 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging ”Binary” factor that measures compatibility of 2 adjacent tags v n a 2 1 3 v n a 2 1 3 Model reuses same parameters at this position find preferred tags 3

4 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging “Unary” factor evaluates this tag Its values depend on corresponding word v 0.2 n a v 0.2 n a can’t be adj find preferred tags 4

5 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging “Unary” factor evaluates this tag Its values depend on corresponding word v 0.2 n a find preferred tags (could be made to depend on entire observed sentence) 5

6 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging “Unary” factor evaluates this tag Different unary factor at each position v 0.3 n 0.02 a v 0.3 n a 0.1 v 0.2 n a find preferred tags 6

7 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging p(v a n) is proportional to the product of all factors’ values on v a n v n a 2 1 3 v n a 2 1 3 v a n v 0.3 n 0.02 a v 0.3 n a 0.1 v 0.2 n a find preferred tags 7

8 Local factors in a graphical model
First, a familiar example Conditional Random Field (CRF) for POS tagging p(v a n) is proportional to the product of all factors’ values on v a n v n a 2 1 3 v n a 2 1 3 = … 1*3*0.3*0.1*0.2 … v a n v 0.3 n 0.02 a v 0.3 n a 0.1 v 0.2 n a find preferred tags 8

9 Great Ideas in ML: Message Passing
Count the soldiers there’s 1 of me 1 beforeyou 2 beforeyou 3 beforeyou 4 beforeyou 5 beforeyou 5 behind you 4 behind you 3 behind you 2 behind you 1 behind you adapted from MacKay (2003) textbook 9

10 Great Ideas in ML: Message Passing
Count the soldiers Belief: Must be = 6 of us there’s 1 of me 2 3 1 2 beforeyou only see my incoming messages 3 behind you adapted from MacKay (2003) textbook 10

11 Great Ideas in ML: Message Passing
Count the soldiers Belief: Must be = 6 of us 1 4 Belief: Must be = 6 of us 2 3 1 there’s 1 of me 1 beforeyou only see my incoming messages 4 behind you adapted from MacKay (2003) textbook 11

12 Great Ideas in ML: Message Passing
Each soldier receives reports from all branches of tree 3 here 7 here 1 of me 11 here (= 7+3+1) adapted from MacKay (2003) textbook 12

13 Great Ideas in ML: Message Passing
Each soldier receives reports from all branches of tree 3 here 7 here (= 3+3+1) 3 here adapted from MacKay (2003) textbook 13

14 Great Ideas in ML: Message Passing
Each soldier receives reports from all branches of tree 11 here (= 7+3+1) 7 here 3 here adapted from MacKay (2003) textbook 14

15 Great Ideas in ML: Message Passing
Each soldier receives reports from all branches of tree 3 here 7 here Belief: Must be 14 of us 3 here adapted from MacKay (2003) textbook 15

16 Great Ideas in ML: Message Passing
Each soldier receives reports from all branches of tree 3 here 7 here Belief: Must be 14 of us 3 here wouldn’t work correctly with a “loopy” (cyclic) graph adapted from MacKay (2003) textbook 16

17 Great ideas in ML: Forward-Backward
In the CRF, message passing = forward-backward belief v 1.8 n a 4.2 message message α α β β v n a 2 1 3 v n a 2 1 3 v 7 n 2 a 1 v 3 n 1 a 6 v 2 n 1 a 7 v 3 n 6 a 1 v 0.3 n a 0.1 find preferred tags 17

18 Great ideas in ML: Forward-Backward
Extend CRF to “skip chain” to capture non-local factor More influences on belief  v 5.4 n a 25.2 α β v 3 n 1 a 6 v 2 n 1 a 7 v 0.3 n a 0.1 v 3 n 1 a 6 find preferred tags 18 18

19 Great ideas in ML: Forward-Backward
Extend CRF to “skip chain” to capture non-local factor More influences on belief  Graph becomes loopy  Red messages not independent? Pretend they are! v 5.4` n a 25.2` α β v 3 n 1 a 6 v 2 n 1 a 7 v 0.3 n a 0.1 v 3 n 1 a 6 find preferred tags 19 19


Download ppt "Local factors in a graphical model"

Similar presentations


Ads by Google