Download presentation
Presentation is loading. Please wait.
Published byJustina Cook Modified over 8 years ago
1
LING 575 Lecture 5 Kristina Toutanova MSR & UW April 27, 2010 With materials borrowed from Philip Koehn, Chris Quirk, David Chiang, Dekai Wu, Aria Haghighi Tree-based translation
2
Overview Motivation Examples of reordering/translation phenomena Synchronous context free grammar Example derivations ITG grammars Reordering for ITG grammars Applications of bracketing ITG grammars Applications: ITGs for word alignment Hierarchical phrase-based translation with Hiero Rule extraction Model features Decoding for SCFGs and integrating a LM
3
Motivation for tree-based translation Phrases capture contextual translation and local reordering surprisingly well However this information is brittle: “author of the book 本書的作者 ” tells us nothing about how to translate “author of the pamphlet” or “author of the play” The Chinese phrase “NOUN1 的 NOUN2” becomes “NOUN2 of NOUN1” in English
4
Motivation for tree-based translation There are general principles a phrase-based system is not using Some languages have adjectives before the nouns, some after Some languages place prepositions before nouns, some after Some languages put PPs before the head, others after Some languages place relative clauses before head, others after Discontinuous translations are not handled well by phrase-based systems ne pas in French, German verb split
5
Types of tree-based systems Formally tree-based but not using linguistic syntax Can still model hierarchical nature of language Can capture hierarchical reordering Examples: phrase-based ITGs and Hiero (will focus on these in this lecture) Can use linguistic syntax on source, target, or both sides Phrase structure trees, dependency trees Next lecture
6
Synchronous context-free grammars
7
A generalization of context free grammars Slide from David Chiang, ACL 2006 tutorial
8
Context-free grammars (example in Japanese) Slide from David Chiang, ACL 2006 tutorial
9
Synchronous CFGs Slide from David Chiang, ACL 2006 tutorial
10
Synchronous CFGs Slide from David Chiang, ACL 2006 tutorial
11
Synchronous CFGs Slide from David Chiang, ACL 2006 tutorial
12
Rules with probabilities Joint probability of source and target language re-writes, given non-terminal on left. Could also use conditional probability of target given source or source given target.
13
Synchronous CFGs Slide from David Chiang, ACL 2006 tutorial
22
Inversion Transduction Grammars (ITGs)
23
Stochastic Inversion Transduction Grammars [Wu 97]
24
Bracketing ITG grammars
25
Reordering in bracketing ITG grammar
26
Example re-ordering with ITG
27
Are there other synchronous parses of this sentence pair? [1,2,3,4]
28
Example re-ordering with ITG Other re-orderings with parses A horizontal bar means the non-terminals are swapped
29
But some re-orderings are not allowed When words move inside-out 22 out of the 24 permutations of 4 words are parsable by the bracketing ITG
30
Number of permutations compared to ones parsable by ITG
31
Application of ITGs Have been applied to word alignment and translation in many previous works One recent interesting work is Haghighi et al’s 09 paper on supervised word alignment with block ITGs Aria Haghighi, John Blitzer, John DeNero, and Dan Klein “Better word alignments with supervised ITG Models”
32
Comparison of oracle alignment error (AER) for different alignment spaces Space of all alignments, space of 1-to-1 alignments, space of ITG alignments From Haghighi et al 09
33
Block ITG: adding one to many alignments From Haghighi et al 09
34
Comparison of oracle alignment error (AER) for different alignment spaces From Haghighi et al 09
35
Alignment performance using discriminative model From Haghighi et al 09
36
Training for maximum likelihood So far results were with MIRA Requiring only finding the best alignment under the model Efficient under 1-to-1 and ITG models If we want to train for maximum likelihood according to a log-linear model Requires summing over all possible alignments This is tractable in ITGs (will discuss bitext parsing in a bit) One of the big advantages of ITGs
37
MIRA versus maximum likelihood training
38
Algorithms for SCFGs Translation with synchronous CFGs Bi-text parsing with synchronous CFGs
39
Review: CKY parsing for CFGs in CNF Start with spans of length one and construct possible constituents Slide from David Chiang, ACL 2006 tutorial
40
Review: CKY parsing for CFGs in CNF Continue with spans of length 2 and construct constituents using words and constructed constituents Slide from David Chiang, ACL 2006 tutorial
41
Review: CKY parsing for CFGs in CNF Spans of length 3 Slide from David Chiang, ACL 2006 tutorial
42
Review: CKY parsing for CFGs in CNF Spans of length 4 Slide from David Chiang, ACL 2006 tutorial
43
Review: CKY parsing for CFGs in CNF The best S constituent covering the whole sentence is the final output Slide from David Chiang, ACL 2006 tutorial
44
Review: complexity of CKY Slide from David Chiang, ACL 2006 tutorial
45
Translation with SCFG Slide from David Chiang, ACL 2006 tutorial
46
Translation Slide from David Chiang, ACL 2006 tutorial
47
Bi-text parsing Slide from David Chiang, ACL 2006 tutorial
48
Bi-text parsing We consider SCFGs with at most two symbols on the right-hand-side (rank 2) Slide from David Chiang, ACL 2006 tutorial
49
Bi-text parsing Slide from David Chiang, ACL 2006 tutorial
50
Bi-text parsing Slide from David Chiang, ACL 2006 tutorial
51
Bi-text parsing Slide from David Chiang, ACL 2006 tutorial
52
Bi-text parsing Slide from David Chiang, ACL 2006 tutorial
53
Bi-text parsing for grammars with higher rank No CNF for synchronous CFGs with rank greater or equal to 4 in the general case With higher-rank grammars we can translate efficiently by converting the source side CFG to CNF, parsing, flattening the trees back, and translating Not so for bi-text parsing: In general, exponential in rank of grammar and polynomial in sentence length
54
David Chiang ISI, USC Hierarchical phrase-based translation
55
Hierarchical phrase-based translation overview Motivation Extracting rules Scoring derivations Decoding without an LM Decoding with a LM
56
Motivation Review of phrase based models Segment input into sequence of phrases Translate each phrase Re-order phrases depending on distortion and perhaps the lexical content of the phrases Properties of phrase-based models Local re-ordering is captured within phrases for frequently occurring groups of words Global re-ordering is not modeled well Only contiguous translations are learned
57
Chinese-English example Australia is one of the few countries that have diplomatic relations with North Korea. Output from phrase-based system: Captured some reordering through phrase translation and phrase re-ordering Did not re-order the relative clause and the noun phrase.
58
Idea: Hierarchical phrases
59
Other example hierarchical phrases
60
A Synchronous CFG for example
61
General approach Align parallel training data using word-alignment models (e.g. GIZA++) Extract hierarchical phrase pairs Can be represented as SCFG rules Assign probabilities (scores) to rules Like in log-linear models for phrase-based MT, can define various features on rules to come up with rule scores Translating new sentences Parsing with an SCFG grammar Integrating a language model
62
Example derivation
63
Extracting hierarchical phrases Start with contiguous phrase pairs, as in phrasal SMT models (called initial phrase pairs) Make rules for these phrase pairs and add them to the rule-set extracted from this sentence pair
64
Extracting hierarchical phrase pairs For every rule of the sentence pair, and every initial phrase pair contained in it, replace initial phrase pair by non-terminal and add new rule
65
Another example Traditional phrases Hierarchical phrase
66
Constraining the grammar rules This method generates too many phrase pairs and leads to spurious ambiguity Place constraints on the set of allowable rules for robustness/speed
67
Adding glue rules
68
Assigning scores to derivations
70
Estimating feature values and feature weights
71
Finding the best translation: decoding
72
Finding the best translation including an LM
73
Parsing with Hiero grammars Modification of CKY which does not require conversion to Chomsky Normal Form Parsing as weighted deduction (without LM, using source-side grammar) Goal: prove [S,0,n]
74
Pseudo-code for parsing If two items are equivalent: same span, same non-terminal, they are merged. For k-best generation, keep pointers to all ways to generate the item, plus weights.
75
K-best derivation generation To generate a k-best list for some item in the chart (e.g. X from 5 to 8), need to consider top k combinations of rules used to form X, plus sub-items used for the rules E.g. top 4 rules applying at span 5 to 8 and target sides of top 3 derivations from 6 to 8 Naïve method: generate all combinations, sort them and return top k Faster method: we don’t need to generate all combinations to get top k
76
K-best combinations of two lists
77
Integrating an LM with cube pruning
79
Results using different LM integration methods
80
Comparison to a phrase-based system
81
Summary
82
Described hierarchical phrase-based translation Uses hierarchical rules encoding phrase re-ordering and discontinuous lexical correspondence Rules include traditional contiguous phrase pairs Can translate efficiently without LM using SCFG parsing Outperforms phrase-based models for several languages Hiero is implemented in Moses
83
References Hierarchical phrase-based translation. David Chiang, CL 2007. An introduction to Synchronous Grammars. Notes and slides from ACL 2006 tutorial. David Chiang. Stochastic inversion transduction grammars and bilingual parsing of parallel corpora. Dekai Wu, CL 1997. Better word alignment with Supervised ITG models. ACL 2009, A. Haghighi, J. Blitzer, J. DeNero, and D. Klein Many other interesting papers using ITGs and extensions to Hiero: will add some to the web page
84
Next lecture Chris Quirk will talk about SMT systems using linguistic syntax Using syntax on source, target Different types of syntactic analysis Other types of synchronous grammars List of readings will be updated
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.