Presentation is loading. Please wait.

Presentation is loading. Please wait.

Generation-Heavy Hybrid Machine Translation Nizar Habash Postdoctoral Researcher Center for Computational Learning Systems Columbia University Columbia.

Similar presentations


Presentation on theme: "Generation-Heavy Hybrid Machine Translation Nizar Habash Postdoctoral Researcher Center for Computational Learning Systems Columbia University Columbia."— Presentation transcript:

1 Generation-Heavy Hybrid Machine Translation Nizar Habash Postdoctoral Researcher Center for Computational Learning Systems Columbia University Columbia University NLP Colloquium October 28, 2004

2 The Intuition Generation-Heavy Machine Translation Español ‚ عربي ‚ English Dictionary gist E

3 Introduction Research Contributions A general reusable and extensible Machine Translation (MT) model that transcends the need for large amounts of deep symmetric knowledge Development of reusable large-scale resources for English A large-scale Spanish-English MT system: Matador; Matador is more robust across genre and produce more grammatical output than simple statistical or symbolic techniques

4 Roadmap  Introduction  Generation-Heavy Machine Translation  Evaluation  Conclusion  Future Work

5 Introduction MT Pyramid Source word Source syntax Source meaningTarget meaning Target syntax Target word AnalysisGeneration Interlingua Gisting Transfer

6 Introduction MT Pyramid Source word Source syntax Source meaningTarget meaning Target syntax Target word AnalysisGeneration Interlingual Lexicons Dictionaries/Parallel Corpora Transfer Lexicons

7 Introduction MT Pyramid Gisting Transfer Source word Source syntax Source meaningTarget meaning Target syntax Target word

8 Introduction Why gisting is not enough Sobre la base de dichas experiencias se estableció en 1988 una metodología. Envelope her basis out speak experiences them settle at 1988 one methodology. On the basis of these experiences, a methodology was arrived at in 1988.

9 Introduction Translation Divergences 35% of sentences in TREC El Norte Corpus (Dorr et al 2002) Divergence Types –Categorial (X tener hambre  X be hungry) –Conflational (X dar puñaladas a Z  X stab Z) –Structural (X entrar en Y  X enter Y) –Head Swapping (X cruzar Y nadando  X swim across Y) –Thematic (X gustar a Y  Y like X)

10 Roadmap  Introduction  Generation-Heavy Machine Translation  Evaluation  Conclusion  Future Work

11 Generation-Heavy Hybrid Machine Translation Problem: asymmetric resources –High quality, broad coverage, semantic resources for target language –Low quality resources for source language –Low quality (many-to-many) translation lexicon Thesis: we can approximate interlingual MT without the use of symmetric interlingual resources

12 Relevant Background Work Hybrid Natural Language Generation Constrained Overgeneration  Statistical Ranking Nitrogen (Langkilde and Knight 1998), Halogen (Langkilde 2002) FERGUS (Rambow and Bangalore 2000) Lexical Conceptual Structure (LCS) based MT (Jackendoff 1983), (Dorr 1993)

13 LCS-based MT Example (Dorr, 1993)

14 Generation-Heavy Hybrid Machine Translation Analysis Translation Theta Linking Expansion Assignment Pruning Linearization Ranking … Generation

15 Matador Spanish-English GHMT Analysis Translation Spanish English Theta Linking Expansion Assignment Pruning Linearization Ranking Generation Expansive Rich Generation for English EXERGE

16 GHMT Analysis Source language syntactic dependency Example: Yo le di puñaladas a Juan. Features of representation –Approximation of predicate-argument structure –Long-distance dependencies dar Yopuñaladaa Juan :obj :mod:subj

17 GHMT Translation Lexical transfer but NO structural change Translation Lexicon (tener V)  ((have V) (own V) (possess V) (be V)) (deber V)  ((owe V) (should AUX) (must AUX)) (soler V)  ((tend V) (usually AV)) ADMINISTER,CONFER, DELIVER, EXTEND, GIVE, GRANT, HAND, LAND, RENDER I, MY, MINE STAB, KNIFE_WOUND AT, BY, INTO, THROUGH, TO JOHN :obj :mod :subj  dar Yopuñaladaa Juan :obj :mod:subj

18 GHMT Thematic Linking Syntactic Dependency  Thematic Dependency Which divergence Goal EXTEND, GIVE, GRANT, RENDER I, MY, MINE STAB, KNIFE_WOUND JOHN Theme Agent ADMINISTER,CONFER, DELIVER, EXTEND, GIVE, GRANT, HAND, LAND, RENDER I, MY, MINE STAB, KNIFE_WOUND AT, BY, INTO, THROUGH, TO JOHN :obj :mod :subj

19 GHMT Thematic Linking Resources Word Class Lexicon :NUMBER "V.13.1.a.ii" :NAME "Give - No Exchange” :POS V :THETA_ROLES ((( ag obl) ( th obl) ( goal obl to)) (( ag obl) ( goal obl) ( th obl))) :LCS_PRIMS (cause go) :WORDS (feed give pass pay peddle refund render repay serve)) Syntactic-Thematic Linking Map (:subj  ag instr th exp loc src goal perc mod-poss poss) (:obj2  goal src th perc ben) (across  goal loc) (in  loc mod-poss perc goal poss prop) (to  prop goal ben info th exp perc pred loc time)

20 GHMT Thematic Linking Syntactic Dependency  Thematic Dependency ((ADMINISTER V.13.2 ((AG OBL) (TH OBL) (GOAL OPT TO))) (CONFER V.37.6.b ((EXP OBL))) (DELIVER V.11.1 ((AG OBL) (GOAL OBL) (TH OBL) (SRC OPT FROM))) (EXTEND V.47.1 ((TH OBL) (MOD-LOC OPT. T))) (EXTEND V.13.3 ((AG OBL) (TH OBL) (GOAL OPT TO))) (EXTEND V.13.3 ((AG OBL) (GOAL OBL) (TH OBL))) (EXTEND V.13.2 ((AG OBL) (TH OBL) (GOAL OPT TO))) (GIVE V.13.1.a.ii ((AG OBL) (TH OBL) (GOAL OBL TO))) (GIVE V.13.1.a.ii ((AG OBL) (GOAL OBL) (TH OBL))) (GRANT V.29.5.e ((AG OBL) (INFO OBL THAT))) (GRANT V.29.5.d ((AG OBL) (TH OBL) (PROP OBL TO))) (GRANT V.13.3 ((AG OBL) (TH OBL) (GOAL OPT TO))) (GRANT V.13.3 ((AG OBL) (GOAL OBL) (TH OBL))) (HAND V.11.1 ((AG OBL) (TH OBL) (GOAL OPT TO) (SRC OPT FROM))) (HAND V.11.1 ((AG OBL) (GOAL OBL) (TH OBL) (SRC OPT FROM))) (LAND V.9.10 ((AG OBL) (TH OBL))) (RENDER V.13.1.a.ii ((AG OBL) (TH OBL) (GOAL OBL TO))) (RENDER V.13.1.a.ii ((AG OBL) (GOAL OBL) (TH OBL))) (RENDER V.10.6.a ((AG OBL) (TH OBL) (MOD-POSS OPT OF))) (RENDER V.10.6.a.LOCATIVE ((AG OPT) (SRC OBL) (TH OPT OF)))) ADMINISTER,CONFER, DELIVER, EXTEND, GIVE, GRANT, HAND, LAND, RENDER I, MY, MINE STAB, KNIFE_WOUND AT, BY, INTO, THROUGH, TO JOHN :obj :mod :subj

21 GHMT Thematic Linking Syntactic Dependency  Thematic Dependency ((ADMINISTER V.13.2 ((AG OBL) (TH OBL) (GOAL OPT TO))) (CONFER V.37.6.b ((EXP OBL))) (DELIVER V.11.1 ((AG OBL) (GOAL OBL) (TH OBL) (SRC OPT FROM))) (EXTEND V.47.1 ((TH OBL) (MOD-LOC OPT. T))) (EXTEND V.13.3 ((AG OBL) (TH OBL) (GOAL OPT TO))) (EXTEND V.13.3 ((AG OBL) (GOAL OBL) (TH OBL))) (EXTEND V.13.2 ((AG OBL) (TH OBL) (GOAL OPT TO))) (GIVE V.13.1.a.ii ((AG OBL) (TH OBL) (GOAL OBL TO))) (GIVE V.13.1.a.ii ((AG OBL) (GOAL OBL) (TH OBL))) (GRANT V.29.5.e ((AG OBL) (INFO OBL THAT))) (GRANT V.29.5.d ((AG OBL) (TH OBL) (PROP OBL TO))) (GRANT V.13.3 ((AG OBL) (TH OBL) (GOAL OPT TO))) (GRANT V.13.3 ((AG OBL) (GOAL OBL) (TH OBL))) (HAND V.11.1 ((AG OBL) (TH OBL) (GOAL OPT TO) (SRC OPT FROM))) (HAND V.11.1 ((AG OBL) (GOAL OBL) (TH OBL) (SRC OPT FROM))) (LAND V.9.10 ((AG OBL) (TH OBL))) (RENDER V.13.1.a.ii ((AG OBL) (TH OBL) (GOAL OBL TO))) (RENDER V.13.1.a.ii ((AG OBL) (GOAL OBL) (TH OBL))) (RENDER V.10.6.a ((AG OBL) (TH OBL) (MOD-POSS OPT OF))) (RENDER V.10.6.a.LOCATIVE ((AG OPT) (SRC OBL) (TH OPT OF)))) ADMINISTER,CONFER, DELIVER, EXTEND, GIVE, GRANT, HAND, LAND, RENDER I, MY, MINE STAB, KNIFE_WOUND AT, BY, INTO, THROUGH, TO JOHN :obj :mod :subj

22 GHMT Thematic Linking Syntactic Dependency  Thematic Dependency Goal EXTEND, GIVE, GRANT, RENDER I, MY, MINE STAB, KNIFE_WOUND JOHN Theme Agent ADMINISTER,CONFER, DELIVER, EXTEND, GIVE, GRANT, HAND, LAND, RENDER I, MY, MINE STAB, KNIFE_WOUND AT, BY, INTO, THROUGH, TO JOHN :obj :mod :subj

23 Interlingua Approximation through Expansion Operations obj enter John room subj in enter John room subj  in go John room subj  development N develop V  Categorial Variation put V butter N butter V  Node Conflation / Inflation Relation Conflation / Inflation Relation Variation

24 Interlingua Approximation 2 nd Degree Expansion obj cross John river subj mod swimming across go John river subj mod swimming Relation Inflation across swim John river subj Node Conflation

25 GHMT Structural Expansion Conflation Example, Goal STAB V I JOHN Agent Goal GIVE V I STAB N JOHN Theme Agent

26 GHMT Structural Expansion Conflation and Inflation Structural Expansion Resources –Word Class Lexicon :NUMBER "V.42.2" :NAME “Poison Verbs” :POS V :THETA_ROLES (((ag obl)(goal obl))) :LCS_PRIMS (cause go) :WORDS (crucify electrocute garrotte hang knife poison shoot smother stab strangle) –Categorial Variation Database (Habash and Dorr 2003) (:V (hunger) :N (hunger hungriness) :AJ (hungry)) (:V (validate) :N (validation validity) :AJ (valid)) (:V (cross) :N (crossing cross) :P (across)) (:V (stab) :N (stab))

27 GHMT Structural Expansion Conflation Example Goal GIVE V I STAB N JOHN Theme Agent STAB V

28 GHMT Structural Expansion Conflation Example Goal STAB V * * Agent [CAUSE GO] Goal GIVE V I STAB N JOHN Theme Agent

29 GHMT Structural Expansion Conflation Example, Goal STAB V I JOHN Agent Goal GIVE V I STAB N JOHN Theme Agent

30 Goal STAB V I JOHN Agent Goal GIVE V I STAB N JOHN Theme Agent GHMT Syntactic Assignment Thematic  Syntactic Mapping Object STAB V I, MY … JOHN Subject IObject GIVE V I, MY … STAB N, KNIFE_ WOUND N JOHN Object Subject Object Mod GIVE V I, MY … STAB N, KNIFE_ WOUND N TO, AT, … Object Subject JOHN

31 GHMT Structural N-gram Pruning Statistical lexical selection Object STAB V I, MY … JOHN Subject IObject GIVE V I, MY … STAB N, KNIFE_ WOUND N JOHN Object Subject Object Mod GIVE V I, MY … STAB N, KNIFE_ WOUND N TO, AT, … Object Subject JOHN Object STAB V I JOHN Subject IObject GIVE V I STAB N JOHN Object Subject Object Mod GIVE V I STAB N v TO Object Subject JOHN

32 Structural N-gram Model –Long-distance –Lexemes Surface N-gram Model –Local –Surface-forms GHMT Target Statistical Resources cloudeveryliningahassilver a lining silver have cloud every

33 GHMT Linearization &Ranking Oxygen Linearization (Habash 2000) Halogen Statistical Ranking (Langkilde 2002) --------------------------------------------------------- I stabbed John. [-1.670270 ] I gave a stab at John. [-2.175831] I gave the stab at John. [-3.969686] I gave an stab at John. [-4.489933] I gave a stab by John. [-4.803054] I gave a stab to John. [-5.045810] I gave a stab into John. [-5.810673] I gave a stab through John. [-5.836419] I gave a knife wound by John. [-6.041891]

34 Roadmap  Introduction  Generation-Heavy Machine Translation  Evaluation  Overall Evaluation  Component Evaluation  Conclusion  Future Work

35 Overall Evaluation Systems Gisting (GIST) Systran (SYST) IBM Model 4 (IBM4) Matador (MTDR) ApproachSymbolic Word-based Symbolic Transfer-based Statistical Word-based Hybrid Generation-Heavy Translation Model 400K surface-lexeme pairs 120K lexeme-lexeme pairs and large transfer lexicon Model 4 Giza Trained 50K UN sentence pairs 50K lexeme-lexeme pairs Language Model Unigrams Brown Corpus 1M words Bigrams 3M words (UN) Bigrams 3M words (UN) and Structural Bigrams 1.5M words (UN) Development Time 1 person-monthHundreds of person-years 1 person-month1 person-year (Brown et al 1990) (Al-Onaizan et al 1999) (Germann and Marcu 2000) (Resnik 1997)

36 Overall Evaluation Bleu Metric Bleu –BiLingual Evaluation Understudy (Papineni et al 2001) –Modified n-gram precision with length penalty –Quick, inexpensive and language independent –Correlates highly with human evaluation –Bias against synonyms and inflectional variations

37 Overall Evaluation Test Sets UNFBISBible Genre United Nations documents News broadcastReligious Spanish-English Sentence pairs 2,000 1,000 Sentence Length (words) 15.3919.2716.38

38 Overall Evaluation Results

39 Systran is overall best Gist is overall worst Matador is more robust than IBM4 Matador is more grammatical than IBM4 Matador has less information loss than IBM4

40 Overall Evaluation Grammaticality Example –SP: Ademàs dijo que solamente una inyecciòn masiva de capital extranjero... –EN: Further, he said that only a massive injection of foreign capital... –IBM4: further stated that only a massive inyecciòn of capital abroad... –MTDR: Also he spoke only a massive injection of foreign capital... Parsed all sentences (Spanish, English reference and English output) –Can we find main verb? –Pro Drop Restoration

41 Overall Evaluation Grammaticality: Verb Determination

42 Overall Evaluation Grammaticality: Subject Realization

43 Overall Evaluation Loss of Information Example –SP: El daño causado al pueblo de Sudáfrica jamás debe subestimarse. –EN: The damage caused to the people of his country should never be underestimated. –IBM4: the damage * the people of south * must never underestimated. –MTDR: Never the causado damage to the people of South Africa should be underestimated. Gisting (GIST) Systran (SYST) IBM Model 4 (IBM4) Matador (MTDR) Reference length 109% 94%104%

44 Component Evaluation Conducted several component evaluations –Parser ~75% correct (labeled dependency links) –Categorial Variation Database 81% Precision-Recall –Structural Expansion –Structural N-grams

45 Component Evaluation Structural Expansion Insignificant increase in Bleu score 40% of divergences pragmatic LCS lexicon coverage issues Minimal handling of nominal divergences Over-expansion –Además, destruyó totalmente sus cultivos de subsistencia … –EN: It had totally destroyed Samoa's staple crops... –MTDR: Furthermore, it totaled their cultivations of subsistence … –SP: Dicha adición se publica sólo en años impares. –EN: That addendum is issued in odd-numbered years only. –MTDR: concerned addendum is excluded in odd years.

46 Component Evaluation Structural N-grams 60% speed-up with no effect on quality

47 Roadmap  Introduction  Generation-Heavy Machine Translation  Evaluation  Conclusion  Future Work

48 Conclusion Research Contributions A general reusable and extensible MT model that transcends the need for large amounts of symmetric knowledge A systematic non-interlingual/non-transfer framework for handling translation divergences Extending the concept of symbolic overgeneration to include conflation and head-swapping of structural variations. A model for language-independent syntactic-to- thematic linking

49 Conclusion Research Contributions Development of reusable large-scale modules and resources: Exerge, Categorial Variation Database, etc. A large-scale Spanish-English GHMT implementation An evaluation of Matador against four models of machine translation found it to be robust across genre and to produce more grammatical output.

50 Ongoing Work Retargetability to new languages –Chinese, Arabic Extending system to use bi-texts –Phrase dictionary –Weighted translation pairs Generation-Heavy parsing –Small dependency grammar for foreign language –English structural n-grams to rank parses Extending system with new optional modules –Cross-lingual headline generation DepTrimmer (work with Bonnie Dorr) extending Trimmer (Dorr, et al. 2003) to dependency representation

51 Future Work Categorial Variation Database –Improving word-cluster correctness Structural Expansion –Extending to nominal divergences –Improving thematic linking with a statistical model Structural N-grams –Enriching with syntactic/thematic relations

52 Thank you! Questions?

53 Overall Evaluation Bleu Metric Test Sentence colorless green ideas sleep furiously Gold Standard References all dull jade ideas sleep irately drab emerald concepts sleep furiously colorless immature thoughts nap angrily

54 Overall Evaluation Bleu Metric Test Sentence colorless green ideas sleep furiously Gold Standard References all dull jade ideas sleep irately drab emerald concepts sleep furiously colorless immature thoughts nap angrily Unigram precision = 4/5

55 Overall Evaluation Bleu Metric Test Sentence colorless green ideas sleep furiously Gold Standard References all dull jade ideas sleep irately drab emerald concepts sleep furiously colorless immature thoughts nap angrily Unigram precision = 4 / 5 = 0.8 Bigram precision = 2 / 4 = 0.5 Bleu Score = (a 1 a 2 …a n ) 1/n = (0.8 ╳ 0.5) ½ = 0.6325  63.25

56 Overall Evaluation Investigating BLEU’s bias towards inflectional variants –SP: Los programas de ajuste estructural se han aplicado rigurosamente. –EN: Structural adjustment programmes had been rigorously implemented. –IBM4: structural adjustment programmes have been applied strictly. –MTDR: programmes of structural adjustment have been added rigurosament.

57 Overall Evaluation Inflectional Normalization


Download ppt "Generation-Heavy Hybrid Machine Translation Nizar Habash Postdoctoral Researcher Center for Computational Learning Systems Columbia University Columbia."

Similar presentations


Ads by Google