Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 39 of 42 Friday, 05 December.

Slides:



Advertisements
Similar presentations
Statistical Machine Translation
Advertisements

Machine Learning Approaches to the Analysis of Large Corpora : A Survey Xunlei Rose Hu and Eric Atwell University of Leeds.
Problems for Statistical MT Preprocessing Language modeling Translation modeling Decoding Parameter optimization Evaluation.
Computing & Information Sciences Kansas State University Lecture 38 of 42 CIS 530 / 730 Artificial Intelligence Lecture 38 of 42 Natural Language Processing,
Translation Models Philipp Koehn USC/Information Sciences Institute USC/Computer Science Department School of Informatics University of Edinburgh Some.
Statistical Machine Translation Part II: Word Alignments and EM Alexander Fraser Institute for Natural Language Processing University of Stuttgart
Statistical Machine Translation Part II: Word Alignments and EM Alexander Fraser ICL, U. Heidelberg CIS, LMU München Statistical Machine Translation.
Statistical Machine Translation Part II – Word Alignments and EM Alex Fraser Institute for Natural Language Processing University of Stuttgart
Machine Translation- 4 Autumn 2008 Lecture Sep 2008.
Machine Translation Introduction to Statistical MT.
MT Evaluation CA446 Week 9 Andy Way School of Computing Dublin City University, Dublin 9, Ireland
Word Alignment Philipp Koehn USC/Information Sciences Institute USC/Computer Science Department School of Informatics University of Edinburgh Some slides.
Page 1 Hidden Markov Models for Automatic Speech Recognition Dr. Mike Johnson Marquette University, EECE Dept.
Lecture 15 Hidden Markov Models Dr. Jianjun Hu mleg.cse.sc.edu/edu/csce833 CSCE833 Machine Learning University of South Carolina Department of Computer.
Computing & Information Sciences Kansas State University Lecture 11 of 42 CIS 530 / 730 Artificial Intelligence Lecture 11 of 42 William H. Hsu Department.
CSCI 5582 Artificial Intelligence
1 Hidden Markov Model Instructor : Saeed Shiry  CHAPTER 13 ETHEM ALPAYDIN © The MIT Press, 2004.
Machine Learning in Natural Language Processing Noriko Tomuro November 16, 2006.
Probabilistic Model of Sequences Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán)
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 26 Jim Martin.
(Some issues in) Text Ranking. Recall General Framework Crawl – Use XML structure – Follow links to get new pages Retrieve relevant documents – Today.
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
Automatic Evaluation Philipp Koehn Computer Science and Artificial Intelligence Lab Massachusetts Institute of Technology.
Natural Language Processing Expectation Maximization.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
Machine Translation- 5 Autumn 2008 Lecture Sep 2008.
Computing & Information Sciences Kansas State University Friday, 01 Dec 2006CIS 490 / 730: Artificial Intelligence Lecture 40 of 42 Friday, 01 December.
Text Models. Why? To “understand” text To assist in text search & ranking For autocompletion Part of Speech Tagging.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture 16 5 September 2007.
SMT – Final thoughts Philipp Koehn USC/Information Sciences Institute USC/Computer Science Department School of Informatics University of Edinburgh Some.
Computing & Information Sciences Kansas State University Rabat, MoroccoHuman Language Technologies Workshop Human Language Technologies (HLT) Workshop.
1 Statistical NLP: Lecture 9 Word Sense Disambiguation.
Statistical Machine Translation Part III – Phrase-based SMT Alexander Fraser CIS, LMU München WSD and MT.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
Sequence Models With slides by me, Joshua Goodman, Fei Xia.
CS774. Markov Random Field : Theory and Application Lecture 19 Kyomin Jung KAIST Nov
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Thursday, September 30, 1999.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
Computing & Information Sciences Kansas State University Wednesday, 25 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 26 of 42 Wednesday. 25 October.
Statistical Machine Translation Part III – Phrase-based SMT / Decoding Alexander Fraser Institute for Natural Language Processing Universität Stuttgart.
For Monday Read chapter 24, sections 1-3 Homework: –Chapter 23, exercise 8.
For Monday Read chapter 26 Last Homework –Chapter 23, exercise 7.
For Friday Finish chapter 23 Homework –Chapter 23, exercise 15.
Haitham Elmarakeby.  Speech recognition
1 Minimum Error Rate Training in Statistical Machine Translation Franz Josef Och Information Sciences Institute University of Southern California ACL 2003.
Statistical Machine Translation Part II: Word Alignments and EM Alex Fraser Institute for Natural Language Processing University of Stuttgart
CSA2050: Introduction to Computational Linguistics Part of Speech (POS) Tagging II Transformation Based Tagging Brill (1995)
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
Machine Learning: A Brief Introduction Fu Chang Institute of Information Science Academia Sinica ext. 1819
Pattern Recognition NTUEE 高奕豪 2005/4/14. Outline Introduction Definition, Examples, Related Fields, System, and Design Approaches Bayesian, Hidden Markov.
Dan Roth University of Illinois, Urbana-Champaign 7 Sequential Models Tutorial on Machine Learning in Natural.
Computing & Information Sciences Kansas State University Wednesday, 25 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 26 of 42 Wednesday. 25 October.
Statistical Machine Translation Part II: Word Alignments and EM
CSC 594 Topics in AI – Natural Language Processing
Statistical Machine Translation
CSCI 5832 Natural Language Processing
Statistical Machine Translation Part III – Phrase-based SMT / Decoding
Machine Learning in Natural Language Processing
CSCI 5832 Natural Language Processing
Statistical NLP: Lecture 9
CSCI 5832 Natural Language Processing
N-Gram Model Formulas Word sequences Chain rule of probability
Machine Translation(MT)
Word Alignment David Kauchak CS159 – Fall 2019 Philipp Koehn
SMT – Final thoughts David Kauchak CS159 – Spring 2019
Johns Hopkins 2003 Summer Workshop on Syntax and Statistical Machine Translation Chapters 5-8 Ethan Phelps-Goodman.
Statistical NLP : Lecture 9 Word Sense Disambiguation
Presentation transcript:

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Lecture 39 of 42 Friday, 05 December 2008 William H. Hsu Department of Computing and Information Sciences, KSU KSOL course page: Course web site: Instructor home page: Reading for Next Class: Sections 22.1, , Russell & Norvig 2 nd edition NLP and Philosophical Issues Discussion: Machine Translation (MT)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence (Hidden) Markov Models: Review Definition of Hidden Markov Models (HMMs)  Stochastic state transition diagram (HMMs: states, aka nodes, are hidden)  Compare: probabilistic finite state automaton (Mealy/Moore model)  Annotated transitions (aka arcs, edges, links) Output alphabet (the observable part) Probability distribution over outputs Forward Problem: One Step in ML Estimation  Given: model h, observations (data) D  Estimate: P(D | h) Backward Problem: Prediction Step  Given: model h, observations D  Maximize: P(h(X) = x | h, D) for a new X Forward-Backward (Learning) Problem  Given: model space H, data D  Find: h  H such that P(h | D) is maximized (i.e., MAP hypothesis) HMMs Also A Case of LSQ (f Values in [Roth, 1999]) A 0.4 B 0.6 A 0.5 G 0.3 H 0.2 E 0.1 F 0.9 E 0.3 F 0.7 C 0.8 D 0.2 A 0.1 G 0.9

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence NLP Hierarchy: Review Problem Definition  Given: m sentences containing untagged words  Example: “The can will rust.”  Label (one per word, out of ~30-150): v j  s  (art, n, aux, vi)  Representation: labeled examples  Return: classifier f: X  V that tags x  (w 1, w 2, …, w n )  Applications: WSD, dialogue acts (e.g., “That sounds OK to me.”  ACCEPT) Solution Approaches: Use Transformation-Based Learning (TBL)  [Brill, 1995]: TBL - mistake-driven algorithm that produces sequences of rules Each rule of the form (t i, v): a test condition (constructed attribute) and a tag t i : “w occurs within  k words of w i ” (context words); collocations (windows)  For more info: see [Roth, 1998], [Samuel, Carberry, Vijay-Shankar, 1998] Recent Research  E. Brill’s page:  K. Samuel’s page: Discourse Labeling Speech Acts Natural Language Parsing / POS Tagging Lexical Analysis

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Statistical Machine Translation Kevin Knight USC/Information Sciences Institute USC/Computer Science Department

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Clients do not sell pharmaceuticals in Europe => Clientes no venden medicinas en Europa Spanish/English Parallel Corpora: Review 1a. Garcia and associates. 1b. Garcia y asociados. 7a. the clients and the associates are enemies. 7b. los clients y los asociados son enemigos. 2a. Carlos Garcia has three associates. 2b. Carlos Garcia tiene tres asociados. 8a. the company has three groups. 8b. la empresa tiene tres grupos. 3a. his associates are not strong. 3b. sus asociados no son fuertes. 9a. its groups are in Europe. 9b. sus grupos estan en Europa. 4a. Garcia has a company also. 4b. Garcia tambien tiene una empresa. 10a. the modern groups sell strong pharmaceuticals. 10b. los grupos modernos venden medicinas fuertes. 5a. its clients are angry. 5b. sus clientes estan enfadados. 11a. the groups do not sell zenzanine. 11b. los grupos no venden zanzanina. 6a. the associates are also angry. 6b. los asociados tambien estan enfadados. 12a. the small groups are not modern. 12b. los grupos pequenos no son modernos.

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Data for Statistical MT and data preparation

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Ready-to-Use Online Bilingual Data (Data stripped of formatting, in sentence-pair format, available from the Linguistic Data Consortium at UPenn). Millions of words (English side)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Ready-to-Use Online Bilingual Data (Data stripped of formatting, in sentence-pair format, available from the Linguistic Data Consortium at UPenn). Millions of words (English side) + 1m-20m words for many language pairs

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Ready-to-Use Online Bilingual Data Millions of words (English side)  One Billion? ???

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence From No Data to Sentence Pairs Easy way: Linguistic Data Consortium (LDC) Really hard way: pay $$$  Suppose one billion words of parallel data were sufficient  At 20 cents/word, that’s $200 million Pretty hard way: Find it, and then earn it!  De-formatting  Remove strange characters  Character code conversion  Document alignment  Sentence alignment  Tokenization (also called Segmentation)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Sentence Alignment The old man is happy. He has fished many times. His wife talks to him. The fish are jumping. The sharks await. El viejo está feliz porque ha pescado muchos veces. Su mujer habla con é l. Los tiburones esperan.

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Sentence Alignment 1.The old man is happy. 2.He has fished many times. 3.His wife talks to him. 4.The fish are jumping. 5.The sharks await. 1.El viejo está feliz porque ha pescado muchos veces. 2.Su mujer habla con él. 3.Los tiburones esperan.

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Sentence Alignment 1.The old man is happy. 2.He has fished many times. 3.His wife talks to him. 4.The fish are jumping. 5.The sharks await. 1.El viejo está feliz porque ha pescado muchos veces. 2.Su mujer habla con él. 3.Los tiburones esperan.

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Sentence Alignment 1.The old man is happy. He has fished many times. 2.His wife talks to him. 3.The sharks await. 1.El viejo está feliz porque ha pescado muchos veces. 2.Su mujer habla con él. 3.Los tiburones esperan. Note that unaligned sentences are thrown out, and sentences are merged in n-to-m alignments (n, m > 0).

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Tokenization (or Segmentation) English  Input (some byte stream): "There," said Bob.  Output (7 “tokens” or “words”): " There, " said Bob. Chinese  Input (byte stream):  Output: 美国关岛国际机场及其办公室均接获 一名自称沙地阿拉伯富商拉登等发出 的电子邮件。

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Lower-Casing English  Input (7 words): " There, " said Bob.  Output (7 words): " there, " said bob. The the “The “the the Smaller vocabulary size. More robust counting and learning. Idea of tokenizing and lower-casing:

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence It Is Possible to Draw Learning Curves: How Much Data Do We Need? Amount of bilingual training data Quality of automatically trained machine translation system

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence MT Evaluation

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence MT Evaluation Manual:  SSER (subjective sentence error rate)  Correct/Incorrect  Error categorization Testing in an application that uses MT as one sub-component  Question answering from foreign language documents Automatic:  WER (word error rate)  BLEU (Bilingual Evaluation Understudy)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Reference (human) translation: The U.S. island of Guam is maintaining a high state of alert after the Guam airport and its offices both received an from someone calling himself the Saudi Arabian Osama bin Laden and threatening a biological/chemical attack against public places such as the airport. Machine translation: The American [?] international airport and its the office all receives one calls self the sand Arab rich business [?] and so on electronic mail, which sends out ; The threat will be able after public place and so on the airport to start the biochemistry attack, [?] highly alerts after the maintenance. BLEU Evaluation Metric (Papineni et al, ACL-2002) N-gram precision (score is between 0 & 1) –What percentage of machine n-grams can be found in the reference translation? –An n-gram is an sequence of n words –Not allowed to use same portion of reference translation twice (can’t cheat by typing out “the the the the the”) Brevity penalty –Can’t just type out single word “the” (precision 1.0!) *** Amazingly hard to “game” the system (i.e., find a way to change machine output so that BLEU goes up, but quality doesn’t)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Reference (human) translation: The U.S. island of Guam is maintaining a high state of alert after the Guam airport and its offices both received an from someone calling himself the Saudi Arabian Osama bin Laden and threatening a biological/chemical attack against public places such as the airport. Machine translation: The American [?] international airport and its the office all receives one calls self the sand Arab rich business [?] and so on electronic mail, which sends out ; The threat will be able after public place and so on the airport to start the biochemistry attack, [?] highly alerts after the maintenance. BLEU Evaluation Metric (Papineni et al, ACL-2002) BLEU4 formula (counts n-grams up to length 4) exp (1.0 * log p * log p * log p * log p4 – max(words-in-reference / words-in-machine – 1, 0) p1 = 1-gram precision P2 = 2-gram precision P3 = 3-gram precision P4 = 4-gram precision

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Reference translation 1: The U.S. island of Guam is maintaining a high state of alert after the Guam airport and its offices both received an from someone calling himself the Saudi Arabian Osama bin Laden and threatening a biological/chemical attack against public places such as the airport. Reference translation 3: The US International Airport of Guam and its office has received an from a self-claimed Arabian millionaire named Laden, which threatens to launch a biochemical attack on such public places as airport. Guam authority has been on alert. Reference translation 4: US Guam International Airport and its office received an from Mr. Bin Laden and other rich businessman from Saudi Arabia. They said there would be biochemistry air raid to Guam Airport and other public places. Guam needs to be in high precaution about this matter. Reference translation 2: Guam International Airport and its offices are maintaining a high state of alert after receiving an that was from a person claiming to be the wealthy Saudi Arabian businessman Bin Laden and that threatened to launch a biological and chemical attack on the airport and other public places. Machine translation: The American [?] international airport and its the office all receives one calls self the sand Arab rich business [?] and so on electronic mail, which sends out ; The threat will be able after public place and so on the airport to start the biochemistry attack, [?] highly alerts after the maintenance. Multiple Reference Translations Reference translation 1: The U.S. island of Guam is maintaining a high state of alert after the Guam airport and its offices both received an from someone calling himself the Saudi Arabian Osama bin Laden and threatening a biological/chemical attack against public places such as the airport. Reference translation 3: The US International Airport of Guam and its office has received an from a self-claimed Arabian millionaire named Laden, which threatens to launch a biochemical attack on such public places as airport. Guam authority has been on alert. Reference translation 4: US Guam International Airport and its office received an from Mr. Bin Laden and other rich businessman from Saudi Arabia. They said there would be biochemistry air raid to Guam Airport and other public places. Guam needs to be in high precaution about this matter. Reference translation 2: Guam International Airport and its offices are maintaining a high state of alert after receiving an that was from a person claiming to be the wealthy Saudi Arabian businessman Bin Laden and that threatened to launch a biological and chemical attack on the airport and other public places. Machine translation: The American [?] international airport and its the office all receives one calls self the sand Arab rich business [?] and so on electronic mail, which sends out ; The threat will be able after public place and so on the airport to start the biochemistry attack, [?] highly alerts after the maintenance.

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence BLEU Tends to Predict Human Judgments slide from G. Doddington (NIST) (variant of BLEU)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence BLEU in Action 枪手被警方击毙。 (Foreign Original) the gunman was shot to death by the police. (Reference Translation) the gunman was police kill. #1 wounded police jaya of #2 the gunman was shot dead by the police. #3 the gunman arrested by police kill. #4 the gunmen were killed. #5 the gunman was shot to death by the police. #6 gunmen were killed by police ?SUB>0 ?SUB>0 #7 al by the police. #8 the ringer is killed by the police. #9 police killed the gunman. #10

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence BLEU in Action 枪手被警方击毙。 (Foreign Original) the gunman was shot to death by the police. (Reference Translation) the gunman was police kill. #1 wounded police jaya of #2 the gunman was shot dead by the police. #3 the gunman arrested by police kill. #4 the gunmen were killed. #5 the gunman was shot to death by the police. #6 gunmen were killed by police ?SUB>0 ?SUB>0 #7 al by the police. #8 the ringer is killed by the police. #9 police killed the gunman. #10 green = 4-gram match (good!) red = word not matched (bad!)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Sample Learning Curves Swedish/English French/English German/English Finnish/English # of sentence pairs used in training BLEU score Experiments by Philipp Koehn

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Word-Based Statistical MT

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Statistical MT Systems Spanish Broken English Spanish/English Bilingual Text English Text Statistical Analysis Que hambre tengo yo What hunger have I, Hungry I am so, I am so hungry, Have I that hunger … I am so hungry

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Statistical MT Systems Spanish Broken English Spanish/English Bilingual Text English Text Statistical Analysis Que hambre tengo yoI am so hungry Translation Model P(s|e) Language Model P(e) Decoding algorithm argmax P(e) * P(s|e) e

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Bayes Rule Spanish Broken English Que hambre tengo yoI am so hungry Translation Model P(s|e) Language Model P(e) Decoding algorithm argmax P(e) * P(s|e) e Given a source sentence s, the decoder should consider many possible translations … and return the target string e that maximizes P(e | s) By Bayes Rule, we can also write this as: P(e) x P(s | e) / P(s) and maximize that instead. P(s) never changes while we compare different e’s, so we can equivalently maximize this: P(e) x P(s | e)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Three Problems for Statistical MT Language model  Given an English string e, assigns P(e) by formula  good English string -> high P(e)  random word sequence -> low P(e) Translation model  Given a pair of strings, assigns P(f | e) by formula  look like translations -> high P(f | e)  don’t look like translations -> low P(f | e) Decoding algorithm  Given a language model, a translation model, and a new sentence f … find translation e maximizing P(e) * P(f | e)

Computing & Information Sciences Kansas State University Friday, 05 Dec 2008CIS 530 / 730: Artificial Intelligence Terminology Simple Bayes, aka Naïve Bayes  Zero counts: case where an attribute value never occurs with a label in D  No match approach: assign an   c/m probability to P(x ik | v j )  m-estimate aka Laplace approach: assign a Bayesian estimate to P(x ik | v j ) Learning in Natural Language Processing (NLP)  Training data: text corpora (collections of representative documents)  Statistical Queries (SQ) oracle: answers queries about P(x ik, v j ) for x ~ D  Linear Statistical Queries (LSQ) algorithm: classification using f(oracle response) Includes: Naïve Bayes, BOC Other examples: Hidden Markov Models (HMMs), maximum entropy  Problems: word sense disambiguation, part-of-speech tagging  Applications Spelling correction, conversational agents Information retrieval: web and digital library searches