CSCI 5832 Natural Language Processing

Slides:



Advertisements
Similar presentations
Mrach 1, 2009Dr. Muhammed Al-Mulhem1 ICS482 Formal Grammars Chapter 12 Muhammed Al-Mulhem March 1, 2009.
Advertisements

Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2007 Lecture August 2007.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Albert Gatt LIN3022 Natural Language Processing Lecture 8.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Syntax and Context-Free Grammars CMSC 723: Computational Linguistics I ― Session #6 Jimmy Lin The iSchool University of Maryland Wednesday, October 7,
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
CPSC 503 Computational Linguistics
Stochastic POS tagging Stochastic taggers choose tags that result in the highest probability: P(word | tag) * P(tag | previous n tags) Stochastic taggers.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING COMP3310 Natural Language Processing Eric Atwell, Language Research Group.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin and Rada Mihalcea.
Speech and Language Processing Lecture 12—02/24/2015 Susan W. Brown.
1 Syntax Sudeshna Sarkar 25 Aug Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane.
1 CPE 480 Natural Language Processing Lecture 5: Parser Asst. Prof. Nuttanart Facundes, Ph.D.
Natural Language Processing Lecture 6 : Revision.
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
1 LIN6932 Spring 2007 LIN6932 Topics in Computational Linguistics Lecture 6: Grammar and Parsing (I) February 15, 2007 Hana Filip.
Rules, Movement, Ambiguity
CSA2050 Introduction to Computational Linguistics Parsing I.
Parsing with Context-Free Grammars References: 1.Natural Language Understanding, chapter 3 (3.1~3.4, 3.6) 2.Speech and Language Processing, chapters 9,
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
Section 11.3 Features structures in the Grammar ─ Jin Wang.
Natural Language Processing Lecture 14—10/13/2015 Jim Martin.
CSA3050: NLP Algorithms Sentence Grammar NLP Algorithms.
Speech and Language Processing Formal Grammars Chapter 12.
Parsing with Context Free Grammars. Slide 1 Outline Why should you care? Parsing Top-Down Parsing Bottom-Up Parsing Bottom-Up Space (an example) Top -
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
CSC 594 Topics in AI – Natural Language Processing
Heng Ji September 13, 2016 SYNATCTIC PARSING Heng Ji September 13, 2016.
Speech and Language Processing
Basic Parsing with Context Free Grammars Chapter 13
Parsing Recommended Reading: Ch th Jurafsky & Martin 2nd edition
SYNTAX.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Part I: Basics and Constituency
CSC NLP -Context-Free Grammars
CPSC 503 Computational Linguistics
Lecture 13: Grammar and Parsing (I) November 9, 2004 Dan Jurafsky
CSCI 5832 Natural Language Processing
CSC 594 Topics in AI – Natural Language Processing
CSCI 5832 Natural Language Processing
CPSC 503 Computational Linguistics
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CS 388: Natural Language Processing: Syntactic Parsing
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 25
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
Probabilistic and Lexicalized Parsing
CSCI 5832 Natural Language Processing
Natural Language - General
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Parsing and More Parsing
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 26
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
Heng Ji January 17, 2019 SYNATCTIC PARSING Heng Ji January 17, 2019.
CSCI 5832 Natural Language Processing
CSA2050 Introduction to Computational Linguistics
David Kauchak CS159 – Spring 2019
David Kauchak CS159 – Spring 2019
Presentation transcript:

CSCI 5832 Natural Language Processing Lecture 10 Jim Martin 11/20/2018 CSCI 5832 Spring 2007

Today: 2/20 Review Break Syntax and Context-free grammars POS Tagging HMMs and Viterbi Break Syntax and Context-free grammars 11/20/2018 CSCI 5832 Spring 2007

Review Parts of Speech Part of Speech tagging Basic syntactic/morphological categories that words belong to Part of Speech tagging Assigning parts of speech to all the words in a sentence 11/20/2018 CSCI 5832 Spring 2007

Probabilities We want the best set of tags for a sequence of words (a sentence) W is a sequence of words T is a sequence of tags 11/20/2018 CSCI 5832 Spring 2007

So… We start with And get 11/20/2018 CSCI 5832 Spring 2007

HMMs This is an HMM The states in the model are the tags, and the observations are the words. The state to state transitions are driven by the bigram statistics The observed words are based solely on the state that you’re in 11/20/2018 CSCI 5832 Spring 2007

State Transitions 11/20/2018 CSCI 5832 Spring 2007 Noun Verb Det Aux 0.5 11/20/2018 CSCI 5832 Spring 2007

State Transitions and Observations bark dog bark run cat Noun bite the Verb Det a that Aux can will 0.5 did 11/20/2018 CSCI 5832 Spring 2007

The State Space The dog can run 11/20/2018 CSCI 5832 Spring 2007 Det Noun Noun Noun Noun </s> <s> Aux Aux Aux Aux Verb Verb Verb Verb The dog can run 11/20/2018 CSCI 5832 Spring 2007

The State Space The dog can run 11/20/2018 CSCI 5832 Spring 2007 Det Noun Noun Noun Noun </s> <s> Aux Aux Aux Aux Verb Verb Verb Verb The dog can run 11/20/2018 CSCI 5832 Spring 2007

The State Space The dog can run 11/20/2018 CSCI 5832 Spring 2007 Det Noun Noun Noun Noun </s> <s> Aux Aux Aux Aux Verb Verb Verb Verb The dog can run 11/20/2018 CSCI 5832 Spring 2007

Viterbi Efficiently return the most likely path Sweep through the columns multiplying the probabilities of one row, times the transition probabilities to the next row, times the appropriate observation probabilities And store the MAX 11/20/2018 CSCI 5832 Spring 2007

Viterbi 11/20/2018 CSCI 5832 Spring 2007

Break Changing the schedule a bit… We’re going to move on to Chapters 11 and 12 starting today. We’ll then go back to cover relevant aspects of Chapter 6. Next quiz will cover 5, 6, 11, 12 and 13 11/20/2018 CSCI 5832 Spring 2007

Talks CS Colloquium (Thursday 3:30) ICS Colloquium (Friday noon) Fernando Pereira -- Penn Learning to Analyze Sequences Basically Chapter 6 on steroids ICS Colloquium (Friday noon) Christer Samuelsson A Computational Linguist on Wall Street How to use HMMs to do market prediction Using Chapter 6 to make gobs of money 11/20/2018 CSCI 5832 Spring 2007

Syntax By syntax (or grammar) I mean the kind of implicit knowledge of your native language that you had mastered by the time you were 2 or 3 years old without explicit instruction Not the kind of stuff you were later taught in school. 11/20/2018 CSCI 5832 Spring 2007

Syntax Why should you care? Grammar checkers Question answering Information extraction Machine translation 11/20/2018 CSCI 5832 Spring 2007

Search? On Friday, PARC is announcing a deal that underscores that strategy. It is licensing a broad portfolio of patents and technology to a well-financed start-up with an ambitious and potentially lucrative goal: to build a search engine that could some day rival Google.The start-up, Powerset, is licensing PARC’s natural language technology - the art of making computers understand and process languages like English… Powerset hopes the technology will be the basis of a new search engine that allows users to type queries in plain English, rather than using keywords. 11/20/2018 CSCI 5832 Spring 2007

Search For a lot of things, keyword search works well, said Barney Pell, chief executive of Powerset. But I think we are going to look back in 10 years and say, remember when we used to search using keywords. 11/20/2018 CSCI 5832 Spring 2007

Search In a November interview, Marissa Mayer, Google’s vice president for search and user experience, said: “Natural language is really hard. I don’t think it will happen in the next five years.” 11/20/2018 CSCI 5832 Spring 2007

Search “My general feeling about natural-language processing in search is that I’m a bit of a skeptic in the sense that even the best systems, and I include there the systems from PARC, make many mistakes,” said Mr. Pereira of the University of Pennsylvania. 11/20/2018 CSCI 5832 Spring 2007

Context-Free Grammars Capture constituency and ordering Ordering is easy What are the rules that govern the ordering of words and bigger units in the language What’s constituency? How words group into units and how the various kinds of units behave 11/20/2018 CSCI 5832 Spring 2007

CFG Examples S -> NP VP NP -> Det NOMINAL NOMINAL -> Noun VP -> Verb Det -> a Noun -> flight Verb -> left 11/20/2018 CSCI 5832 Spring 2007

CFGs S -> NP VP This says that there are units called S, NP, and VP in this language That an S consists of an NP followed immediately by a VP Doesn’t say that that’s the only kind of S Nor does it say that this is the only place that NPs and VPs occur 11/20/2018 CSCI 5832 Spring 2007

Generativity As with FSAs and FSTs you can view these rules as either analysis or synthesis machines Generate strings in the language Reject strings not in the language Impose structures (trees) on strings in the language 11/20/2018 CSCI 5832 Spring 2007

Derivations A derivation is a sequence of rules applied to a string that accounts for that string Covers all the elements in the string Covers only the elements in the string 11/20/2018 CSCI 5832 Spring 2007

Derivations as Trees 11/20/2018 CSCI 5832 Spring 2007

Parsing Parsing is the process of taking a string and a grammar and returning a (many?) parse tree(s) for that string It is completely analogous to running a finite-state transducer with a tape It’s just more powerful Remember this means that there are languages we can capture with CFGs that we can’t capture with finite-state methods 11/20/2018 CSCI 5832 Spring 2007

Other Options Regular languages (expressions) Too weak Context-sensitive or Turing equiv Too powerful (maybe) 11/20/2018 CSCI 5832 Spring 2007

Context? The notion of context in CFGs has nothing to do with the ordinary meaning of the word context in language. All it really means is that the non-terminal on the left-hand side of a rule is out there all by itself (free of context) A -> B C Means that I can rewrite an A as a B followed by a C regardless of the context in which A is found 11/20/2018 CSCI 5832 Spring 2007

Key Constituents (English) Sentences Noun phrases Verb phrases Prepositional phrases 11/20/2018 CSCI 5832 Spring 2007

Sentence-Types Declaratives: A plane left Imperatives: Leave! S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane leave? S -> Aux NP VP WH Questions: When did the plane leave? S -> WH Aux NP VP 11/20/2018 CSCI 5832 Spring 2007

Recursion We’ll have to deal with rules such as the following where the non-terminal on the left also appears somewhere on the right (directly). Nominal -> Nominal PP [[flight] [to Boston]] VP -> VP PP [[departed Miami] [at noon]] 11/20/2018 CSCI 5832 Spring 2007

Recursion Of course, this is what makes syntax interesting flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch 11/20/2018 CSCI 5832 Spring 2007

Recursion Of course, this is what makes syntax interesting [[flights] [from Denver]] [[[Flights] [from Denver]] [to Miami]] [[[[Flights] [from Denver]] [to Miami]] [in February]] [[[[[Flights] [from Denver]] [to Miami]] [in February]] [on a Friday]] Etc. 11/20/2018 CSCI 5832 Spring 2007

The Point If you have a rule like VP -> V NP It only cares that the thing after the verb is an NP. It doesn’t have to know about the internal affairs of that NP 11/20/2018 CSCI 5832 Spring 2007

The Point 11/20/2018 CSCI 5832 Spring 2007

Conjunctive Constructions S -> S and S John went to NY and Mary followed him NP -> NP and NP VP -> VP and VP … In fact the right rule for English is X -> X and X 11/20/2018 CSCI 5832 Spring 2007

Problems Agreement Subcategorization Movement (for want of a better term) 11/20/2018 CSCI 5832 Spring 2007

Agreement This dog Those dogs This dog eats Those dogs eat *This dogs *Those dogs eats 11/20/2018 CSCI 5832 Spring 2007

Subcategorization Sneeze: John sneezed Find: Please find [a flight to NY]NP Give: Give [me]NP[a cheaper fare]NP Help: Can you help [me]NP[with a flight]PP Prefer: I prefer [to leave earlier]TO-VP Told: I was told [United has a flight]S … 11/20/2018 CSCI 5832 Spring 2007

Subcategorization *John sneezed the book *I prefer United has a flight *Give with a flight Subcat expresses the constraints that a predicate (verb for now) places on the number and syntactic types of arguments it wants to take (occur with). 11/20/2018 CSCI 5832 Spring 2007

So? So the various rules for VPs overgenerate. They permit the presence of strings containing verbs and arguments that don’t go together For example VP -> V NP therefore Sneezed the book is a VP since “sneeze” is a verb and “the book” is a valid NP 11/20/2018 CSCI 5832 Spring 2007

Next Time We’re now covering Chapters 11 and 12. Next time… Project discussion. Come prepared to talk about project ideas. We’ll break into preliminary groups on Thursday. 11/20/2018 CSCI 5832 Spring 2007