For Wednesday Finish Chapter 22 Program 4 due. Program 4 Any questions?

Slides:



Advertisements
Similar presentations
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
Advertisements

Syntax and Context-Free Grammars Julia Hirschberg CS 4705 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
For Wednesday Read chapter 19, sections 1-3 No homework.
Syntactic analysis using Context Free Grammars. Analysis of language Morphological analysis – Chairs, Part Of Speech (POS) tagging – The/DT man/NN left/VBD.
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
For Monday Read Chapter 23, sections 3-4 Homework –Chapter 23, exercises 1, 6, 14, 19 –Do them in order. Do NOT read ahead.
 Christel Kemke /08 COMP 4060 Natural Language Processing PARSING.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
Syntactic Parsing with CFGs CMSC 723: Computational Linguistics I ― Session #7 Jimmy Lin The iSchool University of Maryland Wednesday, October 14, 2009.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
Chapter 3: Formal Translation Models
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
Context-Free Grammar CSCI-GA.2590 – Lecture 3 Ralph Grishman NYU.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
1 Basic Parsing with Context Free Grammars Chapter 13 September/October 2012 Lecture 6.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
For Friday Read chapter 22 Program 4 due. Program 4 Any questions?
Compiler Construction 1. Objectives Given a context-free grammar, G, and the grammar- independent functions for a recursive-descent parser, complete the.
9/8/20151 Natural Language Processing Lecture Notes 1.
For Wednesday Read chapter 23, sections 1-2 Homework: –Chapter 22, exercises 1, 8, 14.
CCSB354 ARTIFICIAL INTELLIGENCE (AI)
ICS611 Introduction to Compilers Set 1. What is a Compiler? A compiler is software (a program) that translates a high-level programming language to machine.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
For Monday Read chapter 23, sections 1-2 FOIL exercise due.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
For Wednesday Finish chapter 22 No homework. Program 4 Any questions?
GRAMMARS David Kauchak CS159 – Fall 2014 some slides adapted from Ray Mooney.
For Monday Read Chapter 22, sections 4-6 No written homework.
October 2005csa3180: Parsing Algorithms 11 CSA350: NLP Algorithms Sentence Parsing I The Parsing Problem Parsing as Search Top Down/Bottom Up Parsing Strategies.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
11 Chapter 14 Part 1 Statistical Parsing Based on slides by Ray Mooney.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Parsing with Context-Free Grammars for ASR Julia Hirschberg CS 4706 Slides with contributions from Owen Rambow, Kathy McKeown, Dan Jurafsky and James Martin.
For Wednesday Read chapter 22, sections 4-6 Homework: –Chapter 18, exercise 7.
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
CSA2050 Introduction to Computational Linguistics Parsing I.
Natural Language - General
Basic Parsing Algorithms: Earley Parser and Left Corner Parsing
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
NLP. Introduction to NLP Motivation –A lot of the work is repeated –Caching intermediate results improves the complexity Dynamic programming –Building.
CSE573 Autumn /23/98 Natural Language Processing Administrative –PS3 due today –PS4 out Wednesday, due Friday 3/13 (last day of class) special.
CS 4705 Lecture 10 The Earley Algorithm. Review Top-Down vs. Bottom-Up Parsers –Both generate too many useless trees –Combine the two to avoid over-generation:
For Friday No reading Program 4 due. Program 4 Any questions?
For Friday No reading Take home exam due Exam 2. For Monday Read chapter 22, sections 1-3 FOIL exercise due.
Natural Language Processing Slides adapted from Pedro Domingos
GRAMMARS David Kauchak CS457 – Spring 2011 some slides adapted from Ray Mooney.
Natural Language Processing (NLP)
NATURAL LANGUAGE PROCESSING
NLP. Introduction to NLP #include int main() { int n, reverse = 0; printf("Enter a number to reverse\n"); scanf("%d",&n); while (n != 0) { reverse =
PARSING David Kauchak CS159 – Fall Admin Assignment 3 Quiz #1  High: 36  Average: 33 (92%)  Median: 33.5 (93%)
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Introduction to Parsing
Natural Language Processing Vasile Rus
Basic Parsing with Context Free Grammars Chapter 13
SYNTAX.
CS 388: Natural Language Processing: Syntactic Parsing
Natural Language - General
Parsing and More Parsing
CSA2050 Introduction to Computational Linguistics
David Kauchak CS159 – Spring 2019
David Kauchak CS159 – Spring 2019
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

For Wednesday Finish Chapter 22 Program 4 due

Program 4 Any questions?

Input/Output Coding Appropriate coding of inputs and outputs can make learning problem easier and improve generalization. Best to encode each binary feature as a separate input unit and for multi­valued features include one binary unit per value rather than trying to encode input information in fewer units using binary coding or continuous values.

I/O Coding cont. Continuous inputs can be handled by a single input by scaling them between 0 and 1. For disjoint categorization problems, best to have one output unit per category rather than encoding n categories into log n bits. Continuous output values then represent certainty in various categories. Assign test cases to the category with the highest output. Continuous outputs (regression) can also be handled by scaling between 0 and 1.

Neural Net Conclusions Learned concepts can be represented by networks of linear threshold units and trained using gradient descent. Analogy to the brain and numerous successful applications have generated significant interest. Generally much slower to train than other learning methods, but exploring a rich hypothesis space that seems to work well in many domains. Potential to model biological and cognitive phenomenon and increase our understanding of real neural systems. –Backprop itself is not very biologically plausible

Natural Language Processing What’s the goal?

Communication Communication for the speaker: –Intention: Decided why, when, and what information should be transmitted. May require planning and reasoning about agents' goals and beliefs. –Generation: Translating the information to be communicated into a string of words. –Synthesis: Output of string in desired modality, e.g.text on a screen or speech.

Communication (cont.) Communication for the hearer: –Perception: Mapping input modality to a string of words, e.g. optical character recognition or speech recognition. –Analysis: Determining the information content of the string. Syntactic interpretation (parsing): Find correct parse tree showing the phrase structure Semantic interpretation: Extract (literal) meaning of the string in some representation, e.g. FOPC. Pragmatic interpretation: Consider effect of overall context on the meaning of the sentence –Incorporation: Decide whether or not to believe the content of the string and add it to the KB.

Ambiguity Natural language sentences are highly ambiguous and must be disambiguated. I saw the man on the hill with the telescope. I saw the Grand Canyon flying to LA. I saw a jet flying to LA. Time flies like an arrow. Horse flies like a sugar cube. Time runners like a coach. Time cars like a Porsche.

Syntax Syntax concerns the proper ordering of words and its effect on meaning. The dog bit the boy. The boy bit the dog. * Bit boy the dog the Colorless green ideas sleep furiously.

Semantics Semantics concerns of meaning of words, phrases, and sentences. Generally restricted to “literal meaning” –“plant” as a photosynthetic organism –“plant” as a manufacturing facility –“plant” as the act of sowing

Pragmatics Pragmatics concerns the overall commuinicative and social context and its effect on interpretation. –Can you pass the salt? –Passerby: Does your dog bite? Clouseau: No. Passerby: (pets dog) Chomp! I thought you said your dog didn't bite!! Clouseau:That, sir, is not my dog!

Modular Processing acoustic/ phonetic syntaxsemanticspragmatics Speech recognition Parsing Sound waves wordsParse trees literal meaning meaning

Examples Phonetics “grey twine” vs. “great wine” “youth in Asia” vs. “euthanasia” “yawanna” ­> “do you want to” Syntax I ate spaghetti with a fork. I ate spaghetti with meatballs.

More Examples Semantics I put the plant in the window. Ford put the plant in Mexico. The dog is in the pen. The ink is in the pen. Pragmatics The ham sandwich wants another beer. John thinks vanilla.

Formal Grammars A grammar is a set of production rules which generates a set of strings (a language) by rewriting the top symbol S. Nonterminal symbols are intermediate results that are not contained in strings of the language. S ­> NP VP NP ­> Det N VP ­> V NP

Terminal symbols are the final symbols (words) that compose the strings in the language. Production rules for generating words from part of speech categories constitute the lexicon. N ­> boy V ­> eat

Context-Free Grammars A context­free grammar only has productions with a single symbol on the left­hand side. CFG: S ­> NP V NP ­> Det N VP ­> V NP not CFG: A B ­> C B C ­> F G

Simplified English Grammar S ­> NP VP S ­> VP NP ­> Det Adj* N NP ­> ProN NP ­> PName VP ­> V VP ­> V NP VP ­> VP PP PP ­> Prep NP Adj* ­> e Adj* ­> Adj Adj* Lexicon: ProN ­> I; ProN ­> you; ProN ­> he; ProN ­> she Name ­> John; Name ­> Mary Adj ­> big; Adj ­> little; Adj ­> blue; Adj ­> red Det ­> the; Det ­> a; Det ­> an N ­> man; N ­> telescope; N ­> hill; N ­> saw Prep ­> with; Prep ­> for; Prep ­> of; Prep ­> in V ­> hit; V­> took; V­> saw; V ­> likes

Parse Trees A parse tree shows the derivation of a sentence in the language from the start symbol to the terminal symbols. If a given sentence has more than one possible derivation (parse tree), it is said to be syntactically ambiguous.

Syntactic Parsing Given a string of words, determine if it is grammatical, i.e. if it can be derived from a particular grammar. The derivation itself may also be of interest. Normally want to determine all possible parse trees and then use semantics and pragmatics to eliminate spurious parses and build a semantic representation.

Parsing Complexity Problem: Many sentences have many parses. An English sentence with n prepositional phrases at the end has at least 2 n parses. I saw the man on the hill with a telescope on Tuesday in Austin... The actual number of parses is given by the Catalan numbers: 1, 2, 5, 14, 42, 132, 429, 1430, 4862,

Parsing Algorithms Top Down: Search the space of possible derivations of S (e.g.depth­first) for one that matches the input sentence. I saw the man. S ­> NP VP NP ­> Det Adj* N Det ­> the Det ­> a Det ­> an NP ­> ProN ProN ­> I VP ­> V NP V ­> hit V ­> took V ­> saw NP ­> Det Adj* N Det ­> the Adj* ­> e N ­> man

Parsing Algorithms (cont.) Bottom Up: Search upward from words finding larger and larger phrases until a sentence is found. I saw the man. ProN saw the man ProN ­> I NP saw the man NP ­> ProN NP N the man N ­> saw (dead end) NP V the man V ­> saw NP V Det man Det ­> the NP V Det Adj* man Adj* ­> e NP V Det Adj* N N ­> man NP V NP NP ­> Det Adj* N NP VP VP ­> V NP S S ­> NP VP

Bottom­up Parsing Algorithm function BOTTOM­UP­PARSE(words, grammar) returns a parse tree forest  words loop do if LENGTH(forest) = 1 and CATEGORY(forest[1]) = START(grammar) then return forest[1] else i  choose from {1...LENGTH(forest)} rule  choose from RULES(grammar) n  LENGTH(RULE­RHS(rule)) subsequence  SUBSEQUENCE(forest, i, i+n­1) if MATCH(subsequence, RULE­RHS(rule)) then forest[i...i+n­1] / [MAKE­NODE(RULE­LHS(rule), subsequence)] else fail end