Logic Form Representations Reading: Chap 14, Jurafsky & Martin Slide set adapted from Vasile Rus, U. Memphis Instructor: Rada Mihalcea.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Chunking: Shallow Parsing Eric Atwell, Language Research Group.
Advertisements

March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Semantics (Chapter 17) Muhammed Al-Mulhem March 1, 2009.
CS460/IT632 Natural Language Processing/Language Technology for the Web Lecture 2 (06/01/06) Prof. Pushpak Bhattacharyya IIT Bombay Part of Speech (PoS)
COGEX at the Second RTE Marta Tatu, Brandon Iles, John Slavick, Adrian Novischi, Dan Moldovan Language Computer Corporation April 10 th, 2006.
COGEX at the Second RTE Marta Tatu, Brandon Iles, John Slavick, Adrian Novischi, Dan Moldovan Language Computer Corporation April 10 th, 2006.
Semantics (Representing Meaning)
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
Statistical NLP: Lecture 3
MORPHOLOGY - morphemes are the building blocks that make up words.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
C. Varela; Adapted w/permission from S. Haridi and P. Van Roy1 Declarative Computation Model Defining practical programming languages Carlos Varela RPI.
NaLIX: A Generic Natural Language Search Environment for XML Data Presented by: Erik Mathisen 02/12/2008.
Introduction to Semantics To be able to reason about the meanings of utterances, we need to have ways of representing the meanings of utterances. A formal.
Basi di dati distribuite Prof. M.T. PAZIENZA a.a
CS 330 Programming Languages 09 / 18 / 2007 Instructor: Michael Eckmann.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
1 Information Retrieval and Extraction 資訊檢索與擷取 Chia-Hui Chang, Assistant Professor Dept. of Computer Science & Information Engineering National Central.
Information Retrieval and Extraction 資訊檢索與擷取 Chia-Hui Chang National Central University
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
(2.1) Grammars  Definitions  Grammars  Backus-Naur Form  Derivation – terminology – trees  Grammars and ambiguity  Simple example  Grammar hierarchies.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
PARSING David Kauchak CS457 – Fall 2011 some slides adapted from Ray Mooney.
Empirical Methods in Information Extraction Claire Cardie Appeared in AI Magazine, 18:4, Summarized by Seong-Bae Park.
Probabilistic Parsing Reading: Chap 14, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
Interpreting Dictionary Definitions Dan Tecuci May 2002.
1 Statistical Parsing Chapter 14 October 2012 Lecture #9.
Jennie Ning Zheng Linda Melchor Ferhat Omur. Contents Introduction WordNet Application – WordNet Data Structure - WordNet FrameNet Application – FrameNet.
Chapter 2 Data Models Database Systems: Design, Implementation, and Management, Rob and Coronel Adapted for INFS-3200.
Methods for the Automatic Construction of Topic Maps Eric Freese, Senior Consultant ISOGEN International.
©2003 Paula Matuszek CSC 9010: Information Extraction Dr. Paula Matuszek (610) Fall, 2003.
SYMPOSIUM ON SEMANTICS IN SYSTEMS FOR TEXT PROCESSING September 22-24, Venice, Italy Combining Knowledge-based Methods and Supervised Learning for.
A Cascaded Finite-State Parser for German Michael Schiehlen Institut für Maschinelle Sprachverarbeitung Universität Stuttgart
Context Free Grammars Reading: Chap 9, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Rada Mihalcea.
Linguistic Essentials
Chapter 3 Part II Describing Syntax and Semantics.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Artificial Intelligence: Natural Language
CSA2050 Introduction to Computational Linguistics Parsing I.
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
WordNet Enhancements: Toward Version 2.0 WordNet Connectivity Derivational Connections Disambiguated Definitions Topical Connections.
Finding frequent and interesting triples in text Janez Brank, Dunja Mladenić, Marko Grobelnik Jožef Stefan Institute, Ljubljana, Slovenia.
Supertagging CMSC Natural Language Processing January 31, 2006.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
Natural Language Processing Lecture 14—10/13/2015 Jim Martin.
Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.
Answer Mining by Combining Extraction Techniques with Abductive Reasoning Sanda Harabagiu, Dan Moldovan, Christine Clark, Mitchell Bowden, Jown Williams.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Parsing and Code Generation Set 24. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program,
Handling Unlike Coordinated Phrases in TAG by Mixing Syntactic Category and Grammatical Function Carlos A. Prolo Faculdade de Informática – PUCRS CELSUL,
Knowledge Structure Vijay Meena ( ) Gaurav Meena ( )
AUTONOMOUS REQUIREMENTS SPECIFICATION PROCESSING USING NATURAL LANGUAGE PROCESSING - Vivek Punjabi.
Chunk Parsing. Also called chunking, light parsing, or partial parsing. Method: Assign some additional structure to input over tagging Used when full.
©2012 Paula Matuszek CSC 9010: Information Extraction Overview Dr. Paula Matuszek (610) Spring, 2012.
GRAMMARS & PARSING. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program, referred to as a.
Word classes and part of speech tagging. Slide 1 Outline Why part of speech tagging? Word classes Tag sets and problem definition Automatic approaches.
Overview of Statistical NLP IR Group Meeting March 7, 2006.
NATURAL LANGUAGE PROCESSING
Sentiment Analysis Using Common- Sense and Context Information Basant Agarwal 1,2, Namita Mittal 2, Pooja Bansal 2, and Sonal Garg 2 1 Department of Computer.
Chapter 3 – Describing Syntax CSCE 343. Syntax vs. Semantics Syntax: The form or structure of the expressions, statements, and program units. Semantics:
Describing Syntax and Semantics
Statistical NLP: Lecture 3
Semantics (Representing Meaning)
CSC 594 Topics in AI – Applied Natural Language Processing
Natural Language - General
Bulgarian WordNet Svetla Koeva Institute for Bulgarian Language
PREPOSITIONAL PHRASES
Linguistic Essentials
Teori Bahasa dan Automata Lecture 9: Contex-Free Grammars
Presentation transcript:

Logic Form Representations Reading: Chap 14, Jurafsky & Martin Slide set adapted from Vasile Rus, U. Memphis Instructor: Rada Mihalcea

Slide 1 Problem Description There is need for Knowledge Bases E.g.: Question Answering 1. find the answer to Q471: What year did Hitler die? in a collection of documents A: “ Hitler committed suicide in 1945 ” 2. how would one justify that it is the right answer: using world knowledge suicide – {kill yourself} kill – {cause to die} Create intelligent interfaces to databases: E.g.: Where can I eat Italian food? Or: I'd like some pizza for dinner. Where can I go?

Slide 1 How to Build Knowledge Bases? Manually - building common sense knowledge bases - see Cyc, Open Mind Common Sense Automatically - from open text - from dictionaries like WordNet

Slide 1 Logic Form Representation What representation to use? Logic Form (LF) is a knowledge representation introduced by Jerry Hobbs (1983) Logic form is a first-order representation based on natural language

Slide 1 First Order Representations Fulfil the five main desiderata for representing meaning: 1. Verifiability: Does Maharani serve vegetarian food? Serves(Maharani, vegetarian food) A representation that can be used to match a proposition against a knowledge base 2. Unambiguous representations: I would like to eat someplace close to UNT. = eat in a place near UNT = eat a place Get rid of ambiguity by assigning a sense to words, or by adding additional information that rules out ambiguity. A representation should be free of ambiguity.

Slide 1 First Order Representations 3. Canonical Form Does Maharani serve vegetarian food? Are vegetarian dishes served at Maharani? Do they have vegetarian food at Maharani? Texts that have the same meaning should have the same representation. 4. Inference and Variables The ability to draw inferences from the representations Serves(x, Vegetarian Food) --> EatAt(Vegetarians, x) 5. Expresiveness Representations should be expressive enough to handle a wide range of subjects.

Slide 1 Induction, Abduction Use FOP for automatic reasoning How? Induction Abduction

Slide 1 Logic Form Transformations First order representations - have the characteristics of FOP Add some extra information (e.g. POS, word sense) Derived automatically from text, starting with parse trees Used for automatic construction of knowledge bases: - e.g. Starting with WordNet

Slide 1 WordNet as a Source of World Knowledge [review] WordNet, developed at Princeton by Prof. Miller, is an electronic semantic network whose main element is the synset – synset – a set of synonym words that define a concept E.g.: {cocoa, chocolate, hot chocolate} a word may belong to more than one synset WordNet contains synsets for four parts of speech: noun, verb, adjective and adverb synsets are related to each other via a set of relations: hypernymy (ISA), hyponymy(reverseISA), cause, entailment, meronymy(PART-OF) and others. hypernymy is the most important relation which organizes concepts in a hierarchy (see next slide) adjectives and adverbs are organized in clusters based on similarity and antonymy relations

Slide 1 WordNet glosses Each synset includes a small textual definition and one or more examples that form a gloss. E.g.: – {suicide:n#1} – {killing yourself} – {kill:v#1} – {cause to die} – {extremity, appendage, member} – {an external body part that projects from the body “ it is important to keep the extremities warm ” } Glosses are a rich source of world knowledge Can transform glosses into a computational representation

Slide 1 Logic Form Representation A predicate is a concatenation of the morpheme ’ s base form, part of speech and WordNet semantic sense – morpheme:POS#sense(list_of_arguments) There are two types of arguments: – x – for entities – e – for events The position of the arguments is important – verb:v#sense(e, subject, direct_object, indirect_object) – preposition(head, prepositional_object) A predicate is generated for each noun, verb, adjective and adverb Complex nominals are represented using the predicate nn: – e.g.: “ goat hair ” – nn(x1, x2, x3) & goat(x2) & hair(x3) The logic form of a sentence is the conjunction of individual predicates

Slide 1 An Example {lawbreaker, violator}: (someone who breaks the law) Someone:n#1(x1) & break:v#6(e1, x1, x2) & law:n#1(x2) Part of Speech WordNet sense SubjectDirect object Categorial Information Semantic Information Functional Information

Slide 1 Logic Form Notation (cont ’ d) Ignores: plurals and sets, verb tenses, auxiliaries, negation, quantifiers, comparatives Consequence: – Glosses with comparatives can not be fully transformed in logic forms The original notation does not handle special cases of postmodifiers (modifiers placed after modifee) respectively relative adverbs (where, when, how, why)

Slide 1 Comparatives {tower}: (structure taller than its diameter) taller/JJR modifies structure or diameter? Both? Solution: introduce a relation between structure and diameter LF: structure(x1) & taller(x1, x2) & diameter(x2)

Slide 1 Postmodifiers {achromatic_lens}: (a compound lens system that forms an image free from chromatic_aberration) Free is a modifier of image ? What is the prepositional head of from ? Solution: free_from – NEW predicate LF: image(x1) & free_from(x1, x2) & chromatic_aberration(x2)

Slide 1 Relative Adverbs {airdock}: (a large building at an airport where aircraft can be stored) Equivalent to: (aircraft can be stored in a large building at an airport) LF: large(x1) & building(x1) & at(x1, x2) & airport(x2) & where(x1, e1) & aircraft(x3) & store(e1, x4, x3)

Slide 1 Logic Form Identification Take advantage of the structural information embedded in a parse tree NP VP S -> NP VP NP VP-PASS NP VP-ACT Direct object Subject S Preprocess (Extract Defs, Tokenize) POS Tag Parse LF Transformer Architecture

Slide 1 Example of Logic Form NP VP DTNNVBNPP IN NP DTNN a monastery ruled by an abbot monastery:n(x1) rule:v(e1, x2, x1) abbot:n(x2)

Slide 1 Logic Form Derivation Take advantage of the syntactic information from the parser For each grammar rule derive one or more LF identification rules {abbey:n#3}(VP (ruled/VBN by/PP)) Verb(e, -, -)/VP-PASS by/PP(-,x)  verb(e,x, -) & by(e,x) VP  VP PP {abbey:n#3}(NP (a/DT monastery/NP)) Noun/NN  noun(x)NP  DT NN SynsetPhraseRuleGrammar Rule Identification Rules NP DTNN VP PP

Slide 1 Building Logic Forms from WordNet From definitions to axioms WordNet glosses transformed into axioms, to enable automated reasoning Specific rules to derive axioms for each part of speech: – Nouns: the noun definition consists of a genus and differentia. The generic axiom is: concept(x)  genus(x) & differentia(x). E.g.: abbey(x1)  monastery(x1) & rule(e1, x2, x1) & abbot(x2) – Verbs: are more trickier as some syntactic functional changes can occur from the left hand side to the right hand side E.g.: kill:v#1(e1, x1, x2, x3)  cause(e2, x1, e3, x3) & die(e3, x2) – Adjectives: they borrow a virtual argument representing the head they modify E.g.: american:a#1(x1)  of(x1, x2) & United_States_Of_America(x2) – Adverbs: the argument of an adverb borrows a virtual event argument as they usually modify an event E.g: fast:r#1(e1)  quickly:r#1(e1)

Slide 1 Building a Knowledge Base from WordNet Parse all glosses and extract all grammar rules embedded in the parse trees The grammar is large If we consider that a grammar rule can map in more than one LF rules the effort to analyse and implement all of them would be tremendous 9,826Total 639Adverbs 1,958Adjectives 1,837Verb 5,392Noun RulesPart of speech

Slide 1 Coverage issue Group the grammar rules by the non terminal on the Left Hand Side (LHS) and notice that the most frequent rules for some class cover most of the occurrences of rules belonging to that class The coverage of top 10 most frequent grammar rule for phrases as measured in 10,000 noun glosses. What does this remind you of? 99%4012,315PP 99%3514,740S 70%45019,415VP 95%24411,408NP 69%85733,643Base NP Coverage of top tenUnique RulesOccurrencesPhrase on the LHS of Grammar Rule

Slide 1 Coverage issue (cont ’ d) Two phases: – Phase 1: develop LF rules for most frequent rules and ignore the others – Phase 2: select more valuable rules The accuracy of each LF rule is almost perfect The performance issue is mainly about how many glosses are entirely transformed into LF i.e. how many glosses the selected grammar rules fully map into LF

Slide 1 Reduce the number of candidate grammar rules (1) Selected grammar rules for baseNPs (non-recursive NPs) have only a coverage of 69% Selected grammar rules for VPs have only 70% coverage Before selecting rules for baseNPs we make some transformations to reduce more complex ones to simpler ones Coordinated base NPs are transformed into coordinated NPs and simple base NPs NP DTNNCCNN NPCCNP DTNN a ruler or institution

Slide 1 Reduce the number of candidate grammar rules (2) Base Nps: – Determiners are ignored (an increase of 11% in coverage for selected grammar rules for base NPs) – Plurals are ignored – Everything in a prenominal position plays the role of a modifier VPs: – Negation is ignored – Tenses are ignored (auxiliaries and modals) NP  DT VBN NN| NNS | NNP|NNPS NP  DT VBG NN|NNS|NNP|NNPS NP  DT JJ NN|NNS|NNP|NNPS Base NP rule

Slide 1 Map grammar rules into LF rules Selected grammar rules map into one or more Logic Form rules Case 1: grammar rule is mapped into one LF rule – Grammar rule: PP -> IN NP – LFT: prep(_, x)  prep(_, x) & headNP(x) Case 2: grammar rule is mapped into one or more LF rules: – Grammar rule: VP -> VP PP – LFT 1: verb(e, x1, _)  verb-PASS(e,x1, _) & prep-By(e, x1) – LFT 2: verb(e, _, x2)  verb-PASS(e, _, x2) & prep-nonBy(e, x2) – To differentiate among the two cases we use two features: The mood of the VP: active or passive The type of preposition: by or non-by

Slide 1 Logic Form Derivation Results Phase 1: – From a corpus of 10,000 noun glosses extract grammar rules, sort them by the nonterminal on the LHS, select the most frequent grammar rules and generate LF rules for them – Manually develop a test corpus of 400 glosses – Test the implemented LF rules on 400 noun glosses – 72% coverage (with almost 100% accuracy) Phase 2: – Select iteratively more rules that bring an increase in coverage of at least  – For glosses  was established at 1% This resulted in a total number of 70 grammar rules selected The new coverage achieved is 81% Open issue: how to fully cover the remaining 19% of glosses which are not fully transformed – using a set of heuristics E.g.: if the subject argument of a verb is missing use the first previous noun as its subject

Slide 1 Question Answering Application Given a question and an answer the task is to select the answer from a set of candidate answers and to automatically justify that the answer is the right answer Ideal case: all the keywords from the question together with their syntactic relationship exist in the answer – Question: What year did Hitler die? – Perfect Answer: Hitler died in Real case: – Real Answer: Hitler committed suicide in – Requires extra resources to link suicide to die: use WordNet as a knowledge base