Attempto Controlled English Norbert Fuchs et al. Dept CS, Univ Zurich, Switzerland as (mis-)interpreted by Peter Clark.

Slides:



Advertisements
Similar presentations
Grammar and Sentences “It is impossible ..to teach English grammar in the schools for the simple reason that no one knows exactly what it is” Government.
Advertisements

The Practical Value of Statistics for Sentence Generation: The Perspective of the Nitrogen System Irene Langkilde-Geary.
APA Style Grammar. Verbs  Use active rather than passive voice, select tense and mood carefully  Poor: The survey was conducted in a controlled setting.
(It’s not that bad…). Error ID  They give you a sentence  Four sections are underlined  E is ALWAYS “No error”  Your job is to identify which one,
Statistical NLP: Lecture 3
Chapter 4 Basics of English Grammar
Building and Analyzing Social Networks Web Data and Semantics in Social Network Applications Dr. Bhavani Thuraisingham February 15, 2013.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
NaLIX: A Generic Natural Language Search Environment for XML Data Presented by: Erik Mathisen 02/12/2008.
Parsing: Features & ATN & Prolog By
LING 364: Introduction to Formal Semantics Lecture 4 January 24th.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
1 CONTEXT-FREE GRAMMARS. NLE 2 Syntactic analysis (Parsing) S NPVP ATNNSVBD NP AT NNthechildrenate thecake.
1 CSC 594 Topics in AI – Applied Natural Language Processing Fall 2009/ Outline of English Syntax.
Chapter 9 Domain Models 1CS6359 Fall 2012 John Cole.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing INTRODUCTION Muhammed Al-Mulhem March 1, 2009.
1 A Chart Parser for Analyzing Modern Standard Arabic Sentence Eman Othman Computer Science Dept., Institute of Statistical Studies and Research (ISSR),
ANSWERING CONTROLLED NATURAL LANGUAGE QUERIES USING ANSWER SET PROGRAMMING Syeed Ibn Faiz.
Automating Translation in the Localisation Factory An Investigation of Post-Editing Effort Sharon O’Brien Dublin City University.
Type-Directed, Whitespace-Delimited Parsing for Embedded DSLs Cyrus Omar School of Computer Science Carnegie Mellon University [GlobalDSL13] Benjamin ChungAlex.
GRAMMAR APPROACH By: Katherine Marzán Concepción EDUC 413 Prof. Evelyn Lugo.
Chapter 4 Basics of English Grammar Business Communication Copyright 2010 South-Western Cengage Learning.
Shallow “Learning by Reading” In Slate Need either transition section from Selmer to Micah, or Title slide plus not-crappy title!
Ontologies Reasoning Components Agents Simulations Belief Update, Planning and the Fluent Calculus Jacques Robin.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Artificial Intelligence 4. Knowledge Representation Course V231 Department of Computing Imperial College, London © Simon Colton.
Artificial intelligence project
Overview Project Goals –Represent a sentence in a parse tree –Use parses in tree to search another tree containing ontology of project management deliverables.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Linguistic Modification of Test Items Jamal Abedi University of California,
Copy right 2003 Adam Pease permission to copy granted so long as slides and this notice are not altered Language to Logic Translation.
GTRI.ppt-1 NLP Technology Applied to e-discovery Bill Underwood Principal Research Scientist “The Current Status and.
Notes on Pinker ch.7 Grammar, parsing, meaning. What is a grammar? A grammar is a code or function that is a database specifying what kind of sounds correspond.
Syntax Why is the structure of language (syntax) important? How do we represent syntax? What does an example grammar for English look like? What strategies.
Ideas for 100K Word Data Set for Human and Machine Learning Lori Levin Alon Lavie Jaime Carbonell Language Technologies Institute Carnegie Mellon University.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Proposed NWI KIF/CG --> Common Logic Standard A working group was recently formed from the KIF working group. John Sowa is the only CG representative so.
What is a M.C. Cloze? Section C – Reading and Language System.
1 CS 2710, ISSP 2610 Chapter 12 Knowledge Representation.
Rules, Movement, Ambiguity
Boeing’s Language Understanding Engine (BLUE) and its Performance on the Shared Task Peter Clark, Phil Harrison, (Boeing Phantom Works)
OilEd An Introduction to OilEd Sean Bechhofer. Topics we will discuss Basic OilEd use –Defining Classes, Properties and Individuals in an Ontology –This.
Natural Language Processing Chapter 1 : Introduction.
Knowledge Representation
Prof Rickus’ Rules of Writing “The Elements of Style” 4th Edition Strunk and White An Excellent Writing Reference:
What do we mean by Syntax? Unit 6 – Presentation 1 “the order or arrangement of words within a sentence” And what is a ‘sentence’? A group of words that.
lamsweerde Chap.4: Formal Requirements Specification © 2009 John Wiley and Sons Fundamentals of RE Chapter 4 Requirements.
Programming Errors. Errors of different types Syntax errors – easiest to fix, found by compiler or interpreter Semantic errors – logic errors, found by.
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Spring 2006-Lecture 2.
SYNTAX.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Basic Syntactic Structures of English CSCI-GA.2590 – Lecture 2B Ralph Grishman NYU.
For Monday Read chapter 26 Homework: –Chapter 23, exercises 8 and 9.
Wiring Up ORION Active Structure The process that converts text to a usable structure.
Grammar and Sentence Writing ENG 111 Al-Huqail, Eman.
Artificial Intelligence Knowledge Representation.
Lecture 1 Sentences Verbs.
SPAG Parent Workshop April Agenda English and the new SPaG curriculum How to help your children at home How we teach SPaG Sample questions from.
OKBC (Open Knowledge Base Connectivity) An API For Knowledge Servers
The Semantic Web By: Maulik Parikh.
Statistical NLP: Lecture 3
Year 2 Objectives: Writing
Chapter 4 Basics of English Grammar
Translation Problems.
Ontology.
Ontology.
Chapter 4 Basics of English Grammar
A Teaching Plan Presentation
ONTOMERGE Ontology translations by merging ontologies Paper: Ontology Translation on the Semantic Web by Dejing Dou, Drew McDermott and Peishen Qi 2003.
Presentation transcript:

Attempto Controlled English Norbert Fuchs et al. Dept CS, Univ Zurich, Switzerland as (mis-)interpreted by Peter Clark

Background Funded continuously from 1989 to 2003 by Swiss NSF. Employed in two master theses (robot control and querying an ontological database). Also taught to students of Univ Dresden. Informal collaborations with hospital (use of controlled language for patient records) –Demand for past tense and passive voice Now: funding from EC as part of the “Network of Excellence” –ACE was chosen as (basis of) the controlled English for the EU Network of Excellence REWERSE (2004-8) –Initial requirements include: verbalization of formal languages in ACE, NAF as well as logical negation, decidable ACE subset. Norbert Fuchs is now senior research fellow with the Univ Zurich heading Zurich’s part of REWERSE for next 4 years. Not a stepping stone to full NL; rather full NL is considered largely out of reach.

ACE: Overview and Approach ACE is completely (& deliberately) knowledge poor –(Exception: numbers, groups, equality) Parsing in ACE is deterministic –“predictability is better than smarts” –users learn the disambiguation rules, e.g., PPs attach right-to-left rules for coordination More like an English-like programming language than a “natural” language –Though: Norbert claims can achieve both –“Only suitable for logic-like domains, not common- sense reasoning”

The ACE System Three main bits of software: –Language to logic translation –Lexicon editor –Theorem prover

ACE Grammar (overview) Vocabulary –Words only have one sense (per part of speech) –No ontology (sense taxonomy) Basic sentence: –subject + verb + complements + adjuncts –e.g., “The driver stops the train at the station” Composite sentences: coordination (and, or) subordination (if, then) verb phrase negation (does not, is not) noun phrase negation (no) quantification (a, there is, for every)

ACE Words only have one sense (per part of speech) Compound nouns are not decomposed –enter them explicitly in the lexicon Anaphora resolution: –simple search of an object stack + recency No inheritance hierarchy Coordination

Ambiguity Resolution Fixed interpretation rules, e.g., –Relative phrases always attach to the immediately preceding noun phrase –PPs always attach to the verb If there is a misattachment, e.g., in “The driver stops the train with the defective brake”, the user rewrites with a relative phrase, e.g., “the driver stops the train that has a defective brake”. –multiple PPs attach right-to-left –Plurals have a collective interpretation by default “Three men lift a table” Use of “cue” phrases to overwrite defaults, e.g., “Each of three men lifts a table”, to get distributive interpretation

ACE No semantic roles: “the person hits the nail”  event(E,hit(person,nail)) “the hammer hits the nail”  event(E,hit(hammer,nail)) –User must be careful! Interpretation is context-free Verbs with multiple alternations are not allowed –e.g., “X gave Y Z”, “X gave Y” –no guessing of missing object Ditransitive verbs not allowed –instead use a preposition (e.g., “John gives a book to Mary”, rather than “John gave Mary a book”) prepositions need to be used consistently by user –e.g., “in the morning”  “during the morning” –with one exception: “in the bed” vs. “in the morning” disambiguated based on object type (physical vs. temporal)

Background Knowledge in ACE ACE deliberately has little prior, built-in knowledge, e.g.,; –No built-in knowledge that “give”, “give_to” are related –No built-in knowledge of interword relationships (e.g., “has” vs. “owns”) –No built-in knowledge of how verbs and nominalizations relate Rather, philosophy is to allow user to specify such knowledge using axioms, so user has control, e.g., –“If a person X gives an object Y to a person Z, then the person X gives the object Y.” –“If a person X has an object Y then the person X owns the object Y”; or simply “If a person has an object then the person owns the object”

Multiple ways of saying things… “The box is red” –box(A) & property(A,red) “The color of the box is red” –box(A) & color-of(A,B) & property(B,red) “The red box” –box(A) & property(A,red)

2. The Lexicon Editor Simple TTY interface; user enters word then: –part of speech –nouns: singular/plural mass/count person/location/time/object –verbs: 3 rd person singular & plural transitive/intransitive/ditransitive Is accessible via the Web. more fancy Web-based interface is also under development.

3. Theorem Proving (Reasoning) ACE translates to first-order logic, not Prolog –wanted non-Horn rules No temporal reasoning Use a theorem-prover to spot contradictions –spent 2 yrs wrestling with it (!) –Will help users debug the KB (in theory) Events are reified.

Also… Feedback to help user –Feeback on failed parses: ACE will identify the first sentence that failed to parse, and try to give more info. –Paraphrase of successful parses: Helpful, providing that the paraphrase does not itself contain the ambiguity in the original sentence.