1© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Expressing Probabilistic Context-Free Grammars in the Relaxed Unification Formalism Tony Abou-Assaleh.

Slides:



Advertisements
Similar presentations
School of something FACULTY OF OTHER School of Computing FACULTY OF ENGINEERING Chunking: Shallow Parsing Eric Atwell, Language Research Group.
Advertisements

Prolog programming....Dr.Yasser Nada. Chapter 8 Parsing in Prolog Taif University Fall 2010 Dr. Yasser Ahmed nada prolog programming....Dr.Yasser Nada.
Some Prolog Prolog is a logic programming language
Computational language: week 10 Lexical Knowledge Representation concluded Syntax-based computational language Sentence structure: syntax Context free.
1 IPSI 2003 © 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj An Overview of the Theory of Relaxed Unification Tony Abou-Assaleh Nick Cercone & Vlado Keselj.
CSC411Artificial Intelligence 1 Chapter 3 Structures and Strategies For Space State Search Contents Graph Theory Strategies for Space State Search Using.
Natural Language Processing Lecture 2: Semantics.
CSA2050: DCG I1 CSA2050 Introduction to Computational Linguistics Lecture 8 Definite Clause Grammars.
Logic Programming – Part 2 Lists Backtracking Optimization (via the cut operator) Meta-Circular Interpreters.
Cs7120 (Prasad)L21-DCG1 Definite Clause Grammars
CSA4050: Advanced Topics in NLP Semantics IV Partial Execution Proper Noun Adjective.
Peter Andreae Computer Science Victoria University of Wellington Copyright: Peter Andreae, Victoria University of Wellington Prolog COMP 112 #
Proceedings of the Conference on Intelligent Text Processing and Computational Linguistics (CICLing-2007) Learning for Semantic Parsing Advisor: Hsin-His.
April 2010CSA50061 Logic, Representation and Inference Simple Question Answering NL Access to Databases Semantics of Questions and Answers Simple Interpreters.
DEFINITE CLAUSE GRAMMARS Ivan Bratko University of Ljubljana Faculty of Computer and Information Sc.
September PROBABILISTIC CFGs & PROBABILISTIC PARSING Universita’ di Venezia 3 Ottobre 2003.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Natural Language and Speech Processing Creation of computational models of the understanding and the generation of natural language. Different fields coming.
The Query Compiler Varun Sud ID: 104. Agenda Parsing  Syntax analysis and Parse Trees.  Grammar for a simple subset of SQL  Base Syntactic Categories.
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
CS4705 Natural Language Processing.  Regular Expressions  Finite State Automata ◦ Determinism v. non-determinism ◦ (Weighted) Finite State Transducers.
LING 388 Language and Computers Lecture 2 9/04/03 Sandiway FONG.
Big Ideas in Cmput366. Search Blind Search Iterative deepening Heuristic Search A* Local and Stochastic Search Randomized algorithm Constraint satisfaction.
Part II. Statistical NLP Advanced Artificial Intelligence Probabilistic DCGs and SLPs Wolfram Burgard, Luc De Raedt, Bernhard Nebel, Lars Schmidt-Thieme.
LING 388 Language and Computers Take-Home Final Examination 12/9/03 Sandiway FONG.
Understanding SVG Diagrams Interim Presentation Vivek Ramakrishna Honours Program, Computer Science
Context Free Grammar S -> NP VP NP -> det (adj) N
Formal Aspects of Computer Science – Week 12 RECAP Lee McCluskey, room 2/07
October 2004csa4050: Semantics II1 CSA4050: Advanced Topics in NLP Semantics II The Lambda Calculus Semantic Representation Encoding in Prolog.
BİL711 Natural Language Processing1 Statistical Parse Disambiguation Problem: –How do we disambiguate among a set of parses of a given sentence? –We want.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
A Cognitive Substrate for Natural Language Understanding Nick Cassimatis Arthi Murugesan Magdalena Bugajska.
An Extended GHKM Algorithm for Inducing λ-SCFG Peng Li Tsinghua University.
Abstract Question answering is an important task of natural language processing. Unification-based grammars have emerged as formalisms for reasoning about.
CS 3240: Languages and Computation Context-Free Languages.
Natural Language Processing Artificial Intelligence CMSC February 28, 2002.
Semantically Processing The Semantic Web Presented by: Kunal Patel Dr. Gopal Gupta UNIVERSITY OF TEXAS AT DALLAS.
Semantic Construction lecture 2. Semantic Construction Is there a systematic way of constructing semantic representation from a sentence of English? This.
3.2 Semantics. 2 Semantics Attribute Grammars The Meanings of Programs: Semantics Sebesta Chapter 3.
The man bites the dog man bites the dog bites the dog the dog dog Parse Tree NP A N the man bites the dog V N NP S VP A 1. Sentence  noun-phrase verb-phrase.
Logical and Functional Programming
Introduction to Prolog. Outline What is Prolog? Prolog basics Prolog Demo Syntax: –Atoms and Variables –Complex Terms –Facts & Queries –Rules Examples.
Evaluating Models of Computation and Storage in Human Sentence Processing Thang Luong CogACLL 2015 Tim J. O’Donnell & Noah D. Goodman.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
DERIVATION S RULES USEDPROBABILITY P(s) = Σ j P(T,S) where t is a parse of s = Σ j P(T) P(T) – The probability of a tree T is the product.
Instructor: Nick Cercone CSEB - 1 Parsing and Context Free Grammars Parsers, Top Down, Bottom Up, Left Corner, Earley.
NLP. Introduction to NLP Time flies like an arrow –Many parses –Some (clearly) more likely than others –Need for a probabilistic ranking method.
In The Name Of Allah Lab 03 1Tahani Aldweesh. objectives Searching for the solution’s. Declaration. Query. Comments. Prolog Concepts. Unification. Disjunction.
CS : Speech, NLP and the Web/Topics in AI Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture-15: Probabilistic parsing; PCFG (contd.)
An argument-based framework to model an agent's beliefs in a dynamic environment Marcela Capobianco Carlos I. Chesñevar Guillermo R. Simari Dept. of Computer.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. C H A P T E R N I N E Logic Programming.
LING/C SC/PSYC 438/538 Lecture 17 Sandiway Fong. Last Time Talked about: – 1. Declarative (logical) reading of grammar rules – 2. Prolog query: s(String,[]).
Statistical NLP Winter 2009
Semantic Parsing for Question Answering
Natural Language Processing
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Natural Language Processing
LING/C SC/PSYC 438/538 Lecture 21 Sandiway Fong.
Martin Kay Stanford University
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
Recitation Akshay Srivatsan
Jess Architecture Diagram WORKING MEMORY INFERENCE ENGINE EXECUTION ENGINE PATTERN MATCHER RULE BASE Here you can see how all the parts fit.
Improving an Open Source Question Answering System
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 26
C H A P T E R N I N E Logic Programming.
CS4705 Natural Language Processing
Parsing Unrestricted Text
Prof. Pushpak Bhattacharyya, IIT Bombay
Presentation transcript:

1© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Expressing Probabilistic Context-Free Grammars in the Relaxed Unification Formalism Tony Abou-Assaleh Nick Cercone & Vlado Keselj Faculty of Computer Science Dalhousie University

2© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Agenda Relaxed Unification RU System PCFG in PROLOG PCFG in RU System Conclusion Future Work

3© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Relaxed Unification Classical Unification: t1 = f(a,h(b)) t2 = f(a,h(c)) t1 U t2 = fail Relaxed Unification t1 = {f( {a}, {h( {b} )} )} t2 = {f( {a}, {h( {c} )} )} t1 U R t2 = {f( {a}, {h( {b,c} )} )}

4© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj RU System Relaxed unification inference engine Syntax is based on first-order logic Relaxed predicates and queries Degree of belief  Correctness function

5© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in PPROLOG Sample PCFG NP -> D N /0.8 NP -> N /0.2 D -> a /0.6 D -> the /0.4 N -> fox /1.0

6© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in PROLOG Probabilistic DCG in PROLOG np(X) --> d(Y), n(Z), {X is 0.8*Y*Z}. np(X) --> n(Y), {X is 0.2*Y}. d(X) --> [a], {X is 0.6}. d(X) --> [the], {X is 0.4}. n(X) --> [fox], {X is 1.0}.

7© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in PROLOG PROLOG Internal Representation np(X, S0, S) :- d(Y, S0, S1), n(Z, S1, S), X is 0.8*Y*Z. np(X, S0, S) :- n(Y, S1, S), X is 0.2*Y. d(X, S0, S) :- ’C’(S0, a, S), X is 0.6. d(X, S0, S) :- ’C’(S0, the, S), X is 0.4. n(X, S0, S) :- ’C’(S0, fox, S), X is 1.0. ’C’([X|S], X, S).

8© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in PROLOG Example query: ?- np(Prob, [a, fox], []). Prob = 0.48 ?= np(Prob, [fox], []). Prob = 0.2

9© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in RU System

10© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in RU System Probabilistic DCG in RU System (1.0) s --> np, vp.(0.5) n --> time. (0.4) np --> n.(0.3) n --> arrow. (0.2) np --> n, n.(0.2) n --> flies. (0.4) np --> d, n.(1.0) d --> an. (0.5) vp --> v, np.(0.3) v --> like. (0.5) vp --> v, pp.(0.7) v --> flies. (1.0) pp --> p, np.(1.0) p --> like.

11© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in RU System RU System Internal Representation (1.0) s(X0, X, s(T1, T2)) :- np(X0, X1, T1), vp(X1, X, T2). (0.4) np(X0, X, np(T)) :- n(X0, X, T). (0.2) np(X0, X, np(T1, T2)) :- n(X0, X1, T1), n(X1, X, T2). (0.4) np(X0, X, np(T1, T2)) :- d(X0, X1, T1), n(X1, X, T2). (0.5) vp(X0, X, vp(T1, T2)) :- v(X0, X1, T1), np(X1, X, T2). (0.5) vp(X0, X, vp(T1, T2)) :- v(X0, X1, T1), pp(X1, X, T2). (1.0) pp(X0, X, pp(T1, T2)) :- p(X0, X1, T1), np(X1, X, T2). (0.5) n(X0, X, n(time)) :- ’C’(X0, time, X). (0.3) n(X0, X, n(arrow)) :- ’C’(X0, arrow, X). (0.2) n(X0, X, n(flies)) :- ’C’(X0, flies, X). (1.0) d(X0, X, d(an)) :- ’C’(X0, an, X). (0.3) v(X0, X, v(like)) :- ’C’(X0, like, X). (0.7) v(X0, X, v(flies)) :- ’C’(X0, flies, X). (1.0) p(X0, X, p(like)) :- ’C’(X0, like, X). (1.0) ’C’([X|Y],X,Y).

12© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj PCFG in RU System Query: ?- s([time, flies, like, an, arrow], [], T). (0.0084) T = s(np(n(time)), vp(v(flies), pp(p(like), np(d(an), n(arrow))))) ( ) T = s(np(n(time), n(flies)), vp(v(like), np(d(an), n(arrow))))

13© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Conclusion Implementation of a relaxed unification inference engine Expressing PCFG in RU System Simple + Parse trees

14© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Future Work Express HPSG in the relaxed unification formalism Express probabilistic HPSG in RU System Apply RU System to Question Answering

15© 2003 T. Abou-Assaleh, N. Cercone, & V. Keselj Thank you! Question or comments?