1 The logic of default inheritance in Word Grammar Richard Hudson Lexington, May 2012.

Slides:



Advertisements
Similar presentations
Word Grammar in Theory Dick Hudson Cardiff, May 2013.
Advertisements

Idioms and exceptionality Nik Gisborne and Dick Hudson LAGB Leeds September 2010.
CSC411Artificial Intelligence 1 Chapter 3 Structures and Strategies For Space State Search Contents Graph Theory Strategies for Space State Search Using.
Logic Programming Automated Reasoning in practice.
Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
1 DCP 1172 Introduction to Artificial Intelligence Chang-Sheng Chen Topics Covered: Introduction to Nonmonotonic Logic.
1 Dependency structure and cognition Richard Hudson Depling2013, Prague.
Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
1 A cognitive analysis of the word ’S Dick Hudson Manchester April 2009.
Knowledge Representation
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
Representations One of the major distinctions between ordinary software and AI is the need to represent domain knowledge (or other forms of worldly knowledge)
For Friday Finish chapter 10 No homework (get started on program 2)
CSM6120 Introduction to Intelligent Systems Knowledge representation.
Cognitive - knowledge.ppt © 2001 Laura Snodgrass, Ph.D.1 Knowledge Structure of semantic memory –relationships among concepts –organization of memory –memory.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio LECTURE 4: Semantic Networks and Description Logics.
 Contrary to the beliefs of early workers in AI, experience has shown that Intelligent Systems cannot achieve anything useful unless they contain a large.
Knowing Semantic memory.
Auto-Epistemic Logic Proposed by Moore (1985) Contemplates reflection on self knowledge (auto-epistemic) Allows for representing knowledge not just about.
Natural Categories Hierarchical organization of categories –Superordinate (e.g., furniture) –Basic-level (e.g., chair) –Subordinate (e.g., armchair) Rosch.
Cognitive Linguistics Croft & Cruse 10 An overview of construction grammars (part 1, through )
Parsing — Part II (Ambiguity, Top-down parsing, Left-recursion Removal)
Knowledge information that is gained and retained what someone has acquired and learned organized in some way into our memory.
CSC 8310 Programming Languages Meeting 2 September 2/3, 2014.
1 Word order without phrases (Introduction to Word Grammar) Richard Hudson Budapest, March 2012.
Word Grammar at Work Dick Hudson Cardiff, May 2013.
Lecture 10 – Semantic Networks 1 Two questions about knowledge of the world 1.How are concepts stored? We have already considered prototype and other models.
Thinking about inflections How to find verb inflections (Part of Dick Hudson's web tutorial on Word Grammar)web tutorial.
2.2 A Simple Syntax-Directed Translator Syntax-Directed Translation 2.4 Parsing 2.5 A Translator for Simple Expressions 2.6 Lexical Analysis.
English Lexicology Morphological Structure of English Words Week 3: Mar. 10, 2009 Instructor: Liu Hongyong.
Theory Of Automata By Dr. MM Alam
General Knowledge Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
To look at how to critically examine issues and how to effectively write essays in Physical Education studies.
Defeasible Security Policy Composition for Web Services Adam J. Lee, Jodie P. Boyer *, Lars E. Olson, and Carl A. Gunter University of Illinois at Urbana-Champaign.
Semantic Memory Psychology Introduction This is our memory for facts about the world This is our memory for facts about the world How do we know.
CSNB234 ARTIFICIAL INTELLIGENCE
For Friday Exam 1. For Monday No reading Take home portion of exam due.
Dr. Monira Al-Mohizea MORPHOLOGY & SYNTAX WEEK 12.
Atomic Sentences Chapter 1 Language, Proof and Logic.
Alternative representations: Semantic networks
Lesson 3 CDT301 – Compiler Theory, Spring 2011 Teacher: Linus Källberg.
For Wednesday Read chapter 13 Homework: –Chapter 10, exercise 5 (part 1 only, don’t redo) Progress for program 2 due.
Temporal Reasoning and Planning in Medicine Frame-Based Representations and Description Logics Yuval Shahar, M.D., Ph.D.
A network model of processing in morphology Dick Hudson UCL
1 How is knowledge stored? Human knowledge comes in 2 varieties: Concepts Concepts Relations among concepts Relations among concepts So any theory of how.
For Friday Read Chapter 11, sections 1 and 2 Homework: –Chapter 10, exercises 1 and 5.
Some Thoughts to Consider 8 How difficult is it to get a group of people, or a group of companies, or a group of nations to agree on a particular ontology?
Department of Computer Science Data Structures Using C++ 2E Chapter 2 Object-Oriented Design (OOD) and C++  Learn about inheritance  Learn about derived.
Syntax and cognition Dick Hudson Freie Universität Berlin, October
For Wednesday Read chapter 13 No homework. Program 2 Any questions?
1 Knowledge Based Systems (CM0377) Lecture 6 (last modified 20th February 2002)
1 Introduction to WG syntax Richard Hudson Joensuu November 2010 Word-word relations are concepts.
1 French clitics and cognition Dick Hudson Oxford, November 2012.
1 (Introduction to Word Grammar) Richard Hudson Joensuu November 2010 Words are concepts.
Lecture 5 Frames. Associative networks, rules or logic do not provide the ability to group facts into associated clusters or to associate relevant procedural.
Semantic Memory Psychology Introduction This is our memory for facts about the world How do we know that the capital of Viet Nam is Hanoi How is.
1 The grammatical categories of words and their inflections Kuiper and Allan Chapter 2.1.
Module 5 Other Knowledge Representation Formalisms
A Simple Syntax-Directed Translator
ARTIFICIAL INTELLIGENCE
Knowledge Representation
Knowledge Representation and Inference
Dependency structure and cognition
KNOWLEDGE REPRESENTATION
Extraction: dependency and word order
Weak Slot-and-Filler Structures
Reasoning with Uncertainty Piyush Porwal ( ) Rohit Jhunjhunwala ( ) Srivatsa R. ( ) Under the guidance of Prof. Pushpak Bhattacharyya.
Defaults without faults
Dick Hudson Kelmscott School, March 2018
Presentation transcript:

1 The logic of default inheritance in Word Grammar Richard Hudson Lexington, May 2012

2 Plan 1.Word Grammar 2.WG morphology 3.Five problems for DI, and the WG solutions 4.Conclusions

3 1: Word Grammar Aims –to model language in cognition 'graceful integration' –to discover if anything is unique to language Assumptions –cognition is a network –the network includes a taxonomy –the model must include processes as well as structures.

4 WG and language structure Language is a network The network links sounds to meanings –but NOT directly –so the units are not 'symbols' or 'constructions' The network defines: –'levels' –various sub-taxonomies

5 'Levels' in a network realization pronunciation morph spelling sound letter concept meaning semantics syntax morphology phon/graph-ology word morph sound

6 Taxonomy in a network concept entity relation letterwordmorphsound realization pronunciation spelling meaning 'is-a'

7 WG and processes Spreading activation –including attention Node-creation for tokens/exemplars –e.g. word-tokens in recognition or planning Binding –e.g. token to token in parsing Default inheritance

8 An example outside language: hearing some purring cat token 1 token 2 ? binding node-creation purring activation inheritance token 3 purrer

9 An example inside language: hearing /kat/ CAT realization token 1 token 2 ? /kat/ token 3

10 2: WG morphology Inferential and realizational Morphosyntax –words realized by meaningless morphs –any degree of generality word-class – lexeme – sub-lexeme Morphophonology –morphs realized by phones or letters

11 Morphosyntax in a network word adj-of noun BOYISH BOY plural {boy} base full s-variant inflectional morphology A full base adjective derivational morphology

12 Morphophonology in a network word {z} base s-variant {z} 1 2 pronunciation /z/ /ɪ//ɪ/ # /s/ / ɪ /+ 1 # sibilant next voiceless next

13 3: Five problems for default inheritance The basic idea is very simple: –defaults generalise except when overridden But there are challenges: a.generality: going beyond morphology b.reliability: non-monotonicity c.certainty: recognising clashes and winners d.relevance: avoiding irrelevant enrichment e.economy: avoiding storage of outcomes f.sensibleness: avoiding silly categorization

14 DI in WG Since 1980 I've tried several different theories of DI –e.g. 1990: exceptions are marked with 'NOT …' 2007: A solution: –Default inheritance only applies to tokens. –DI is driven by spreading activation. –The network defines conflicts and outcomes.

15 a: Generality Default inheritance is part of general cognition. –Hence: prototype effects So the mechanism used for morphology must generalize –beyond ready-made AVMs –e.g. 'birds fly' but 'penguins don't fly'.

16 Generality in WG Properties are links to other concepts –not AVMs –e.g. 'Birds fly' is a property of 'bird' as well as of 'fly'. –'Not P' conflicts with 'P' and overrides it. 'not P' in a network: 'quantity of P = 0'

17 Penguins bird flier flying penguin token 0 # 0 #

18 b: Reliability Monotonicity: inferences are –cumulative –reliable – i.e. not liable to later retraction But DI is non-monotonic –because default properties may be overridden –unless special ordering restrictions are imposed –but this is odd in a declarative system.

19 People or persons? noun plural full s-variant PERSON PERSON, plural {people} full allows persons

20 Reliability in WG Inheritance only applies to tokens –i.e. at the foot of the taxonomy And it applies recursively working upwards –so the first value inherited is always the winner. So DI is, in fact, monotonic! And DI is always part of the process of node-building.

21 People, not persons noun plural full s-variant PERSON PERSON, plural {people} full token

22 c: Certainty Inheritable properties may be uncertain: –Multiple inheritance allows two outcomes. –A node's identity may be defined directly by a 'filler' indirectly by reentrancy. In conflicts, which property wins? WG: let the network decide.

23 Multiple inheritance No special provisions needed. If conflicts arise, so be it. –e.g. *I amn't = BE, 1sg, negative 1 sg: realization = {am} negative: realization = {aren't} unresolved because 1 sg and negative are sisters. Reentrancy (loops) is harder.

24 Is people the s-variant of PERSON? noun plural x full s-variant PERSON PERSON, plural {person} {z} 2 1 token y Does token inherit s-variant? If {person} isa x, … and the consequences! But if not, then y inherits s-variant. all is well.

25 Certainty in WG If: the inheritor for token x finds [X, R, Y] = [X R Y] or [Y R X] 'upper-case' dominates 'lower-case' And if: –x already has [x, r, …] then: Do nothing. –x already has [x, …, y] then: create [X, r, y], [r isa R] –otherwise: create [x, r, y], [r isa R], [y isa Y]

26 Certainty in a network x R X Y If: [X, R, Y] and if: already [x, r, …] Otherwise: create [x, r, y], [y isa Y], [r isa R] Then: Do nothing. Then: create [X, r, y], [r isa R] Otherwise if: already [x, …, y] and X dominates x y r

27 d: Relevance The inheritance explosion Each new node created is a token –so it inherits too, creating further nodes Every node in the taxonomy is an inheritance source. Every property can be inherited. Inheritance is resource-intensive. –E.g. 'Does a canary have skin?' takes longer to answer than 'Does a canary sing?' So we need a rationing mechanism.

28 Relevance in WG Do we inherit irrelevant properties? –e.g. spelling –etymology Relevance in WG: –only inherit active properties. –Activation levels vary, and reflect: past experience present concerns

29 Context-dependent DI school SCHOOL etymology skhole meaning token Assume: Total inheritance. Relevant inheritance.

30 e: Economy Do we store inherited properties? –Not in general. –Mixture of full and empty entries. Economy in WG: Only tokens inherit –So types aren't enriched. –But tokens may become types i.e. may be 'learned'

31 Fringe activity Memory changes slowly Experience changes fast

32 f: sensibleness Problem: How to avoid silly classification? –e.g. a block of wood isa bird, but overrides all bird properties. Solution: A theory of learning and use: –We only build isa links where properties are shared: in creating new category nodes in creating new token nodes.

33 5: Conclusions DI is part of cognition. –and needs a cognitive model –it mustn't be limited to AVMs. It's part of token-building. –so it's monotonic. It's limited to active relations. –so it only inherits what's relevant.

34 Thank you This slideshow can be downloaded from: For more on Word Grammar, see: