1 Dependency structure and cognition Richard Hudson Depling2013, Prague.

Slides:



Advertisements
Similar presentations
Word Grammar in Theory Dick Hudson Cardiff, May 2013.
Advertisements

Idioms and exceptionality Nik Gisborne and Dick Hudson LAGB Leeds September 2010.
1 Cognitive sociolinguistics Richard Hudson Budapest March 2012.
Sociolinguistics 2 Everyday knowledge and language.
1 Word Grammar and other cognitive theories Richard Hudson Budapest March 2012.
Lecture #9 Syntax, Pragmatics, and Semantics © 2014 MARY RIGGS 1.
1 A cognitive analysis of the word ’S Dick Hudson Manchester April 2009.
Dr. Abdullah S. Al-Dobaian1 Ch. 2: Phrase Structure Syntactic Structure (basic concepts) Syntactic Structure (basic concepts)  A tree diagram marks constituents.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Ambiguity ambiguity is the property of being ambiguous, where a word, notation, phrase, clause, sentence is called ambiguous if it can be interpreted in.
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
1 Language and kids Linguistics lecture #8 November 21, 2006.
MORPHOLOGY - morphemes are the building blocks that make up words.
Thinking about valency and sentence structure Part of Dick Hudson's web tutorial on Word Grammarweb tutorial.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Cognitive Processes PSY 334 Chapter 5 – Meaning-Based Knowledge Representation July 24, 2003.
Cognitive Linguistics Croft & Cruse 10 An overview of construction grammars (part 1, through )
Language, Mind, and Brain by Ewa Dabrowska Chapter 2: Language processing: speed and flexibility.
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Syntax and Grammar John Goldsmith Cognitive Neuroscience May 1999.
Lecture 1 Introduction: Linguistic Theory and Theories
Generative Grammar(Part ii)
UML Class Diagrams: Basic Concepts. Objects –The purpose of class modeling is to describe objects. –An object is a concept, abstraction or thing that.
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
Grammar and education Dick Hudson University of Middlesex March 2006.
1 Word order without phrases (Introduction to Word Grammar) Richard Hudson Budapest, March 2012.
Thinking about inflections How to find verb inflections (Part of Dick Hudson's web tutorial on Word Grammar)web tutorial.
Transitivity / Intransitivity Lecture 7. (IN)TRANSITIVITY is a category of the VERB Verbs which require an OBJECT are called TRANSITIVE verbs. My son.
The Language Instinct Talking Heads.
Frames and semantic networks, page 1 CSI 4106, Winter 2005 A brief look at semantic networks A semantic network is an irregular graph that has concepts.
Subject Verb Agreement Pronoun-Antecedent Agreement
Semantic Memory Psychology Introduction This is our memory for facts about the world This is our memory for facts about the world How do we know.
1 Words and rules Linguistics lecture #2 October 31, 2006.
Thinking about agreement. Part of Dick Hudson's web tutorial on Word Grammarweb tutorial.
Prepositional Phrases A Phrase is a group of related words that does not include a Subject and Verb. on the coast in college to the game.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
A network model of processing in morphology Dick Hudson UCL
Subordination & Content clauses
Computational linguistics A brief overview. Computational Linguistics might be considered as a synonym of automatic processing of natural language, since.
Rules, Movement, Ambiguity
Linguistic Theory Lecture 5 Filters. The Structure of the Grammar 1960s (Standard Theory) LexiconPhrase Structure Rules Deep Structure Transformations.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 16, March 6, 2007.
Cognitive Processes PSY 334 Chapter 5 – Meaning-Based Knowledge Representation.
Introduction Chapter 1 Foundations of statistical natural language processing.
1 How is knowledge stored? Human knowledge comes in 2 varieties: Concepts Concepts Relations among concepts Relations among concepts So any theory of how.
Syntax and cognition Dick Hudson Freie Universität Berlin, October
1 Knowledge Based Systems (CM0377) Introductory lecture (Last revised 28th January 2002)
Verb phrases Main reference: Randolph Quirk and Sidney Greenbaum, A University Grammar of English, Longman: London, (3.23 – 3.55)
1 Introduction to WG syntax Richard Hudson Joensuu November 2010 Word-word relations are concepts.
1 French clitics and cognition Dick Hudson Oxford, November 2012.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 3.
1 The logic of default inheritance in Word Grammar Richard Hudson Lexington, May 2012.
1 (Introduction to Word Grammar) Richard Hudson Joensuu November 2010 Words are concepts.
Composition I Spring   Subjects are always nouns or pronouns.  Nouns are people, places, things, or ideas.  Pronouns take the place of nouns:
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Module 5 Other Knowledge Representation Formalisms
Students’ typical confusions and some teaching implications
Diagramming with Linking Verbs and
SEL1007: The Nature of Language
… and language typology
Dependency structure and cognition
Word order and phrases in a network
Between dependency structure and phrase structure
Extraction: dependency and word order
PREPOSITIONAL PHRASES
Word order and phrases in a network
How is knowledge stored?
Defaults without faults
Lecture 7: Definite Clause Grammars
Presentation transcript:

1 Dependency structure and cognition Richard Hudson Depling2013, Prague

2 The question What is syntactic structure like? –Does it include dependencies between words (dependency structure)? –Or does it only contain part-whole links (phrase structure)? Shelookedafterhim after him She looked after him

3 Relevant evidence: familiarity University courses teach only one approach. School grammar sometimes offers one. –Usually dependency structure –even in the USA Reed-Kellogg sentence-diagramming –especially in Europe –and especially in the Czech Republic!

4 What Czech children do at school blossomed out kingcups by stream near yellow Jirka Hana & Barbora Hladk á 2012

5 or even …

6 Relevant evidence: convenience Dependency structure is popular in computational linguistics. Maybe because of its simplicity: –few nodes –little but orthographic words Good for lexical cooccurrence relations

7 Relevant evidence: cognition Language competence is memory Language processing is thinking Memory and thinking are part of cognition So what do we know about cognition? A. Very generally, cognition is not simple –so maybe syntactic structures aren't in fact simple?

8 B. Knowledge is a network me Gaynor Lucy Peter John Gretta Colin

9 C. Links are classified relations person woman man relative parent child mother father is-a

10 D. Nodes are richly related me Gaynor Lucy Peter John Gretta Colin m s m f f d gf b b w h s s s s

11 E. Is-a allows default inheritance Is-a forms taxonomies. – e.g. 'linguist is-a person', 'Dick is-a linguist' Properties 'inherit' down a taxonomy. But only 'by default' – exceptions are ok. – e.g. birds (normally) fly –but penguins don't.

12 Penguins bird robin 'flies' penguin 'doesn't fly' robin* 'flies' penguin* 'doesn't fly'

13 Cognitivism 'Cognitivism' –'Language is an example of ordinary cognition' So all our general cognitive abilities are available for language –and we have no special language abilities. Cognitivism matters for linguistic theory.

14 Some consequences of cognitivism 1.Word-word dependencies are real. 2.'Deep' and 'surface' properties combine. 3.Mutual dependency is ok. 4.Dependents create new word tokens. 5.Extra word tokens allow raising. 6.But lowering may be ok too.

15 1. Word-word dependencies are real Do word-word dependencies exist (in our minds)? –Why not? –Compare social relations between individuals. What about phrases? –Why not? –But maybe only their boundaries are relevant? –They're not classified, so no unary branching.

16 Punctuation marks boundaries At the end of the road, turn right. Not: –At the end of the, road turn right. –At the end, of the road turn right. –At the end of the road turn right, How do we learn to punctuate if we can't recognise boundaries?

17 No unary branching If S NP + VP, then: S NP VP N V Cows moo. Cows moo. N V But if a verb's subject is a noun:

18 2. 'Deep' and 'surface' properties combine. Dependencies are relational concepts. Concepts record bundles of properties that tend to coincide –e.g. 'bird': beak, flying, feathers, two legs, eggs –'mother': bearer, carer So one dependency has many properties: –semantic, syntactic, morphosyntactic –e.g. 'subject' ….

19 'subject' The typical subject is defined by meaning –typically 'actor' or … word order and/or case –typically before verb and/or nominative agreement –typically the verb agrees with it status –obligatory or optional, according to finiteness

20 So … Cognition suggests that 'deep' and 'surface' properties should be combined –not separated They are in harmony by default –but exceptionally they may be out of harmony –this is allowed by default inheritance

21 3. Mutual dependency is ok. Mutual dependency is formally impossible in standard notation And is formally impossible in phrase structure theory So if it exists, we need to –resist PS theory –change the standard notation

22 Mutual dependency exists I wonder who came? Who is subject of came, –so who depends on came. But who depends on wonder and came can be omitted: –e.g. Someone came – I wonder who. So came depends on who.

23 Standard notation A B B A A 'dominates' B so A is above B so B cannot 'dominate' A

24 4. Dependents create new word tokens. General cognition: –every exemplar needs a mental node. –no node carries contradictory properties. –so some exemplars need two nodes. E.g. when we re-classify things. –NB we can remember both classifications

25 What kind of bird? bird blackbird B B* ? mate

26 And in language … word LIKE-verb like ? subject I like* NB like* is a token of a token

27 The effect of a dependent When we recognise a dependent for W, we change W into a new token W*. The classification of W* may change. W* also has a new meaning –normally a hyponym of W –but may be idiomatic If we add dependents singly, this gives a kind of phrase structure!

28 typical French house HOUSE house meaning house house* house** French house meaning French house meaning typical French house meaning typical

29 Notation houseFrench typical house* house** houseFrench typical

30 5. Extra word tokens allow raising. rains subject it keeps it* raining subject it subject predicative

31 Raising in the grammar A A* C B higher parent lower parent shared A* is-a A, so A* wins.

32 6. But lowering may be ok too. Raising is helpful for processing –the higher parent is nearer to the sentence root. But sometimes lowering is helpful too –e.g. if it allows a new meaning-unit. Eine Concorde gelandet ist hier nie. a Concorde landed has here never. A-Concorde-landing has never happened here.

33 German Partial VP fronting gelandet ist hier nie Eine Concorde Eine Concorde* higher parent lower parent lowered

34 Conclusions Language is just part of cognition. So syntactic dependencies are: –psychologically real –rich (combining 'deep' and 'surface' properties) –complex (e.g. mutual, multiple). And dependency combines with –default inheritance –multiple tokens