Presentation is loading. Please wait.

Presentation is loading. Please wait.

Extraction: dependency and word order

Similar presentations


Presentation on theme: "Extraction: dependency and word order"— Presentation transcript:

1 Extraction: dependency and word order
Dick Hudson Berlin October 2018

2 Plan Conceptual networks and sequential order
The taxonomy of relational concepts Default inheritance Dependencies and word order Rich dependencies The ‘extractee’ dependency A new middle way in syntactic theory?

3 1. Conceptual networks and sequential order
‘isa’ We start with a non-linguistic network. It includes entities (= rectangle) relations (= ellipses and arrows) ‘isa’ relations (= triangle) defining: a taxonomy of entities a taxonomy of relations. an example of agreement so agreement in syntax doesn’t require special mental apparatus. John surname son father Hudson Dick surname child man ‘isa’ agreement

4 So what? This kind of mental structure appears to be real.
So let’s assume that we use it for language too. If we can model language in the same way, we’ve succeeded. So for every linguistic construct we have to provide an analogue outside language. e.g. ‘agreement’ isn’t confined to language. It includes conceptual nodes of two types: entities and relations It allows default inheritance via the ‘isa’ taxonomies

5 Sequential order A network has no left-right dimension.
B A B B A B A A network has no left-right dimension. Most syntactic theories build on the written tradition, where order comes for free. But a network model needs an alternative. Sequential ordering is a relationship in time or space. between an entity and a landmark (Langacker) it’s a property of the entity which we may not remember. landmark

6 Berlin So what? in landmark Sequential ordering is a relationship between two entities. So it’s a property of each entity, just like its other properties. It can be factored into two relations: a landmark a position defined in relation to the landmark So this mental apparatus is available for defining word order in language. position meeting Dick before landmark position breakfast

7 2. The taxonomy of relational concepts
A taxonomy is different from a partonomy. [finger : hand] is a partonomy: a finger is not a hand, but is part of one. [little finger : finger] is a taxonomy: a little finger is a finger, but is not part of one. A taxonomy allows default inheritance, but a partonomy doesn’t. Like entity concepts, relational concepts form a taxonomy. [son : child : relative] [to-the-left-of : beside : near] They can be created as needed, and learned. The set of relations is open-ended, so we can create them freely. They are the ‘attributes’ that link entities in the network.

8 So what? We’re good at creating and recognising
relations between entities including taxonomies including relation taxonomies. This ability is available for language as well.

9 3. Default inheritance In Word Grammar, default inheritance is monotonic because it’s part of the process of creating new token nodes. It applies to every taxonomy it defines the ‘isa’ relation. If A isa B, then A inherits every relation and value of B except those which already have a value for A. flying flying sparrow penguin 1 flying bird 1

10 So what? Default inheritance and taxonomies formalise:
prototypes versus exceptions typical versus atypical unmarked versus marked basic versus derived These notions are central to extraction and other syntactic processes.

11 4. Dependencies and word order
If syntax is a network, then syntactic structure must be a network. So we can: recognise relations between words (contrary to phrase structure, which denies word- word relations). recognise a taxonomy of grammatical functions. distinguish word order from dependencies. big book subject complement valent adjunct dependent

12 Dependencies and default word order
By default, every word depends on one other word. Every word has a position. By default, a word’s landmark is the word it depends on. The notation can show the position and relations as a stemma but with <, > for ‘before’, ‘after’ > < position < lm I like red wine a s o

13 Phrasal integrity Why not *I red like wine?
By default, a word stays as near to its landmark as possible. ‘one thing at a time’ ‘Projectivity’: no crossing lines. So phrasal integrity is guaranteed without phrases. > < I red like wine a s o

14 Phrase boundaries But phrase boundaries are important in some languages e.g. Welsh, where llawn > lawn by soft mutation at the start of a phrase: Dw i [lawn mor grac â chi]. be.PRES.1S I full as angry as you ‘I’m just as angry as you.’ We can freely invent a relation ‘start’. start Dw i lawn mor grac â chi s a a c c c

15 Dependencies and exceptional word order
> By default, a word’s landmark is the word it depends on. By default, in English a word’s position is after its landmark. Most dependents follow the head. But exceptions are allowed: subjects are before their landmark subjects of inverting auxiliaries are after the landmark. inv aux < subject > dependent

16 So what? Dependencies predict word-order relations.
But since a word’s position is one of its properties, it can be overridden by default inheritance just like any other property. Word-order relations are controlled by general principles of cognition: ‘one thing at a time’ bans discontinuous phrases Phrase boundaries are available when needed

17 5. Rich dependencies William John
descendant John In ‘plain vanilla’ dependency structure, a word can’t depend on more than one other word. But outside language we can recognise multiple connections e.g. ‘descendant’ applies recursively. So why not in language? e.g. subject-raising: It has been raining. Similar to HPSG re-entrance descendant descendant Dick It has been raining s s s

18 Rich dependencies and word order
How do we know which verb is the landmark of it? Why not *Has been it raining? Because of default inheritance. The link to has overrides its links to been and raining. But how does this work? It has been raining s s s

19 Multiple word tokens It1 has been raining It2 It3
The word it is actually three different word-tokens: it1 subject of has landmark = has isa it2 so its landmark overrides that of it2 it2 subject of been landmark = been isa it3 it3 subject of raining landmark = raining It1 has been raining s It2 s s It3

20 A non-linguistic analogue for multiple tokens
Imagine yourself in the countryside. You see something in the sky. Call it ‘x’ at first, x1 isa plane then x2 isa bird then x3 isa pair of bats They’re all tokens of the same exemplar. x1 ,x2 and x3 are distinct they all exist as memories but x3 is correct so x3 isa x2 isa x1. x1 x2 x3

21 So what? A word may have multiple ‘heads’ (words on which it depends).
When the properties predictable from these dependencies conflict, the conflict is resolved by default inheritance. Normally, it is resolved in favour of the highest head, giving ‘raising’. In order to explain this conflict, we need to recognise multiple word tokens, in order to separate the different bundles of properties. NB multiple word tokens are needed for other reasons e.g. to explain the meaning of typical French house: house1 means ‘house’ and has no dependent house2 means ‘French house’ and is modified by French house3 means ‘typical French house’ and is modified by typical

22 6. The ‘extractee’ dependency
< > Extraction ‘moves’ a postdependent into predependent position e.g. You are here > Here you are. This is explained by all the default dependencies an extra dependency ‘extractee’ (‘x’) applied to multiple word tokens: here1 = extractee, before are here2 = predicative, after are here1 isa here2 . you are here s p > < < here1 you are s x here2 p

23 A non-linguistic analogue for extraction
Basic: fill a page with pictures Select a picture Select an empty space Stick the picture in the space Repeat #2.1, or stop. ‘Extraction’: re-insert a photo in a page. Select the correct location Stick the picture in that location

24 A non-linguistic analogue for long-distance extraction
Aim: to stick a picture back into an album Try first page Look for correct location If success, stick the picture in. Otherwise, turn page and repeat #2

25 Long-distance extraction
e.g. What do you think he did? Extractee is passed recursively down the dependency chain. > < > > > < > what1 do you think he did s s x o o what2 x what3 x, o

26 8. A new middle way in syntactic theory?
Between phrase structure and dependency structure Between movement and surface constraints Between competence and performance big book book2 big book big book1 big book Here you are. Here1 you are. Here you are. book2 BOOK book1 book1 You are here. You are here2. BOOK

27 Thank you. This slideshow can be downloaded from dickhudson.com/talks


Download ppt "Extraction: dependency and word order"

Similar presentations


Ads by Google