Subjects, Predicates, and Systematicity Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy.

Slides:



Advertisements
Similar presentations
-- in other words, logic is
Advertisements

Semantic Typology for Human i-Languages Paul M. Pietroski, University of Maryland Dept. of Linguistics, Dept. of Philosophy.
Summer 2011 Tuesday, 8/ No supposition seems to me more natural than that there is no process in the brain correlated with associating or with.
 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Semantics (Representing Meaning)
Lexicalizing and Combining Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy.
1 Dependency structure and cognition Richard Hudson Depling2013, Prague.
Simple & compound sentences
Universals, Properties, Kinds
MORPHOLOGY - morphemes are the building blocks that make up words.
Cognitive Processes PSY 334 Chapter 11 – Language Structure.
NLP and Speech 2004 Feature Structures Feature Structures and Unification.
Term 1 Week 9 Syntax.
The Language of Theories Linking science directly to ‘meanings’
Meanings as Instructions for how to Build Concepts Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy
Artificial Intelligence 2005/06 From Syntax to Semantics.
Describing I-Junction Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy
Syntax and Grammar John Goldsmith Cognitive Neuroscience May 1999.
Lect. 11Phrase structure rules Learning objectives: To define phrase structure rules To learn the forms of phrase structure rules To compose new sentences.
Lecture 1 Introduction: Linguistic Theory and Theories
The syntax of language How do we form sentences? Processing syntax. Language and the brain.
Cohesion Cohesion describes the way in which a text is tied together by linguistic devices, such as And so we see , Additonally , Therefore.
Language and the Mind Prof. R. Hickey SS 2006 Linguistic Theory and Language Acquisition Claudia Berger (LN, GS) Günther Milbradt (LN, HS)
1 1 Introduction: Terminological Remarks. 2 Language, Mind, World Famous triangle: Famous triangle:Language Mind World How does the mind relate to the.
Psych 156A/ Ling 150: Acquisition of Language II Lecture 16 Language Structure II.
Learning goals.
Language Preview Chapter 1. Human Language Specialization OrganSurvivalSpeech Lungs Exchange CO 2 for O 2 Air flow Vocal Cords Cover tube to lungsVibration.
Quiz The boy found the ball. 2. The boy found quickly. 3. The boy found in the house. 4. The boy found the ball in the house. 5. Lisa slept the.
1. Information Conveyed by Speech 2. How Speech Fits in with the Overall Structure of Language TWO TOPICS.
1 Natural Language Processing Lecture 11 Efficient Parsing Reading: James Allen NLU (Chapter 6)
Some animals are born early, and acquire a “second nature” catterpillars become butterflies infants become speakers of human languages, whose meaningful.
2.2 Statements, Connectives, and Quantifiers
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
Unit 5 Week 1 Introducing Adjectives. Adjectives.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Levels of Language 6 Levels of Language. Levels of Language Aspect of language are often referred to as 'language levels'. To look carefully at language.
SIS Piemonte a.a. 2004_2005 Corso di Fondamenti della Matematica Nodi fondamentali in Matematica.
October 2004CSA4050: Semantics III1 CSA4050: Advanced Topics in NLP Semantics III Quantified Sentences.
Linguistic Essentials
Entity Theories of Meaning. Meaning Talk Theory should make sense of meaning talk Theory should make sense of meaning talk What sorts of things do we.
LECTURE 2: SEMANTICS IN LINGUISTICS
CTM 2. EXAM 2 Exam 1 Exam 2 Letter Grades Statistics Mean: 60 Median: 56 Modes: 51, 76.
Cognitive Processes PSY 334 Chapter 11 – Language Structure June 2, 2003.
Rules, Movement, Ambiguity
CSA2050 Introduction to Computational Linguistics Parsing I.
Locating Human Meanings: Less Typology, More Constraint Paul M. Pietroski, University of Maryland Dept. of Linguistics, Dept. of Philosophy.
Building a Semantic Parser Overnight
1 CS 385 Fall 2006 Chapter 7 Knowledge Representation 7.1.1, 7.1.5, 7.2.
BEFORE DOING THE LEARNING STYLES TEST IT IS IMPORTANT TO REMEMBER THE FOLLOWING POINTS. IT DOESN’T DEFINE THE WAY YOU LEARN OR THINK. IT HELPS TO MAKE.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
What is the theory of categories (predicable „things”)? Is it a logical theory? From the historical point of view: it was. Predicable things have some.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
Deductive Reasoning. Inductive: premise offers support and evidenceInductive: premise offers support and evidence Deductive: premises offers proof that.
Lecturer : Ms. Abrar Mujaddidi S YNTAX. I NTRODUCTION  In the previous chapter, we moved from the general categories and concepts of traditional grammar,
What is a verb? Click here to find out.... jump.
LEARNING WORDS, GRAMMAR, AND PRONUNCIATION
College of Education for Girls Dr. Mohamed Younis Mohamed
An Introduction to the Government and Binding Theory
SYNTAX.
Assessing Grammar Module 5 Activity 5.
Language, Logic, and Meaning
Introduction to Philosophy
Unit 7: Cognition AP Psychology
IMPORTANT! Feedback Listen. Absorb. Take Notes.
CSC 594 Topics in AI – Applied Natural Language Processing
Dependency structure and cognition
LANGUAGE, SPEECH, AND THOUGHT
Traditional Grammar VS. Generative Grammar
Habib Ullah qamar Mscs(se)
Presentation transcript:

Subjects, Predicates, and Systematicity Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy

Origins of Propositional Thought What are Thoughts? What a Propositional (or non-Propositional) Thought? Origins? 35 minutes (Q&A included)? Strategy Focus on two notions of “Sentential” Thoughts (1) thoughts that exhibit Subject-Predicate structure (2) thoughts composed of Systematically Combinable concepts Raise an “origin question” about each Speculate about a role for the Human Language Faculty

Where does Subject-Predicate structure come from? THOUGHT/SENTENCE SUBJECT (copula) PREDICATE mortal QUANTIFIER PREDICATE every/some/no human S  NP (aux) VP but ‘Subject’ is not a notion of Generative Grammar; and neither is ‘Sentence’; even NP-of-S was passé in the 1980s

Where does Subject-Predicate structure come from? THOUGHT/SENTENCE SUBJECT (copula) PREDICATE mortal QUANTIFIER PREDICATE every/some/no human Darcy is easy to please. It is easy (for us) to please Darcy. It is raining. There will be few people at the beach.

‘Subject’ is not a notion of Generative Grammar But human languages seem to really like predicates – every father is mortal – some big brown cow which they chased was from Texas – every Chris we know was between a Smith and a Jones – every chase is an event in which something is chased – some event was a stabbing of Caesar that was done by Brutus in March compare: which [they chased ___ ] whonk [they _____ cows]

Where does Subject-Predicate structure come from? THOUGHT/PREMISE SUBJECT (copula) PREDICATE mortal QUANTIFIER PREDICATE every/some/no human Thoughts (Gedanken) have Function-Argument structure ‘Subject’ is not a notion of Serious Logic; and unary Functions are just special cases consider the Thought that Zero precedes every positive number

 x:PositiveNumber(x){Precedes(0, x)} (x = 0) v Precedes(0, x)  y:NumberOf[y, {z: ~(z = z)}]  F[F(x)  F(0)] … Ancestral:Predecessor(0, x) … the constituents of Fregean Thoughts are massively relational

We introduce relational concepts like NumberOf[x, F]. But we can imagine thinkers who use such concepts to introduce Number(x) and Zero. A good language is not just for expressing concepts; it lets you use old concepts to introduce new ones, and thereby re-present contents in a new format. Zero precedes every positive number  x:PositiveNumber(x){Precedes(Zero, x)} … … …

Origins of Propositional Thought The notion of a Subject seems to be rooted in pre-linguistic cognition, as opposed to Grammar or Logic (or Meaning) But human languages really like predicates, as Aristotle and the Medieval Logicians suggested, even if predicates are not essential to Thought/Logic A language can let you use old concepts to introduce new ones, and thereby re-present contents in a new format

Origins of Propositional Thought Minds house modules, whose characteristic representations exhibit distinctive formats. These “informationally encapsulated” cognitive systems provide inputs to at least one “central” system. Minds deploy concepts that are systematically combinable. But how did modular minds come to have such concepts?

Where do Systematically Combinable Concepts come from? Minds house modules that employ proprietary vocabularies. Minds deploy concepts that are systematically combinable. if you can think that you can also think that Al is blue, and Bo is green Al is green, and Bo is blue Al saw Bo, and Cy heard Di Di saw Al, and Bo heard Cy A dog barked, and every cat ran Every cat barked, and a dog ran Someone fell, and Al saw Bo Al fell, and Bo saw someone fall Big new ideas emerge rarely Colorless green ideas sleep furiously But how did modular minds come to have concepts (or words) that are as combinable as words?

if modules 21 and 28 employ their own vocabularies … how does any central system—say, 31—support both productive combination (within the system) and interfaces with diverse modules? Putting the question crudely:

How did there come to be a network of concepts that interface with disparate modules, and yet combine as freely as words?

How did there come to be lexical items that are linked (via concepts) to disparate modules, and yet combine so freely? How did there come to be a network of concepts that interface with disparate modules, and yet combine as freely as words?

LEXICALIZER Maybe a Human Language lets a child use Prelexical Concepts (which may not be systematically combinable) to introduce Lexical Concepts, which exhibit a new format that supports systematically combinabilty

LEXICALIZER No guarantee that any particular concept is lexicalizable. But many concepts can be lexicalized. (And on any plausible view, lexicalization is not merely a process of connecting concepts with pronunciations.)

Origins of Propositional Thought SENTENCE SUBJECT (copula) PREDICATE QUANTIFIER PREDICATE Maybe… Subject-Predicate is an ancient form of mental sentence. Our natural modes of relational thought are limited; to combine relational concepts systematically, Frege had to invent a language for this very purpose. But humans acquire words that let us build predicates and thereby use an ancient form of thought more systematically.

Subjects, Predicates, and Systematicity THANKS