Describing I-Junction Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy

Slides:



Advertisements
Similar presentations
Chapter Two The Scope of Semantics.
Advertisements

Semantics Static semantics Dynamic semantics attribute grammars
1 CHAPTER 4 RELATIONAL ALGEBRA AND CALCULUS. 2 Introduction - We discuss here two mathematical formalisms which can be used as the basis for stating and.
Ambiguous contents? Arvid Båve, Higher seminar in Theoretical Philosophy, FLoV, Gothenburg University, 8 May 2013.
Ontology From Wikipedia, the free encyclopedia In philosophy, ontology (from the Greek oν, genitive oντος: of being (part. of εiναι: to be) and –λογία:
Universals, Properties, Kinds
Albert Gatt LIN3021 Formal Semantics Lecture 5. In this lecture Modification: How adjectives modify nouns The problem of vagueness Different types of.
ISBN Chapter 3 Describing Syntax and Semantics.
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
CPSC 411, Fall 2008: Set 12 1 CPSC 411 Design and Analysis of Algorithms Set 12: Undecidability Prof. Jennifer Welch Fall 2008.
School of Computing and Mathematics, University of Huddersfield CAS810: WEEK 3 LECTURE: LAMBDA CALCULUS PRACTICAL/TUTORIAL: (i) Do exercises given out.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
1 Undecidability Andreas Klappenecker [based on slides by Prof. Welch]
Meanings as Instructions for how to Build Concepts Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy
KISS Programming. What’s so great about computers? They are fast (so they can accomplish much in a short time… spell check a thesis) They don’t make mistakes.
The Square Root of 2, p, and the King of France: Ontological and Epistemological Issues Encountered (and Ignored) in Introductory Mathematics Courses Martin.
Meanings as Instructions for how to Build Concepts Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Let remember from the previous lesson what is Knowledge representation
Semantics with Applications Mooly Sagiv Schrirber html:// Textbooks:Winskel The.
School of Computing and Mathematics, University of Huddersfield PDDL and other languages.. Lee McCluskey Department of Computing and Mathematical Sciences,
CS 330 Programming Languages 09 / 16 / 2008 Instructor: Michael Eckmann.
Lesson 6. Refinement of the Operator Model This page describes formally how we refine Figure 2.5 into a more detailed model so that we can connect it.
Describing Syntax and Semantics
Monadic Predicate Logic is Decidable Boolos et al, Computability and Logic (textbook, 4 th Ed.)
Programming Language Semantics Denotational Semantics Chapter 5 Part III Based on a lecture by Martin Abadi.
Three Generative grammars
CAS LX 502 Semantics 3a. A formalism for meaning (cont ’ d) 3.2, 3.6.
Frames and semantic networks, page 1 CSI 4106, Winter 2005 A brief look at semantic networks A semantic network is an irregular graph that has concepts.
LDK R Logics for Data and Knowledge Representation Modeling First version by Alessandro Agostini and Fausto Giunchiglia Second version by Fausto Giunchiglia.
Some animals are born early, and acquire a “second nature” catterpillars become butterflies infants become speakers of human languages, whose meaningful.
Extending the Definition of Exponents © Math As A Second Language All Rights Reserved next #10 Taking the Fear out of Math 2 -8.
Declarative vs Procedural Programming  Procedural programming requires that – the programmer tell the computer what to do. That is, how to get the output.
Database Management Systems, R. Ramakrishnan1 Relational Calculus Chapter 4.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
Course Overview and Road Map Computability and Logic.
Entity Theories of Meaning. Meaning Talk Theory should make sense of meaning talk Theory should make sense of meaning talk What sorts of things do we.
Fall 2004EE 3563 Digital Systems Design EE 3563 VHSIC Hardware Description Language  Required Reading: –These Slides –VHDL Tutorial  Very High Speed.
Albert Gatt LIN3021 Formal Semantics Lecture 4. In this lecture Compositionality in Natural Langauge revisited: The role of types The typed lambda calculus.
Paul M. Pietroski University of Maryland
Key Concepts Representation Inference Semantics Discourse Pragmatics Computation.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Locating Human Meanings: Less Typology, More Constraint Paul M. Pietroski, University of Maryland Dept. of Linguistics, Dept. of Philosophy.
ECSE Software Engineering 1I HO 4 © HY 2012 Lecture 4 Formal Methods A Library System Specification (Continued) From Specification to Design.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
Subjects, Predicates, and Systematicity Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy.
CS 285- Discrete Mathematics Lecture 4. Section 1.3 Predicate logic Predicate logic is an extension of propositional logic that permits concisely reasoning.
Procedure Matters Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy
ICOM 6005 – Database Management Systems Design Dr. Manuel Rodríguez Martínez Electrical and Computer Engineering Department Lecture 5 – September 4 th,
What is the main focus of this course? This course is about Computer Science Geometry was once equally misunderstood. Term comes from ghia & metra or earth.
The Interpreter Pattern (Behavioral) ©SoftMoore ConsultingSlide 1.
What’s the Right Logic. What Is Logic? Joe Lau The laws of biology might be true only of living creatures, and the laws of economics are only applicable.
Building Abstractions with Variables (Part 2) CS 21a: Introduction to Computing I First Semester,
1 Logic Our ability to state invariants, record preconditions and post- conditions, and the ability to reason about a formal model depend on the logic.
EEL 5937 Content languages EEL 5937 Multi Agent Systems Lecture 10, Feb. 6, 2003 Lotzi Bölöni.
Overview of the theory of computation Episode 3 0 Turing machines The traditional concepts of computability, decidability and recursive enumerability.
Psychological status of phonological analyses Before Chomsky linguists didn't talk about psychological aspects of linguistics Chomsky called linguistics.
1 Section 7.1 First-Order Predicate Calculus Predicate calculus studies the internal structure of sentences where subjects are applied to predicates existentially.
Operational Semantics Mooly Sagiv Reference: Semantics with Applications Chapter 2 H. Nielson and F. Nielson
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Describing Syntax and Semantics
SEMANTICS VS PRAGMATICS
7/3/2018 EMR 17 Logical Reasoning Lecture 11.
Copyright © Cengage Learning. All rights reserved.
ARTIFICIAL INTELLIGENCE
Higher-Order Procedures
2.1 – Represent Relations and Functions.
Relations/Sequences Objective: Students will learn how to identify if a relation is a function. They will also be able to create a variable expression.
Presentation transcript:

Describing I-Junction Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy (extended version of the slides for this talk and a series of talks earlier this week)

A Large Project Meanings are simple they compose systematically in ways that emerge naturally for humans they are perceived rapidly and automatically There are decent first-pass theories of meaning/understanding Meaning relies on rudimentary linking of unsaturated conceptual “slots” Truth is complicated It depends on context in apparently diverse ways Making a claim truth-evaluable often requires work, especially if you want people to agree on which truth-evaluable claim got made There are paradoxes Truth requires fancy (Tarskian) variables

A Large Project Meanings are simple they compose systematically in ways that emerge naturally for humans they are perceived rapidly and automatically There are decent first-pass theories of meaning/understanding Meaning relies on rudimentary linking of unsaturated conceptual “slots” Truth is complicated It depends on context in apparently diverse ways Making a claim truth-evaluable often requires work, especially if you want people to agree on which truth-evaluable claim got made There are paradoxes Truth requires fancy (Tarskian) variables

An Elementary Case Study a phrase like ‘brown cow’ is somehow conjunctive to a first approximation… ‘brown cow’ indicates a concept like BROWN(_) & COW(_) ‘brown cow’ applies to an individual thing x if and only if x is brown AND x is a cow to a second approximation… ‘brown cow’ indicates COW(_) & BROWN-FOR-A-COW(_) ‘brown cow’ applies to an individual thing x if and only if x is a cow AND x is brown for a cow

An Elementary Case Study a phrase like ‘brown cow’ is somehow conjunctive to a first approximation… ‘brown cow’ indicates a concept like BROWN(_) & COW(_) ‘brown cow’ applies to an individual thing x if and only if x is brown AND x is a cow to a second approximation… ‘big ant’ indicates ANT(_) & BIG-FOR-AN-ANT(_) ‘big ant’ applies to an individual thing x if and only if x is an ant AND x is big for an ant

An Elementary Case Study a phrase like ‘brown cow’ is somehow conjunctive to a first approximation… ‘brown cow’ indicates a concept like: BROWN(_) & COW(_) ‘Ernie speak yesterday’ (as in ‘I heard Ernie speak yesterday’) indicates a concept like: AGENT(_, ERNIE) & SPEAK(_) & YESTERDAY(_) already many questions… what kind(s) of conjunction? how is such conjunction implemented in human psychology

Kinds of Conjoiners If P and P* are propositions (sentences with no free variables), then: &(P, P*) is true iff P is true and P* is true If S and S* are sentential expressions (with zero or more free variables) then for any sequence of domain entities σ: & (S, S*) is satisfied by σ iff S is satisfied by σ, and S* is satisfied by σ If M and M* are monadic predicates, then for each entity x: 1 &(M, M*) applies to x iff M applies to x and M* applies to x If D and D* are dyadic predicates, then for each ordered pair : 2 &(D, D*) applies to iff D applies to and so does D*

An Elementary Case Study a phrase like ‘brown cow’ is somehow conjunctive to a first approximation… ‘brown cow’ indicates a concept like: BROWN(_) & COW(_) ‘Ernie speak yesterday’ (as in ‘I heard Ernie speak yesterday’) indicates a concept like: AGENT(_, ERNIE) & SPEAK(_) & YESTERDAY(_) already many questions… what kind(s) of conjunction are we appealing to here? if we don’t know, that’s bad if we’re appealing to Tarski’s conjunction, are we saying that humans use this kind of conjunction to understand ‘brown cow’?

‘I’ Before ‘E’ Frege: each Function determines a "Course of Values" Church: function-in-intension vs. function-in-extension --a procedure that pairs inputs with outputs in a certain way --a set of ordered pairs (no instances of and where y ≠ z) Chomsky: I-language vs. E-language --a procedure, implementable by child biology, that pairs phonological structures (PHONs) with semantic structures (SEMs) --a set of pairs

I-Language/E-Language function in Intensionimplementable procedure that pairs inputs with outputs function in Extensionset of input-output pairs |x – 1| + √(x 2 – 2x + 1) {…(-2, 3), (-1, -2), (0, 1), (1, 0), (2, 1), …} λx. |x – 1| = λx. + √(x 2 – 2x + 1) λx. |x – 1| ≠ λx. + √(x 2 – 2x + 1) Extension[λx. |x – 1|] = Extension[λx. + √(x 2 – 2x + 1)]

Going Back To Church given a procedure P1 that maps each α to a β, and a procedure P2 that maps each β to an Ω, there is a procedure P3 that maps each α to an Ω in this sense, procedures compose (and some can be compiled) but a mind might implement P1 via certain representations/operations, and implement P2 via different representations/operations, yet lack the capacity to use outputs of P1 as inputs to P2 if s1 and s2 are recursively specifiable sets, and s1 pairs each α with a β, and s2 pairs each β with a Ω, then some recursively specifiable set s3 pairs each α with an Ω sets don’t compose: s3 is no more complex than s1 or s2 but procedural descriptions of sets might compose

Going Back To Church given a procedure P1 that maps each α to a β, and a procedure P2 that maps each Ψ to an Ω, there is a procedure P3 that maps each α to an Ω specifying a procedure in the lambda calculus (without cheating) tells us that the outputs in can be computed in the Church-Turing sense, given the inputs (and any posited capacities/oracles) this raises questions like those pressed by Marr (in the study of vision) and Chomsky (in the study of language) what kind of algorithm is needed to compute the outputs from the inputs? but MERELY specifying a procedure, by using familiar formal notation, tells us nothing about how the procedure in represented/implemented by human psychology

an I-language in Chomsky’s sense: the expression-generator generates semantic instructions; and executing these instructions yields concepts that can be used in thought  complex concepts that are available for use

 e.g., BROWN(_) & COW(_) But which concept of conjunction is invoked here?

Kinds of Conjoiners If P and P* are propositions (sentences with no free variables), then: &(P, P*) is true iff P is true and P* is true If S and S* are sentential expressions (with zero or more free variables) then for any sequence of domain entities σ: &(S, S*) is satisfied by σ iff S is satisfied by σ, and S* is satisfied by σ If M and M* are monadic predicates, then for each entity x: 1 &(M, M*) applies to x iff M applies to x and M* applies to x If D and D* are dyadic predicates, then for each ordered pair : 2 &(D, D*) applies to iff D applies to and so does D*

The Bold Tarskian Ampersand &(Fx, Gx) is satisfied by (a sequence) σ iff Fx is satisfied by σ, and Gx is satisfied by σ &(Rxx’, Gx’) is satisfied by σ iff Rxx’ is satisfied by σ, and Gx’ is satisfied by σ &(Fx, Gx’) is satisfied by σ iff Fx is satisfied by σ, and Gx’ is satisfied by σ &(Rxx’, Gx’’) is satisfied by σ iff Rxx’ is satisfied by σ, and Gx’ is satisfied by σ &(Wxx’x’’, Rx’’’x’’’’) is satisfied by σ iff Wxx’x’’ is satisfied by σ, and Rx’’’x’’’’ is satisfied by σ The adicity of &(S, S*) can exceed that of either conjunct but think about ‘from under’, which does NOT have these readings: Fxx’ & Ux’’x’’’, Fxx’ & Ux’x, etc.

Frege-to-Tarski Fregean Judgment: Unsaturated(saturated) Planet(Venus) Number(Two) Precedes( ); Precedes(Two, Three) First-Order Judgment-Frames: Unsaturated(_) Planet(_) Number(_) Precedes(_, Three); Precedes(Two, _); Precedes(_, _)

Frege-to-Tarski Tarskian Variables (first-order): x, x', x'', … Tarskian Sentences: Planet(x), Planet(x'),... Precedes(x, x'), Precedes(x', x), Precedes(x, x), … any variable can "fill" any slot of a first-order Judgment-Frame Sentences (open or closed) satisfied by sequences: σ satisfies Number(x'') iff σ (x'') is a number σ satisfies Precedes(x'', x''') iff σ(x'') precedes σ(x''') σ satisfies Precedes(x'', x''') & Number(x'') iff σ satisfies Precedes(x'', x''') and σ satisfies Number(x'’)

The Bold Tarskian Ampersand &(Fx, Gx) is satisfied by σ iff Fx is satisfied by σ, and Gx is satisfied by σ &(Rxx’, Gx’) is satisfied by σ iff Rxx’ is satisfied by σ, and Gx’ is satisfied by σ &(Fx, Gx’) is satisfied by σ iff Fx is satisfied by σ, and Gx’ is satisfied by σ &(Rxx’, Gx’’) is satisfied by σ iff Rxx’ is satisfied by σ, and Gx’ is satisfied by σ &(Wxx’x’’, Rx’’’x’’’’) is satisfied by σ iff Wxx’x’’ is satisfied by σ, and Rx’’’x’’’’ is satisfied by σ The adicity of &(S, S*) can exceed that of either conjunct Do humans naturally employ any such conjoiner?

Kinds of Conjoiners If P and P* are propositions (sentences with no free variables), then: &(P, P*) is true iff P is true and P* is true If S and S* are sentential expressions (with zero or more free variables) then for any sequence of domain entities σ: &(S, S*) is satisfied by σ iff S is satisfied by σ, and S* is satisfied by σ If M and M* are monadic predicates, then for each entity x: 1 &(M, M*) applies to x iff M applies to x and M* applies to x If D and D* are dyadic predicates, then for each ordered pair : 2 &(D, D*) applies to iff D applies to and so does D*

Kinds of Conjoiners (now using y instead of x’) Note the difference between 2 &(D, D*) and &(Pxy, Qxy) no need for variables in the former, and hence no analogs of: &(Pxy, Qyx); &(Pyx, Qxy); &(Pxx, Qxx); &(Pxx, Qxy);...; &(Pyy, Qyy) We could stipulate that 2 +(D, D*) applies to iff D applies to and D* applies to. But this still leaves no freedom with regard to variable positions There is a big difference between (1) a mind that can fill any unsaturated slot with any variable, and (2) a mind that has "unsaturated" concepts like D(_, _) but cannot fill the slots with variables and create open sentences

One More Conjoiner If D is a dyadic predicate, and M is a monadic predicate, then for each entity x: ^(D, M) applies to x iff for some entity y, D applies to and M applies to y _____ | | ^(D, M)  [D(_, _)^M(_)] |___________| Note the difference between ^(D, M) and &(Pxy, Qy) in the former, the “slots” are not independent; no analogs of &(Pxy, Qz) We could define other “mixed” conjunctions. But ^(D, M) is a simple one: its monadic conjunct is closed, leaving another monadic predicate

A Versatile (but simple) Conjoiner If D is a dyadic predicate, and M is a monadic predicate, then for each entity x: ^(D, M) applies to x iff for some entity y, D applies to and M applies to y _____ | | ^(D, M)  [D(_, _)^M(_)] |___________| a separate talk to show that given plausible lexical meanings, and a limited form of abstraction that is required on any view, this can handle ‘quickly eat (sm) grass’ ‘saw cows eat grass’ ‘think I saw most of the cows that ate every bit of grass in the field

General Point phrases indicate complex concepts already many questions… what kinds of complexity? how is this complexity implemented in human psychology?

Summary: Elementary Case Study a phrase like ‘brown cow’ is somehow conjunctive to a first approximation… ‘brown cow’ indicates a concept like BROWN(_) & COW(_) ‘brown cow’ applies to an individual thing x if and only if x is brown AND x is a cow already many questions… what kind of conjunction Monadic: 1 &(M, M*) applies to x iff M applies to x and M* applies to x Tarskian: &(S, S*) is satisfied by σ iff S is satisfied by σ, and S* is satisfied by σ Another Suggestion: ^(Dyadic, Monadic) how is the conjunction implemented in human psychology