Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Symbol Grounding Problem

Similar presentations


Presentation on theme: "The Symbol Grounding Problem"— Presentation transcript:

1 The Symbol Grounding Problem
Presenter: Ankur Garg - Good morning, everyone. I'll be talking about the "The symbol grounding problem" which was originally written to explain the cognitive theory, basically the theory about the internal functioning of the brain to explain our observed behavior. But since we are supposedly living in the current age of AI which aims to develop systems mimicking our brains, I felt that this paper was quite an interesting read on how far are we from actual AI.

2 Motivation: The Chinese Room[1]
- The motivation of this paper stems from a major thought experiment conducted by John Searle in the early 80's. This was known as the Chinese room experiment and first let's see a glimpse of what it was. - As you saw in the video, this experiment countered the claims of the GOFAI which was popular at that time. I'll come back to this experiment later again and show it's relevance for today's AI systems. [1] Searle, John. "The Chinese Room." [2] Video Source:

3 Motivation: Chinese/Chinese Dictionary-Go-Round[1]
Endless looping between chinese symbols No understanding of the symbols 马 → 骘 → 骑[2] - The author gives another example which is termed as Chinese/Chinese Dictionary Go Round. Consider you want to learn Chinese but you only have access to a Chinese dictionary. You are more likely to oscillate between different symbols without ever knowing what the symbol actual means in the real world. These two examples show how our learning of language which is also a set of symbols cannot be based on symbols alone. [1] Harnad, Stevan. "The symbol grounding problem." Physica D: Nonlinear Phenomena (1990): [2] Chinese Synonyms fetched from Google Translate

4 Symbol Grounding Problem
"How can the semantic interpretation of a formal symbol system be made intrinsic to the system, rather than just parasitic on the meanings in our heads?"[1] Formally, the symbol grounding problem is defined through this statement in the paper. Basically, the symbols should explicitly convey the intended meaning instead of relying on the reader to interpret its meaning. "connecting to the world" [1] Harnad, Stevan. "The symbol grounding problem." Physica D: Nonlinear Phenomena (1990):

5 Why is it important? 马 → 骘 → 骑 Connects symbols to the world
The obvious benefit of this problem is can then ground the chinese symbols we saw in the earlier slide to horses as they all are synonyms of horses. [1] Chinese Synonyms fetched from Google Translate [2] Image Credit (Creative Commons license):

6 Why is it important? Explain cognitive theory
How do we discriminate and identify? Showing cognition is more than just symbol manipulation The larger goal is to explain the cognitive theory of human brain and what happens inside of it that explains behaviors like discriminating between two objects and identifying the class of an object. Once, we'll see the symbol system formalism, we will see that it is completely independent of the physical elements it corresponds to and the author attempts to bridge this gap through this symbol grounding problem.

7 Contributions Defining the Symbol Grounding Problem
Bottom-up solution to ground symbols to non-symbolic representations Defining the role of connectionism in symbol grounding These are three main contributions of the work

8 Background Modeling the mind Symbol System Connectionism
Now, let's take a step back and understand some terms mentioned in the previous few slides which are important. I'll be focusing on symbol systems and connectionism in my slides. Is there any other term you want me to disambiguate before moving on?

9 Symbol Systems Formal System: 8 properties Explicit Rules
Semantically Interpretable Composite Programming Language is a symbol system

10 Symbol Systems Independent of Physical Realizations
Most successful in building AI systems at the time Better at formal, language-like tasks

11 Connectionist Systems
"neural networks", "parallel distributed processing" Restricted to explaining observable behavior and causal interactions - cognitive aspect "Brainlike" not necessary criteria

12 Connectionist Systems
Better at sensory, motor and learning tasks However, some higher level tasks seem symbolic logical reasoning, mathematics, chess-playing

13 Candidate Solution

14 Connecting to the World: Hybrid Approach
Symbolic Representations Symbol Systems Elementary Symbols Iconic Representations Categorical Representations Connectionist Systems

15 Iconic Representation
Projections of objects on sensory surfaces Helps in discriminating between two objects [1] Zebra Image Credit (Creative Commons license): [2] Horse Image Credit (Creative Commons license):

16 Categorical Representations
"Invariant features" of sensory projection Helps in identifying a member of a category Example Invariant Features Different Horses - add some images of invariant features [1] Horses Image Credit (Creative Commons license): [2] Features Image Credit (Creative Commons license):

17 Symbolic Representations
Compose grounded set of elementary symbols Builds semantic interpretation Allows identification of unseen objects & = [1] Zebra Image Credit (Creative Commons license): [2] Stripes Image Credit (Creative Commons license): [3] Horse Image Credit (Creative Commons license):

18 Role of Connectionism Aid in establishing relations between symbols and icons Learn the "invariant features" for identification

19 Conclusion Bottom-up is the only viable route for grounding
Complete symbolic system impossible Symbol modification in proposed system: Shape of the symbol Shape of iconic/categorical representation of the grounded symbol Scheme in spirit of behaviorism

20 Critique

21 Chinese Room Axioms and Conclusions[1]
(A1) Programs are formal (syntactic). (A2) Minds have mental contents (semantics). (A3) Syntax by itself is neither constitutive of nor sufficient for semantics. (A4) Brains cause minds. [1] Source: [2] Searle, John. "The Chinese Room."

22 Importance of the Problem
Relevant to multiple disciplines Cognitive Science, Philosophy, Computer Science, AI, Robotics Does AI understands the symbols as we do? Problem still persists[1] Most approaches follow the brute force grounding solution Consequences of actions - even though it was intended only to be a cognitive modeling problem - can be read with multiple perspectives [1] Lewis, Mike, et al. "Deal or no deal? end-to-end learning for negotiation dialogues." arXiv preprint arXiv: (2017).

23 Limitations of Solution
Missing details on generating symbolic representations User study to verify the proposed cognitive theory

24 Research Directions 4,426 citations![2] Cognitive Theory
Psycholinguistics Artificial Intelligence Question Answering Interpretability from connectionism Robotics Teaching Language Self-driving cars - talking about the research directions it has opened [1] Image Credit: [2]

25 References Harnad, Stevan. "The symbol grounding problem." Physica D: Nonlinear Phenomena (1990): The Chinese Room Experiment - The Hunt for AI - BBC Cangelosi, Angelo. "Solutions and open challenges for the symbol grounding problem." International Journal of Signs and Semiotic Systems (IJSSS) 1.1 (2011): Chinese Room Argument

26 Discussion Clarifying questions about the content
Is the Chinese Room experiment still relevant for today's AI?


Download ppt "The Symbol Grounding Problem"

Similar presentations


Ads by Google