Download presentation
Presentation is loading. Please wait.
Published byRudolph Griffin Modified over 9 years ago
1
The Chinese Room Argument Part II Joe Lau Philosophy HKU
2
The issues 4 Certain computations are sufficient for cognition (computational sufficiency). –Objection : The Chinese room argument –Evaluation : Not valid. 4 A more general argument –The argument from syntax and semantics
3
The argument 4 Computer programs are formal (syntactic). 4 Human minds have mental contents (semantics). 4 Syntax is neither constitutive of nor sufficient for semantics. 4 Conclusion : Programs are neither constitutive of nor sufficient for minds.
4
Initial comments 1. Programs are formal. 2. Minds have contents. 3. Formal syntax not enough for contents. Conclusion : 4. Programs not enough for minds. Comment #1 : The argument is valid. That is, if the premises are true, the conclusion must also be true. So we have to decide whether premises 1 to 3 are true or not. Comment #2 : The second premise is obviously true. To have a mind, one must have mental states with content. Thoughts, beliefs, desires all have content (intentionality, aboutness). Brentano’s “mark of the mental” Comment #3 : The Chinese room argument is an argument is supposed to provide independent support of premise #3.
5
First premise : “Programs are formal” 4 True in the sense that : –Symbols are defined independently of meaning. –Computational operations are defined without reference to the meaning of symbols. 4 False in the sense that : –Programs cannot / do not contain meaningful symbols. –The function of symbols is to encode content!
6
Third premise : “Syntax not sufficient for semantics” 4 Question : Do the symbols have meaning or not? –If so, then there is content / semantics. –The symbols in the Chinese room do have content. –Symbols in AI programs can have assigned content. –So programs with meaningful symbols might still be sufficient for minds.
7
What is “semantics” for Searle? “Having the symbols by themselves … is not sufficient for having the semantics. Merely manipulating symbols is not enough to guarantee knowledge of what they mean.” So having meaningful symbols in a system is not enough for mental content. The system must know what those symbols mean. But why?
8
Response 4 Mental representations (symbols) are used to explain intentional mental states. –E.g. X believes that P = X has a mental representation M of type B with content P. –X is not required to “understand” M. 4 They cannot do that if they themselves have to be understood or interpreted. –Infinite regress otherwise.
9
Summary 4 Searle thinks that the symbols in a system must be understood / interpreted by the system to generate meaning / understanding. 4 Begs the question against the thesis of computational sufficiency : –Understanding is having symbols that encode information in the right way. –The symbols do not require further understanding or interpretation.
10
Remaining issues 4 Suppose formal operations on meaningful symbols can be sufficient for mental states. 4 Q1 : Where do the meanings of symbols come from? 4 Q2 : Can formal operations be sufficient to give symbols meaning?
11
Where does meaning come from? 4 The meaning of words (linguistic meaning) depends on conventions governing their use. 4 Words are “voluntary signs” (John Locke)
12
A different theory is needed 4 The theory of linguistic meaning does not apply to mental representations : –No conventions governing the use of the mental representations. –Presumably we cannot change the meanings of mental representations arbitrarily through conventions.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.