2

Searle says syntax is neither sufficient for nor constitutive of semantics, all a computer gets (eg from sensors) is syntax (tokenised shapes) therefore computers will never understand the world. Searle: "There is no way to get from syntax to semantics" (Minds, Brains and Science, p34); "digital computers insofar as they are computers have, by definition, a syntax alone" (ibid); "semantics is not intrinsic to syntax" (Mystery of Consciousness, p17).

The premiss that semantics is not intrinsic to syntax to me crystallizes the powerful appeal of the Chinese room argument. And all a computer gets according to Searle is syntax: in the Chinese room, this is the shapes of the Chinese ideograms that drop from the slot in the door and the simple manipulations based solely on symbol shape.

Strangely, Searle never discusses relationships between symbols. In his 1990 Scientific American article he calls a basket of symbols a "database". Elsewhere he calls input symbols "bunches". He never talks about relations between symbols. A database is symbols related together. An input stream is a temporal sequence, also symbols related - related by their adjacency in time as they enter the machine.

So sure, shape is syntax, but what about relations between tokenised shapes? Computers get the relations as well as the shapes. Why does Searle never discuss these relations? If they were part of syntax, surely he would have said so, but he hasn't. So they must be separate from syntax. A token has a shape, syntax, but the relations between tokens are a different matter.

Computers get the relations as well as the tokens related. Computers can react purely to temporal contiguity between input symbols and ignore the syntax, eg alternate input symbols can be stored in alternate locations. The shape, the syntax, of the symbol is irrelevant.

So Searle's premiss "digital computers insofar as they are computers have, by definition, a syntax alone" must be false and, hence the Chinese room argument is unsound. Is this a rebuttal of the CRA? Since there are symbols and relations between them inside the machine, we can grant that syntax is neither sufficient for nor constitutive of semantics, but point out that there is more than syntax inside the machine. Syntax could be a component. True, syntax alone can't yield semantics, semantics is not intrinsic to syntax, but semantics could be a compound of syntax plus relations, and digital computers might think.

Roddus
  • 629
  • 3
  • 14
  • Shape is syntax? Here's a better definition: "[Syntax](https://stackoverflow.com/a/17931183/1634046) is about the structure or the grammar of the language." –  Dec 22 '17 at 21:52
  • Syntax is often taken to be the union of grammar and relational (inferential) structure of the language. One can take an inferentialist position on semantics, but Searle clearly rejects it, to him meaning must involve intentionality with "live" connection of symbols to reality. Searle's contention is that computers are in principle incapable of maintaining such a connection, his opponents dispute it. The ambiguity in the post is that one can take "relations" formally, as abstractions, or as "real", "active" relations that go beyond symbolism, imputing those to computers is problematic. – Conifold Dec 22 '17 at 22:10
  • In a relational database, an n-m relationship between two entities is simply another entity, another table. Syntax. It's the humans who put meaning on the symbols. – user4894 Dec 22 '17 at 23:05
  • 2
    In the final analysis, it doesn't make any difference anyway because humans supply both the semantics *and* the syntax. Even if there is some sort of connection from symbol to reality, only humans can make any sense of such things. Computers just flip bits without understanding anything at all. Imagine reading a book when you had no means to grasp any two letters of the same word at the same time. Likewise, every bit is flipped in epistemic isolation from everything else that's going on in the system, because computers have no means to consciously relate one thing to another. –  Dec 23 '17 at 01:45
  • @Pé de Leão. The term 'syntax' I think was initially borrowed from linguistics, but Searle has a pretty precise sense that does not relate to grammar or language structure. Syntax to him is the *form*al character of the symbol, and "...all that 'formal' means here is that I can identify the symbols entirely by their shapes." (Minds, Brains and Programs, BBS 3(3),1980, p417). More generally, syntax is the value of a property of a token (a certain shape being a value of the property of shape). But there's more to this, eg, different shapes in different fonts can be the same symbol. – Roddus Dec 23 '17 at 20:53
  • @user4894. OK. So relational database relations are syntax. The relation "...between two entities is simply another entity, another table", and the other table is syntax so the relation is syntax. But aren't tables the things related, not the relationship itself? Say 2 input symbols s1, s2, arrive at the machine one after the other and are stored in field 1 of two 2-field records: R1= |s1| | , R2 = |s2| |. Then the relation (temporal contiguity) is stored by writing the address of R2 in F2 of R1, yielding |s1|aR2|. The relation is stored as a pointer, separate from the input symbols.... – Roddus Dec 23 '17 at 21:14
  • @user4894 Cont... In the Chinese room, the pointer could be a piece of string - not a symbol at all. The string could be glued to the two cards inscribed with the symbols. The piece of string is a permanent record of the relationship in time between the symbol tokens as they arrived in the room one after the other. – Roddus Dec 23 '17 at 21:18
  • @Conifold. "to [Searle] meaning must involve intentionality with "live" connection of symbols to reality. This mire has to be solved by AI of course. To me the problems with live connection are big: (a) why is solipsism unrebutted? (b) the causation is from outside to inside, and the caused things don't intrinsically reference or indicate the nature of the cause, so how is a "live connection" (or any inner-to-outer connection) possible? The aboutness of intentionality could be completely internal: one set of inner representation (of symbols) referencing another set (of the world). – Roddus Dec 23 '17 at 21:54
  • @Conifold Cont... To the algorithms running around inside the representations (i.e., to consciousness), it seems like symbols reference the world. Popular linguistic theory says symbols (the external things) reference, denote, refer to, the world (other external things). But the actual connection is between inner representations of symbols and inner representations of the world. There is no connection between (outer) symbols and the (outer) world, there is no inner-to-outer connection - only a causal outer-to-inner sensory connection and inner-to-inner (neural) connection. (bit radical) – Roddus Dec 23 '17 at 22:08
  • @Roddus I don't understand this part: "If they were part of syntax, surely he would have said so, but he hasn't. So they must be separate from syntax." If Searle does not mention that relations are part of syntax perhaps he felt this point was obvious. Why "must" they be separate from syntax? – Frank Hubeny Dec 25 '17 at 18:51
  • @Frank Hubeny Searle's published about a dozen papers and books and given many interviews about his Chinese room, but he's never discussed the great amount of structure in the brain. The room is supposed to be a brain but has no structure in it at all. There's no relational object in the room's ontology. The symbols are in piles or baskets or bunches and can't be connected together. His really key point is that syntax = symbol shape. I wanted to counter the objection that Searle regarded syntax as including structure. If he thought it did he would have said so, but he didn't. – Roddus Dec 26 '17 at 05:22
  • @Roddus As I understand his "Minds, Brains, and Programs" he is trying to make the Chinese room like a program, not like a brain. He is doing this on purpose. He is a physicalist. He believes that intentionality is "a product of causal features of the brain"--not a program. He can't have strong AI because that implies a mind-body dualism. If that strong AI dualism were true one could dump one's mind into a computer as is done in science fiction (see the movie Chappie). He is saying that is impossible. – Frank Hubeny Dec 26 '17 at 15:40

0 Answers0