Syntax by itself is not constitutive of semantics nor by itself sufficient to guarantee the presence of semantics.
Is syntax not sufficient to guarantee
the presence of semantics, or does syntax alone mean that semantics can never
be present? "Not guaranteed" is very different to "definitely not present". You are saying that computers can never have semantics, but here your phrasing says only that they cant guarantee semantics.
In order to have language you have to specify the syntax and the semantics for that language. "blorg," "et," and "dring" are words in a certain language. Here is a valid sentence: "Blorg et", now tell me what it means. You *can't* know, you can *never* know what it means unless and until it's meaning is assigned.
You havent shown that "knowing" is not a syntactic process, nor that meaning cant be assigned through a syntactic process. I think the main problem here is that unless we have a very well defined understanding of consciousness and how "knowing" works with relation to consciousness then we are just moving the problem into the gaps.
I can describe the process of "knowing" syntactically without too much trouble:
- Take a set of inputs and assign them a random symbol (anything - eg, 0xFFA8B)
- Assign a random output to associate with the symbol.
- Execute the output each time the symbol is present, and adjust the output based on feedback.
This is a perfectly valid description based on the idea that we dont need to specify the semantics - we can instead agree on semantics by a syntactically defined trial and error process. We can see that this is present in babies: They dont automatically understand language and know the semantics that we as adults take for granted. They come to it through a process of trial and error with random noises.
Now, if I were to say that this process now "knows" the meaning of the inputs it is impossible to contradict unless we have a very well defined understanding of what it means to "know". Is this syntactic description meaningfully different to the way knowledge works in a conscious mind? We dont know enough about consciousness to be able to definitively say "the critical part of consciousness is missing from this description".
So all youve done is shift the question of "what is consciousness" to needing to first answer "what is knowledge".
Computers *only* have syntax. That's the whole point of building one. Symbolic abstractions are very powerful.
This is just assuming your premise. "Computers only have syntax" is only true if we assume your premise that computers can only have syntax.
I also disagree that only
having syntax is the point of building one - the point of building a computer is to use syntax, but that doesnt preclude semantics, it just means semantics arent necessary to meet the purpose. In other words, the point of a computer is to have syntax; the point of a computer is not to have only
Abstract symbolic manipulations have no causal powers. 2 + 2 does not *cause* 4, it simply *is* 4. I can simulate water on my PC but I cannot cause it to get wet by simulating water. The brain is above all a causal mechanism and anything that thinks must be able to duplicate and not merely simulate the causal powers of the causal mechanism. The mere manipulation of formal symbols is not sufficient for this. We are machines and the brain "does" consciousness just as the stomach does digestion so it should be possible to build a brain someday. But it won't be a PC.
As I pointed out in the previous thread - there is a fair amount of study that many processes in the brain happen prior to our conscious awareness of that process. For example, decisions can be predicted prior to the subject being aware of having made a decision. Therefore if we are talking about the conscious
part of the brain (where semantics exists), it is far from clear that we can say that it is a causal mechanism. And conversely since semantics is a property of consciousness, any causal properties of the unconscious part of the brain cannot be relevant.