@ufology @Constance @Pharoah
Re Searle's Chinese Room thought experiment, I've been reading a little about the so-called Symbol Grounding Problem (SGP). From Wiki:
"The symbol grounding problem is related to the problem of how words (symbols) get their meanings, and hence to the problem of what meaning itself really is (
link )."
It's an open debate whether the SGP has been theoretically solved. Some think it has been solved, others think it has not been solved. I'm in the camp who thinks it either has been solved or will/can be solved in the near future.
The second question is whether solving the SGP solves the mind-body problem (MBP). In other words, if
meaning is synonymous with
consciousness, and we have a theory of how meaning arises in physical systems, then we have a theory of how consciousness arises in physical systems.
In other words, if a physical system interacts with a rose, and the system knows that the rose is a rose; if the object means rose-ness to the system, then just maybe the system is having a conscious experience of the rose. Of course, this would be
conceptual consciousness, and not necessarily
phenomenal consciousness. We've talked here about conceptual consciousness being grounded in phenomenal consciousness. That is, in order to have conceptual consciousness of a rose, one would first need to experience phenomenal consciousness of the roses' smell, texture, colors, etc.
I've argued that phenomenal qualities (colors, smells, sounds, etc.) are
non-conceptual meanings that have arisen between organisms and environmental stimuli.
I feel that
@Pharoah 's HCT offers a theory for the SGP. Pharoah's theory offers an explanation of how neurophysiological processes come to acquire meaning. The meaning is derived from the "qualitative relevancy" of the neurophysiological processes.
However, it's not clear that solving the SGP will solve the MBP. It's conceptually possible that there might be a physical system capable of giving real meaning to symbols, while not being a conscious system. (I say it's conceptually possible because we really don't know; it's possible that any system capable of creating/manipulating grounded symbols
would be conscious.)
This is my beef with HCT; I think it is a model for the SGP but not necessarily a model for the MBP. And here's the main reason why: Non-conscious brain states versus conscious brain states and related, complex behaviors. Put simply, there
appear to be human brain states that utilize meaning to guide behavior but that are
not associated with conscious experience. This is why I have repeatedly asked Pharoah if HCT can explain why some brain states are correlated with conscious experience but not others.
It thus seems to me that meaning (symbol grounding) is a
necessary but not
sufficient condition for (human-like) consciousness. Brain states involved in symbol grounding are sometimes conscious and sometimes not conscious. Imo HCT provides a model of how/why certain neurophysiological processes have meaning (grounded symbols), but not why certain neurophysiological processes are conscious.
Here's more from the Wiki entry to clarify (or further confuse) the issue:
"No, the problem of intentionality is not the symbol grounding problem; nor is grounding symbols the solution to the problem of intentionality. The symbols inside an autonomous dynamical symbol system that is able to pass the robotic Turing test are grounded, in that, unlike in the case of an ungrounded symbol system, they do not depend on the mediation of the mind of an external interpreter to connect them to the external objects that they are interpretable (by the interpreter) as being "about"; the connection is autonomous, direct, and unmediated. But grounding is not meaning. Grounding is an input/output performance function.
Grounding connects the sensory inputs from external objects to internal symbols and states occurring within an autonomous sensorimotor system, guiding the system's resulting processing and output.
Meaning, in contrast, is something mental. But to try to put a halt to the name-game of proliferating nonexplanatory synonyms for the mind/body problem without solving it (or, worse, implying that there is more than one mind/body problem), let us cite just one more thing that requires no further explication: feeling.
The only thing that distinguishes an internal state that merely has grounding from one that has meaning is that it feels like something to be in the meaning state, whereas it does not feel like anything to be in the merely grounded functional state. Grounding is a functional matter; feeling is a felt matter. And that is the real source of Brentano's vexed peekaboo relation between "intentionality" and its internal "intentional object":
All mental states, in addition to being the functional states of an autonomous dynamical system, are also feeling states: Feelings are not merely "functed," as all other physical states are; feelings are also felt.
Hence feeling is the real mark of the mental. But the symbol grounding problem is not the same as the mind/body problem, let alone a solution to it. The mind/body problem is actually the feeling/function problem: Symbol-grounding touches only its functional component. This does not detract from the importance of the symbol grounding problem, but just reflects that it is a keystone piece to the bigger puzzle called the mind (
link )."
The author above confusingly distinguishes
symbol grounding from
meaning. I think that is confusing. To me, a grounded symbol is a symbol that has meaning. However, I agree that a grounded symbol is not necessarily a "conscious" symbol. In my own words, I think the above argument is better stated as:
The only thing that distinguishes an internal state that is grounded from one that is conscious is that it feels like something to be in the conscious state, whereas it does not feel like anything to be in the merely grounded functional state.
Conceptually, I am tempted to question the above argument. Intuitively it makes sense that if a brain state "generates" meaning then that same brain state may be "generating" conscious experience. That just seems right to me. That is, it seems conceptually right to me that meaning and consciousness should be synonymous.
At the same time, it seems evident that there are brain states associated with meaning (symbol grounding) that are not associated with conscious experience. Therefore, symbol grounding/meaning seems to be distinct from conscious experience.
This has a least one very interesting ramification which I've advocated here but which others, namely Pharoah, are completely against.