• NEW! LOWEST RATES EVER -- SUPPORT THE SHOW AND ENJOY THE VERY BEST PREMIUM PARACAST EXPERIENCE! Welcome to The Paracast+, eight years young! For a low subscription fee, you can download the ad-free version of The Paracast and the exclusive, member-only, After The Paracast bonus podcast, featuring color commentary, exclusive interviews, the continuation of interviews that began on the main episode of The Paracast. We also offer lifetime memberships! Flash! Take advantage of our lowest rates ever! Act now! It's easier than ever to susbcribe! You can sign up right here!

    Subscribe to The Paracast Newsletter!

Consciousness and the Paranormal — Part 9

Free episodes:

Status
Not open for further replies.
This just popped in my Philosophical Overdose feed


Structural Realism (Stanford Encyclopedia of Philosophy)

"Scientific realism is the view that we ought to believe in the unobservable entities posited by our most successful scientific theories. It is widely held that the most powerful argument in favour of scientific realism is the no-miracles argument, according to which the success of science would be miraculous if scientific theories were not at least approximately true descriptions of the world."

Naïve realism - Wikipedia

"It is not uncommon to think of naïve realism as distinct from scientific realism, which states that the universe contains just those properties that feature in a scientific description of it, not properties like colour per se but merely objects that reflect certain wavelengths owing to their microscopic surface texture. This lack of supervenience of experience on the physical world has influenced many thinkers to reject naïve realism as a physical theory.[12]

One should add, however, that naïve realism does not claim that reality is only what we see, hear etc.. Likewise, scientific realism does not claim that reality is only what can be described by fundamental physics. It follows that the relevant distinction to make is not between naïve and scientific realism but between direct and indirect realism."
 
Bile - Wikipedia

Bile is produced by the liver.

"That's because kidneys and bile are both objective, physical processes while brains and minds are not."

To what non-objective, non-physical processes would you compare them to?
Other than my mind and by inference the minds of other humans, there are no other non-objective, non-physical things to compare minds to.

So for now, "mind is to body" has no analog. One, because of the subjective nature of mind, but, two, because we don't know how mind is to body.

The only analog that has potential imho is the "as information is to information processor."

But as we noted, even if consciousness just is information, we still have the hard problem.

Do you feel that an upload of your brain would be you? And would it be conscious?
The problem is that we don't even know at what scale consciousness is realized.

Is consciousness fundamental?

Is it realized at the quantum level?

Is it realized at the cellular level?

Is it realized at the multicellular level?

Is it realized at the neural network level?

We don't know.

But then we have a related but distinct question of at which scale is mind realized?

An electron may be fundamentally conscious, but does it have a mind? How about an autopoetic cell? It may be conscious and have a rudimentary mind.

It's hard to answer questions like the above when we can't even confidentially answer those two questions above.

Then there is the issue of kinds of minds, for example, human minds, squid minds, dog minds, etc.

Can a human body have a dog mind? Can a dog body have a human mind? Can a machine body have a human mind?

Unless we know at which scale consciousness is realized and at which scale minds are realized, we can't answer the above.

The EC is position is arguing against the position that the brain is a computer and (1) the mind is a static pattern (program) that is running on this computer, and/or (2) that the brain-computer processes information via static algorithms inherent in its architecture.

ECers seem to favor a dynamic systems approach in which the body-brain is not a static computer executing static programs or utilizing static algorithms. But rather is an open-ended, dynamic system interacting with an open-ended, dynamic environment. Moreover, that the boundary between the organism and the environment is weak rather than strong.

So you ask: would an upload of your brain be you; and would it be conscious?

If it turns out that consciousness is fundamental or is realized at the quantum scale, then everything made out of electrons (say) would be conscious and have the potential for having a conscious mind.

Thus, a brain made out of carbon and a brain made out of silicon would both be capable of realizing a conscious mind.

But would they have the same kind of mind?

That would depend on which level of scale that mind is realized. If mind is realized at the neural network level and structurally and functionally similar neural networks can be made out of carbon and silicon, then a carbon human brain and a silicon human brain should both have the same kind of mind, a human mind.

I could go through multiple scenarios with consciousness and mind being realized at various scales.

Until we solve the MBP and ascertain at which scales consciousness and minds are realized, we can't answer such questions with confidence in the affirmative or negative.
 
Other than my mind and by inference the minds of other humans, there are no other non-objective, non-physical things to compare minds to.

That's because kidneys and bile are both objective, physical processes while brains and minds are not.


Do you mean that brains and mind are not objective, physical processes or just that minds are not?

And again, your position is still dual-aspect monism?


 
Other than my mind and by inference the minds of other humans, there are no other non-objective, non-physical things to compare minds to.

That's because kidneys and bile are both objective, physical processes while brains and minds are not.


Do you mean that brains and mind are not objective, physical processes or just that minds are not?

And again, your position is still dual-aspect monism?
Kidneys and bile are both objective, but brains and minds are not both objective.

That's why the analogy doesn't work.

My position is perhaps better described as neutral monist. However I don't think of consciousness and matter as being ontologically equal.
 
I believe that cognitive scientists use the term "mind" to refer to information processes in the body/brain. Therefore the term mind is not always meant to imply consciousness.

All 'cognitive scientists'? What about Damasio? My impression is that at least several cognitive neuroscientists have become capable of recognizing the multiple lived aspects of embodied consciousness -- both prereflective and reflective -- and to distinguish between consciousness and 'mind'. It does not seem reasonable to me for anyone to claim that the affectivity and awareness evidently present in cells and primordial multicelled organisms means that these organisms possess 'mind'. So there is, as I see it, an unacceptable level of ambiguity in the ways in which the terms 'brain' and 'mind', and also 'protoconsciousness' and 'consciousness', are used by various specialists seeking to contribute to the field of Consciousness Studies.

Between the "mind" and "world?" Sigh. That muddles things right from the start. Honestly I hate to even wade into that.

Not for all of all philosophers of mind, including both phenomenological and analytical ones. Once consciousness as we experience it is recognized as embodied consciousness, both embedded and enactive in the 'world' we live in, it is not only reasonable but necessary to discuss the relations among mind and world as we experience them. And once we are situated within that understanding, we are on the road to overcoming the MBP in terms of a strictly defined dualism, either epistemologically or ontologically.
 
Last edited:
All analogies are "bad" ... but are often an effective way to get the point across. It's exactly because your position is that minds and brains are not both objective that I asked (rhetorically) to what other non-physical, non-objective process would you make a comparison.

Here is the analogy made by Searle that I had in mind:

Wikipedia article on John Searle:

"Stevan Harnad argues that Searle's "Strong AI" is really just another name for functionalism and computationalism, and that these positions are the real targets of his critique.[53]Functionalists argue that consciousness can be defined as a set of informational processes inside the brain. It follows that anything that carries out the same informational processes as a human is also conscious. Thus, if we wrote a computer program that was conscious, we could run that computer program on, say, a system of ping-pong balls and beer cups and the system would be equally conscious, because it was running the same information processes.

Searle argues that this is impossible, since consciousness is a physical property, like digestion or fire. No matter how good a simulation of digestion you build on the computer, it will not digest anything; no matter how well you simulate fire, nothing will get burnt. By contrast, informational processes are observer-relative: observers pick out certain patterns in the world and consider them information processes, but information processes are not things-in-the-world themselves. Since they do not exist at a physical level, Searle argues, they cannot have causal efficacy and thus cannot cause consciousness. There is no physical law, Searle insists, that can see the equivalence between a personal computer, a series of ping-pong balls and beer cans, and a pipe-and-water system all implementing the same program.[54]"
 
Last edited:
Wow ... How did you come across this? Floridi is very interesting. YouTube has a lot of his talks ....


This one is a discussion with Searle on AI.

Searle "...computation is observer relative so it's not a natural science. There are no Turing machines in nature, something's a Turing machine only relative to an interpretation."
 
Not for all of all philosophers of mind, including both phenomenological and analytical ones. Once consciousness as we experience it is recognized as embodied consciousness, both embedded and enactive in the 'world' we live in, it is not only reasonable but necessary to discuss the relations among mind and world as we experience them.
What world? The noumenal world or the "lived" world?

I argue that the lived world just is the mind. The mind just is the lived world. The experienced world. The phenomenal world.

The mind just is green.

So to say the mind interactes with the lived world is redundant. The mind interacts with the mind.

On the other hand, to say that the mind interacts with the external, noumenal world is quite different.

But once we say that, we have introduced the MBP, the HP. Just how does the phenomenal world interact with the noumenal world?
 
Last edited:
Searle "...computation is observer relative so it's not a natural science. There are no Turing machines in nature, something's a Turing machine only relative to an interpretation."
IMG_2029.PNG

Isn't computational neuroscience just a level of description like chemical science, biological science, and psychological science?
 
IMG_2029.PNG

Isn't computational neuroscience just a level of description like chemical science, biological science, and psychological science?

What Searle is arguing is that computation is observer relative - computers are syntactic not semantic machines. See links above for more information.

First, the notion of information is systematically ambiguous. There is the observer-independent sense of information that I have in my conscious mind-brain and the observer-relative derivative information that exists in books, computers, temperature gauges, etc. In either form, all information is dependent on conscious minds. It either exists in the form of conscious thought processes, or it exists in derivative forms of books, computers, etc. Information is not primary in the structure of reality; rather it is dependent on consciousness, just as consciousness itself is a biological phenomenon dependent on brain processes that are themselves dependent on more basic features of physics and chemistry.
 
From Bringsjord's response to Searle, linked above:

That’s the story; now the observation: There are plenty of people, right now, at this very moment, as I type this sentence, who are working to build robots that work on the basis of formulae of this type, but which of course don’t do anything like what R did. I’m one of these people. This state-of-affairs is obvious because, with help from researchers in my laboratory, I’ve already engineered a malicious robot: (Bringsjord et al. 2014). [Of course, the robot we engineered wasn’t super-intelligent. Notice that I said in my story that R was only “highly intelligent.” (Searle doesn’t dispute the Floridi-chronicled fact that artificial agents are becoming increasingly intelligent.)] To those who might complain that the robot in question doesn’t have phenomenal consciousness, I respond: “Of course. It’s a mere machine. As such it can’t have subjective awareness (e.g., see Bringsjord 2007). Yet it does have what Block (1995) has called access consciousness. That is, it has the formal structures, and associated reasoning and decision-making capacities, that qualify it as access-conscious. A creature can be access conscious in the complete and utter absence of consciousness in the sense that Searle appeals to.

 
What Searle is arguing is that computation is observer relative - computers are syntactic not semantic machines. See links above for more information.

First, the notion of information is systematically ambiguous. There is the observer-independent sense of information that I have in my conscious mind-brain and the observer-relative derivative information that exists in books, computers, temperature gauges, etc. In either form, all information is dependent on conscious minds. It either exists in the form of conscious thought processes, or it exists in derivative forms of books, computers, etc. Information is not primary in the structure of reality; rather it is dependent on consciousness, just as consciousness itself is a biological phenomenon dependent on brain processes that are themselves dependent on more basic features of physics and chemistry.
(1) Correct. By this logic all sciences are observer dependent.

(2) Of course that consciousness is a biological processes dependent on brain processes is a thesis not without problems.
 
Other than my mind and by inference the minds of other humans, there are no other non-objective, non-physical things to compare minds to.

So for now, "mind is to body" has no analog. One, because of the subjective nature of mind, but, two, because we don't know how mind is to body.

The only analog that has potential imho is the "as information is to information processor."

But as we noted, even if consciousness just is information, we still have the hard problem.


The problem is that we don't even know at what scale consciousness is realized.

Is consciousness fundamental?

Is it realized at the quantum level?

Is it realized at the cellular level?

Is it realized at the multicellular level?

Is it realized at the neural network level?

We don't know.

But then we have a related but distinct question of at which scale is mind realized?

An electron may be fundamentally conscious, but does it have a mind? How about an autopoetic cell? It may be conscious and have a rudimentary mind.

It's hard to answer questions like the above when we can't even confidentially answer those two questions above.

Then there is the issue of kinds of minds, for example, human minds, squid minds, dog minds, etc.

Can a human body have a dog mind? Can a dog body have a human mind? Can a machine body have a human mind?

Unless we know at which scale consciousness is realized and at which scale minds are realized, we can't answer the above.

The EC is position is arguing against the position that the brain is a computer and (1) the mind is a static pattern (program) that is running on this computer, and/or (2) that the brain-computer processes information via static algorithms inherent in its architecture.

ECers seem to favor a dynamic systems approach in which the body-brain is not a static computer executing static programs or utilizing static algorithms. But rather is an open-ended, dynamic system interacting with an open-ended, dynamic environment. Moreover, that the boundary between the organism and the environment is weak rather than strong.

So you ask: would an upload of your brain be you; and would it be conscious?

If it turns out that consciousness is fundamental or is realized at the quantum scale, then everything made out of electrons (say) would be conscious and have the potential for having a conscious mind.

Thus, a brain made out of carbon and a brain made out of silicon would both be capable of realizing a conscious mind.

But would they have the same kind of mind?

That would depend on which level of scale that mind is realized. If mind is realized at the neural network level and structurally and functionally similar neural networks can be made out of carbon and silicon, then a carbon human brain and a silicon human brain should both have the same kind of mind, a human mind.

I could go through multiple scenarios with consciousness and mind being realized at various scales.

Until we solve the MBP and ascertain at which scales consciousness and minds are realized, we can't answer such questions with confidence in the affirmative or negative.

Tyler Burge, Perception: Where Mind Begins - PhilPapers

ABSTRACT

What are the earliest beings that have minds in evolutionary order? Two marks of mind are consciousness and representation. I focus on representation. I distinguish a psychologically distinctive notion of representation from a family of notions, often called ‘representation’, that invoke information, causation, and/or function. The psychologically distinctive notion implies that a representational state has veridicality conditions as an aspect of its nature. Perception is the most primitive type of representational state. It is a natural psychological kind, recognized in a mature science: perceptual psychology. This kind involves a type of objectification, and is marked by perceptual constancies. The simplest animals known to exhibit perceptual constancies, perception, and representation in a distinctively psychological sense, are certain arthropods. Representational mind, or representational psychology, begins in the arthropods. We lack scientific knowledge about the beginnings of consciousness. Consciousness is neither necessary nor sufficient for perception. I conclude by reflecting on the kinds mind and psychology.
 
(1) Correct. By this logic all sciences are observer dependent.

(2) Of course that consciousness is a biological processes dependent on brain processes is a thesis not without problems.

SIGH ... you need to read the material or watch the video ... he is talking about computation ... not "computational neuroscience" ... watch the bit where he throws the computer at his audience ...
 
Re the multiple realizability of minds

If we conceive of minds as being tightly coupled to the body or even identical to the body than it follows that minds are just as diverse as human bodies. A fascinating concept and one that we may intuitively accept but not intellectual digest.

Think for a moment how different other humans minds are or might be if we allow that the mind is identical to the body as some suggest.

Contrary to intuition, if we were to switch minds with another human—supposing we could retain a memory of our previous mind—I think the differences would be profound.

The new feelings, perceptions, affects, and concepts, etc would be profoundly different than our old ones.

Yes, there would be core similarities; just as most people have two arms, two ears, two legs, etc. But if our minds—lived experiences—are as diverse as our bodies, then the intuition that our minds are pretty much like the minds of other humans is mistaken.
 
SIGH ... you need to read the material or watch the video ... he is talking about computation ... not "computational neuroscience" ... watch the bit where he throws the computer at his audience ...
He says it right here:

Information is not primary in the structure of reality; rather it is dependent on consciousness, just as consciousness itself is a biological phenomenon dependent on brain processes that are themselves dependent on more basic features of physics and chemistry.
 
Status
Not open for further replies.
Back
Top