Indeed, as I outlined in previous posts, neuroscience has identified a number of regions that appear to be intimately associated with consciousness, not the least of which is the
thalamocortical loop. The other issue is the nature of the term "transducer", which is something that converts one type of energy into another, and the evidence strongly suggests that if we consider sensory stimuli to be one form of energy, and consciousness as another form of energy, that the brain is instrumental in the process of converting one to the other, and therefore the label of transducer appears to be rather apt.
As noted above, Dennet is not arguing that consciousness—subjective experience—does not originate with brain activity.
He is arguing that there is no double transduction; he is arguing against dualism via emergence.
He argues that there is a transduction that takes place in the brain. Namely that environmental stimuli are 'transduced' into neural representations. He is claiming (if I read correctly [
@smcder? ] ) that there is an ontological identity between physical neural activity and the representations they constitute (such as the phenomena color green).
This is in direct contradiction to the view I understand you to hold that there is a double transduction:
Environmental stimuli transduced to (1) neural signals, which are then transduced to (2) consciousness.
Dennet doesn't argue against a double transduction on mechanical grounds—there is plenty of literature out there presenting the problems with weak and strong emergence.
Rather he argues against it on the grounds that thinking there is a double transduction is a mistake and it's not necessary.
Rather than suggesting that brain activity produces an ontologically new substance (which we call consciousness) he argues that brain activity constitutes representations, and it is these representations we refer to as consciousness.
I think this makes a lot of simple sense. Especially if we consider that double transduction just kicks the can down the road.
If consciousness is representational, then the double transduction just delays things.
For ex:
Environmental stimuli X (<FROG>) is transduced by the nervous system into neural activity X1 (a representation of a frog).
The double transducers (may) say brain activity X1 can't be consciousness because neural activity doesn't
look like the representation as experienced from the 1st person perspective, the frog.
So, the double-transducers say there must be another transduction: from brain activity X1, something that
looks like a green frog (and the rest of the phenomenal field) must spring from this neural activity, something we call consciousness.
It's not enough, they say, to say that neural activity constitutes a representation of a green frog; rather, a new substance that is iconically like a green frog must spring from the brain.
One of the many problems with this view that I like to point out is that even if there were such a double transduction—a substance that springs from brain activity that just is the perception of a frog—if we tried to observe this objectively, our perception of it would be a representation of this substance—and therefore, would
not look like the consciousness substance either. So we'd be right back in the same boat that leads some to double transduction in the first place.
In other words, whatever consciousness is, it won't appear the way that we subjectively experience it.
Therefore, the fact that brain activity looks nothing like the experience of a green frog is no grounds to argue that brain activity doesn't constitute the experience of a green frog.
Having said all this, having finished Dennett's most recent article, he still doesn't account for the existence of
phenomenal consciousness.
I largely agree with his argument against double transduction. I agree that brain activity can constitute 1st person representations.
I agree with Dennett that it's a mistake to expect consciousness to perceptually resemble its content.
Thus, the process that constitutes the experience of a green frog needn't—itself—look anything like the experience of green frog.
Ergo, the fact that brain activity doesn't look like the representational experiences it grounds is no reason to posit another transduction into a medium that does look like the representation.
There
is a problem though: Dennett seems to be arguing that a non-phenomenal medium—physical brain activity—can ground phenomenal representations.
So I think Dennett is correct to say that brain activity needn't resemble the representations it constitutes, but I think it's problematic to suggest that a non-phenomenal system (the brain) can ground phenomenal representations.
My approach to this concern is along the lines of Strawson. Namely that it's a mistaken presupposition to suggest that brains are non-phenomal.
I suppose that Dennett would say perceptual representations are not truly phenomenal; and thus there is no reason to call for their ground (the perceptual system) to be phenomenal.
The fact that conscious representations seem phenomenal is itself the product of the perceptual system (the brain).