smcder
Paranormal Adept
Reddit yes... nicely put: first contribution last para.
Klaussen not yet... v bad internet (I blame either London or apple) and not accessed v easily. What did you think of K?
Klaassen
"It seems then that the sense of contingency that accompanies the fact that I am Tim Klaassen is really illusory. Wherever there exists the self conscious human being that is Tim Klaassen, I am necessesarily there, present to his point of view. And this gives my existence a very real and robust quality. No matter what, as long as Tim Klaassen is alive, I am here and no one else. It could not be otherwise."
1. OK, but is the fact that there is a TK not contingent? Does the feel of this problem (what I call cognitive "funny bone" problems, what others call aporia) influence our thinking on it? It seems that just here, like on a lot of questions (ethical questions come to mind (which I think ethical considerations are inseparable from individual identity questions)) - the feel of the problem (note: notably absent for some people) is a real factor.
EXAMPLE - I am leery of hypotheticals, much less super-duper ones like the one below, but in my defense in this instance, some people believe the following may be possible some day (soon). AND I think it's quite easy to see what each variation gets at and its possible it may change your intuitions, or at least it's possible it may be changing mine:
"would you allow the contents/configuration of your brain to be uploaded to a computer?"
i.e. a digital representation is made of the synaptic connections - or, do you prefer an analog representation is made? Would your answer change if we say they are instantiated by the newly invented/discovered memristor - which functions a lot like a real neuron ... would it change if it were transferred to a new human brain/body tabula rasa - would it change if say this is done with an artificial neuron arbitrarily physically similar to your own neurons? - (i.e. by some miracle of material science, you can specify how physically similar the substrate is to your own brain, for any degree of similarity x, x < the specified degree of similarity - up to and including bio-identicality).
Would it change if a digital copy were made and then later the physical instantiation were made? Would it change if the digital copy were destroyed but a printout had been made and then someone hand codes it back in to a computer and then instantiates it into any kind of substrate you desire? (is time a factor, is the substrate a factor? does it matter if at some point all that's left of you is an ink jet printout? Is any transmission allowable?)
Would you consider it if it were transferred to any of the above and then back down into your own mind and body (the synapses having been randomly re-connected to one another in the meantime?)*
A. Your brain is not destroyed in the process.
B. It is.
C. Same question but for a loved one.
D. Under what, if any, circumstances - would you allow your brain to be transferred?
E. What is the legal identity of the above if A? If B?
F. Similarly for what is your relationship to the loved one?
I find that I have a different gut response to being transferred to anything, including a "bio-identical" or arbitrarily identical brain than to being transferred to another brain - there is still a twinge with thinking about being transferred to another brain - does that mean I harbor some non-physical sense of who I am? Can that not be explained indexically? That whatever is transferred just isn't me and that's important? (to me?) Under no circumstances do I find myself more comfortable with B - the brain is destroyed - that seems to me telling for any other circumstance - so that while I am more comfortable with being transferred to another human body, B makes me realize that is no more me than any other variant - although it's arguably human, and the question of the humanity of the others is questionable. But that's only true if there is something essential to being in a body/brain.
Of course we can dismiss the whole thing as impossible - that whatever data is carried over isn't human/wouldn't be conscious, etc but then we have to explain what the extra is. Let's modify the Star Trek transporter problem. We have a device that records no data. But simply pulls out, elementary particle by elementary particle - "One Quark At A Time" to misquote the great philosopher of mind, Johnny Cash - you and whips it one hundred miles away into the same relative physical configuration - and does so arbitrarily rapidly, instantaneously, if you wish. Do you buy it? If the most precious thing in the world was one hundred miles away and going to be gone in an instant if you are not there to save it - would you buy it?
So all of that I think explores how our gut feelings interacts with our "rational mind" - to me, all the scenarios are the same and the answer is no - to me, in that last process, I would be violently, if instantly (or instantly but violently) torn apart and I would not be put together again, horses and men be damned. Something would be lost, but what?
On the one hand, I seem to be arguing that the physical as it is is very important, on the other I seem to feel that I is not just physical.