I know that makes sense to computationalists, but there are many other specialists (psychologists, philosophers, cognitive neuroscientists, biologists, ethologists, and more) for whom it does not.
My point is you can't have it both ways. You can't interpret 'information' in the computational/physical sense, and try to extend or re-interpret that in the philosophical or epistemological sense.
As an example, one can show a fundamental compression limit on information. I forget what it is, but I once wrote an algorithm that hit something like 11% lossless compression on plain text. I can show mathematical equivalency in information, with a smaller data set as a result, with the uncompressed data being in a low entropic state, and the compressed data set being in a high entropic state.
One cannot ascribe such descriptions to philosophical descriptions of "information" or "knowledge." Or psychology, etc. Cognitive neuroscience, perhaps, but only when dealing with Information in the formal sense, not the informal sense.
Major category error in my opinion.
Yes, there is a distinction to be made between memory and sense of self. Memory has long been considered to be necessary for the construction of a sense of 'self'. The marvel Panksepp points out is that in the loss, the absence, of memory, through stroke or other brain damage, an individual consciousness can still maintain its sense of self. We see this also in cases of amnesia, when an individual wakes up from an accident of some sort with no remembrance at all of his/her name, identity, origin, or past life, yet functions in this world to which she or he has formerly been accustomed. What does this signify concerning the nature of consciousness and selfhood, personhood, personality? .
My thinking here is that the sense of self could be easily explained as an illusion we tell ourselves, and evolved so we would have a sense of continuity, future planning, and learning.
I.e. I don't have to re-learn something I learned at 5 because that was still me, and I want to gather food for the winter, because it will still be me, then too.
It's like pointing at a river and calling it a river.
It's not the same river the next second, we just call it the same because it's convenient and we are trying to rationalize the world. "River" is not the actual river.
An accretion is not an integrated system. An embodied consciousness is a complex integrated system, integrated within itself and also integrated with its environment. The difference is elaborated in Varela and Thompson's research into self-organizing dissipative systems from the single cell to the human being. This paper is an overview of their approach, referred to as neurophenomenology:
http://brainimaging.waisman.wisc.edu/~lutz/ET&AL&DC.Neuropheno_intro_2004.pdf
It's late and I'm going to stop here, but will continue tomorrow.
My thinking is that we are not an integrated system, we just have a big, shiny, and new neocortex that is able to be a general-purpose AI and make sense of all the underlying subsystems, abstracting enough of it away that we think we're just one thing.
As an example, if I blew away my hindbrain and replaced it with a machine able to keep my autonomic systems functioning, would I still be me? Probably. If I blew away my limbic system? Maybe... but emotionally I'd be different. My neocortex? No way. The overarching complex meat that made sense of the hardware would be gone, and that's probably mostly of what "me" is when I say "me."