Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
Imagine that tomorrow we perfect mind-upload. Your entire brain, and with it your identity and memories get cloned into an AI. You get to meet the AI, it’s really you. But the physical you, the meat you, still exists. The AI is a clone. I presume there would be no need to kill yourself, but would you no really longer have fear of death in your physical body? I doubt it. The thing about us living forever is that even if it happens in your lifetime, it probably won’t be ‘you’ living forever.
That is a strict improvement over the status-quo.
I'm not a biological chauvinist, and I think that the upload has equal claim to my name and assets. I also expect that unless things go really awry, the human version would probably end up acquiring biological immortality eventually. Destructive scans seem much easier than ones that preserve the original, but the latter isn't a bad thing as far as I'm concerned. It always leaves the option of another upload, one when the original is closer to death.
Even if that wasn't the case, I'd rest easier. Growing old and dying of old age sucks, but it is a great comfort to have seen your grandkids be born, follow in your footsteps and flourish. You can die with fewer regrets. In the same manner, if I had a backup, even one that would outlive me, I'd wish it well, and know that it would share most of my goals and desires, and would mourn my passing.
Or feel relief about not having some progenitor who's seen as more-real-than-you hanging aroung anymore.
I suspect there will be all kinds of dysfunctions with the uploads themselves and revolving around them. The psychologists of the future will have their quackery cut out for them.
The best person to speculate about a copy of myself would be me. And I don't think that would happen.
The copying process may not be perfect. There could be bit flips. And that's assuming that your pattern will actually think like you go; that the difference in hardware will not matter and that the uploading process actually captures everything that makes you tick.
We're talking about a hypothetical technology, one that might pass through decades of improvement and iteration.
The minimum viable product is one that isn't a gibbering mess after destructive scanning. The ideal is one that is absolutely indistinguishable (outside of being in a computer) from the original human and leaves the OG meat intact.
There's plenty of room in between. I'm not the exact same person when I go to bed and wake up the next day, and I don't write a will before I do. Or when I consider what I'll be like in a year, or even decades.
I can't say with 100% certainty what I'll be like in a few decades as is, but I'm reasonably confident that short of a brain disease, I won't become a serial killer or try and suffocate my parents in their sleep.
Scanning the brain in any fashion is the hard bit. Making sure the hardware and simulated environment is congenial is trivial in comparison. If we diverge over time, that's no surprise, even twins do, or the me of today and next year.
If you don't consider yourself the same after you go to sleep then I don't see how you can justify having any sentimental attachment to the mind upload, who immediately deviates by virtue of existing in a different context.
I don't consider myself to be exactly the same person after any period of time that can be measured on a clock. I think it's quite clear that I'm not particularly bothered by this.
If I can identify with and care about the wellbeing of the person "I" will be in decades, then I can identify with an upload.
The difference is I will never be the upload.
This strikes me as one of the failures of rationalism: attempting to reconstruct the value system based on abstract and inevident first principles such as "pattern theory", then when someone's common intuition fails to align with those, the rationalist declares common intuition wrong. Not referring to you, since you seem pretty open to other people having different opinions, but rather someone like EY calling it "muggle thinking".
I care about living because of my biological substrate's reward system, not because I intellectually (on what basis?) prefer an existence of a "pattern" of sun_the_second over its nonexistence.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link