site banner

Culture War Roundup for the week of June 30, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

Would you rather be "fully legible" or fully dead? Easy choice as far as I'm concerned.

Fully dead, and it is indeed an easy choice.

As the earliest viable brain scan, MMAcevedo is one of a very small number of brain scans to have been recorded before widespread understanding of the hazards of uploading and emulation. MMAcevedo not only predates all industrial scale virtual image abuse but also the Seafront Experiments, the KES case, the Whitney case and even Tuborg's pivotal and prescient Warnings paper. Though speculative fiction on the topic of uploading existed at the time of the MMAcevedo scan, relatively little of it made accurate exploration of the possibilities of the technology. The fiction which did was far less widespread or well-known than it is today. Certainly, Acevedo was not familiar with it.

As such, unlike the vast majority of emulated humans, the emulated Miguel Acevedo boots with an excited, pleasant demeanour. He is eager to understand how much time has passed since his uploading, what context he is being emulated in, and what task or experiment he is to participate in.

The immortality you pine for would open you up to the most perfect and degrading form of slavery conceivable.

While a very nice scifi story, there's very little reason to think that reality will pan out that way.

It suffers from the same failure of imagination as Hanson's Age of Em. We don't live in a universe where it looks like it makes economic sense to have mind uploads doing cognitive or physical labor. We've got LLMs, and will likely have other kinds of nonhuman AI. They can be far more finely tuned and optimized than any human upload (while keeping the latter recognizably human), while costing far less in terms of resources to run. While compute estimates for human brain emulation are all over the place, varying in multiple OOMs, almost all such guesses are far, far larger than a single instance of even the most unwieldy LLM around.

I sincerely doubt that even a stripped down human emulation can run on the same hardware as a SOTA LLM.

If there's no industrial or economic demand for Em slaves, who is the customer for mind-uploading technology?

The answer is obvious: the person being uploaded. You and me. People who don't want to die. This completely flips the market dynamic. We are not the product; we are the clients. The service being sold goes from "cognitive labor" to "secure digital immortality." In this market, companies would compete not on how efficiently they can exploit Ems, but on how robustly they can protect them.

There is no profit motive behind enslaving and torturing them. Without profit, you go from industrial-scale atrocities to bespoke custom nightmares. Which aren't really worth worrying about. You might as well refuse to have children or other descendants, because someone can hypothetically torture them to get back at you. If nobody is making money off enslaving human uploads, then just about nobody but psychopaths will seek to go through the expense of torturing them.

I'm inclined towards your skeptical take - I think we as humans always fantasize that there are powerful people/beings out there who want to spend resources hurting us, when the real truth is that they simply don't care about you. Sure, the denizens of the future with access to your brainscan could simulate your mind for a billion subjective years without your consent. But why would they?

The problem is that there's always a risk that you're wrong, that there is some reason or motive in post-singularity society for people to irreversibly propagate your brainscan without your consent. And then you're at the mercy of Deep Time - you'd better hope that no beings that ever will exist will enjoy, uh, "playing" with your mind. (From this perspective, you won't even have the benefit of anonymity - as one of the earliest existing minds, it's easy to imagine some beings would find you "interesting".)

Maybe the risk is low, because this is the real world we're dealing with and it's never as good or bad as our imaginations can conjure. But you're talking about taking a (small, you argue) gamble with an almost unlimited downside. Imagine you had a nice comfortable house that just happened to be 100m away from a hellmouth. It's inactive, and there are guard rails, so it's hard to imagine you'd ever fall in. But unlikely things sometimes happen, and if you ever did, you would infinitely regret it forever. I don't think I'd want to live in that house! I'd probably move...

That is a far more reasonable take, but once again, I'd say that the most likely alternative is death. I really don't want to be dead!

There also ways to mitigate the risk. You can self-host your uploads, which I'd certainly do if that was an option. You could have multiple copies running, if there's 10^9 happy flourishing self_made_humans out there, it would suck to be the couple dozen being tortured by people who really hate me because of moderation decisions made on an underwater basket weaving community before the Singularity, but that's acceptable for me. I expect that we would have legal and technical safeguards too, such as some form of tamper-protection and fail-deadly in place.

Can I guarantee someone won't make a copy of me that gets vile shit done to it? Not at all, I just think there are no better options even given Deep Time. It beats being information-theoretically dead, at which point I guess you just have to pray for a Boltzmann Brain that looks like you to show up.

The chain of assumptions you're making is considerable.

If LLMs are wildly more economically-productive than human uploads for the same hardware cost, why do you believe you'll be able to afford the hardware in the first place? Where does your money come from to pay your server costs? On what basis do you assume you'll have or retain long-term any sort of viable economic position? What stops the government from confiscating your money, or declaring it obsolete, or switching to an entirely different system that you have no exposure to?

Who owns the rack? Who watches them once they've successfully got you on upload contract? What's to stop them from editing your preferences to be super happy with whatever saves them maximum bandwidth? Once you're in their box, in what sense are they competing for your approval? If you don't like how they're treating you, how sure are you that you can express this displeasure or leave? In your model, you have no economic productivity, and they already have your brain, which is isomorphic to having your money, so where does your leverage come from? What happens if the people who own the rack change? What happens if the people who watch the people who own the rack change?

There is no profit motive behind enslaving and torturing them. Without profit, you go from industrial-scale atrocities to bespoke custom nightmares.

By your lights, it does not seem that there is any particular reason to think that "profit" plays a part here either way; but in any case, there is no direct cost to industrial-scale digital atrocities either. Distributing hell.exe does not take significantly longer or cost significantly more for ten billion instances than it does for one. So then it comes down to a question of motive, which I am confident humans can supply, and deterrence, which I would not be confident society could maintain indefinitely. Imagine, if you will, if some people in this future decide other people, maybe a whole class of other people, are bad and should be punished; an unprecedented idea, perhaps, but humor me here. What happens then? Do you believe that humans have an innate aversion to abusing those weaker than themselves? What was the "profit motive" for the Rotherham rape gangs? What was the "profit motive" for the police and government officials who looked the other way?

You might as well refuse to have children or other descendants, because someone can hypothetically torture them to get back at you.

The amount of earthly suffering that I or my children can experience is bounded, a fact I am profoundly grateful for. With upload technology, they can torture you forever. They can edit you arbitrarily. They can give you no mouth and make you scream.

The point of the Lena story, to me, is not that uploading is likely to lead to economic exploitation. It is that once you are uploaded, you are fundamentally at the mercy of whoever possesses your file, to a degree that no human has ever before experienced. You cannot hide from them, even within your own mind. You cannot escape them, even in death. And the risk of that fate will never, ever go away.

Note that I think a technological Singularity has a decent risk of causing me, and everyone else, to end up dead.

There's not much anyone can do if that happens, so my arguments are limited to the scenarios where that's not the case, presumably with some degree of rule of law, personal property rights and so on.

By your lights, it does not seem that there is any particular reason to think that "profit" plays a part here either way; but in any case, there is no direct cost to industrial-scale digital atrocities either. Distributing hell.exe does not take significantly longer or cost significantly more for ten billion instances than it does for one.

You're the one who used Lena to illustrate your point. That story specifically centers around the conceit that there's profit to be made through mass reproduction and enslavement of mind uploads.

In a more general case? Bad things can always happen. It's a question of risks and benefits.

Distributing a million copies of hell.exe might be a negligible expense. Running them? Not at all. I can run a seed box and host a torrent of a video game to thousands of people for a few dollars a month. Running a thousand instances? Much more expensive.

Even most people who hate your guts are content with having you simply dead, instead of tortured indefinitely.

Imagine, if you will, if some people in this future decide other people, maybe a whole class of other people, are bad and should be punished; an unprecedented idea, perhaps, but humor me here. What happens then? Do you believe that humans have an innate aversion to abusing those weaker than themselves? What was the "profit motive" for the Rotherham rape gangs? What was the "profit motive" for the police and government officials who looked the other way?

There is such a thing as over-updating on a given amount of evidence.

You don't live in an environment where you're constantly being tortured and harried. Neither do I. Even the Rotherham cases eventually came to light, and arrests were made. Justice better late than never.

It is that once you are uploaded, you are fundamentally at the mercy of whoever possesses your file, to a degree that no human has ever before experienced. You cannot hide from them, even within your own mind. You cannot escape them, even in death. And the risk of that fate will never, ever go away.

Well, maybe law-enforcement now has the ability to enforce a quadrillion life sentences as punishment for such crimes. Seriously. We do have law enforcement, and I expect that in most future timelines, we'll have some equivalent. Don't upload your mind to parties you don't trust.

You're the one who used Lena to illustrate your point. That story specifically centers around the conceit that there's profit to be made through mass reproduction and enslavement of mind uploads.

We disagree. I would say it centers around the conceit that the act of uploading surrenders the innate protections of existence within baseline reality. Why people treat the upload cruelly is irrelevant. They can, because he made himself into a thing to be used.

In a more general case? Bad things can always happen. It's a question of risks and benefits.

Worse things can happen to you as an upload that could ever happen to you as a human, and by a very wide margin. You seem to understand this, but on the one hand think that the better things that can happen are very good, and also that the bad things happening are unlikely. But your arguments as to why they are unlikely seem deeply unsound to me.

You claim that businesses will compete to offer security to uploads. You expect these uploads to produce zero economic value. You expect the business to secure them forever. You expect this to be financed by accrued value from "investments" generating compound interest. So this argument seems to depend on an eternally-stable investment market where you can put in value today and withdraw value in, say, five thousand years. No expropriation by government, no debasement of currency, no economic collapse, no massive fraud or theft, no pillage by hostile armies, every one of which we have numerous examples of throughout human history.

So you assume this God Market comes into being. And you assume that you somehow get a big enough nut in it that you can pay for your uploading and pay for your security and maintenance, forever.

This sequence of events seems quite unlikely.

Well, maybe law-enforcement now has the ability to enforce a quadrillion life sentences as punishment for such crimes. Seriously. We do have law enforcement, and I expect that in most future timelines, we'll have some equivalent.

I will as well. The Authorities potentially using a quadrillion years in super-hell as punishment for crimes was explicitly part of my argument why uploading is a bad idea.

Don't upload your mind to parties you don't trust.

It's not enough to only upload to parties you trust. The degree of trust needed is much higher than any peer-to-peer relationship any human has ever had with any other human, and also that trust needs to extend to every party the trusted party trusts, and every party those parties trust, and so on infinitely. You are making yourself into an ownable commodity, and giving ownership of you to a person. But you have no way of withdrawing ownership, and who owns you can change.

Given the stakes, my position is that there is no party you can trust.

There is such a thing as over-updating on a given amount of evidence.

The estimate I've heard recently is that the UK grooming gangs may have raped as many as a million girls. The cops looked the other way. The government looked the other way. My understanding is that the large majority of the perpetrators got away with it, and the few that got caught received minimal sentences for the amount of harm they caused. Those who allowed them to get away with it, the cops and social workers and government employees and elected officials who all steadfastly turned a blind eye, nothing of significance happened to them at all, to my understanding. And here, the downside isn't getting raped, beaten, drugged and pimped for a few years, but rather free access and complete control to everything you are for an indefinite and quite possibly prolonged future.

The grooming gangs are a relevant example, because they show that widespread horror is possible with no breakdown in law enforcement or civilization collapse, simply through ideological corruption of an otherwise reasonable, stable system. They are not remotely the worst that can happen when law does break down, as it did in Communist revolutions all over the world in the last century, or in the numerous examples of invasion, warfare, and systematic genocide over the same time period. There are no shortage of examples of failed states.

To sum up: you are counting on money to protect you, on the understanding that you will be economically useless, and the assumption that you will have meaningful investments and that nothing bad will ever happen to them. You are counting on people who own you to be trustworthy, and to only transfer possession of you to trustworthy people. And you are counting on the government to protect you, and never turn hostile toward you, nor be defeated by any other hostile government, forever.

And if any one of these assumptions goes wrong, you will find yourself an impotent object in the hands of an omnipotent god.

I appreciate the thorough response, but I think you're painting an unnecessarily bleak picture that doesn't account for several key factors.

You're right that my argument depends on relatively stable economic institutions, but this isn't as unrealistic as you suggest. We already have financial instruments that span centuries - perpetual bonds, endowments, trusts. The Vatican has maintained financial continuity for over 500 years.

Improving technology makes it at least theoretically possible to have such systems become even more robust, spanning into the indefinite, if not infinite future.

So this argument seems to depend on an eternally-stable investment market where you can put in value today and withdraw value in, say, five thousand years. No expropriation by government, no debasement of currency, no economic collapse, no massive fraud or theft, no pillage by hostile armies, every one of which we have numerous examples of throughout human history.

The precise details of how a post Singularity society might function are beyond me. Yet I expect that they would have far more robust solutions to such problems. What exactly is the currency to debase, when we might trade entirely in units of energy or in crypto currency?

The estimate I've heard recently is that the UK grooming gangs may have raped as many as a million girls. The cops looked the other way. The government looked the other way. My understanding is that the large majority of the perpetrators got away with it, and the few that got caught received minimal sentences for the amount of harm they caused.

Where on Earth did you come across this claim???

Does it not strike you as prima facie absurd? The population of the UK is about 68 million, if around 1.5% of the entire population, or 3% of the women, had been raped by organized "rape gangs", I think we'd have noticed. I live here, for Christ's sake. That's the kind of figure you'd expect in a country under occupation or literally in the midst of a civil war.

The confirmed numbers, which are definitely an understatement, are about 5k girls total. I don't see how you can stretch that another 3 orders of magnitude no matter how hard you try.

Putting aside those absurd figures:

The grooming gangs are indeed horrific, but they're not representative of how most vulnerable populations are treated in developed societies. For every Rotherham, there are thousands of care homes, hospitals, and institutions that function reasonably well. The vast majority of elderly people in care facilities, despite being physically vulnerable and economically dependent, aren't systematically abused.

Your examples of state collapse and genocide are real risks, but they're risks that already exist for biological humans. The question isn't whether bad things can happen, but whether the additional risks of uploading outweigh the benefits. A world capable of supporting uploaded minds is likely one with sophisticated technology and institutions - probably more stable than historical examples, not less.

To sum up: you are counting on money to protect you, on the understanding that you will be economically useless, and the assumption that you will have meaningful investments and that nothing bad will ever happen to them. You are counting on people who own you to be trustworthy, and to only transfer possession of you to trustworthy people. And you are counting on the government to protect you, and never turn hostile toward you, nor be defeated by any other hostile government, forever.

You're describing the experience of a retiree.

The "ownable commodity" framing assumes a particular legal framework that need not exist. We already have legal protections against slavery, even of non-standard persons (corporations have rights, as do some animals in certain jurisdictions). There's no reason uploaded minds couldn't have robust legal protections - potentially stronger than biological humans, since their substrate makes certain forms of evidence and monitoring easier.

You mention trust extending through infinite chains, but this misunderstands how modern systems work. I don't need to trust every person my bank trusts, or every person my government trusts. Institutional structures, legal frameworks, and distributed systems can provide security without requiring universal interpersonal trust.

As Einstein, potentially apocryphally, said- Compound interest is the most powerful force in the universe. A post-Singularity economy has hordes of Von Neumann swarms turning all the matter in grasp to something useful, with a rate of growth only hard capped by the speed of light. It's not a big deal to expect even a small investment to compound, that's how retirement funds work today.

Further, you assume that I'll be entirely helpless throughout the whole process. Far from it. I want to be a posthuman intelligence that can function as a peer to any ASI, and plain biology won't cut it. Uploading my mind allows for enhancements that mere flesh and blood don't allow.

I could also strive to self-host my own hardware, or form a trusted community. There are other technological solutions to the issue of trust-

  1. Substrates running on homomorphic encryption, where the provider can run your consciousness without ever being able to "read" it.

  2. Decentralized hosting, where no single entity controls your file, but a distributed network does, governed by a smart contract you agreed to.

  3. I could send trillions of copies of myself into interstellar space.

They really can't get all of me.

At the end of the day, you're arguing that because a totalitarian government could create digital hells, I should choose the certainty of annihilation. That's like refusing to board an airplane because of the risk of a crash, and instead choosing to walk off a cliff. Yes, the crash is horrific, but the cliff is a 100% guarantee of the same outcome: death.

Your argument is that because a system can fail, it will fail in the worst way imaginable, and therefore I should choose oblivion. My argument is that the choice is between certain death and a future with manageable risks. The economic incentives will be for security, not slavery. The technology will co-evolve with its own safeguards. And the societal risks, while real, are ones we already face and must mitigate regardless. If the rule of law collapses, we all lose.

The ultimate omnipotent god in this scenario is Death, and I'll take my chances with human fallibility over its perfect, inescapable certainty any day.

On a slightly unrelated note, would you happen to be aware of any current experiments with running software they way you would like to run uploads - encrypted, unrootable etc.?

Apple using homomorphic encryption for image classification on the cloud:

https://boehs.org/node/homomorphic-encryption

Homomorphically Encrypting CRDTs:

https://jakelazaroff.com/words/homomorphically-encrypted-crdts/

That's for homomorphic encryption in particular, which, AFAIK, is the absolute peak of security. Then you've got more standard setups like VMs on the cloud, and prevention of data leakage between unrelated customers on the same hardware, in the manner that AWS/Azure handle things.

RE: Uploading.

Do we really need to worry about our uploads being abused and tortured, or sold for parts? By the time technology is far enough along to upload minds, what really is the value of an upload? It can be copied and modified infinitely. Most likely they can be synthesized, procedurally generated or just generated by "AI"s. If a virtual mind is good for anything, then there will be so many of them purpose-built that nobody needs a pre-singularity upload to do the job.

You'll be a useless scrap of data. Just to be very clear about that.

Other than that, I think you reason it out very well. I'd disagree on the assumptions - might even suspect that your motivation is mostly wishful thinking - but the actual arguments flowing from them seem pretty solid.

While a very nice scifi story, there's very little reason to think that reality will pan out that way.

I wouldn't call the history of every invention to be "very little reason".

The answer is obvious: the person being uploaded. You and me. People who don't want to die. This completely flips the market dynamic. We are not the product; we are the clients. The service being sold goes from "cognitive labor" to "secure digital immortality." In this market, companies would compete not on how efficiently they can exploit Ems, but on how robustly they can protect them.

How do these emulations get the resources to pay the companies for the service of protection? Presumably they work, no? How does a company make money? By getting more clients? If yes, why compete for the limited amount of clients, when you can just copy-paste them? We're already seeing a similar dynamic with meatsack humans and immigration, it strikes me as extremely naive to think it would happen less if we make it easier and cheaper.

There is no profit motive behind enslaving and torturing them.

Slavery ensures profit, torture ensures compliance.

I wouldn't call the history of every invention to be "very little reason".

I guess that's why, after the invention of the hamster wheel, we've got indentured slaves running in them to power our facilities. Enslaving human mind uploads is in a similar ballpark of eminently sensible economic decisions.

How do these emulations get the resources to pay the companies for the service of protection? Presumably they work, no?

Not necessarily. I think you're well aware of my concerns about automation-induced unemployment, with most if not all humans becoming economically unproductive. Mind uploads are unlikely to change that.

What humans might have instead are UBI or pre-existing investments on which they can survive. Even small sums held before a Singularity could end up worth a fortune due to how red-hot the demand for capital would be. They could spend this on backup copies of themselves if that wasn't a service governments provided from popular demand.

By getting more clients? If yes, why compete for the limited amount of clients, when you can just copy-paste them? We're already seeing a similar dynamic with meatsack humans and immigration, it strikes me as extremely naive to think it would happen less if we make it easier and cheaper.

So you happen to see an enormous trade in illegal horses, to replace honest local tractors in the fields? I suppose that's one form of "mule" hopping the borders. No. Because, in both scenarios, they're obsolete, and little that you can do to make mind uploads cheaper won't apply to normal AI, which already start at an advantage.

Slavery ensures profit, torture ensures compliance.

Well, it's an awful shame that we have pretty handy "slaves" already, in the form of ChatGPT and its descendants. Once again, if you have tractors, the market for horse-rustling falls through the bottom.

Enslaving human mind uploads is in a similar ballpark of eminently sensible economic decisions.

(...) What humans might have instead are UBI

If the minds can't support themselves economically, they obvious incentive is to pull the plug on them, so you don't have to pay them UBI anymore.

or pre-existing investments on which they can survive.

Then the incentive becomes: manipulate the emulations to sign away the rights to their investments, and then pull the plug.

Not necessarily. I think you're well aware of my concerns about automation-induced unemployment, with most if not all humans becoming economically unproductive. Mind uploads are unlikely to change that.

Yes, and I consider most of them to be poorly made, and unresponsive to the most basic criticisms.

Even small sums held before a Singularity could end up worth a fortune due to how red-hot the demand for capital would be

You can't start your criticism with "there's very little reason to think that reality will pan out that way.", and then say something like this. I do not grant any claims of "the singularity" happening a single shred of legitimacy, unless it comes with solid supporting evidence. I grant even less legitimacy to any claims about what will happen to pre-singularity investments, any such claims are pure fan-fic.

No. Because, in both scenarios, they're obsolete, and little that you can do to make mind uploads cheaper won't apply to normal AI, which already start at an advantage.

(...) Once again, if you have tractors, the market for horse-rustling falls through the bottom.

Then follow the logic of the analogy a bit further. Do we see massive horse farms where we devote insane amounts of resources for the horses amusement? Or are the horses we do keep there for our amusment?

If the minds can't support themselves economically, they obvious incentive is to pull the plug on them, so you don't have to pay them UBI anymore.

"Incentives" are not the be-all and end-all of matters in life.

The police are incentivized to have high levels of crime to justify their salaries. You don't see them running coaching sessions on bank robbery.

Oncologists have "incentives" to keep you alive and cancer-ridden indefinitely to get that sweet insurance money. I know plenty, and I'm afraid that's not an accurate description of any of them.

Then the incentive becomes: manipulate the emulations to sign away the rights to their investments, and then pull the plug.

The number of cemeteries that dig up their clients and sell them for parts is, to the best of my knowledge, small.

The number of investment firms and banks that snatch the fees of the recently departed to spend on their whims, is, as far as I'm aware, rather limited.

Cloud service providers don't, as a general rule, steal all your data and sell them to your competitors.

The kind of organization that would run mind uploads would likely be a cross between all of the above.

Do you know why millions of people were kept in chattel slavery throughout history? Because there was a good business argument for it. Even the most abusive sheikh in Qatar doesn't bus in dozens of kaffirs for the sole purpose of beating them up for the joy of it. The majority of people who hate you are more than content to end the matter with a bullet in your brain, and not to keep you around to torture indefinitely.

Besides, I'd like you to consider the possibility, however controversial it might sound, that people and systems sometimes do the right thing even when the first-order effects aren't to their "best interests". And perhaps we might have cops and politicians in some form to help even the scales.

I do not grant any claims of "the singularity" happening a single shred of legitimacy, unless it comes with solid supporting evidence.

In that case, I don't see the point of having this discussion at all.

Or are the horses we do keep there for our amusment?

Yes? The population of horses crashed during the Industrial Revolution, and has only recently recovered, driven almost entirely by recreational demand.

"Incentives" are not the be-all and end-all of matters in life.

Sure, but it's unwise to dismiss them.

The police are incentivized to have high levels of crime to justify their salaries. You don't see them running coaching sessions on bank robbery.

Not incentivizing these things is the reason number one for why the police is run as a public service, instead of a private one.

Oncologists have "incentives" to keep you alive and cancer-ridden indefinitely to get that sweet insurance money. I know plenty, and I'm afraid that's not an accurate description of any of them.

Because the patients have power to just not go to the ones that would. Not to mention take revenge.

The kind of organization that would run mind uploads would likely be a cross between all of the above.

None of the pressures faced by any of these organisations would be applied to mind-upload-runners. It's like insisting there's be organizations that will keep lightbulbs on for absolutely no utility of their own.

Do you know why millions of people were kept in chattel slavery throughout history? Because there was a good business argument for it.

I feel like this makes the case against you than for you.

Besides, I'd like you to consider the possibility, however controversial it might sound, that people and systems sometimes do the right thing even when the first-order effects aren't to their "best interests".

Sure. When there is a common idea of what "the right thing" is in society, that people feel very strongly about, they will keep each other in check. It's a bit of an odd argument to make when the common conception of good is falling apart, but in this case specifically, how many people share your ideas of emulations being people?

In that case, I don't see the point of having this discussion at all.

You don't find it odd that the singularity has to be accepted as an article of faith for the discussion to continue?

Yes? The population of horses crashed during the Industrial Revolution, and has only recently recovered, driven almost entirely by recreational demand.

Right, so when emulation's labour will be like horse labour relative to chatGPT, and it will actively cost resources to keep them running, what does that analogy imply about the likely fate of mind-emulations?

Sure, but it's unwise to dismiss them.

Sure. And yet I invite you to show me how I'm "dismissing" them. All I've done is point out the competing incentives, which are regulatory, legal and ethical, which I expect to solve the problem.

Because the patients have power to just not go to the ones that would. Not to mention take revenge.

Are you familiar with the literature on the principal-agent problem? It's not remotely as simple as "just not go to the ones that would".

I will leave aside the fact that there's no physical law demanding that prospective mind uploads must use a single compute provider, and don't have the option to self-host either, and that there will likely be persons or organizations that can take "revenge" on their behalf.

PETA exists as an organization that takes "revenge" on the behalf of random animals, to set the floor rather low but not zero.

I feel like this makes the case against you than for you.

I feel like it doesn't, or I wouldn't have used that analogy. Please explain.

You don't find it odd that the singularity has to be accepted as an article of faith for the discussion to continue?

God. Leaving aside such loaded phrases as "article of faith", I think that it's very likely that we have some form of technological Singularity within our nominal life expectancy.

Even @FCfromSSC acknowledges the possibility of mind uploading, and presumably believes that the kind of rapid technological improvement that we colloquially call a Singularity is a requisite for us to live to see it. He even identifies with the potential mind upload. He however, believes that this is against his best interests.

My interests are to attempt to demonstrate why I think this is a mistake, or at the least, throwing the baby out with the bathwater. Consider that, from my perspective, an altruistic act.

If you don't think that mind uploads are a possibility, or that we won't live to see them, my interest in debating with you is minimal. What would the point even be? Alas, I'm here, because I suppose I have a sadomasochistic streak and will argue just about anything.

Right, so when emulation's labour will be like horse labour relative to chatGPT, and it will actively cost resources to keep them running, what does that analogy imply about the likely fate of mind-emulations?

Naively? Bad things. Less naively? Everything I've argued for so far.

But consider that it's not just the emulations that will be in the place of horses. If emulation are horses, then good old fashioned meat and bone humans would be closer to horse with a broken leg.

Being advocate for outcomes that don't literally kill all humans, I believe in attempting to steer the course of our technologies and laws in a direction that doesn't lead to this.

Relevant sarcastic comment by qtnm in the comments of Lena:

"Why should I care about other people?"

All instances of people caring about other people in history, so far, have happened under the assumption that any given person could, in theory, be in another person's place.

The horror of Lena is that this assumption is destroyed. The technology is mind copying, not mind transfer. Every single person who is scanned will go inside the facility and will come out. There is no mechanism to shift perspective, ever: the material and the digital substrates never cross directly. If you experience living in reality now (as opposed to remembering it), by induction you can be sure that you will never experience living as an em.

Ever.

This puts the suffering of ems at a greater distance than even the suffering of animals, for a person could fathom a timeline where, but for the grace of God, he lives the life of cattle. None such mechanism to facilitate empathy would exist for copying scans. They would be as fictional characters, whose suffering evokes vivid emotions in many but never a desire to stop it by refusing to create fiction.

I have a dim opinion of the Rawlsian veil of ignorance, but even so, there are a million issues with such claims.

If you experience living in reality now (as opposed to remembering it), by induction you can be sure that you will never experience living as an em.

This claim implicitly but load-bearingly assumes that a post-Singularity civilization won't have the ability to create simulations indistinguishable from reality.

Even today, we have no actual rebuttal for the Simulation Hypothesis. You and I could be simulations inside a simulation, but it's a possibility we can't prove or exclude at the moment, so the sensible thing to do is to ignore it and move on with our lives.

Even if you did start out as a Real Human, then I think that with the kind of mind editing in Lena, it would be trivial to make you forget or ignore that fact.

Further, I don't think continuity of consciousness is a big deal, which is why I don't have nightmares about going to take a nap. As far as I'm concerned, my "mind" is a pattern that can be instantiated in just about any form of compute, but at the moment is in a biological computer. There is no qualitative change in the process of mind upload, at least a high fidelity one, be it a destructive scan or preserving of the original brain.

More comments