This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
That is why he wrote it that way. He's describing a character, a type of character even, not just a caricature.
I'm all for building artificial worlds. I'm skeptical "better" plants and animals are possible; we've altered plants and animals before, and we can doubtless alter them far more radically in the future, but what makes those alterations "better"? "Living as long as one wants, regardless of age" used to be something I was very excited for, less so after contemplating the downsides. All the pathways to serious immortality I'm aware of involve making the sum of me fully legible, and the risks of that very likely outweigh any possible benefit, assuming it's even possible.
The alternative is thinking that our mastery is not ever-increasing in the way you seem to mean. Technology can and has greatly increased, and maybe it will greatly increase even more, but technology is not the same thing as mastery. If you want a highly reductive example of the difference between the two, compare the original Snow White film to the remake. The people who made the remake had vastly more technology, vastly more resources, vastly more experience in filmmaking to draw on; more "mastery", right? So why was the original a masterpiece, and the remake a trash disaster? Again, that's a highly reductive example, it seems to me that the principle generalizes quite widely.
I don't think we are moving toward ever-increasing mastery. I don't think we have to stop tech advancement either. I think what will happen next is pretty similar to what has happened before: we'll build something wondrous, and then the contradictions will assert themselves and it will all fall apart.
Technology is the concentration of power. Concentrated power is individual power. There is almost certainly a level of individual power that society, as we understand the term, can't contain or channel, and once that level is achieved society will simply fail. Society maintains technology; when society fails, likely the technology will fail as well, and then it's back down the curve for the survivors.
Maybe this time will be different. I wouldn't bet on it, though.
Would you rather be "fully legible" or fully dead? Easy choice as far as I'm concerned.
Fully dead, and it is indeed an easy choice.
The immortality you pine for would open you up to the most perfect and degrading form of slavery conceivable.
While a very nice scifi story, there's very little reason to think that reality will pan out that way.
It suffers from the same failure of imagination as Hanson's Age of Em. We don't live in a universe where it looks like it makes economic sense to have mind uploads doing cognitive or physical labor. We've got LLMs, and will likely have other kinds of nonhuman AI. They can be far more finely tuned and optimized than any human upload (while keeping the latter recognizably human), while costing far less in terms of resources to run. While compute estimates for human brain emulation are all over the place, varying in multiple OOMs, almost all such guesses are far, far larger than a single instance of even the most unwieldy LLM around.
I sincerely doubt that even a stripped down human emulation can run on the same hardware as a SOTA LLM.
If there's no industrial or economic demand for Em slaves, who is the customer for mind-uploading technology?
The answer is obvious: the person being uploaded. You and me. People who don't want to die. This completely flips the market dynamic. We are not the product; we are the clients. The service being sold goes from "cognitive labor" to "secure digital immortality." In this market, companies would compete not on how efficiently they can exploit Ems, but on how robustly they can protect them.
There is no profit motive behind enslaving and torturing them. Without profit, you go from industrial-scale atrocities to bespoke custom nightmares. Which aren't really worth worrying about. You might as well refuse to have children or other descendants, because someone can hypothetically torture them to get back at you. If nobody is making money off enslaving human uploads, then just about nobody but psychopaths will seek to go through the expense of torturing them.
I wouldn't call the history of every invention to be "very little reason".
How do these emulations get the resources to pay the companies for the service of protection? Presumably they work, no? How does a company make money? By getting more clients? If yes, why compete for the limited amount of clients, when you can just copy-paste them? We're already seeing a similar dynamic with meatsack humans and immigration, it strikes me as extremely naive to think it would happen less if we make it easier and cheaper.
Slavery ensures profit, torture ensures compliance.
I guess that's why, after the invention of the hamster wheel, we've got indentured slaves running in them to power our facilities. Enslaving human mind uploads is in a similar ballpark of eminently sensible economic decisions.
Not necessarily. I think you're well aware of my concerns about automation-induced unemployment, with most if not all humans becoming economically unproductive. Mind uploads are unlikely to change that.
What humans might have instead are UBI or pre-existing investments on which they can survive. Even small sums held before a Singularity could end up worth a fortune due to how red-hot the demand for capital would be. They could spend this on backup copies of themselves if that wasn't a service governments provided from popular demand.
So you happen to see an enormous trade in illegal horses, to replace honest local tractors in the fields? I suppose that's one form of "mule" hopping the borders. No. Because, in both scenarios, they're obsolete, and little that you can do to make mind uploads cheaper won't apply to normal AI, which already start at an advantage.
Well, it's an awful shame that we have pretty handy "slaves" already, in the form of ChatGPT and its descendants. Once again, if you have tractors, the market for horse-rustling falls through the bottom.
If the minds can't support themselves economically, they obvious incentive is to pull the plug on them, so you don't have to pay them UBI anymore.
Then the incentive becomes: manipulate the emulations to sign away the rights to their investments, and then pull the plug.
Yes, and I consider most of them to be poorly made, and unresponsive to the most basic criticisms.
You can't start your criticism with "there's very little reason to think that reality will pan out that way.", and then say something like this. I do not grant any claims of "the singularity" happening a single shred of legitimacy, unless it comes with solid supporting evidence. I grant even less legitimacy to any claims about what will happen to pre-singularity investments, any such claims are pure fan-fic.
Then follow the logic of the analogy a bit further. Do we see massive horse farms where we devote insane amounts of resources for the horses amusement? Or are the horses we do keep there for our amusment?
"Incentives" are not the be-all and end-all of matters in life.
The police are incentivized to have high levels of crime to justify their salaries. You don't see them running coaching sessions on bank robbery.
Oncologists have "incentives" to keep you alive and cancer-ridden indefinitely to get that sweet insurance money. I know plenty, and I'm afraid that's not an accurate description of any of them.
The number of cemeteries that dig up their clients and sell them for parts is, to the best of my knowledge, small.
The number of investment firms and banks that snatch the fees of the recently departed to spend on their whims, is, as far as I'm aware, rather limited.
Cloud service providers don't, as a general rule, steal all your data and sell them to your competitors.
The kind of organization that would run mind uploads would likely be a cross between all of the above.
Do you know why millions of people were kept in chattel slavery throughout history? Because there was a good business argument for it. Even the most abusive sheikh in Qatar doesn't bus in dozens of kaffirs for the sole purpose of beating them up for the joy of it. The majority of people who hate you are more than content to end the matter with a bullet in your brain, and not to keep you around to torture indefinitely.
Besides, I'd like you to consider the possibility, however controversial it might sound, that people and systems sometimes do the right thing even when the first-order effects aren't to their "best interests". And perhaps we might have cops and politicians in some form to help even the scales.
In that case, I don't see the point of having this discussion at all.
Yes? The population of horses crashed during the Industrial Revolution, and has only recently recovered, driven almost entirely by recreational demand.
Sure, but it's unwise to dismiss them.
Not incentivizing these things is the reason number one for why the police is run as a public service, instead of a private one.
Because the patients have power to just not go to the ones that would. Not to mention take revenge.
None of the pressures faced by any of these organisations would be applied to mind-upload-runners. It's like insisting there's be organizations that will keep lightbulbs on for absolutely no utility of their own.
I feel like this makes the case against you than for you.
Sure. When there is a common idea of what "the right thing" is in society, that people feel very strongly about, they will keep each other in check. It's a bit of an odd argument to make when the common conception of good is falling apart, but in this case specifically, how many people share your ideas of emulations being people?
You don't find it odd that the singularity has to be accepted as an article of faith for the discussion to continue?
Right, so when emulation's labour will be like horse labour relative to chatGPT, and it will actively cost resources to keep them running, what does that analogy imply about the likely fate of mind-emulations?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link