This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
Consider the following thought experiment, courtesy of Scott Summer
I think this is a good counterpart to the AGI questions below. There is a massive conceptual gap in defining welfare across vastly different levels of technological mastery.
It also highlights that some of the analysis misses the largest factor here -- that AGI (if it happens, sadly not if it doesn't pan out) will greatly increase the quality and personalization of a large set of goods & services. If that does happen, it will dwarf the distributional aspects.
But if AGI happens, then ASI is right around the corner? If AI can produce excellent personalized media, surely it can make better AI? If ASI is in reach, all resources available will be tapped to reach it first.
Distribution is of utmost importance! The distribution of power will be wildly upset. AGI cannot be considered like any other technology in history, it's an actor rather than a tool or a method. AGI, by definition, means a mass-producible high-quality person-in-a-box equivalent. That alone is an unprecedented achievement. ASI is a mass-producible superhuman being. Better to think about summoning forth demons or djinns or faeries, it's vital to cleanse all economic preconceptions.
Economics assumes peaceful competition and the rule of law, unchanging and clear distinctions between capital and labour, a world where 'labour' can add value to the economy... It's not the right tool for the job.
I will note that some people (e.g. Eliezer) predict that ASI will indeed drastically reduce socioeconomic inequality between humans. Just not in a good way.
In case alignment is as simple as the billionaires believe, I guess that the median AI billionaire will probably be willing to yield half a steradian of the light cone to the rest of us, which would still imply an enormous quality of life increase over our present state.
Even if things go well and the plebs do get scraps off the plate, how do we subdivide that amongst ourselves? X was an armed burglar, Y kicked puppies, Z posted mean things on the internet, ZA was a really nice guy, ZB is from a historically marginalized group... I think we're being too rosy-eyed about resource distribution. If we're carving up the lightcone, we can also reproduce fast, clone ourselves, use up superhuman amounts of resources. We might drop down to subsistence quickly (by immortal standards). What if the people born in simulation demand fleshbags share the wealth they're hogging with inefficient resource use?
We might have extremely toxic discourse like 'Alice worked 10 hours a day as a nurse and is poor as fuck, Bob bought bitcoin in the hopes of buying some child pornography and is a millionaire' except it's NVIDIA shares and scaled up immeasurably, such that Alice's whole bloodline is born into poverty for the rest of time because she took on student debt and never accumulated capital?
There are all kinds of resource-distribution problems that deserve consideration. I think that this is something we need to be thinking about beforehand. Most important of course is not letting psychopathic men/machines exterminate the rest of us but even the lesser problems of wealth distribution have already seen tens of millions butchered!
I think we need a strong consensus on distribution of power, to prevent a singleton.
If you want to avoid a singleton, the best advice I can give you is to go full Butlerian Jihad. Singletons are a strong attractor state for competition between AI in a way they're not between humans.
Trying to wage war against computing is like waging war against guns, you are sure to lose. Personally I don't see any way out of this mess besides a miracle. Our civilization really struggled with baby's first game theory of 'avoid mutually assured destruction.' We still haven't cracked down on gain of function viruses. What chance is there that we can manage superintelligence properly? Maybe actions right now will turn out to be vaguely helpful by some unknown method but my expectations are very low.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link