site banner

Culture War Roundup for the week of May 8, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

UBI is a transition strategy, in the long term it seems less likely that people will accept a state of affairs in which people who happened to be rich in 2025 get to be rich forever, while people who weren't have no way of bettering themselves beyond the basic income.

Who says people need to accept reality for it to be imposed upon them?

Mass unemployment means the balance of military power moving decisively towards machines over people. How hard is it to put a machine-gun on a Boston Dynamics dog? How hard is it to put a gun or a bomb on a drone? Not very hard, we're seeing it in Ukraine and it has considerable battlefield impacts, whether it's dropping grenades on trenches or more sophisticated Lancets.

Machines are better at people in terms of combat power. They're compact, they don't fall asleep, they're tough, fast and easily replaced. The only advantage that people have is intelligence, the knowledge of how to traverse rugged terrain, climb over a fence, know that hearing explosions mean you should duck for cover... On an operational level, people are better at commanding armies too but only because they're more intelligent, not faster or capable of processing more information.

All these advantages in intelligence are diminishing. Palantir for instance is working on AI organizing tactics semi-autonomously, calling in drone strikes: https://twitter.com/thealexbanks/status/1651560588362002433

If we have mass unemployment from AI, then it's fairly easy to mechanize war industry as well. It isn't technically difficult to produce artillery shells, this is technology over a century old! If truck drivers are gone, then it can get a robotic howitzer to drive to the front, organize resupply trucks and drone surveillance craft for spotting. It's the same for SAMs. A war run by AI would be very quick, given how units could communicate so quickly. Artillery would be coming in 30 seconds or so, targets wouldn't have time to move before they get obliterated. Infantry can be replaced by small tanks or dog-like robots making up in numbers and coordination for what they might lack in the ability to dig trenches. Artillery/nerve gas > trenches anyway.

Air warfare favours machines even more than ground combat. Machines don't care about G-force, they don't need oxygen or parachutes. Aircraft can be slimmer and much cheaper. Even assuming technical parity, AI can pull off inhumanly precise bursts with its cannons. https://www.businessinsider.com/ai-just-beat-a-human-pilot-in-a-simulated-dogfight-2020-8

Modern navies are also effectively made up of people who sit in front of computers, supervising guns and missiles that fire automatically. I'll grant that it's extremely hard to build decent ships quickly, they can't really be mass-produced. Human-controlled navies would dominate in numbers and tonnage for a long time. But maybe spamming anti-ship missiles until one gets through would be the dominant strategy.

The main weakness of automated war machines would be decapitation strikes against server infrastructure and communications. Ideally these would be underground and well-defended. But perhaps they'd be legacy infrastructure on the surface, something that might be easily nuked. Alternately, if an opposing force achieved superiority in space, they could prevent satellite communications and surveillance. This would be pretty hard though since machines dominate in space even more than in the skies.

I conclude that in worlds with mass unemployment from automation, we will be in the field of military solutions - not political ones. If the 1% can win a war with the 99%, why sacrifice their wealth and power?

What binds the 1% together as a military force beyond wealth? Who says that the forces of a Musk would align with a Bezos would align with a Gates would align with a Soros?

Well none of them want their money and power taken away!

I think I had in mind a 1% consisting of a small class of officials, billionaires, engineers and generals. They're the people who already run the world (engineers excepted), they're vaguely on the same side. They will probably also fight eachother for final dominance - but it makes sense to pick off the weakest parties first.

My core idea is that if people can't do any productive work (mass unemployment) that includes soldiering too.

I can imagine a future where there has been enough production or robots/drones/whatever that any human forces become irrelevant, but it would only be one possible future. Other futures where arming a mass of human auxiliaries can give an edge in any conflict would seem to be more likely to me.

i also feel that this framing of "the 1%" simply exterminating everyone else assumes a lot more sociopathy than actually exists. Leaders throughout history have certainly spent lives freely, but we've hardly ever seen them completely disregard their "lessers". Even the genocidal maniacs like Hitler and Stalin typically just targeted certain groups, not all humanity.

And speaking of Hitler and Stalin, the current crop of dictators would almost certainly count among the 1%, and many of them seem to possess very strong nationalistic streaks. Clearly the likes of Xi and Putin would go to great lengths to protect their wealth and status, but I can't imagine they would completely abandon their countrymen - what's the point of elevating to godhood if there is no one around to worship you? Plus, I'd say that speculation about elite vs masses is much less likely than good old fashioned wars between nations and races, if anything would lead to near extinction.

we've hardly ever seen them completely disregard their "lessers"

Well if their lessers had absolutely no recourse, if they couldn't even rebel... then we're dealing with a fundamentally new kind of situation outside of recorded history.

The leadership class doesn't need to exterminate humanity, only seize whatever they need for their immediate warfighting needs. This would be stuff like power plants, fuel, industrial facilities, computing equipment, mines... We need those things too! They're unlikely to democratize their power - power wants to be centralized. No state votes in referendums before they start a war, that opens up all kinds of vulnerabilities and complications. I think that states that hold back from autonomous weapons, states that try to be ethical, sections of elites that don't fully commit to the competition will all fall behind.

I believe that power corrupts, that absolute power corrupts very greatly, if not absolutely. Maybe there are some trustworthy people to hold enormous power - but do you trust that our current systems will put them in charge and that they can defeat more ruthless opponents?

what's the point of elevating to godhood if there is no one around to worship you?

Cloning yourself? Subjugating the rest of humanity and ruling like a god? My prediction encompasses extermination, replacement, subjugation, mass brainwashing, selective genocide, getting tossed in some rinky-dink upload-simulation that's the equivalent of UBI for a penny-pinching posthuman... Everything except true decentralization of power, a world where ordinary people have real agency in their transhuman/posthuman lives, where people get to ask for more than UBI, as OP posits. I think the game was rigged from the start, we're completely fucked.