site banner

Culture War Roundup for the week of May 8, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

UBI is a transition strategy, in the long term it seems less likely that people will accept a state of affairs in which people who happened to be rich in 2025 get to be rich forever, while people who weren't have no way of bettering themselves beyond the basic income.

Who says people need to accept reality for it to be imposed upon them?

Mass unemployment means the balance of military power moving decisively towards machines over people. How hard is it to put a machine-gun on a Boston Dynamics dog? How hard is it to put a gun or a bomb on a drone? Not very hard, we're seeing it in Ukraine and it has considerable battlefield impacts, whether it's dropping grenades on trenches or more sophisticated Lancets.

Machines are better at people in terms of combat power. They're compact, they don't fall asleep, they're tough, fast and easily replaced. The only advantage that people have is intelligence, the knowledge of how to traverse rugged terrain, climb over a fence, know that hearing explosions mean you should duck for cover... On an operational level, people are better at commanding armies too but only because they're more intelligent, not faster or capable of processing more information.

All these advantages in intelligence are diminishing. Palantir for instance is working on AI organizing tactics semi-autonomously, calling in drone strikes: https://twitter.com/thealexbanks/status/1651560588362002433

If we have mass unemployment from AI, then it's fairly easy to mechanize war industry as well. It isn't technically difficult to produce artillery shells, this is technology over a century old! If truck drivers are gone, then it can get a robotic howitzer to drive to the front, organize resupply trucks and drone surveillance craft for spotting. It's the same for SAMs. A war run by AI would be very quick, given how units could communicate so quickly. Artillery would be coming in 30 seconds or so, targets wouldn't have time to move before they get obliterated. Infantry can be replaced by small tanks or dog-like robots making up in numbers and coordination for what they might lack in the ability to dig trenches. Artillery/nerve gas > trenches anyway.

Air warfare favours machines even more than ground combat. Machines don't care about G-force, they don't need oxygen or parachutes. Aircraft can be slimmer and much cheaper. Even assuming technical parity, AI can pull off inhumanly precise bursts with its cannons. https://www.businessinsider.com/ai-just-beat-a-human-pilot-in-a-simulated-dogfight-2020-8

Modern navies are also effectively made up of people who sit in front of computers, supervising guns and missiles that fire automatically. I'll grant that it's extremely hard to build decent ships quickly, they can't really be mass-produced. Human-controlled navies would dominate in numbers and tonnage for a long time. But maybe spamming anti-ship missiles until one gets through would be the dominant strategy.

The main weakness of automated war machines would be decapitation strikes against server infrastructure and communications. Ideally these would be underground and well-defended. But perhaps they'd be legacy infrastructure on the surface, something that might be easily nuked. Alternately, if an opposing force achieved superiority in space, they could prevent satellite communications and surveillance. This would be pretty hard though since machines dominate in space even more than in the skies.

I conclude that in worlds with mass unemployment from automation, we will be in the field of military solutions - not political ones. If the 1% can win a war with the 99%, why sacrifice their wealth and power?

Coming in late, but no.

Fully automated warfare is as of right now science fiction.

A human with a gun is worth WAY more than robo-dog with a gun; the best autonomous platforms in the world are less good than a rando with 3 days training, a shovel, and a WW1 relic bolt action.

The only place where this might change soon is in the sky, where there are less obvious advantages for meat.

And even in the air anything that a drone can do, that same drone managed by a human can do it better.

Fully automated warfare is as of right now science fiction.

Let me emphasise that I was not talking about now, but in a world where machines have caused mass unemployment! Conditional on them replacing enormous numbers of human truck drivers, miners, factory workers, oil workers... it's reasonable that they replace humans on the battlefield.

Furthermore, nobody seems to have tested things like Ghost Robotics dog with a gun on it against people in real combat. I imagine there are various tactics human soldiers can use to overcome them, using camouflage or whatever. But robot dogs also have various advantages including price and accuracy (I'm willing to bet on the dog's marksmanship over your 3-day conscript)- this one can be moved around by drone: https://www.businessinsider.com/china-machine-gun-mounted-robot-dog-black-mirror-2022-10

anything that a drone can do, that same drone managed by a human can do it better.

No. Flying in a contested environment means drone aircraft can't rely on long-range (medium-high latency) communications with a human controller due to ECM. It also has negative stealth implications if it's beaming long-range signals around. Humans don't have the speed or skill of machines in air combat, as I pointed out in the simulated duel between a human ace and a robot pilot, where the robot's inhuman gun accuracy got it the W. Furthermore, autonomous aircraft do better together, they can swarm, communicating with eachother over short ranges, coordinating faster and more effectively than humans can. What does the managing human bring to the table that the successors of Palantir's tactics bots can't?

Let me emphasise that I was not talking about now, but in a world where machines have caused mass unemployment! Conditional on them replacing enormous numbers of human truck drivers, miners, factory workers, oil workers... it's reasonable that they replace humans on the battlefield.

In that context, I give it to you. Once you have a single robot that can dice an onion AND mop a floor AND determine if someone has dinned afore dashing, we are fucked as a species.

I just think it is not going to happen any time soon, and will never happen under the current paradigm.

Furthermore, autonomous aircraft do better together, they can swarm, communicating with eachother over short ranges, coordinating faster and more effectively than humans can. What does the managing human bring to the table that the successors of Palantir's tactics bots can't?

Actually make tactical decisions that pan out, predict other agents consistently, handle complex situations, handle any change in surfaces, and basically do anything at all outside of the renderite test environment.

If a drone fighter gets into a turning fight with a human, of course it will win because it doesn't have any blood; just like I wouldn't win a wrestling mach with an industrial press if I stood underneath it. The solution is for the ape to simply not get into the shitty situation; and current paradigm drones don't have the capacity to consistently force opponents into losing positions.

Big caveat: because I don't know what the drones that CAN do these things look like other than "not like this"; they could be here anytime in a 5-500 year timespan, which doesn't get me any internet points for predicting.

handle any change in surfaces

I don't know what you mean by this, are you talking about the control surfaces of the plane getting damaged? Modern planes are fly-by-wire anyway, the computer handles control surfaces. Everything else can be handled by throwing billions of simulated engagements at the machine to train it. AlphaZero can predict other players and make tactical decisions just fine, better than any human.

What complex situation is there where machines can't cope? Airspace violations I guess, where you're not supposed to engage and humans will be snarling on the radio to each other. Is that really relevant to a war situation? Say there is some trick human pilots can pull - it surely can't consistently overcome the advantages of autonomous weapons being cheaper, faster to react, stealthier, longer-range and so on. All it takes is for a OTA patch to remove the exploit, no patch is going to make humans like machines.

Current paradigm drones don't have the capacity to consistently force opponents into losing positions because there's a giant pilot mafia that runs all airforces worldwide that suppresses the development of autonomous fighters. The X-47 is decade old technology and worked fine, they just decided to convert it from recon-strike to refuelling manned aircraft. The program was eventually scrapped for no clear reason. Of course airforces don't want to give up their prestigious, expensive manned jets, they don't want to say goodbye to the romance of air warfare (which is pretty dumb seeing how they just fling BVR missiles at eachother these days). Cavalry didn't want to give up the lance, why would pilots want to give their money and culture away to limpwristed programmers?

I just saw this lol

If you are still interested: I use the word surfaces here to mean interface between the world model of a given system and the sense data it is receiving; which is kinda specific and maybe only means anything to me.

It is a hard problem to create a system that can handle a sudden change in surfaces in this sense without resorting to a million zillion special cases, which is what we do now.

I am continuously confounded by the ability of humans to refuse to think one step ahead.

Do you think it's a knockdown argument to dismiss advances in drone technology as "science fiction" given the pace of progress?

Do you need the robot dog to piss on your lawn first before pointing a gun at your face before you worry about how they'll exceed human capabilities in astonishingly short time frames?

If you think Science Fiction isn't real, take a look around, you're already living in it.

As I actually have qualification and experience somewhere near the field: Hell yes.

None of the current glorified algorithms people are doomsaying about will ever manifest that doom vis. piloting a robot dog.

The real benefit they offer is as a force multiplier for a human with training and expertise, not as fully independent platforms.

And here is my strong claim: and they never will. The age of fully autonomous drone war will inevitably come about; It could be in 5 years or 500 years, but not under the current paradigm.

...what specifically about robot dogs that makes them such a dead-end?

The amount of energy, computation, and engineering you need to make a thing walk on the ground with legs as compared to conscripting a wagie simply makes no sense.

You can do it, but you could also do counterbattery fire by dropping big rocks from a galaxy.

Drone platforms become more viable as they get larger and get further from areas that are cluttered with bullshit, which is why drone aerial platforms were flying around the stratosphere in the 70's and currently are zooming around at sea level +30m; but self driving cars still can't make a reliable 45 deg. left.

What binds the 1% together as a military force beyond wealth? Who says that the forces of a Musk would align with a Bezos would align with a Gates would align with a Soros?

Well none of them want their money and power taken away!

I think I had in mind a 1% consisting of a small class of officials, billionaires, engineers and generals. They're the people who already run the world (engineers excepted), they're vaguely on the same side. They will probably also fight eachother for final dominance - but it makes sense to pick off the weakest parties first.

My core idea is that if people can't do any productive work (mass unemployment) that includes soldiering too.

I can imagine a future where there has been enough production or robots/drones/whatever that any human forces become irrelevant, but it would only be one possible future. Other futures where arming a mass of human auxiliaries can give an edge in any conflict would seem to be more likely to me.

i also feel that this framing of "the 1%" simply exterminating everyone else assumes a lot more sociopathy than actually exists. Leaders throughout history have certainly spent lives freely, but we've hardly ever seen them completely disregard their "lessers". Even the genocidal maniacs like Hitler and Stalin typically just targeted certain groups, not all humanity.

And speaking of Hitler and Stalin, the current crop of dictators would almost certainly count among the 1%, and many of them seem to possess very strong nationalistic streaks. Clearly the likes of Xi and Putin would go to great lengths to protect their wealth and status, but I can't imagine they would completely abandon their countrymen - what's the point of elevating to godhood if there is no one around to worship you? Plus, I'd say that speculation about elite vs masses is much less likely than good old fashioned wars between nations and races, if anything would lead to near extinction.

we've hardly ever seen them completely disregard their "lessers"

Well if their lessers had absolutely no recourse, if they couldn't even rebel... then we're dealing with a fundamentally new kind of situation outside of recorded history.

The leadership class doesn't need to exterminate humanity, only seize whatever they need for their immediate warfighting needs. This would be stuff like power plants, fuel, industrial facilities, computing equipment, mines... We need those things too! They're unlikely to democratize their power - power wants to be centralized. No state votes in referendums before they start a war, that opens up all kinds of vulnerabilities and complications. I think that states that hold back from autonomous weapons, states that try to be ethical, sections of elites that don't fully commit to the competition will all fall behind.

I believe that power corrupts, that absolute power corrupts very greatly, if not absolutely. Maybe there are some trustworthy people to hold enormous power - but do you trust that our current systems will put them in charge and that they can defeat more ruthless opponents?

what's the point of elevating to godhood if there is no one around to worship you?

Cloning yourself? Subjugating the rest of humanity and ruling like a god? My prediction encompasses extermination, replacement, subjugation, mass brainwashing, selective genocide, getting tossed in some rinky-dink upload-simulation that's the equivalent of UBI for a penny-pinching posthuman... Everything except true decentralization of power, a world where ordinary people have real agency in their transhuman/posthuman lives, where people get to ask for more than UBI, as OP posits. I think the game was rigged from the start, we're completely fucked.

I think to make this work, you also need automated construction and maintenance, including maintenance of maintenance drones, recursively.

At which point you're not in a 1%-rulership scenario, but an ALife/digital ecosystem takeoff scenario.

Maybe so. But we're assuming that AI is already 30-50% of world GDP, there's 30-50% unemployment. Isn't that a takeoff scenario? OP said 'in the long term' too, so we might even be talking about a post-takeoff scenario where the balance of power is ridiculously unbalanced.