site banner

Culture War Roundup for the week of January 27, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

One of these cultists described as an "Oxford trained computer scientist" recently (17th of january) killed the 82 year old landlord who previously fought off an attack by Zizians and survived a sword through his chest. He had a gun, of course.

https://openvallejo.org/2025/01/28/man-accused-of-killing-witness-in-vallejo-could-face-death-penalty-da-says/

An Oxford-trained computer scientist could face the death penalty for allegedly killing an 82-year-old Vallejo landlord to prevent him from testifying in a murder case against his former tenants, according to the Solano County District Attorney’s Office.

Prosecutors allege in a complaint filed Monday that 22-year-old Maximilian Snyder was “lying in wait” for Curtis Lind before stabbing him outside his property around 2:18 p.m. Jan. 17 near Lemon and Third streets in Vallejo. The complaint charges Snyder, who made his first court appearance Tuesday afternoon, with first-degree murder with special circumstances. He is being held without bail at the Solano County Jail in Fairfield.


It's a seemingly pretty irrational cult as these guys first attacked a guy with knives and then stabbed him to death. Quietly poisoning him with something that'd look like a natural death seems far more rational and within the abilities of computer scientists, yet..

One of their beliefs is that unrelenting violence against oppressors such as BBQers, landlords, and elderly parents who keep calling you their son is justified even if it's not effective, because it can improve your ideology's bargaining position averaged across every other reality in the multiverse.

It's sort of like dying in a game of chicken and saying "yeah but committing to that head-on collision helped me win in other worldlines"

Wait, is this actually what they believe, or are you exaggerating? This is Scientology levels of completely delusional.

They called it timeless-decision-theoretic-blackmail-absolute-morality theory on lesswrong

I'm pretty sure that's not how it works, since almost anything to do with timeless decision theory is basically incomprehensible and could never be dumbed down into something as concrete as stabbing your landlord with a sword. If you're killing someone in the name of Wittgenstein or Derrida, you're doing something wrong (on several levels). Maoism on the other hand smiles upon executing landlords.

could never be dumbed down into something as concrete as stabbing your landlord with a sword.

As the meme goes, you are like a little baby. Watch this.

The government is something that can be compromised by bad people. And so, giving it tools to “attack bad people” is dangerous, they might use them. Thus, pacts like “free speech” are good. But so is individuals who aren’t Nazis breaking those rules where they can get away with it and punching Nazis.

<...>

If you want to create something like a byzantine agreement algorithm for a collection of agents some of whom may be replaced with adversaries, you do not bother trying to write a code path, “what if I am an adversary”. The adversaries know who they are. You might as well know who you are too.

Alternatively, an extended Undertale reference that feels so on the nose it almost hurts (yes, fucking Chara is definitely the best person to mentally consult while trying to rationalize your actions).

Once you make "no-selling social reality" your professed superpower, I imagine the difference in performing Olympic-levels mental gymnastics to justify eating cheese sandwiches and coming up with legitimate reasons to stab your landlord is negligible. (I know the actual killer is a different person but I take the patient zero as representative of the "movement".)

Alternatively, an extended Undertale reference that feels so on the nose it almost hurts (yes, fucking Chara is definitely the best person to mentally consult while trying to rationalize your actions).

I'm not very well versed in Undertale lore, so can you point out how this is an extended Undertale reference?

[cw: spoilers for a 10 year old game]

In brief, Chara is the most straightforwardly evil entity in all of Undertale and the literal embodiment of soulless "number go up" utilitarian metagaming. One of the endings (in which your vile actions quite literally corporealize it) involves Chara directly taking over the player avatar, remarking that you-the-player have no say in the matter because "you made your choice long ago" - hypocrite that you are, wanting to save the world after having pretty much destroyed it in pursuit of numbers.

Hence the post's name and general thrust, with Ziz struggling over having to do evil acts (catching sentient crabs) to fund a noble goal (something about Bay Area housing?):

In deciding to do it, I was worried that my S1 did not resist this more than it did. I was hoping it would demand a thorough and desperate-for-accuracy calculation to see if it was really right. I didn’t want things to be possible like for me to be dropped into Hitler’s body with Hitler’s memories and not divert that body from its course immediately.

After making the best estimates I could, incorporating probability crabs were sentient, and probability the world was a simulation to be terminated before space colonization and there was no future to fight for, this failed to make me feel resolved. And possibly from hoping the thing would fail. So I imagined a conversation with a character called Chara, who I was using as a placeholder for override by true self. And got something like,

You made your choice long ago. You’re a consequentialist whether you like it or not. I can’t magically do Fermi calculations better and recompute every cached thought that builds up to this conclusion in a tree with a mindset fueled by proper desperation. There just isn’t time for that. You have also made your choice about how to act in such VOI / time tradeoffs long ago.

So having set out originally to save lives, I attempted to end them by the thousands for not actually much money.

I do not feel guilt over this.

It really can't be more explicit, I took it as an edgy metaphor (like most of his writing) at first reading but it really is a pitch-perfect parallel: a guy has a seemingly-genuine crisis of principles, consciously picks the most evil self-serving path imaginable out of it, fully conscious of each individual step, directly acknowledging the Chara influence (he fucking spells out "override by true self"!), and manages to reason himself out of what he just did anyway. Now this is Rationalism.

I knew about Undertale's general outline but couldn't piece it together, so thanks for doing that. So, in essence, ziz identifies one-to-one with Chara, an avatar of utilitarianism. He excuses his actions by simply asserting that his "true self" is a soulless consequentialist; he by-passes moral deliberation or crisis of principles by simply saying that whatever actions that puts him into conflict with himself are expressions of his true self. And because they are expressions of his true self, and therefore out of his control, he should not feel guilt over them. Determinism taken to its logical conclusions. Rationalism is just its beast.