site banner

Culture War Roundup for the week of September 4, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

I am a very weird human being. When I first read this stuff on LessWrong as a teenager I remember being very annoyed by how smug they seemed about "hey, breaking news, you should believe true things and not false things."

I have similar peeve, but because of exactly the opposite reasons: this whole credo is obvious bullshit. Even rationalists like Yudkowsky do not really practice it, take as an example his annoyance with Roko's Basilisk idea leaking or his secrecy around methods how he can get out of the box pretending to be AI. Why doing that, just set the truth free. If it destroys countries or even the whole humanity, then it should be destroyed, right? The cold truth is defined as the highest value so what is the problem.

Anyways, there are many ways how one can save "belief in untruth". One way is to defer to an authority: I cannot evaluate if Many Worlds or Spontaneous Collapse interpretation of quantum mechanics is true, but I think expert A is trustworthy so I take his word for it. But in a way this is is belief about expert and not belief about the thing, so it is cheating a bit. Another way to do that is to have epistemic humility, Scott Alexander himself once remarked how he was able to argue untrue points very effectively toward people with less knowledge and his takeaway was to be a lot more skeptic when it comes his own views as he could also have been misled. Ironically rationalists themselves accept this premise, their whole shtick is how AI can lie to reach its goals. Similar idea is also expressed by yet another rationalist glib of it all adds up to normality, which basically urges you to be skeptical about too "weird" conclusions and sticking to your intuition a bit, even if evidence seems strong.

Now given the utilitarianism of rationalists I do not trust them at all, there is nothing preventing them to lie to me to reach their goals of maximizing utils or whatever. In fact they are quite upfront about this. The third one is right there in the rationalist Bayesian thinking idea. All it takes for me to defend any belief is to set my prior to very low value so it is incredibly hard for it to be flipped in my lifetime. And I can still signal my sophistication: my credence of idea X being true shifted a bit in light of new circumstances and recalculated posterior, but I still find it unlikely for X to be true. That is unless Scott Alexander or Yudkowsky or other gurus of rationalist faith say otherwise, then my posterior will shift dramatically.

Now maybe this all sounds too harsh, I do not really mind it as much. But one really has to treat rationalism as yet another pretentious internet fad, as an infotainment. There are very useful things I learned and for it I am very thankful. But I think dropping the guru sounding shit or weird stuff like defending value of insect life or polyamory or any of the awfully convenient overlap of supposedly cold rationalists with hippie/techbro Silicon Valley culture and ethos is advised. But sometimes I think I am not harsh enough - listening to Yudkowsky lately I would not be surprised if he founded some Unabomber style cult set out to bomb datacenters to prevent AI apocalypse, which would be logical step if they really believed in the Truth of apocalypse so firmly and unshakenly. So there is that.

Good lord, it’s been a long time since I’ve read such uncharitable interpretations of single sentences. With analysis like this you can make anyone look bad.

The "analysis" does not really depend on that single "sentence" - although I also think calling it just as a sentence is uncharitable. It is not some random sentence from Sequences, The OP called is as a credo, it is oneliner that is tied to rationalism and Yudkowsky especially.

What I was getting at was the overall tone of some of the rationalist writing that I think "the credo" shows very well: it is edgy sounding guru oneliners that are sometimes literally used in normal conversation - the credo in particular I think was for instance said by Aella in her interview with Lex Fridman unironically.

I also admit that I am maybe too harsh, maybe I am taking it all too uncharitably. It is just internet infotainment, there is not that much going on and rationalists do have also oneliners like "it all adds up to normalcy". And then one reminds himself that normalcy includes saving ants, or AI apocalyptic doomerism and then I am not as sure what charitable take on rationalist utilitarianism should look like when taken as an actual moral philosophy that is adopted up by the unwashed masses.

Also as a closing point, I thought in this manner due to the fact that the OP described how normal people including Marxists do not adhere to the credo. I found it paradoxical as I do not find rationalists strictly adhering to the credo either, in that sense they do have much more in common with Marxists: they do have materialist utilitarian moral philosophical system (or one can almost dare say theology) build up ground up from first principles with some transhumanist transformative project. It is a philosophy created outside of mainstream, a system created by outsider "basement dwelling" philosopher with prolific writing and slight ties to rich donors. I wanted to point out this myopia to OP.

I suppose he doesn't get an upvote?

If it destroys countries or even the whole humanity, then it should be destroyed, right? The cold truth is defined as the highest value so what is the problem.

I think you either misunderstand or are deliberately misrepresenting the point to dunk on the nerds here. Obviously you shouldn't post nuclear codes on Twitter just because they're true - we're talking about the nature of beliefs. "Dangerous information exists" isn't incompatible with the idea that you should try to believe true things, and not random shit that would be convenient. That's just common sense!

Anyways, there are many ways how one can save "belief in untruth"

Nothing you said here is even remotely like belief in untruth. Trust authority figures? Also a means of determining the truth, because the whole reason you're trusting them is that you think they're right! Be sceptical of clever-sounding arguments? Don't rush to believe weird things just because you think you have evidence? That's literally just Yudkowskian rationality stated informally! He would probably say something more like that in Bayesian terms, your odds of hearing a good argument for X are not that much higher given that X is true, and also that prior probabilities exist, but it's the same damn thing.

All it takes for me to defend any belief is to set my prior to very low value so it is incredibly hard for it to be flipped in my lifetime.

I can't find it on a cursory Google, but one of the Sequence posts on this is about how confident it's reasonable to be in your priors, and "so low no reasonable evidence could ever make a difference" is, obviously, too low. Again, common sense.

Now given the utilitarianism of rationalists I do not trust them at all, there is nothing preventing them to lie to me to reach their goals of maximizing utils or whatever.

Gosh, it sounds like being so willing to lie could have bad consequences that a consequentialist might want to avoid. Seriously though, there's whole reams of decision theory stuff about how you shouldn't lie! Partly in fact for exactly this reason. A good basic sketch from the Olden Days of why in rat terms you should not in fact abandon all ethics to be "utilitarian" is here.

which would be logical step if they really believed in the Truth of apocalypse so firmly and unshakenly.

See above. It's not actually logical.

I think dropping the guru sounding shit or weird stuff

Yeah, agreed, it bothers me a lot too. Yudkowsky in particular seems to just not have much sense of... PR, image, not seeming weird, and it's very annoying. The only thing that annoys me more than LessWrong rationality is how terrible the criticisms of it are. I'd take a hundred weird mystical descriptions of common-sense reasoning over one "these people are weird and cringey which is of course equal to 'wrong' because 'wrong' is just another word for 'bad'" dunking.

As @georgioz says, rationalists have many blind spots. The main one I've found is the 'Not made here' problem, or whatever it's called, where everything has to be invented from first principles. This means rationalists actually miss the Truth quite a bit.

For instance, observe how instead of accepting that emotions are real and digging into the science of trauma that has been building for decades, rationalists prefer to throw it all out and come up with their toy model that reduces human being to automatons.

Rationalists are foolish in many ways, and the most tragic thing is that they think their belief in Truth and Reason means they aren't taking anything on faith, or having any untestable beliefs. Unfortunately for them, there is no way to objectively measure reality outside of human perception, so Truth and Reason are just another God they believe in, albeit with very sophisticated and labyrinthine scriptures.

"Dangerous information exists" isn't incompatible with the idea that you should try to believe true things

The credo is much stronger than that, it puts the Truth as ultimate value, not as just something aspiring or something one "tries" to adhere to but abandons for something else in presence of "dangerous" information. The credo is not "that which can be destroyed by the truth should be unless it is dangerous to do so". Of course you can argue what you do, but then there is no need for edgy sounding guru lines like the credo. You would then just have ordinary thing like "try to tell the truth whenever you can" - it almost sounds something people like Peterson could say actually.

Nothing you said here is even remotely like belief in untruth. Trust authority figures? Also a means of determining the truth, because the whole reason you're trusting them is that you think they're right!

Of course it does. I can say that I believe New York Times or Eliezer Yudkowsky or The Pope or I can trust the Science. If you pick up bundle of beliefs some of them are for sure going to be untrue. This is a common way how people get to believe untrue things. And this is also the way rationalists pick up their beliefs, unlike some scientific sounding first principles reasoning. So again, there is not that much of a difference between rationalists and just regular informed people, in fact from what I noticed rationalists are putting too much faith into their own thought leaders.

That's just common sense!

Slow down, we are talking about rationalists, I am not that sure how far the appeal to common sense can carry you here. Again, I am maybe too harsh as most rationalists are just normal people who actually have some common sense, except that the whole rationalist ethos is about overcoming commonsensical reasoning on many things and there really are some people over there that can take these things maybe too literally. That's my whole point.

Seriously though, there's whole reams of decision theory stuff about how you shouldn't lie!

Except if it is dangerous to tell the truth, we already covered that, right?

Anyways, there are many ways how one can save "belief in untruth". One way is to defer to an authority: I cannot evaluate if Many Worlds or Spontaneous Collapse interpretation of quantum mechanics is true, but I think expert A is trustworthy so I take his word for it. But in a way this is is belief about expert and not belief about the thing, so it is cheating a bit. Another way to do that is to have epistemic humility, Scott Alexander himself once remarked how he was able to argue untrue points very effectively toward people with less knowledge and his takeaway was to be a lot more skeptic when it comes his own views as he could also have been misled. Ironically rationalists themselves accept this premise, their whole shtick is how AI can lie to reach its goals. Similar idea is also expressed by yet another rationalist glib of it all adds up to normality, which basically urges you to be skeptical about too "weird" conclusions and sticking to your intuition a bit, even if evidence seems strong.

I think the much more rational approach is to be clear to yourself about where your beliefs come from, especially if they didn’t come from your own investigations. If I’m convinced that Ukraine is doing well on the counteroffensive, that’s fine, as long as I’m clear with myself and others I try to convince that I’m basing this on news reports and not on defense analysis or talking to people on the ground or even talking to people in the military. I think personally it’s almost as important as the idea of putting some sort of certainty quantification on statements.

I believe in “many worlds” at about 60% certainty, but I’ll be honest that my understanding comes from listening to science communicators explain it on TV. Putting it that way at least gives you (and me if I’m willing to be honest about the implications) a baseline of knowing just how seriously to take my claims based on this belief.

I think that all this language about how one is rationalist but one should also put numerical credences (ideally down to decimals) to one's beliefs and how one should be careful about context of information and source of my views and how in the end it should all kind of feel "normal" - it all is the usual way of how rationalists say a simple thing everybody knows in a complicated way. Man, practice some source hygiene, work on your thinking and trust your intuition a bit. On most beliefs one would be in line with majority of informed people.

Paradoxically it is always the weird shit where rationalists are touting their supposed first principle revolutionary approach, where they are espousing mantras like that which can be destroyed by the truth should be. I am talking about things like saving ants or taking drugs or defending some sexual deviancy or other defense of some weird shit that nerds really want to rationalize. Guess what, my intuition screams "red alert".

Information in areas where you are not a specialist will always be no different than any other well informed person. That’s sort of the point. A lot of rationalists seem to take information from blogs and video and so on. This is fine. For most purposes, cosmology as explained by popular science communicators is just fine. Where it becomes a problem is when you use that geeky layman understanding and pretend it’s more than it is. It leads to a kind of arrogance where you assume you know all the relevant details without doing a deep enough dive to really know what’s going on.

Likewise, while I don’t think it’s necessary to go down to decimals of certainty, I do think, especially when reasoning about things, to have some idea of just how sure you are about a given conjecture. If you’re not pretty darn sure then it shouldn’t be the lynchpin of that argument or prediction. If you’re pretty sure, fine, give it importance, but I’d never advise making a major decision based on something that you’re less that 80% confident in.