This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Schelling points of the online right and occasionaly irrational rationalists.
You go online today and you see quite a bit of absurd garbage that is supported by the online right, while they call their left-leaning counterparts out. The catalyst for me was the backlash finasteride got yesterday upon my reactivation of Twitter.
Finasteride is usually taken as a 1 mg oral pill. It is a 5-alpha reductase inhibitor used to successfully treat male pattern baldness in men. Your body's DHT levels change, which is fine if you have gone through puberty successfully. The drug was originally used for men with prostate issues and accidentally ended up being the single most effective intervention for male pattern baldness, even more so than its more potent cousin, dutasteride. The side effects can be quite strong: lower libido, extreme cases of ED, and mood swings; many men need what is called post-finasteride therapy. Right-wing faux masculine bros call for the companies making it to be charged for crimes against humanity. The funny thing is that the number of people who get side effects is close to 2 percent or less, depending on what study you choose. In fact, it is safer and has fewer reported side effects than many medications people take daily. So why are people lining up against a drug that is not just safe but is a damn near modern-day miracle? Nothing can stop male pattern baldness the way it can, so much so that minoxidil, a medication used for promoting regrowth, is useless without it, as you will keep losing more hair than you gain. Hair transplants, by the way, mandate you hop on the same two drugs, if not more, so that you save your remaining hair.
Seed oils are oils extracted from seeds of various plants. They are very cheap, and the fast-food industry uses them a lot because of that. Anyone not living under a rock must have heard reasons not to use them. Butter, ghee, lard, olive oils—all oils with higher amounts of saturated fats—are much better by all accounts; even "stats bro" and "IQ-denying" online bully Nassim Taleb swears by them. Yet the data on this is pretty unfavorable. Now, I am a twig compared to what I wish to be, so I will share what the people over at Barbell Medicine feel, doctors who have really high totals in drug tested powerlifting. They state that every single paper they came across showed that replacing these "better" oils with seed oils produced much better health outcomes.
I use both. I hopped on finasteride three years ago, and my family has been using seed oils since my grandfather's heart attack. I am willing to ditch both if that is the way. Yet, if you press someone on the online right who swears by the benefits of "sun and steel," they would probably state that both might be fine but some have had terrible experiences with both. Hence, the crusade against them at least allows people to not feel alone when they question the validity of what "science" has to say. Plenty of studies, papers, and people are simply incorrect. You will never see a large-scale study that gets public eyeballs which presents group differences as being innate. Hell, the good folks over at ScienceBasedMedicine go out of their way to lie about "science" when it comes to any leftist values. ScienceBasedMedicine is a popular skeptic blog that did its best to be as neutral and was fairly rigorous. Their contributor, Harriet Hall, another person who is not a rabid reactionary, faced scorn for a milquetoast review of a milquetoast book that states very obvious things about transgenderism. The entire blog went into a lefty purity spiral and has pushed out the kind of stuff you would expect from Jezebel on the issue.
So, the authorities are wrong on a lot of things. The world is indeed run mostly by leftists, and science is just a thing for them to justify their holy cows and why they must not be questioned. People here already know this part, but I try to provide more context for newer "mottizens." This goes deeper, which is why I brought up faux masculinity. War is the ultimate masculine experience, with the ability to exert power being a near equivalent or might even be something that surpasses that. The online right (including me) lacks both. Man wants people with whom he cooperates, to feel like he is a part of a clan, and these memes like seed oil hatred and finasteride fear-mongering are no different from the conservative ones (like living in some ranch with a podcast setup where you talk about guns and Black Rifle Coffee whilst shilling for Israel) or the lefty ones where you deny basic human nature in varying degrees.
You also have a rationalist counterpart for this, which is AI hype, wherein people write literal sci-fi pieces and have a view of AI that people who worked on it mostly did not, and many still don't. Scott Alexander is a great guy; his work is responsible for what we are here. Reading his 2027 piece made me feel a bit odd; the man who posted the most well-thought-out takes on medicine and personally helped many wrote something that is flimsy at best. Gary Marcus wrote a decent critique of it (he can be an asshat but is right here), and 4chan's /g/ largely agreed with it. After all, LLMs have in fact slowed down in terms of progress. Anthropic's CEO has been warning us about AI taking away all jobs in 2 years since 2023 at least, much like self-driving cars. The progress has been remarkable, yet the hype around it is has not paid off till now. Jeremy Howard, who wrote ULMFiT (one of the most important papers in NLP according to many, so much so that transfer learning for ChatGPT was inspired by it), simply laughs at statements about AI taking away all jobs, publicly claiming that we are as far away from ASI or AGI now as we were 20 years ago. I am a novice coder; my friends who do write code usually come out feeling angry when they use LLMs for their coding work, despite being proficient at using DSPy and prompts in general. The average person on this place, or ASC, or LessWrong, has a late 130 IQ, with people who write code making up a big part of the reader base, yet many seem to not want to change their beliefs about what I just listed.
Schelling points are clear to see for an outsider; the weirder they are, the more visible they are to them. Though once you are in a group, your worldview indirectly changes a little to match your clan's. Many hackers in 5-10 years' time would probably admit that the podcasts that host people running firms that make them money would in fact want more hype as they make money from their product. People want to be a part of something; humanity is not an island. I bought plenty of stupid, outright lies during my time working with a co-founder who is clearly in need of psychiatric intervention. I would buy it fully, like I bought the lies of a religious sect before it. The rationalists or the online right are not bad people; these Schelling points are kind of benign. In the case of rationalists, it's not even a point as major as seed oil disrespect among the "bronze age warriors," yet as a person on the fringes of both, it was funny that they would both go to great lengths to keep their holy cows alive. 4chan's /g/ is a toxic place full of bitterness, but their dismissal of the 2027 AI predictions and the amount of belief in our ability to produce synthetic intelligence many on LessWrong believe was not off the mark. I really do like LessWrong's stuff; their pieces on things beyond AI and many on AI are worth reading, and SSC inspired the one place I like visiting on the internet and have benefited a lot from. Yet, I am willing to eat downvotes and get blocked by people for pointing out things that I know are likely false. LLMs may take away all jobs, fin and seed oils might make me a beta soyboy and we may need to accept that singlualrity is upon us, yet I will bet against all of that for now, not because I am a contrarian but because I dont want to blindly accept memes that are probably wrong.
edit - typos
I have two datapoints about AI and programming recently.
I asked it about an unknown PRNG function I've reverse engineered which I had previously tried googling to see if it was based on a standard function. It was able to find functions that were similar that I had not previously been able to find googling. I then asked it to come up with a known plaintext attack when part of the seed was known and it spat out something that looked correct.
Another developer was looking at reverse engineering a function that was protected with a weak form of control flow obfuscation. The control flow obfuscation was just replacing function call instructions with a call to a shared global dispatch function that would end up calling the target function. The global dispatch function would execute approximately 200 instructions. There is an obvious attack against this obfuscation and it can be stripped off with ~100 lines of python in ghidra. They were using LLMs to try and investigate this function but didn't make much progress. But maybe with better prompting and allowing more access to tools it would have been possible for the LLM to make progress.
More options
Context Copy link
More options
Context Copy link