This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
Enough with the election. Let's talk about memes, sort of, in relation to message discipline, consensus building, and partisanship... for elections. Sort of.
I've been meaning to tap the motte-trust on this topic. Spurred by this comment by @Goodguy below. The following ramblings make me feel a great deal of shame. Forgive me, senseis.
I've been having similar thoughts as Mr. GoodGuy. Not just because it's campaign season, although this is part of it, but it's a general noooticing. I have always assumed astroturfing has had an impact on what people say online, but post-2020 it became more visible, or perhaps less bearable personally. 2016 set the stage, and probably perfected some systems, and now it does sometimes feel like Dead Internet Theory is real. But, instead of bots, these are performers.
Since most of you are credentialed internetters familiar with web surfing the following few paragraphs may not be necessary:
A recent case study that has spurred my curiosity is /r/npr/. I have been subscribed to the NPR subreddit for a long time. I don't engage there, but I would visit it a few times a year. Historically, it has been a relatively low comment activity link aggregate for NPR stories and podcasts. The most common type of post that received comments would be an NPR story and a few dozen comments. A specific program was good or bad and a few users would come talk about it. Between the years of 2018-2022 there was also a recurring "what has happened to NPR?" themed post.
Until around last Fall. I started checking it more frequently, because news was hot, I was weak, and the reddit-fication can be interesting in a guilty pleasure way. First October 7th, the the college protest stuff, then January 1st rolled around (it became an election year), Claudine gay was fired, more college protest stuff, then finally Uri Berliner's story came out in Spring.
Which is a rough, anecdotally polluted, timeline-- a relatively quiet link aggregate transformed into /r/politics blob with /r/politics type of consensus. My recollection of the sub as a light user could be wrong. Maybe it was part of the /r/politics blob already and I just missed the switch. It saw a ton of growth during the happening years, but a couple examples follow my concept of the subreddit:
Despite its astronomical growth following 2020 I didn't notice a full on reddit political consensus until this year. And, if I were visiting between then and now, I'm fairly sure I would have noticed. I am no n00b nor naive traveler. I know what to expect from Popular Reddit Sub, but the comments in those places are still rather unbelievable.*
The sub now experiences an insane amount of increased activity in comments in the vein of /r/politics. Seriously, just go read the comment section. Almost like a flip was switched as it was decided this place was an important canvas to paint.
"Well, duh, @wemptronics, of course reddit is astroturfed," you say. But, my curiosity isn't limited to reddit-leftist types of blob. I see this many places in any popular English speaking onlineville. That's the basis for some general follow up questions and thoughts-- poorly formatted and ill-considered.
Is the social-media-net made up of a bunch of actors with too much spare time playing roles manipulated by a just a little bit of astroturf and narrative controls? How much weight do astroturf campaigns and organizations carry on social media? How much of what people say on large social media platforms is authentic types of group think and reinforcement?
Has anyone begun studying this stuff yet? Has the internet sociology and history been ideologically captured yet? It's too much for my small brain to systematize, nor do I want to spend time doing so for free.
Besides getting out of the screen, here are some ways I reason myself out of "wtf these people can't exist" Kookville:
Was this all just a roundabout way for me to scream, "Wake Up Sheeple" as I tip my fedora violently? Perhaps. Eternal September is not a new topic to this forum. But, geez, when I venture a little too far out into genpop, when I dive into a Twitter chain I shouldn't, when I click the "comments" section at WaPo, NYT, NYpost I am reminded just exactly what never was or will be.
I think it's even worse than that. It'd be one thing is this was just social media getting to a mechanic that thinks a 'clever' Dem-politician pencil-holder is funny, or a moron with a podcast that can't read.
This is the official GovTrack mastodon account, a site that people here use, myself included. Axios just revised a three-year-old story today to remove 'border czar' from Harris' list of accomplishments. Elon Musk put tens of billions of dollars into twitter to shitpost, and does it badly. I've worked on open-source code with someone who was really proud of having physically attacked Brandon Eich, and that's far from the worst I've seen there; my boss and a coworker have a conspiracy theory about the FBI and the Trump assassination that would be fascinating if they weren't doing it in a business meeting; a forum that once was a mainstay for me blocked discussion of the Trump assassination attempt as a thread the day of (literally as the second post) and never discovered one made the day after. KelseyTUoC spent the better part of a decade as part of the EA community, earned Scott Alexander's respect, and then got to work at Vox... except it was a problem before then, too.
These aren't astroturf, or rando nutjobs who have nothing to their name but politics, or AI, or rats following the Pied Piper, or nineteen-year-olds fresh-faced to the internet, or whatever. This is what they are under the mask.
Beneath that, Trace and Its_Not_Real have been having a twitter debate over The Machine and its output, and I think it's bad enough that Trace's defense is literally pointing to "Hanania/Karlin", but the more critical problem is that even were it true (which I'm far from sold on), The Machine has lost any capability to credibly present the truth, and very few people care.
I wrote, three and a half years ago, about how I didn't see a path back to trust in academia. But why would they care? In many ways, things have gotten worse for the academics, but academi_a_ has been doing fine. Even individual schools and journals with massive scandals have quite happily shaken them off and gone right back to it. Sometimes bad actors manage to get fired, but sometimes they get a TV show. In some cases, and I'll point to Wansink again, the policy proposals and even individual papers don't suffer much even after everyone discovers they were always made up from whole cloth.
Why would anyone expect that to stay to one poorly-demarcated field?
I've been tapping the sign so vigorously lately that its starting to hurt my finger.
Literally I just want accountability from those who are nominally in charge of various important functions.
Accountability is coup-complete. The entire system as it exists is designed to launder and diffuse responsibility. You'd have to change how things work in a fundamental sense that requires a circulation of elites. The current ones can't be accountable because they rule through unaccountability.
I think what tends to happen is that lack of accountability makes it almost impossible for the system to correct course even as the need for such course-correction becomes absolutely obvious. There's no mechanism for filtering out incompetents, there's no feedback for the leaders to judge which decisions are actually improving matters, so we get the iron star catastrophe.
Covid kinda showed many of the seams. It really seems like the elites are running very low on effective tactics for reigning in discontent. I don't see how they'll effectively resolve the Israel-Palestine divide without alienating some large portion of the population. It seems unlikely that they'll achieve true 'victory' in Ukraine. They can't even solve the problem of drug overdose deaths in the heart of the capital, let alone the outer reaches of the empire.
You can only run away from consequences for so long. I'd wager most of them are gambling that they'll be dead before the chickens actually come home to roost.
Why wouldn't "brute force" be effective tactic enough?
By the time you're resorting to pure brute force you've probably lost so much legitimacy that you're asking for revolution or coup.
Of course this doesn't mean it'll actually happen.
From where? I know I keep harping on the German Peasants' War, but I think it's a good analogy for the relative positions of ordinary citizens and professional militaries. Modern governments like that of the USA are effectively "rebellion-proof." It wouldn't matter if tens of millions of gun-owning Americans decided to rise up in revolution, because it would only take ten thousand or so regime-loyal troops to crush them utterly.
As for coups, the upper ranks have all been too politically captured to want to carry out a coup, and the lower ranks don't have the capacity to organize one.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link