site banner

Culture War Roundup for the week of December 5, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

So I finally installed tiktok. While registering, I indicated I was male. I was immediately shown what I can only describe as "anti-feminist" videos, women winning arguments against feminists, jordan peterson interview clips, etc. I generally scroll past these videos quickly, but they got more and more frequent, I probably made it worse for liking a few bill-burr clips early on, but it certainly started very early on.

My wife is a frequent tiktok user, she likes videos you'd expect of women, crafting stuff, recipes, etc. She gets also gets ton of overtly political feminist videos. Neither of us have strong feelings towards feminism. If anything, she's to my right on the gender issues.

I hear a lot of anti-tiktok rhetoric along the lines that china is invading our privacy. I'm much more concerned about tiktok dividing the younger generations and pitting groups against each other. This is probably more algorithmic than intentional, but this effect is almost certainly worse than the privacy concerns. I know this isn't anything new, other social media apps have similar effects, but I think the effect is much stronger with tiktok. With facebook, you inherit the political environment of your friends. With reddit and twitter you can choose your own echo-chambers. With tiktok, the decision is made against your will and almost instantly.

Same for YouTube..they always recommend Jordan Peterson clips even though I seldom watch his videos. I think this puts a dent in the narrative that social media is always pro-left. I suspect the reason for this is because Jordan Peterson and non-PC political videos have a high retention rate and follow-through rate. People who watch one video are highly inclined to watch more , probably more so than other niches.

This is probably true.

It’s also the origin of complaints about an alt-right “pipeline.” You watch one Peterson meme and your recommended videos will be destroyed by FACTS and LOGIC. Never mind whether or not you actually cared about Shapiro or whoever, YouTube knows that some people are really into both, so that’s where you’re sorted, bucko.

I’m of the impression that the effect waxes and wanes based on YouTube’s current algorithm. Though it could just be trends in how much vocal notice it receives? Either way, this phenomenon applies to the left as well. “Breadtube” was explicitly created to harness similar network effects. It’s best known for Contrapoints and I think hbomberguy. People make long-form talky videos and try for algorithmic cross-pollination.

The YouTube algorithm is notoriously opaque from the end user perspective, and truly is hasn't been transparent since YouTube made the first major adjustment to it by disincentivizing clicks over watch time and website retention. Back when they were on their meteoric rise in popularity, the Paul twins were famous for directing their viewers to their brother's videos and using each other's channels to 'trick' the algorithm and they both got really famous really quickly as a result. The algorithm correctly identified that viewers who watched Jake Paul very often watched Logan Paul videos, so as soon as you click on one of their videos in your 'Recommended' tab, YouTube had already started the process of tailoring your user profile to watch videos from the other brother.

Information about how and why the YouTube algorithm was changing from 2015-2020 would be so incredibly valuable for contextualizing a lot of the social movements we've seen recently, but too bad that this is information we will probably never get publicly.

Many successful youtubers emulate this by having multiple channels even if all of them have basically the same type of content. They just advertise content on the other channel during the video and maybe with a link in description. Sometimes this secondary channel gets even more popular than the main one, so this crosspollinating definitely has some effect.