site banner

Culture War Roundup for the week of December 5, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

9
Jump in the discussion.

No email address required.

So I finally installed tiktok. While registering, I indicated I was male. I was immediately shown what I can only describe as "anti-feminist" videos, women winning arguments against feminists, jordan peterson interview clips, etc. I generally scroll past these videos quickly, but they got more and more frequent, I probably made it worse for liking a few bill-burr clips early on, but it certainly started very early on.

My wife is a frequent tiktok user, she likes videos you'd expect of women, crafting stuff, recipes, etc. She gets also gets ton of overtly political feminist videos. Neither of us have strong feelings towards feminism. If anything, she's to my right on the gender issues.

I hear a lot of anti-tiktok rhetoric along the lines that china is invading our privacy. I'm much more concerned about tiktok dividing the younger generations and pitting groups against each other. This is probably more algorithmic than intentional, but this effect is almost certainly worse than the privacy concerns. I know this isn't anything new, other social media apps have similar effects, but I think the effect is much stronger with tiktok. With facebook, you inherit the political environment of your friends. With reddit and twitter you can choose your own echo-chambers. With tiktok, the decision is made against your will and almost instantly.

I'm much more concerned about tiktok dividing the younger generations and pitting groups against each other. This is probably more algorithmic than intentional, but this effect is almost certainly worse than the privacy concerns.

Yeah, this seems like the obvious problem with social media controlled by your geopolitical enemies. Maybe they don't want to call attention to it because that leads to the question of why is it okay for any company to be able to manipulate our youth that way?

These algorithms are mysterious.

Geographical location is a factor. Most of the videos I get in TikTok and Instagram Reel are the majority of whatever cluster of videos I liked in a previous slice of time, with some random videos local to my area scattered in.

Also, for a deeper look at how these things work. It's not an apples-to-apples comparison, But if you have Spotify, you can download some of your metadata. They have a dizzying amount of what I'd suppose are advertisement categories. I was categorized as anything from "Midwest Truck Driver" to "Hot sauce enthusiast" (the former is way off, but the latter is somewhat correct, I mean, I like hot sauce but am not sure If I'm an enthusiast) and a whole host of other things that makes 0 fucking sense. There were soo many categories!

I would assume TikTok et al. use similar clustering techniques of users for their content recommendation as Spotify does for targeted advertising. WAG though.

As for "training" the algorithm, the longer you use a platform, the better it gets. I have been using my YouTube account for over a decade, and the recommendations are more or less precisely what I would have wanted to watch.

Geographical location is a factor.

It's a significant one. My wife has been served the inane thottery published by our teenage next-door neighbors. There's no reason for a woman into golden retrievers and 'mom life' videos to be also interested in thirst traps besides location.

I just downloaded it and it showed me mostly dating advice videos and a mix of videos about Canada and animals videos. No political content at all.

I just downloaded it and it showed me mostly dating advice videos

Depending upon what kind of dating advice videos they were giving you, I might consider that political content. I'm honestly not sure what apolitical dating advice TikToks would even look like.

With tiktok, the decision is made against your will and almost instantly.

I've seen a claim on twitter it takes like hours of effort to train tik-tok to stop showing you crap you don't want.

TikTok falls somewhere between addictive and cringy. It shamelessly refines media bubbles, and it’s certainly mining as much of your data as it can manage.

So...why are you making an account?

I vote cringey. I've fortunately not used TikTok much, but man, from what I've seen of it, it really seems like it's lowering the intelligence waterline. The only thing I ever hear about it is stupid stupid content, like challenges which will result in people hurting themselves (like chicken a la NyQuil), or else just plain old dumb shit I can't believe anyone cares about (like fake life hacks and all the controversy about pink ranch dressing). Granted, I probably hear about these things more because of molloch. But TikTok is basically Twitter for videos, and it shows. I refuse to have anything more to do with it.

People fit into archetypes. Particularly, for social media, there are archetypes for the type of content people engage with. Trying to build a more socially beneficial public square deemphasizes or ignores that aspect, and so you will be competed out of existence. That's why TikTok is dominating: you get an echo chamber telling you things you want to hear. You can imagine fixes to this, but that ends up costing you real money and provides a path for someone to usurp you.

I don't think there's a way out of this, except by refusing to use social media. That can work for an individual but doesn't do much for the broader social damage.

Chinese Tiktok:

This is probably more algorithmic than intentional

Well. The choice of algorithm is probably intentional.

https://youtube.com/watch?v=0j0xzuh-6rY

TikTok the company does that, limiting access of young children, because the chinese govt mandates it - they wouldn't do it on their own! And for adults (which OP is), the algorithm isn't obviously different - from using douyin briefly, and I also got half-naked women, sports videos, animal videos, dumb social situations, all of which, like US tiktok, are stupid.

TikTok the company does that, limiting access of young children, because the chinese govt mandates it - they wouldn't do it on their own!

Tiktok is owned by Bytedance.

From Bytedance' wikpedia page:

In April 2021, a state-owned enterprise owned by the Cyberspace Administration of China and China Media Group, the China Internet Investment Fund, purchased a 1% stake in ByteDance's main Chinese entity and placed a government official, Wu Shugang, on its board of directors. The Economist and Reuters have described the Chinese government's stake in ByteDance as a golden share investment.

https://www.scmp.com/tech/big-tech/article/3145362/chinese-government-takes-minority-stake-board-seat-tiktok-owner

You may be missing how "the company" and "the chinese government" are pretty aligned on the issue regardless. They wouldn't do it 'on their own' except they're doing it and one should probably not ignore the incentives at play here.

Was that meant to be surprising? The chinese govt is deeply personally tied to, and exerts direct guiding power over, its economy in a way the US govt isn't.

That doesn't mean chinese tech companies aren't trying to make profit, grow, or get user attention, or that they aren't trying to get users otherwise. These limits exist, and their recent creation is newsworthy because before them, tiktok and other apps were showing 'unwholesome' content to underage users, and letting them use the app as much as they wanted to. And not by corrupting intent, but because that is what got the most views.

You may be missing how "the company" and "the chinese government" are pretty aligned on the issue regardless

Aligned in the sense of taking action, yes. To a lesser extent, but similarly - tech companies in europe are aligned with the GDPR, and Apple is aligned with the chinese government in China. But absent the guiding hand of the government, they'd be happy to let people use their product more.

OP claimed "Well. The choice of algorithm is probably intentional." - as if TikTok being attention-grabby in the US was intentional by china to harm the US somehow, compared to a tiktok wholesome algorithm in the US. This is very incoherent, and very common, something I've had irl friends and internet friends tell me dozens of times total. Aside from US companies using the same algorithms, and surfacing almost identical content for adults, and OP being about content presented to adults in the US, while douyin produces similarly cute girl / stupid content for adults in China, douyin was less-filtered for chinese youth years ago before those changes were implemented! The desire to claim China is Hurting Us without much justification seems prevalent - similar are "tiktok is a chinese privacy violation operation we must ban it" - much of US social media is already public and china can get most of what it gets via owning TikTok just by scraping twitter, facebook, instagram, etc, just as you yourself can. (the only place I've seen that statement made is, funnily, a CSET report)

Same for YouTube..they always recommend Jordan Peterson clips even though I seldom watch his videos. I think this puts a dent in the narrative that social media is always pro-left. I suspect the reason for this is because Jordan Peterson and non-PC political videos have a high retention rate and follow-through rate. People who watch one video are highly inclined to watch more , probably more so than other niches.

How does one explain how incredibly bad YouTube recommendations are? If I go to YouTube and just look for something interesting to watch, I have to scroll past dozens and dozens of videos to find something interesting. The most baffling thing is that it can't figure out that if it shows me a video every time I go to YouTube and I always scroll past it and don't click on anything until it has shown me about 40 videos, I probably don't want to watch that video and it shouldn't show it to me 20 times before giving up.

It also seems to have a long memory and a short memory for the wrong things. If I watch one video from a particular channel, it will suggest videos from that channel every day for a few weeks even if I never watch any more. But if I watch a lot of videos on a particular subject, if I stop watching them for a few weeks, it will completely forget that I was interested in that and never show me one of those videos again. It should be the exact opposite. If I just watch one video about something and don't keep watching more videos that it suggests on the same subject, it should assume I'm not that interested in it or was only interested in that particular video. But if I watch a lot of videos on a subject, it should assume I will be somewhat interested in that subject for the long term, even if I go a week without watching one of them.

Another thing it struggles with is it seems to put too much weight on the channel the video is hosted on and not enough on the actual content. If I watch a video because I am interested in the interviewee, it will keep recommending videos with the same interviewer, but it won't recommend other videos with the interviewee.

The other thing is that it clearly grouping certain channels into clusters and cannot figure out that I actually really dislike certain types of videos in that cluster. So like if it clusters videos about Alice and Bob together because people who like one tend to like the other, if I watch a video about Alice, it will immediately start suggesting videos about Bob, even if I never watch a Bob video. It cannot figure out that I am in some way different from other people who like this cluster of videos.

How does one explain how incredibly bad YouTube recommendations are? If I go to YouTube and just look for something interesting to watch, I have to scroll past dozens and dozens of videos to find something interesting. The most baffling thing is that it can't figure out that if it shows me a video every time I go to YouTube and I always scroll past it and don't click on anything until it has shown me about 40 videos, I probably don't want to watch that video and it shouldn't show it to me 20 times before giving up.

I actually like this feature. I watch different categories of videos ranging from various debates and podcasts, gaming videos, historical videos and documentaries, videos about technology from solar to new weapons and I also follow some channels due to them being entertaining or just cute to clean the palate with something wholesome such as some animal stuff. I often think something may be interesting but not right now as I am interested in different topic at the moment. But I like it if it remains in the feed for some time - even for a day or two - as I can return to it.

On the other hand I do use "not interested" feature for content I do not find interesting which helps. I am also blocking/unsubscribing whole channels a lot if the algorithm thinks for some reason that I should watch the content even if I find it uninteresting. One example is that I do watch League of Legends content from one youtuber/streamer when I am chilling in the evening or drinking my coffee as he has soothing voice and it helps me vent off stress. Of course as soon as I subscribed to his content and watched a few of his videos, I was slammed by other League content that I am not at all interested in. It took me weeks of religious blocking of random League stuff until the algorithm realized that no, I do not want to watch anything else from that category.

At a guess, shoving in 400k/4M view videos increases watch time overall, because most people do want them - but people hre tend to not want them?

I think youtube wants to drastically reduce the amount of niche content they host and broadcast, so they're always recommending stuff from only the channels they want to keep, even if it's not what people want to watch.

Sure, it's the same idea, but tiktok is much more powerful. You don't have to follow, like or even watch an entire video for the algorithm to respond. Being even slower to dismiss a video will boost similar videos. The youtube equivalent would be something like tracking your eye movement to see which thumbnails you're looking at. I know because I'm now getting porn-ish content after being slightly slower to dismiss videos with pretty girls.

I would be willing to bet YouTube does timing stuff too, if only because ad revenue is definitely based on watch time.

There’s a possibility that TikTok’s is more aggressive, or maybe it just gets more independent data points due to the short form. I wonder how fast Vine adapted to users.

Youtube created an entire Shorts product that's a direct clone of tiktok, so they're definitely paying attention. And even before tiktok YT creators believed YT used watch-time as part of their recommendations and tried to keep it high.

I suspect the reason for this is because Jordan Peterson and non-PC political videos have a high retention rate and follow-through rate. People who watch one video are highly inclined to watch more , probably more so than other niches.

100% this. Youtube's metrics include the following:

  1. How much did you watch a video?

  2. Did you leave after watching this?

This is probably true.

It’s also the origin of complaints about an alt-right “pipeline.” You watch one Peterson meme and your recommended videos will be destroyed by FACTS and LOGIC. Never mind whether or not you actually cared about Shapiro or whoever, YouTube knows that some people are really into both, so that’s where you’re sorted, bucko.

I’m of the impression that the effect waxes and wanes based on YouTube’s current algorithm. Though it could just be trends in how much vocal notice it receives? Either way, this phenomenon applies to the left as well. “Breadtube” was explicitly created to harness similar network effects. It’s best known for Contrapoints and I think hbomberguy. People make long-form talky videos and try for algorithmic cross-pollination.

It does seem to create weird trends. I like to watch videos of people building things. One day, I watched a video of somebody building a bizarre contraption termed a turbo burn barrel, which involves connecting a automotive turbocharger to a sealed metal barrel full of burning wood and starting the turbo going with a leaf blower. Pretty amusing, I thought. Well it seems either this was a Youtube creator trend or the algorithm decided I loved it (or both), since I kept seeing a constant stream of turbo burn barrel videos for the next few weeks, which I mostly did watch. Then all of the sudden it just stopped and I hardly ever saw them anymore. I saw a few hints suggesting videos in that genre were still getting made, but it seemed more like the algorithm just decided to stop showing them to me for some reason.

I saw that one in action! Thing was terrifying. I have no idea how much thrust a regular turbo is supposed to produce, but it certainly looked like a lot.

However, it didn’t mess with my recommendations. No idea why.

Depends which one - at least 5 or so channels that I saw built them, probably more.

Those automotive turbos aren't supposed to produce thrust at all, the exhaust is supposed to be run through the usual mufflers. With bare exposed turbine exits, it'd be just a little. You'd probably need to build a proper nozzle to generate much thrust.

I got the turbo barrels too! Does youtube push them to people in a broad demographic or do they have an implicit characterization of people wherein we both fall into the same narrow bucket of turbo-woodstove-video susceptibility? I honestly find this question rather disturbing, in that unless I choose to buy print media it's getting harder and harder to tell what's going on in the general populace. As in, how am I to tell whether something gets to me because it's in general circulation, as opposed to having been precisely targeted?

The YouTube algorithm is notoriously opaque from the end user perspective, and truly is hasn't been transparent since YouTube made the first major adjustment to it by disincentivizing clicks over watch time and website retention. Back when they were on their meteoric rise in popularity, the Paul twins were famous for directing their viewers to their brother's videos and using each other's channels to 'trick' the algorithm and they both got really famous really quickly as a result. The algorithm correctly identified that viewers who watched Jake Paul very often watched Logan Paul videos, so as soon as you click on one of their videos in your 'Recommended' tab, YouTube had already started the process of tailoring your user profile to watch videos from the other brother.

Information about how and why the YouTube algorithm was changing from 2015-2020 would be so incredibly valuable for contextualizing a lot of the social movements we've seen recently, but too bad that this is information we will probably never get publicly.

Many successful youtubers emulate this by having multiple channels even if all of them have basically the same type of content. They just advertise content on the other channel during the video and maybe with a link in description. Sometimes this secondary channel gets even more popular than the main one, so this crosspollinating definitely has some effect.

While registering, I indicated I was male. I was immediately shown what I can only describe as "anti-feminist" videos

I've had the exact same experience across all platforms (Tik tok, youtube, & instagram mainly). I used to be right wing but have solidly been on the left more several years now. Whenever I start any new account or social media I'm always bombarded with classic man-oriented 'right wing-ish' content (Peterson debunking feminists, gym bro complaining about girls, Shapiro clips, etc). Even now, after having these accounts for years, I'll still get random suggestions for this content.

What's even funnier is that if I watch a man-adjacent video (non-political workout vids or a video about guns) my algorithms get fucked up for weeks. I really do have to wonder if the right-wing influencers have a crazy high budget in comparison to the left. At this point they have to know my stats well enough to know that I'm not interested in those videos but they keep pushing. I admire the effort.

Tech companies were literally funding and pushing antifa felons' videos because conservative content was doing better organically. https://twitter.com/fingerishatroc1/status/1601729418640371712

I really do have to wonder if the right-wing influencers have a crazy high budget in comparison to the left.

I don't think they have a greater budget, but it has to be similar. The right-coded videos (IME) are more directly monetizable. Jordan Peterson is selling self-help videos. Workout Bro has supplements, and Gun Guy has training and $100 anodized aluminum pistol accessories made for $2 in china with his logo plastered. These guys are truly able to use a marketing budget to push algorithmic weight. Of course there's plenty of leftist grift out there, but I've found it's a lot simpler to see where the money would come from.

At this point they have to know my stats well enough to know that I'm not interested in those videos but they keep pushing. I admire the effort.

I've found that "bad for you" stuff for some reason is just so sticky. The sexualized content on instagram is something where if you hover over a reel or a post (much fucking less click on it or like it) you're going to be sent down that rabbit hole, and have to spend quite a bit of time manually de-weighting it.

There's also the element of the algo being aggressive in responding to your tastes. I think they've done this incorrectly as a rule across platforms. I search for science fiction images on Instagram? My feed is inundated for at least 3 days. Let it ramp up in proportion to my history.

As someone who became more right wing over the years, I’m curious what caused you to have the opposite trajectory?

It's quite a long story (and I can go into more detail if you want). Overall, I realized that certain beliefs I held weren't as supported as I was led to believe by conservatives. I attended a conservative Christian college but had two left-leaning professors in particular & a few new left-leaning friends who broke me out of my conservative bubble and challenged my previous beliefs. Growing up I was taught that issues like racism & sexism were issues of the past (and totally blown out of proportion by virtue signaling woke libtards). This new group of people gave me a different story via personal anecdotes (from my new friends) & substantial research (from my professors). Over the course of four years and lots of debating, I came to the conclusion that the biggest pillar of my ring-wing belief (namely the idea of merit, aka 'you can do anything as long as you work hard' & the inverse "If you're struggling it's primarily your fault") wasn't as absolute as I thought it was.

Trust me, I was not looking to turn to the left in the slightest. I mean, who wants to be associated with woke feminists (or worse, liberals)? Of course, I don't agree with everything the woke crowd believes nor do I have many positive things to say about Biden & other libs. But overall I'm now firmly planted somewhere on the left because the right just didn't have sufficient support for the biggest issues. Feel free to ask any other questions but that's the basic story for me.

Fantastic question - I'm admittedly still undecided on exactly how to approach this issue. But here's my basic moral justification:

All people should have equality of opportunity

Discriminated groups have less opportunity

Privileged groups have more opportunity

Therefore, discriminated groups ought to have more opportunity so all people have equality of opportunity.

('Equality' in this case simply means 'as equal as is possible to realistically achieve')

As a simple hypothetical, I would support increased government funding to schools with predominately black student bodies. This would privilege the discriminated-against but is justified based on my value of equality of opportunity. (Edit: I would also support increased government funding for schools in poorer areas using this same logic as well).

I'm curious, what is your opinion/justification on the same issue?

Interesting. I’m curious if you have read a lot of Thomas Sowell? Also I’m assuming you aren’t an HBD enthusiast?

Also, it’s interesting that meritocracy created a big change in you. Did you believe in meritocracy solely on deontological grounds (ie just deserts) or utilitarian grounds (ie meritocracy leads to generally the best outcome in a kaldor hicks sense, even if it doesn’t reflect just deserts)?

If the latter, what was the basis that led you to change your mind re the outcome?

If the former, do you reject utilitarianism as an appropriate framework or think the calculation comes out to reject utilitarianism?

Yes, Sowell used to be my guy haha. If you want to talk more about Sowell or why I longer find him persuasive I'd be down.

I'm not an HBD enthusiast on the grounds that I don't find its support convincing nor its utility to be of much benefit.

I'm not familiar with the two types of meritocracy that you provided. Meritocracy might be too strong a word in this sense too. Here's what I mean in simple terms: I believed that (in general) people could achieve positive outcomes if they worked hard enough. On the flip side, people who weren't achieving positive outcomes (or people who were experiencing negative outcomes) were primarily at fault for their own situation. The solution to most problems was individual in nature: If you wanted to improve your life, work harder and be better. And especially don't rely on government handouts or assistance in the meantime. We can take my previous beliefs on homelessness for example: In 99% of cases, being homeless was the fault of the individual (drugs, behavior, work ethic, etc). Therefore, the solution to homelessness was focused on the individual as well: Pick yourself up, get clean, apply for jobs, and get back on your feet (and don't mooch off other people while doing so). You can copy/paste that reasoning to just about every political issue (racism, sexism, immigration, income inequality, welfare, etc).

So what changed? The primary factor was one of my economics classes called "the economics of race, class, & gender" (trust me, my past self was NOT happy to see this liberal bullshit on the schedule lol). Growing up well-off (and in a well-off area), I was hilariously naive when it came to the economics of class in particular. Life's trajectory was simple: do well in school, do well in college, do well in your career, and you'll never have to worry about being poor. This class quickly showed me why my simple plan was highly dependent on where you grew up. I'd heard a similar story before but this was the first time that I saw real statistics & research to back it up. I saw similar evidence for things like racism & sexism.

I finally came to the conclusion that some people were much worse off than others due to no fault of their own.

I know, not exactly a mind-blowing conclusion (and really speaks to my ignorance and naivety more than anything). But this had a domino effect on almost all of my other beliefs. For example, welfare. Since some people are 'poor' through no fault of their own, I could no longer justify my disdain for government handouts. Morally, I don't want people to suffer due to something that is likely out of their control. Even if becoming poor was a personal choice, escaping poverty is a vicious cycle. Economically, I found strong arguments for buffing up welfare systems in order to turn poor people into economically productive, tax-positive citizens. Homelessness is another example: Moral reasons were the same as before. Economically, in addition to making homeless productive citizens, I saw decent evidence that aggressive left-wing solutions were more cost-effective in the long and short term. I came to similar conclusions on other issues regarding race & gender. Morally, I find it wrong for someone to suffer because of something they were born with. Economically, I concluded that protecting these groups leads to positive economic outcomes for everyone involved.

I want to stress that I still value hard work and individual responsibility highly. I do believe that hard work can and will solve certain individual problems and that bad decisions/lack of effort can cause certain individual problems as well. But overall, I think that external, uncontrollable factors are the root causes of many of the issues we see today. Since being on the left I've also become more sympathetic to the idea that we should assist others even if they are entirely to blame for their situation (given that said assistance is effective, addresses root causes, and comes at a reasonable economic cost).