site banner

Culture War Roundup for the week of March 13, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

15
Jump in the discussion.

No email address required.

Alright, folks, I'm out.

Consider me the first Motte casualty to AI. No, I'm not planning on any self-harm. Quite the opposite, in fact. But reading about this shit is so depressing and anxiety-inducing that it's giving me a premature mid-life crisis, so I'm going to do the reasonable thing and take the grill pill.

As part of that reevaluation of how I've been spending my time, I've taken a look at the 15ish years I've been "Very Online And Politically Aware". Since the financial crisis in 2008, really. And I have to conclude that the investment of 10's of thousands of hours of my life has been basically a wash. I don't think I can point to a single tangible benefit to my life, it's all just sunk opportunity cost. Maybe a few memes pushed out, but thinking I inspired them is probably just arrogance. My lazy, slacktivist involvement almost certainly didn't matter for anything at all. Being Politically Aware is arguably the worst thing I've done with my life; I would rather have played more video games.

Even the insight porn is getting stale.

So combining these two epiphanies, I'm going to block TheMotte and SSC, and ACT and Instapundit, and unsub from any remotely political or AI-interested subreddit and block /all. Anything else I can think of or that comes up, I'm just gonna dip. (I would like some advice on how to block websites on chrome mobile for Android. The recommended apps seem to not really work the way they're presented.)

Because if this is the beginning of the end, of us becoming either obsolete or all dying, I would rather go out in a eudaemonic frenzy than wasting my time whining and worrying. I would rather double-down on being an amazing dad, and son and brother. I'll double-down at the gym, and run the best D&D campaign my friends have ever seen. I've had fantasy novels fermenting in my head for decades, I'd like to write a couple before GPT-X makes human creativity obsolete (and then I can have GPT-X churn out countless sequels!) Maybe I'll even try dating again.

And if and when the nanite disassembler swarms come for me, I'll go down on my feet knowing that I was a pretty kick-ass human, back when that mattered.

If anyone has suggestions for other things worth doing or being, or that satisfy that "check my phone while waiting in the line to pickup the kids" nudge that avoids my new no-nos, I'm all ears.

I've enjoyed this community a great deal, and think fondly of many of you. Thank you for contributing to making this a place where I felt at home. Maybe we'll have a grand meetup if humanity wins, and spend a subjective eternity having AI-moderated arguments about who was right.

And if any of you are in a position to do anything about the future - godspeed. We're all counting on you.

Thank you again. <3

Now if you'll excuse me, I need to take this empty propane tank to ACE for a refill.

Knowing about things that might transform your life or society as a whole is important? Even if there's nothing you can do about the trend of AI as a whole, whatever smaller-scale goals you have are surely impacted, so knowing about it is important. And more generally, that the forum was capable of demonstrating to you, viscerally, that AI will transform the world, suggests it might inform you of similarly-important things in the future.

I just don't think that's very true.

Because the effects it will have are going to be relatively unpredictable, and your choices trying to respond to every little development have potential to make things worse for you.

Its like trying to 'time the market' and day trade vs. just stick with a long term investment strategy.

There are almost certainly diminishing returns to becoming deeply informed about [current thing.]

For instance, if you're a woman who makes her living spinning fabric and selling it, knowing about the 'industrial revolution' or 'factory production of cloth' is incredibly relevant. Knowing about it three years earlier seems very useful. That "long term investment strategy" of continuing to spin fabric to feed your kids doesn't work.

You say there are diminishing returns to being deeply informed, but without being deeply informed, you might just stay confused. that It's difficult to know beforehand what the 'big things' will be. A lot of people were not, three years ago, sure that AI would be a 'big thing' in five years, even though they probably saw something about 'neural networks identify cats in youtube videos' in a news headline. And today, most people still don't really care. So if you just 'read the headlines once every few months', maybe you'll hear about ChatGPT as a cool thing your young friend plays with, and write it off as something that doesn't matter. Maybe ten years ago, you could've trained to be a ML engineer or lesswrong alignment person or something.

Yeah, it'll make you sad to think about too much, or something, but ... humans being obviated in all aspects of life is, at least potentially, sad, right? Being sad isn't an unconditional bad! It's being aware that something not-good is happening. Consider: we could easily, by tweaking a few dozen/hundred genes, not feel any sadness after a family member dies - and, yeah, at that point it's too late to do anything- so why feel sad? Would that be good?

if you're a woman who makes her living spinning fabric and selling it, knowing about the 'industrial revolution' or 'factory production of cloth' is incredibly relevant. Knowing about it three years earlier seems very useful. That "long term investment strategy" of continuing to spin fabric to feed your kids doesn't work.

What does she do with the information?

Develop another skillset... which is ALSO going to be disrupted in short order?

How does she act when, knowing that the change is coming, she still can't tell what the second order impacts might be?

That's my point. Knowing about the coming change is perhaps useful, but how much information must one obsessively seek out in order to make a good decision with that information? And how much time should one spend before it is counterproductive?

For instance, I'm pretty sure AI is coming for my job inside of ̶1̶0̶ ̶5̶ 2 years. But how in the hell can I predict which jobs are going to be 'safe' with any precision?

So basically, I've done the best I can by buying stocks in companies that might take off due to AI development, and I'm preparing myself to jump when the inflection point arises.

But I am not obsessively churning through AI news to try and predict the outcomes.

Develop another skillset... which is ALSO going to be disrupted in short order?

... In that non-hypothetical historical situation, yes, you develop another skillset. And that skillset won't 'also be disrupted in short order', given we're hundreds of years later and plenty of people hold occupations for decades. But, given the primary occupations aren't 'farm / household laborer' anymore, every single person eventually retrained, whether because they saw the way the wind was blowing or because the price of their labor went to zero.

But how in the hell can I predict which jobs are going to be 'safe' with any precision?

It's tough! But "plumber" or "doctor" are better jobs than "copyeditor" or "commodity artist", i think.

I'm not so sure about 'Doctor,'

https://www.cnbc.com/2023/03/14/googles-medical-ai-might-soon-answer-questions-about-health.html

Surgeon, maybe.

And while I agree with Plumber, I'm no longer very confident in my own predictions so I wouldn't be too surprised if we get "PipeGPT" sometime this year.