site banner

Culture War Roundup for the week of April 17, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

8
Jump in the discussion.

No email address required.

I’m hoping something like ChatGPT will be able to act as a super personal assistant and analyst.

At what level of 'smarts,' however, will an AI that is already training on how you do your job going to stop needing you around to do it?

I mean, you're basically happily accepting an apprentice who will lighten your workload whilst learning your job, except this thing is known to learn 100x faster than your standard human. The assumption that you'll have moved on to bigger and better things (or retired) before the apprentice steps up to take over your job may not hold here.

At what level of 'smarts,' however, will an AI that is already training on how you do your job going to stop needing you around to do it?

At some point soon we will at least increase productivity by 1.5-2x per person. At that point why don't we collectively demand a 3 or 4 day workweek?

First ask yourself this: why do you not already have a 3 day workweek?

Because I'm too poor.

As usual, WTF Happened in 1971 is a fitting reference. Productivity and compensation stopped correlating in 1971, and we haven't (effectively) collectively demanded a reduced work week yet.

We could have transitioned to three day work weeks way before 1971. The flaw in Keynes's famous prediction is that, past the point of basic subsistance, economic utility is relative. People don't want to make $20,000 or $50,000 or $100,000 or $200,000 inflation-adjusted household income to be happy. They want more than their peers. They want to have class-markers that low status people don't, not the luxuries that those class-markers manifest themselves in. It's why the canard about modern trailer trash having it better than kings in 1900 is so ridiculous.

If whatever happened in 1971 never happened, people would still be working as much as ever. The hedonic treadmill would just be moving faster.

Humans don't 'collectively' demand things because generally there's a massive divergence in values at scale. Coordination problems abound.

And put simply, if you can make $4000 for a 4 day work week, and $5500 for a 5 day work week, then there are plenty of rational reasons to just work an extra day.

The choice to do or not do so comes down to, I'd say, values, as above. If you have high time preference and thus value leisure and 'fun' things, you'll try to minimize the time spent working as much as you can.

The markets will balance supply of labor and demand for labor, as they always do, unless we actually do achieve fully automated gay luxury space communism.

I think at least in the short/medium term this technology could lead to large productivity gains without corresponding cuts in total headcount.

When I started my career finance teams used to forecast in excel using relatively simple formulas. Now they use coding languages and forecast more frequently, with greater detail, and greater accuracy while working with massive data sets. This hasn’t lead to a huge cut in overall headcount, but it has changed the skill set mix on the teams.

I think at least in the short/medium term this technology could lead to large productivity gains without corresponding cuts in total headcount.

Agreed. It's just psychologically painful to fire people, and especially if companies are making a ton of money from these models I don't think there will be a giant firing spree. As we saw with all the recent layoffs at big tech, when times are good companies are more than willing to keep a bunch of low impact employees on the payroll, especially in tech.

when times are good companies are more than willing to keep a bunch of low impact employees on the payroll

Also, it helps crowd out competition. Why fire a bunch of people when the interest rate is zero?

Sure, you'll save money in the short term, but those workers don't just disappear from the labor market; enterprising competitors will snap them up and end up requiring you to offer them a billion dollar acquihire scheme to shut them down before their product starts taking your marketshare.

Better to just keep them at Bigco. Sure, they won't really develop anything for you, but why drive the state of the art forward when you can just ignore all your customers, keep your competitors down, and rake in the cash from your ad business?

Right, but it's presumably cheaper to spin up more GPT instances or build up more datacenters than it is to train more 'experts' in fields that are susceptible to ML automation.

Hence the question:

At what level of 'smarts,' however, will an AI that is already training on how you do your job going to stop needing you around to do it?

I'm not really doubting that humans will be 'in the loop' for quite a bit longer, but I suspect it will be more 'sanity checking' AI outputs and/or as a backup in case of outages, and there'll be strong downward pressure on wages. Which is fine if productivity gains make things cheaper.

But you're talking about AI as a complement to human skills, but I'm very specifically inquiring about how smart it needs to get to replace given skill sets.