site banner

Culture War Roundup for the week of March 20, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

New research paper attempts to quantify which professions have to most to lose from the introduction of GPTs into the larger world. From the abstract:

Our findings indicate that approximately 80% of the U.S. workforce could have at least 10% of their work tasks affected by the introduction of GPTs while around 19% of workers may see at least 50% of their tasks impacted.

The results vary by models but mathematics and math-related industries like accounting have the highest risk. The researchers overall found that "information processing industries (4-digits NAICS) exhibit high exposure, while manufacturing, agriculture, and mining demonstrate low exposure" (pg 15) and "programming and writing skills...are more susceptible to being influenced by language models."

I find myself wondering if "learn to code" from however long back will shortly become "learn to farm" or some such.

I have never taken these sort of studies or projections with much salt. Any job loss is easily negated by the creation of new, unforeseen jobs as well as more total jobs as the economy grows. AI as far back as 15 years ago was projected to displace lawyers, doctors, and journalists...not even close to happening. At best, AI only replaces a part of the job, not the whole thing. AI can help doctors diagnose conditions but cannot treat patients, nor can it make invasive diagnosis like biopsy.

I think the issues this time may mean this automation is different.

First, there are hard limits to what humans can actually do, before we even get into what will happen to anyone with low IQ or learning disabilities. If the “new jobs” are things that you need to be a genius to do, really maybe only 10% of the population could even be trained to do them. So where does this leave those displaced? All the easy tasks are done by a machine.

Second, there’s the issue of the pace of the change. Computing power has long grown exponentially. This would seem to mean that any task created by the AI revolution could be done by AI within 5-10 years of the creation of the job, you’d barely be able to train humans to do that work before that job, too, is taken by automated systems. And if this goes on infinitely, then there’s effectively very little job creation for most people.

  • If the “new jobs” are things that you need to be a genius to do, really maybe only 10% of the population could even be trained to do them. So where does this leave those displaced? All the easy tasks are done by a machine.

I got a taste of that when I went to Mexico. Since there are many fewer good office jobs in Mexico, most of the population has to compete for the rest of the jobs, and the wages are low. For example, my Uber driver was willing to wait around for me for 2 hours so he could get my next $15 fare for a half hour ride.

Plumbers might be a "safe" profession, but there are going to be a lot more people trying to do that kind of stuff in the near future and it will drive wages down.

The difference might be, of course, that the United States is extraordinarily rich. If AI increases productivity the government / non-profit sector will be able to create ever more elaborate do-nothing office jobs for the newly useless.

I got a taste of that when I went to Mexico.

The angle that advocates of "overpopulation" rhetoric never state (or is mangled by their class enemies on purpose- using this for environmentalist soapboxing was common in the early 2010s, but you don't hear it these days because the narrative was updated to "social justice") is that overpopulation is specifically relative to the amount of general economic opportunity per capita. The amount of economic opportunity affects the price of children so you can, in free countries, look to TFR as a rough guide to how much opportunity exists relative to population.

For example, on one end of the spectrum, you have New World countries where you're still relatively employable even as a high school dropout. Yeah, you won't get that far in life, but minimum wage is still a livable existence (this door has closed significantly from its peak in 1960 but is still technically doable). The high watermark in recent history for economic opportunity was, well, the US in the 1960s.

On the other end of the spectrum, you have Asian countries and Indians who have so many people that they'll willingly sacrifice the totality of their children's lives from 6 to 18 drilling them to produce GPT-4 output cramming them full of worthless memorization just so they have a chance at outcompeting the other kids for those 80 hour workweeks with a middling salary (the real play here is emigrating to a New World nation, but that's uniquely difficult for those nations).

If AI allows wide swaths of the economy to be enclosed, the last free nation on the planet will have it just as bad, though it remains to be seen what will happen to the Indians. If the economic opportunity in the US dries up because of this, the drip-feed they get vanishes; perhaps un/fortunately for them the country to their immediate west is more than happy to solve their population crisis with a few missiles in the right places should they try to get their hands on the larger Middle East.