site banner

Culture War Roundup for the week of November 28, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

Consider the field of law

Consider how much less efficient the practice of corporate law was before the advent of the word processor. As a result, merger agreements used to be just a few pages long. With a modern word processor and a database of electronic precedents, a law partner could bang one of these out in no time. The legal profession's response to this efficiency windfall was not to slash law firm staff, but to increase the length and complexity of merger agreements. Now they're like 100 pages long, plus hundreds of pages of other crap, and they are expected to be drafted with full knowledge of a much vaster corpus of law and jurisprudence.

So I suspect that further efficiency gains will simply raise the complexity ceiling rather than reducing the size of the industry. We could see thousand-page merger agreements of increasing variety, vastly more intricate negotiated horse-trading over terms previously accepted as boilerplate, and increasing rigor in sourcing each provision to ever more obscure and fact-specific legal doctrines.

I think the law students' jobs are safe, or at least as safe as they ever were.

Interesting, so your position is that large law firms would continue hiring law grads into minor positions at $235,000.00/year even if they can achieve significant cost savings (possibly losing some efficaccy) by utilizing an AI?

My position is that they wouldn't achieve significant cost savings, because as they become more efficient in producing high quality legal documents, the quality expectations of the industry would increase by approximately the same percentage.

The legal profession is predominantly a red queen's race. It produces some objective benefit in promoting clarity and discipline within and between organizations, but everything beyond that is zero-sum -- trying to get better terms or a better dispute settlement at a counterparty's expense, trying to cover off an edge case where the default rule might favor your counterparty marginally better than your proposed customized solution -- and my suspicion is that the latter category is the bulk of the profession. Through that lens, the amount that a corporation spends on legal fees is more akin to the amount that a country spends on its military than the amount it spends on agriculture. When technology makes militaries more efficient, the result isn't less military spending on either side of a border, it's scarier militaries staring each other down across that border.

I don't see how your position supports the conclusion that "law students' jobs are safe," only that "law firms will continue to be profitable and entrenched in the corporate ecosystem."

Which I agree with. I just expect that law firm partners will exploit the ability to produce more billable time whilst paying less to their associates.

And this will likely trigger even harsher competition amongst small firms/solo practitioners since an AI that can produce most basic legal documents after a brief discussion with a potential client can be used to corporatize this aspect of the practice.

How does a firm justify billing $300/hour to a non corporate client when the AI-based firm up the street can produce similar quality work for <$100 total?

People are still force multipliers. What the GP is saying is that companies that employ lots of drones and lots of AI will provide better results than just the firms with ai or drones. So eventually big law will employ lots of drones and AI - in arms race no advantage is permanent.

People are still force multipliers.

The IQ waterline above which a given person is reliably better at performing given tasks than an AI will probably rise by a couple IQ points a year, is the problem.

There will certainly still be a place for the humans thus displaced, it just won't be in any of the fields where skill is the determining factor and the AIs are higher skilled.

I mean, people still like to watch Magnus Carlsen play chess, but he could be beaten by a chess program running on a four-year-old smartphone.

There will certainly still be a place for the humans thus displaced, it just won't be in any of the fields where skill is the determining factor and the AIs are higher skilled.

As an amusing thought experiment, consider trying to explain modern economics to someone from a society just coming upon the division of labor:

"You mean to tell me that only 1% of your population has to work to feed everyone? That sounds great! Imagine how much everyone must enjoy all of that free time!"

Needless to say, that isn't how it actually went, and I expect AI to be similar: we'll find something else in which to spend our time and raise our expected standards of living to match.

The two questions we could break it down to are:

Is this the equivalent of the invention of the car in terms of it's impact on the horse drawn carriage?

and

Are we the HORSE in this scenario?

Once automobiles became strictly better than horses for the majority of tasks horses were used for, what happened to horse employment?

I'd agree those are the questions, but I'm not certain the answer to the second question is yes. There seems to be space for different outcomes there. While there are fewer horses in the US today than a century ago (a quick search suggests around half as many), I suspect the modal American horse lives a better life than its working counterpart of a century ago, largely because it's much more likely to exist as a pampered pet or show animal.

In some ways "yes, and humans retreat to doing only the things we enjoyed all along" seems like one of the best possible outcomes. I see art (see trends toward "handmade" and "bespoke"), governance (does GPT-3 demonstrate executive function?), and high-level resource allocation (what should we build/research?) as fundamentally human tasks. In the largely blank slate of oft-disagreed-upon human endeavor (admittedly, AI risk seems to focus on other possible endeavors), I don't forsee people voluntarily ceding control of what we decide to build and how it's paid for, at least with the existing technology: people like bikeshedding too darn much.

More comments