site banner

Culture War Roundup for the week of November 28, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

16
Jump in the discussion.

No email address required.

To be honest, I don't know what to make of your comment.

Could I ask you to explain first why your theory of law student disemployment did not result from previous increases in lawyer efficiency, such as the advent of the electronic word processor or electronic case law databases? As in, what is it specifically about this new technology that causes a different economic equilibrium than such past improvements? I think that would help me to better understand your claim.

Could I ask you to explain first why your theory of law student disemployment did not result from previous increases in lawyer efficiency,

Because there was no 'overproduction' of law grads due to the relatively stringent limits on how many lawyers we can produce in a given year. There's always been an artificial 'floor' on legal salaries and employment in this way.

You can model the entire legal industry as a cartel that is cooperating to gatekeep access to jobs and thereby keep salaries acceptably high and avoid any major forces disrupting the stability of said industry. Universities, massive law firms/corporations, judges, politicians, they've got representation in virtually every level of society to 'enforce' this cartel's control.

And AI is threatening to do to this legal cartel what Uber and Lyft did to the taxi cartels. Except worse, since any model capable of replicating a decent attorney's work product can be copied and deployed endlessly as long as there is sufficient compute.

The cap is way higher.

We have a similar bottleneck for doctors. But if there was an AI program that could perform 90% of the tasks of a doctor (in terms of examination, diagnosis, treatment recommendations, and prescriptions, but excluding surgeries) and do it better than the median doctor, what do you think that would do for salaries and employment rate of doctors?

In essence, every step of becoming a lawyer has steep costs, both in effort/time AND money. Costs that newly minted lawyers expect to recoup over the course of their careers.

And then let us introduce a class of lawyers that can be trained in a course of days, can be reproduced nigh-instantly, and can work literally around the clock without sleeping.

How do 'normal' lawyers compete against that in terms of salary, assuming each produces similar quality of work. And if lawyers can't compete against that in terms of salary, how can they expect to recoup all the costs that went into their license?

And if they can't recoup the cost of their license while working in the legal industry, how can they stay in the legal industry?

And AI is threatening to do to this legal cartel what Uber and Lyft did to the taxi cartels. Except worse, since any model capable of replicating a decent attorney's work product can be copied and deployed endlessly as long as there is sufficient compute.

But... it can't. Not yet. It still needs a person to guide it. It will make those people a lot more efficient, potentially, possibly 10x more efficient, but it can't fully close the loop and do away with the person. If company A wants to acquire company B, it is still going to need a bunch of lawyers, even if large language models make those lawyers much more efficient. And my contention is that, if corporate lawyers become 10x more efficient, then the legal industry will resettle into a new equilibrium where mergers take 10x more work. Everyone works just as hard, deal teams have just as many people, deals take just as long, the clients pay just as much, but the merger agreements are fantastically more detailed and longer, the negotiations are fantastically more sophisticated, and the SEC requires fantastically more elaborate disclosure materials, etc. From the horse's perspective, this is more like the invention of the horseshoe than the invention of the automobile.

I don't think we'll replace the horse until we have full AGI -- as in a system that can literally do every cognitive task that people can do, better than the best people that can do it. At that point, all knowledge workers will be in the same boat -- everyone, at minimum, whose job consists of typing on a computer and speaking to other people, and robots can't be far behind for the blue collar workers too. And it's closer than people think. Honestly, maybe it is three years from now, when incoming law students are graduating -- not my modal prediction but IMO certainly not impossible. But even if that's the case, the advice is less "don't go to law school" and more "get ready for everything to change radically in a way that is impossible to hedge."

We have a similar bottleneck for doctors. But if there was an AI program that could perform 90% of the tasks of a doctor (in terms of examination, diagnosis, treatment recommendations, and prescriptions, but excluding surgeries) and do it better than the median doctor, what do you think that would do for salaries and employment rate of doctors?

I don't know. Medicine is less zero-sum than law. We'd reach some new equilibrium, but you could make a case for it being lower (because it's more efficient to achieve our current level of medical outcomes) or higher (because better medical outcomes become possible and society will be willing to pay more in aggregate to achieve them), or somewhere in the middle.

If you have a machine that can do 90% of what a doctor does today, then a doctor with that machine can see 10x more patients than she does today, or see the same number of patients but provide each patient with 10x more personal attention than they get today, or some other tradeoff. Maybe everyone will see the doctor once per month to do a full-body cancer screen and a customized senolytic treatment or whatever, because better medical technology will allow that more intensive schedule to translate into radically better health outcomes -- which would mean the medical system would grow by 12x compared to what it is today, and we'd all be better off for it.

You keep going to the Corporate merger thing, which I may even grant is on point. AIs increase the size in productivity terms if not headcount and complexity of firms in weird ways, I'm sure.

But by most counts less than <40% of all lawyers are employed in those huge firms and corporate environments.

more data here:

https://www-media.floridabar.org/uploads/2019/03/2018-Economics-Survey-Report-Final.pdf

It seems like you expect that the larger corporate merger firms will just keep growing in size to absorb the rest of the lawyers practicing elsewhere?

Because most lawyers aren't working on complex corporate law.

The average person's will won't get more complex. A home purchase agreement won't get more complex, and small-business contracts won't get much more complex.

Likewise, most civil suits involving two private citizens or small corporations won't get more complex.

I sure hope criminal defense and prosecution won't get more complex.

So going with your model, this is implying a future where almost all legal services are provided by a relatively small handful of huge and growing firms having to handle increasingly complex transactional law, with complexity increasing with the power of the AIs in use, ad infinitum.

But by most counts less than <40% of all lawyers are employed in those huge firms and corporate environments.

It seems like you expect that the larger corporate merger firms will just keep growing in size to absorb the rest of the lawyers practicing elsewhere?

Because most lawyers aren't working on complex corporate law.

...

A home purchase agreement won't get more complex, and small-business contracts won't get much more complex.

Oh, I see what you mean. Again -- could go any direction. Home sales are handled with less complexity than a corporate acquisition basically because there are fewer resources to spend on advisors. What if lawyers become 10x more efficient? Maybe every home sale starts to resemble what a corporate merger looked like twenty years ago. It could happen. Same with small business contracts.

Likewise, most civil suits involving two private citizens or small corporations won't get more complex.

They absolutely will! Why wouldn't they? Here there is a direct relationship between the amount people are willing to spend and the marginal advantage it gives them over their counterparty. Why would that ratio change? You'd get more detailed briefs, more comprehensive discovery and document review, etc., all for the same price that you pay today.

I sure hope criminal defense and prosecution won't get more complex.

They absolutely would, for the same reason as civil suits, except more so, because so much more is on the line! Wealthy people who have been indicted spend through the nose on criminal defense, which suggests that ability to pay is the only thing constraining less wealthy defendants. If legal services become 10x more efficient, you should expect them to consume 10x as much.

It seems like you expect that the larger corporate merger firms will just keep growing in size to absorb the rest of the lawyers practicing elsewhere?

No, you could still have smaller firms and solo practitioners, and each of them would be 10x more efficient than they are today too. Their work product would just become a lot more sophisticated by today's standards.

You'd get more detailed briefs, more comprehensive discovery and document review, etc., all for the same price that you pay today.

Or you get a simple interface that allows both parties to upload all the evidence they believe supports their case and the arguments they wish to put forward, in plain English, the AI churns through it for a couple minutes then renders (literally, renders) a verdict that the parties can either accept or appeal to a higher-resolution appellate judge AI.

So, SO much of the cost of civil litigation is tied up in accessing the Judicial resources necessary to have hearings on motions and waiting on decisions to be rendered and arguing over tiny little points of contention for literal hours.

And it can all be avoided if people prefer the simplicity of a provably neutral robojudge that responds to motions instantly rather than scheduling a hearing 60 days out.

Similar to how arbitration clauses are a common way to avoid the costs of litigation because people DON'T want to pay for litigation when they can avoid it!

They absolutely would, for the same reason as civil suits, except more so, because so much more is on the line!

Imma strongly disagree here if only because AI tech will almost certainly make it trivial to solve the vast majority of crimes in a way that make prosecution extremely easy. The sheer amount of evidence that could be brought to bear in our increasing surveillance state would hurdle the 'reasonable doubt' standard pretty easily.

Because the vast, vast majority of accused criminals aren't wealthy people.

So more people will be accepting plea offers, which are also vastly simplified because AI assists Judges in determining appropriate sentences.

No, you could still have smaller firms and solo practitioners, and each of them would be 10x more efficient than they are today too. Their work product would just become a lot more sophisticated by today's standards.

But would there ACTUALLY be 10x as much work to be done? Where is all this pent up demand currently located?