This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
New research paper attempts to quantify which professions have to most to lose from the introduction of GPTs into the larger world. From the abstract:
The results vary by models but mathematics and math-related industries like accounting have the highest risk. The researchers overall found that "information processing industries (4-digits NAICS) exhibit high exposure, while manufacturing, agriculture, and mining demonstrate low exposure" (pg 15) and "programming and writing skills...are more susceptible to being influenced by language models."
I find myself wondering if "learn to code" from however long back will shortly become "learn to farm" or some such.
I have never taken these sort of studies or projections with much salt. Any job loss is easily negated by the creation of new, unforeseen jobs as well as more total jobs as the economy grows. AI as far back as 15 years ago was projected to displace lawyers, doctors, and journalists...not even close to happening. At best, AI only replaces a part of the job, not the whole thing. AI can help doctors diagnose conditions but cannot treat patients, nor can it make invasive diagnosis like biopsy.
There's no reason that a destroyed job will always create one or more new jobs. Take it to the limit: suppose we invent a benevolent God AI that is capable of all the information processing that humans are and more for cheaper; there'd be no need for jobs, at least once we get embodied agents of the God AI. And we don't need that extreme a limit, so long as the marginal productivity of an additional worker is less than the additional cost (not just direct salary and benefits but also additional organizational complexity/informational bottlenecks) of hiring them.
Bullshit jobs (gotta get five human reviewers on this TPS report, even if they don't add any value) will exist for awhile, but that's just our social institutions taking a while to catch up with reality.
The "economic theory" is of comparative advantage of isolated actors who interact voluntarily - even if a GodAI can star trek 3d-print print billions of wood planks and billions of CPUs per second, a human can (on average) only produce hundreds of planks per hour and maybe a dozen CPUs per year - so, if the GodAI exchanges its CPUs for human planks, the total number of logs and computers each can produce with trade is greater than each alone! The godAI will have 1e9 + .01 logs per second!
This is just the 'we don't trade with ants' thing. this was an unconvincing rebuttal to that on lesswrong.
When and where does comparative advantage break down? It's not some mathematical certainty that all interacting entities inevitably must abide by; horses used to exchange their labor for food and shelter, and as technology progressed and humans improved their capital edge, they didn't increasingly specialize into their production of horsepower. Instead they (mostly) got turned to glue, though a minority of the luckier ones shifted into entertainment for the rich.
Keeping some assets around has costs, and arranging them in productive ways also has costs, which can conceivably outrun any potential profits from arranging them even in an optimal way.
Horses used to be produced for their labour. As this became unprofitable, horses stopped being produced. Humans are not produced for their labour, so your analogy has problems. If lots of horses were still around and there was no way of just taking them to the glue factory, we'd use horses a lot more.
When people behave irrationally.
However, it is possible that humans stop getting employed as a result of technological change. For example, insofar as the value of unemployment benefits rises as a result of automation, to the point where it exceeds the wages that humans can get, then people will stop selling their labour. Another possibility is that the expected marginal profit from hiring more humans falls below the minimum wage and the latter is not reduced, the marginal profit is not increased by subsidies etc.
What doesn't happen is that comparative advantage breaks down because absolute advantage becomes really, REALLY, REALLY big!! Think of them as different scales: absolute advantage is a ranking according to outcomes, whereas comparative advantage is a ranking (inversely) according to opportunity cost. The opportunity cost of using automation for many tasks increases as the range of tasks that automatons can do increases.
Maybe this will help: imagine that a film studio can make at least $50 million by casting Eddie Murphy in any role. People love him so much, they'd rather see him play all the human roles in every film, including all the female parts. (And you know that Eddie Murphy would be game for that.) Would Eddie Murphy play the role of Henchman #9 in a straight-to-video action film? No, because even Eddie Murphy can't play all the human roles in every movie. Now modify the imaginary scenario: imagine that Eddie Murphy can also play all the animal roles as well, and the film studio can make at least $50 billion from casting him in any role. Does Eddie Murphy start playing Henchman #9?
Hypothetical scenario: the San Francisco Homeless Union approaches the Motte with a unique offer. We have the opportunity to trade with the homeless of SF; moreover, we've gotten a special dispensation from the government to allow us to trade with them without any regulations around wages etc. They've also been cut off from any direct government services. Although the homeless are far less effective than us at creating both widgets and symbols, this is our chance to use the principle of comparative advantage to benefit all the involved parties. We appoint you CEO: what do you do and how do we make a profit?
Answer: you run and we don't. Economic organization works by embedding information into the structure of the organization so that humans don't need to worry about it, but that requires abstraction. The leakier the abstraction, the less effective the organization, to the point where it becomes unprofitable as the costs to manage the leakiness outrun any possible economic value created. Actual existing homeless people can't provide a reliable enough labor abstraction to create any economic value.
Comparative advantage might always exist mathematically, but whether it results in trade depends on the costs of the trade. In my homeless example, there are management costs; the reason I don't hire a maid living in Manila to do my housework is (mostly) travel costs; and a hypothetical GAI wouldn't want to trade with humans if including us in its economic organization created more costs than economic value. (Granted, it would be better at designing systems to minimize those costs than humans currently are.)
This isn't a counterexample to comparative advantage. It's true that the opportunity cost of using resources to employ the homeless people can exceed the expected benefits, but that's still ranking according to opportunity cost: the opportunity cost of their employment relative to other uses of those resources.
I actually mentioned an example of such non-trades in the post you replied to:
Notice that this can happen for all sorts of reasons other than GAI, and the huge absolute advantage of the GAI does not create the non-trades.
To repeat, the law of comparative advantage doesn't mean that everyone gets employed. It means that rational people use their resources on the basis of an opportunity cost ranking, rather than an absolute advantage basis.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link