This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
When will the AI penny drop?
I returned from lunch to find that a gray morning had given way to a beautiful spring afternoon in the City, the sun shining on courtyard flowers and through the pints of the insurance men standing outside the pub, who still start drinking at midday. I walked into the office, past the receptionists and security staff, then went up to our floor, passed the back office, the HR team who sit near us, our friendly sysadmin, my analysts, associate, my own boss. I sent some emails to a client, to our lawyers, to theirs, called our small graphics team who design graphics for pitchbooks and prospectuses for roadshows in Adobe whatever. I spoke to our team secretary about some flights and a hotel meeting room in a few weeks. I reviewed a bad model and fired off some pls fixes. I called our health insurance provider and spoke to a surprisingly nice woman about some extra information they need for a claim.
And I thought to myself can it really be that all this is about to end, not in the steady process envisioned by a prescient few a decade ago but in an all-encompassing crescendo that will soon overwhelm us all? I walk around now like a tourist in the world I have lived in my whole life, appreciating every strange interaction with another worker, the hum of commerce, the flow of labor. Even the commute has taken on a strange new meaning to me, because I know it might be over so soon.
All of these jobs, including my own, can be automated with current generation AI agents and some relatively minor additional work (much of which can itself be done by AI). Next generation agents (already in testing at leading labs) will be able to take screen and keystroke recordings (plus audio from calls if applicable) of, say, 20 people performing a niche white collar role over a few weeks and learn pretty much immediately know how to do it as well or better. This job destruction is only part of the puzzle, though, because as these roles go so do tens of millions of other middlemen, from recruiters and consultants and HR and accountants to millions employed at SaaS providers that build tools - like Salesforce, Trello, even Microsoft with Office - that will soon be largely or entirely redundant because whole workflows will be replaced by AI. The friction facilitators of technical modernity, from CRMs to emails to dashboards to spreadsheets to cloud document storage will be mostly valueless. Adobe alone, which those coworkers use to photoshop cute little cover images for M&A pitchbooks, is worth $173bn and yet has been surely rendered worthless, in the last couple of weeks alone, by new multimodal LLMs that allow for precise image generation and editing by prompt1. With them will come an almighty economic crash that will affect every business from residential property managing to plumbing, automobiles to restaurants. Like the old cartoon trope, it feels like we have run off a cliff but have yet to speak gravity into existence.
It was announced yesterday that employment in the securities industry on Wall Street hit a 30-year high (I suspect that that is ‘since records began’, but if not I suppose it coincides with the final end of open outcry trading). I wonder what that figure will be just a few years from now. This was a great bonus season (albeit mostly in trading), perhaps the last great one. My coworker spent the evening speaking to students at his old high school about careers in finance; students are being prepared for jobs that will not exist, a world that will not exist, by the time they graduate.
Walking through the city I feel a strange sense of foreboding, of a liminal time. Perhaps it is self-induced; I have spent much of the past six months obsessed by 1911 to 1914, the final years of the long 19th century, by Mann and Zweig and Proust. The German writer Florian Illies wrote a work of pop-history about 1913 called “the year before the storm”. Most of it has nothing to do with the coming war or the arms race; it is a portrait (in many ways) of peace and mundanity, of quiet progress, of sports tournaments and scientific advancement and banal artistic introspection, of what felt like a rational and evolutionary march toward modernity tempered by a faint dread, the kind you feel when you see flowers on their last good day. You know what will happen and yet are no less able to stop it than those who are comfortably oblivious.
In recent months I have spoken to almost all smartest people I know about the coming crisis. Most are still largely oblivious; “new jobs will be created”, “this will just make humans more productive”, “people said the same thing about the internet in the 90s”, and - of course - “it’s not real creativity”. A few - some quants, the smarter portfolio managers, a couple of VCs who realize that every pitch is from a company that wants to automate one business while relying for revenue on every other industry that will supposedly have just the same need for people and therefore middlemen SaaS contracts as it does today - realize what is coming, can talk about little else.
Many who never before expressed any fear or doubts about the future of capitalism have begun what can only be described as prepping, buying land in remote corners of Europe and North America where they have family connections (or sometimes none at all), buying crypto as a hedge rather than an investment, investigating residency in Switzerland and researching countries likely to best quickly adapt to an automated age in which service industry exports are liable to collapse (wealthy, domestic manufacturing, energy resources or nuclear power, reasonably low population density, produce most food domestically, some natural resources, political system capable of quick adaptation). America is blessed with many of these but its size, political divisions and regional, ethnic and cultural tensions, plus an ingrained highly individualistic culture mean it will struggle, at least for a time. A gay Japanese friend who previously swore he would never return to his homeland on account of the homophobia he had experienced there has started pouring huge money into his family’s ancestral village and directly told me he was expecting some kind of large scale economic and social collapse as a result of AI to force him to return home soon.
Unfortunately Britain, where manufacturing has been largely outsourced, most food and much fuel has to be imported and which is heavily reliant on exactly the professional services that will be automated first seems likely to have to go through one of the harshest transitions. A Scottish portfolio manager, probably in his 40s told me of the compound he is building on one of the remote islands off Scotland’s west coast. He grew up in Edinburgh, but was considering contributing a large amount of money towards some church repairs and the renovation of a beloved local store or pub of some kind to endear himself to the community in case he needed it. I presume that in big tech money, where I know far fewer people than others here, similar preparations are being made. I have made a few smaller preparations of my own, although what started as ‘just in case’ now occupies an ever greater place in my imagination.
For almost ten years we have discussed politics and society on this forum. Now events, at last, seem about to overwhelm us. It is unclear whether AGI will entrench, reshape or collapse existing power structures, will freeze or accelerate the culture war. Much depends on who exactly is in power when things happen, and on whether tools that create chaos (like those causing mass unemployment) arrive much before those that create order (mass autonomous police drone fleets, ubiquitous VR dopamine at negligible cost). It is also a twist of fate that so many involved in AI research were themselves loosely involved in the Silicon Valley circles that spawned the rationalist movement, and eventually through that, and Scott, this place. For a long time there was truth in the old internet adage that “nothing ever happens”. I think it will be hard to say the same five years from now.
1 Some part of me wants to resign and short the big SaaS firms that are going to crash first, but I’ve always been a bad gambler (and am lucky enough, mostly, to know it).
Ned Ludd led weavers to smash looms. It didn’t save the weaver’s jobs, but their great-granddaughters were far wealthier.
Just based on history, large productivity increases will raise wages. I’m looking into a cushy union sinecure that will never be automated but AI is a minor factor compared to the money. Yes, some fintech roles will be curtailed(and the remainder will be more client-and-customer heavy), but meh. These people’s high salaries is not fundamental to our social model.
The whole point is that it would be a grave error to naively extrapolate from history. The increase in productivity came from humans being freed from physical labor (mostly), and their cognitive labor augmented and multiplied.
Now we're at approaching replacement rather than augmentation. The economy might boom, but that's doesn't mean the humans in it will see the benefit. This is would take intentional action to prevent ~everyone who isn't independently wealthy from being laid off and without a revenue stream that wasn't welfare, as the free market value of their work would be lower than the minimum required to keep them housed and fed.
There have been previous large increases in intellectual productivity due to computers. Job prospects for nerds have gotten better, not worse.
We don’t live in a free market. We live in a regulated society. Do you think doctors, lawyers, teachers will get replaced by machines just because those machines will do a better job?
Yes? It's going to be harder than someone working for a faceless corporation with at-will employment, but eventually, people are going to wonder: "Hey, those AI thingies seem super smart, they're giving me the same advice (or better) as the doctor I'm paying all that money for, why can't they prescribe too?".
If not individuals, then governments and politicians. That's where the incentives lie for hospitals, for the owners of law firms who haven't had to handle an actual case in years, for bureaucrats looking at how expensive the NHS is and wondering if they really need that many doctors.
Even if licensed professionals continue to play a token role, it might just be a polite fiction that they're necessary. You could have one bored, disinterested doctor signing off on AI recommendations, assuming the liability with ease because he knows the AI is almost never wrong. He's now doing the work of ten doctors, and the hospital, happy to save costs, fires the rest. Even if he's not happy about it, it beats being unemployed.
Controversial statement, but from my perspective, 90% unemployment rates for doctors is almost as bad as 100% unemployment.
It could be easy to instigate. An AI company, or its lobbyists, publish a few papers that (truthfully) claim that AIs outperform human physicians. This is used as ammunition by lobbyists and governments to begin gradual replacement, boiling the water slowly and saving a lot of money.
The average Joe, who once trusted human doctors, is collecting unemployment. He thinks, hey, the AI took my job, why should I believe that doctors are any better? It saves him money and time, leaving aside the scope for resentment.
We don't seem to live in a world where the average Joe is protected very much, which would have been the point to try and stem the tide. How many people support UBI for artists and journalists? All it takes is a single nation or smaller polity to try this experiment, see that it works well (which I expect) and then it's easy to bring others on board. They'll be left in the dust otherwise.
Artists and journalists aren’t the average Joe. They’re poor members of the upper classes.
Most people have a lot of sympathy for laid off coal miners and factory workers, and one of the terminal values of western regimes is raising the LFPR. The jobs must flow, and flow they shall. There may not be universal six figures for nerds, but that isn’t a necessity.
In any case, AI isn’t taking everyone’s job. There will be fewer software engineers, sure, but we don’t need so many of them. They should learn to fix toilets or dig coal or something. Previous increases in the productivity of white collar work have not led to the elimination of white collar employment.
I think calling artists and journalists "poor members of the upper classes", while not entirely wrong, isn't my preferred framing. They're semi-prestigious, certainly, but my definition of upper class would be someone like 2rafa. They're often members of the intelligentsia, and have a somewhat disproportionate impact on public affairs, but they're not upper class by most definitions. Poor but upper class is close to a contradiction in terms.
I've already explained my stance in this thread that the previous expectation about the state of affairs for automation doesn't hold. Cognitive automation that replaces all human thought is a qualitatively different beast when compared to the industrial revolution or computers.
A tool that does 99% of my work for me? Great, I'm a hundred times as productive! There might even be a hundred times more work to do, but I'll probably see some wage growth. There might be some turmoil in the employment market.
A tool that does 100% of the labor? What are you paying me for?
The whole point is that AI is approaching 100%, might even be there, or is so close employers don't care and will fire you.
Perhaps a more accurate description would be members of an upper class in the same way that samurai were in Edo society, literati were in China since essentially the Warring States, or Brahmins in India?
To be honest the pessimistic case of the AI "only" being able to do 99% or even 90% of human cognitive work scares me in terms of social upheaval. It might be better off in the long run, but it sure looks like it'll be a bumpy ride...
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link