I like the government to fund some welfare services, and I generally dislike bureaucracy. I am also fine with some immigration.
Anyone who turns a blind eye to welfare fraud is effectively steering us towards an equilibrium with less welfare spending and more red tape. (And yes, red tape can very much prevent welfare from reaching the needy, because the needy often are not great at jumping through the hoops of bureaucracy.) If the perps are immigrants, it will also foster an anti-immigrant sentiment as surely as thunder follows lightning.
I wish I could blame some Ayn Rand fans who were working as moles to achieve that outcome, but in all likelihood the officials who turned a blind eye were probably SJ people who failed to think of the consequences. After all, Uncle Sam has plenty of money, and if the Somali skim a bit to keep their relatives from starvation, what is the harm?
Except that the taxpayers and voters feel very differently (I imagine). And sending money to a failed state through intransparent channels is not necessarily net positive.
In short, lawfulness is (at least) instrumentally useful. Even if you feel your cause is good, breaking laws to further it will generally generate a backslash. I imagine SBF did not donate a lot of money to EA in 2025.
…Of course, we can still entertain the hypothesis that all of the above is some interesting ephemera and this final dash of the Chosen Nation towards AGI-powered Rapture and completion of history is the real story of the times. I won't completely discount it, we shall see.
IMO, this is still the operating assumption of the AI race. It is also the only thing which justifies the intensity of the efforts expended by various AI companies.
Normally, when a new field of tech is breached, there is no decisive first-mover advantage, where a technological lead of a year will translate into long-term dominance. History is full of cases (e.g. early home computers (e.g. Apple Macintosh), dot-com boom (e.g. myspace), photovoltaic (e.g. German companies), browser wars) where the forerunners became footnotes in history.
But what OpenAI investors buy is not so much future profits if OpenAI replaces most of the work force, but an investment-proportional solid angle of the light cone if Altman achieves aligned superintelligence (and remains aligned to his investors himself). The presumption is clearly that whoever finishes climbing the rope first will then cut the rope to prevent anyone else from following.
- Prev
- Next

The most important thing is to hurry up, because 'This Year' is going to last only a few more hours. Luckily, most of the advice also works for 2026, though!
(Sorry, could not resist.)
More options
Context Copy link