site banner

Achieving post-scarcity in a world of finite resources

The most common response to "AI took my job" is "don't worry, soon the AI will take everyone's jobs, then we'll all have UBI and we won't have to work anymore." The basic thesis is that after the advent of AGI, we will enter a post-scarcity era. But, we still have to deal with the fact that we live on a planet with a finite amount of space and a finite number of physical resources, so it's hard to see how we could ever get to true post-scarcity. Why don't more people bring this up? Has anyone written about this before?

Let's say we're living in the post-scarcity era and I want a Playstation 5. Machines do all the work now, so it should be a simple matter of going to the nearest AI terminal and asking it to whip me up a Playstation 5, right? But what if I ask for BB(15) Playstation 5's? That's going to be a problem, because the machine could work until the heat death of the universe and still not complete the request. I don't even have to ask for an impossibly large number - I could just ask for a smaller but still very large number, one that is in principle achievable but will tie up most of the earth's manufacturing capacity for several decades. Obviously, if there are no limits on what a person can ask for, then the system will be highly vulnerable to abuse from bad actors who just want to watch the world burn. Even disregarding malicious attacks, the abundance of free goods will encourage people to reproduce more, which will put more and more strain on the planet's ability to provide.

This leads us into the idea that a sort of command economy will be required - post-scarcity with an asterisk. Yes, you don't have to work anymore, but in exchange, there will have to be a centralized authority that will set rules on what you can get, and in what amounts, and when. Historically, command economies haven't worked out too well. They're ripe for political abuse and tend to serve the interests of the people who actually get to issue the commands.

I suppose the response to this is that the AI will decide how to allocate resources to everyone. Its decisions will be final and non-negotiable, and we will have to trust that it is wise and ethical. I'm not actually sure if such a thing is possible, though. Global resource distribution may simply remain a computationally intractable problem into the far future, in which case we would end up with a hybrid system where we would still have humans at the top, distributing the spoils of AI labor to the unwashed masses. I'm not sure if this is better or worse than a system where the AI was the sole arbiter of all decisions. I would prefer not to live in either world.

I don't put much stock in the idea that a superhuman AI will figure out how to permanently solve all problems of resource scarcity. No matter how smart it is, there are still physical limitations that can't be ignored.

TL;DR the singularity is more likely to produce the WEF vision of living in ze pod and eating ze bugs, rather than whatever prosaic Garden of Eden you're imagining.

6
Jump in the discussion.

No email address required.

I'm not sure exactly what you mean when you say "singularity", but for me a post-scarcity world involving the singularity would really be more post-human (or post-singularity, which is more apt of a term for the time period after the singularity). I see a post-singularity world as one where humans are all uploaded, the Earth is just one system made of computronium-like material, and the whole system is administrated by a sort of "mother" ASI singleton.

Obviously resources will always be finite, unless the AI is able to make some sort of godlike breakthrough only capable of an entity like it; solving the heat death of the universe should be a priority of it, and creating matter out of nothing with some sort of mini big bang (i.e. magic) would simultaneously solve the finite resource problem.

Your example does touch on something I have thought about a lot, which is the idea of true freedom in a post-singularity system. With humans there is always a disconnect with what one "wants" to do and what one actually wants to do. If someone asks the mother AI to create some sort of simulated universe that is a great deal larger than our own, and fill it with Playstation 5s (or better yet, an assortment of random objects in which no two are alike), the request may be denied even if the finite resource problem had been already solved at the time, because the person asking is only trying to challenge the AI with some stubborn thought experiment and doesn't actually want to interact with anything they're creating. There's also the problem of creating new people - especially when it comes to morally questionable requests. I'd assume the AI would grant these requests, but would hand out p-zombies depending on the nature of them.

Though I don't think I can argue too well for it, I do believe that a post-singularity system would have access to at least the collective matter/energy of our whole solar system at first, and would be able to siphon more from other stars. And I believe that there would be no reasonable requests that would strain the system, probably for any duration of the post-singularity time period.