site banner

Small-Scale Question Sunday for April 26, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

Jump in the discussion.

No email address required.

I haven't read the novels...but your comment reminded me of this discussion. It and this reply I agree with.

I think a life of only simple pleasures (eating, sleeping, etc.) would get boring, because I desire achievement, and I believe most people agree. I also think such a life isn't realistically human, it's what animals do, while most humans have long-term plans. Achievement also requires adversity, because one needs to at least imagine they could fail.

However, if the Minds were really intent on "preserving humanity", they could also give humans fake achievement and adversity, up to recreating life as it is now.

If you believe The Culture is a dystopia, what would make it a utopia?

You mean other than changing the core premise?

If forced to work in the framework as it stands, I'd probably need to see a mix of human agency (possibly by resurrecting the early and largely abandoned concept of Referrers), and introducing more AI entities that are less... smarmily dickish? The only AI character who seems to genuinely care about biologicals on a personal and moral level is Falling Outside The Normal Moral Constraints, who other AI consider to be psychotic.

The AIs do care, the humans are beloved pets. High functioning pets that need a lot of room to play.

Edit: The example I heard about once is that as dogs are to humans, humans are to AI minds. A dog wouldn't understand why you brought him to the vet for a painful treatment, but it was for the dog's own good. A human couldn't understand an AI's motives because a human's mind is on a lower level of sentience.