site banner

Small-Scale Question Sunday for February 8, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

Jump in the discussion.

No email address required.

Does anyone have suggestions on books or movies that have a positive vision that they're trying to portray through fiction? To elaborate, I don't mean positive in the sense of morality or optimism, but in the sense that they actually want to show something, rather than tear something else down.

As an example, the older Star Trek series did this well. The show wanted to portray a world where humans were mostly post scarcity, and what a society would look like in that environment. It looked like fully automated gay luxury space communism where most people focused on self actualization that incidentally sided society at large. I'm not a gay space communist, but I've always appreciated that they took the concept and ran with it. Banks' Culture novels fill a similar niche.

I'm not going to go too deeply into counter examples because this isn't the culture war thread, but it's lately felt like that positive vision is increasingly hard to find, awash as it is in "deconstructive reimaginings".

Can anybody recommend things that fit that description? I'm not particularly concerned about the topic, so much as that the creators own the topic and actually think it through to the point where the settings and characters feel natural.

The Commonwealth Saga and The Culture are the two best sci fi I've read for this.