site banner

Small-Scale Question Sunday for January 18, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

Jump in the discussion.

No email address required.

To be a bit more charitable, the novel could be read as an attempt to demonstrate the concept that consciousness is not a prerequisite for advanced intelligence. I must admit I've never really struggled to decouple the one from the other, but a lot of people seem to find this idea absurd on its face: it's remarkable how many anti-AI arguments boil down to "people say that artificial intelligence is possible, but computers can't be conscious, QED AI is impossible".

Doesn't the opposite also exist? There are plenty of people out there arguing, "LLMs are instrumentally intelligent, therefore they are conscious", despite that being the same error. 'Intelligence' in the sense of the capacity to perform complex tasks is a different thing to consciousness.

Unfortunately the word 'intelligence' in natural language tends to bundle a number of concepts. When I say 'intelligent' in a casual context I usually mean some nexus of "has internal conscious experience", "has real thoughts", "is able to solve complex problems", "can engage with abstract concepts", "is possessed of a rational soul", and so on. When faced with a machine that's capable of solving complex problems but none of the other things, it's understandable that a lot of people's reactions are pretty wonky.