site banner

Small-Scale Question Sunday for January 18, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

So, what are you reading?

Still on The Question of Palestine. Said's writing is great as usual, and it is making me want to reread Orientalism. He kinda omitted that the displacement of 1948 was in the context of war, but perhaps he was assuming common knowledge. Interesting facts abound, but the core of the book is the system of thought he's applying, and it remains unclear how useful it is.

Otherwise picking up Al-Ghazali's The Book of Contemplation, book 39 of his Revival of the Religious Sciences series.

Last week I said I was about halfway through Blindsight and didn't really understand the hype, and several people chimed in to second that motion. But about the two-thirds mark, something unexpected happened: it became... good? I'd like to talk about why but don't want to spoiler-tag the entire rest of the comment, so if you have any interest in reading this book, don't read the rest of the comment.

SPOILERS BELOW

The revelation that the aliens can appear invisible to a single individual by synchronising their movements with the individual's saccades, but this tactic doesn't work with groups of people (because their saccades aren't synchronised with each other) was surprising and ingenious. I understand the novel is controversial for its later revelation that, while the aliens are highly technologically advanced, they are not "conscious" as we would understand it, and more than one character goes out of his way to point out how vestigial consciousness is from an evolutionary standpoint, given that humans can perform all manner of highly complex tasks while unconscious (e.g. there have been reports of sleepwalkers getting into their cars and driving without incident). Even the writer who wrote the introduction describes it as an excellent book whose core thesis she vociferously disagrees with.

I'm not entirely sure if Watts's contention is that consciousness is vestigial, an unnecessary evolutionary offshoot the human species would be best served by ridding ourselves of. To be a bit more charitable, the novel could be read as an attempt to demonstrate the concept that consciousness is not a prerequisite for advanced intelligence. I must admit I've never really struggled to decouple the one from the other, but a lot of people seem to find this idea absurd on its face: it's remarkable how many anti-AI arguments boil down to "people say that artificial intelligence is possible, but computers can't be conscious, QED AI is impossible". Blindsight provides us with a vivid example of what a hypothetical non-conscious, non-sapient and yet clearly intelligent species might look like. I wonder if Nick Bostrom was inspired by Blindsight when describing his "Disneyland with no children":

It is conceivable that optimal efficiency would be attained by grouping capabilities in aggregates that roughly match the cognitive architecture of a human mind…But in the absence of any compelling reason for being confident that this so, we must countenance the possibility that human-like cognitive architectures are optimal only within the constraints of human neurology (or not at all). When it becomes possible to build architectures that could not be implemented well on biological neural networks, new design space opens up; and the global optima in this extended space need not resemble familiar types of mentality. Human-like cognitive organizations would then lack a niche in a competitive post-transition economy or ecosystem.

We could thus imagine, as an extreme case, a technologically highly advanced society, containing many complex structures, some of them far more intricate and intelligent than anything that exists on the planet today – a society which nevertheless lacks any type of being that is conscious or whose welfare has moral significance. In a sense, this would be an uninhabited society. It would be a society of economic miracles and technological awesomeness, with nobody there to benefit. A Disneyland with no children.

A thought-provoking novel, even if it takes a long time to get there. I'm not going to donate it to the charity shop just yet.

SPOILERS OVER

I'm about 40 pages into Eric Hoffer's The True Believer. Three years ago I earned an AAQC by arguing that the only people demanding radical ground-up changes to the society in which they live are people who are one or more of poor, unattractive, widely disliked and uncharismatic. How disheartening to learn that Hoffer had scooped me seventy years prior.

I agree that Blindsight was thought-provoking; I just wish its ideas held up to scrutiny.

Spoilers for Blindsight below

According to the conceit of the book, you don't need consciousness to achieve great things and humans aren't special. So, what happens when one (1) human decides "Hey, there's some scary shit out there, let me launch a Von Neumann self-replicating droneswarm that stays synched and fueled with quantum-linked antimatter and blows the shit out of anything it detects fucking with it."?

I feel like the book throws in a lot of completely random setting-building stuff that is just there for vague thematic relevence, and not because it builds a coherent world that explores the ideas therein. In the world of Blindsight, what should happen when humans try to exterminate an ant colony? One is pure, unconstrained instinct, sharpened and honed by millions of years of evolution, with a million eyes and a distributed processing network, with built-in subroutines to handle nearly any kind of obstacle it encounters in its native environment. On the other hand, we have hairless apes...who, because are conscious, can do things like build the civilizational and industrial infrastructure to make and distribute ant poison.

Ah, but what if Satan-Cthulhu was secretly feeding the ants the kind of information that we clearly see needs a conscious mind to extract from the universe, and also the ants could do some weird-ass quantum shit with radiation? Well, for one, the ants would be a really poor thought experiment vector, as I feel the Scramblers are; we have no context for how they think, much less that they do. If the Scramblers are not conscious, why do they respond to torture at all? The story wants me to believe that there is some kind of cosmic Chinese Room of responses that can perfectly pattern-match and encompass the weird-ass protagonists and their dysfunction, such that the Scramblers can somehow arrive at the correct solution to get some random stuff to happen. To me, it just reads like the Scramblers are being fed the author's notes; they don't feel like inhuman superintelligences, they feel like plot devices.

But the thing that did make me realize that the ideas of the book were fundamentally hollow was, ironically, the creepy cool saccade trick. Because, even if we assume that these are actual-Lovecraftian space monsters and literal reality-bending Nyarlathotep is whispering in their space-ears to explain exactly what neurons firing in the squishy human brain-meats that are doing the optical processing...if you have no conception of yourself, and you look at the feed of a creature looking at a room with you in it, how do you know what data has you in it and what doesn't? How do you know how to move to hide yourself if you don't know who or what you are?

I feel like it is kind of the point of the book that the human characters are pretty much weak, helpless, and make consistently bad decisions (when they make decisions at all), but telling a story that cracks apart when one single, solitary character uses judgement and foresight and explores the elements of the setting as they are presented because they have an agenda of Not Dying and take reasonable actions thereof is, to me, not an engaging story.

what if Satan-Cthulhu was secretly feeding the ants the kind of information that we clearly see needs a conscious mind to extract from the universe

I think this begs the question. Why is consciousness a prerequisite for extracting complex information from the universe? Sure, we're the only species that we know of that can extract complex information from the universe, and we're conscious. But this strikes me as a strange kind of parochialism. Nobody thinks that, because we're capable of extracting complex information from the universe and we're featherless bipeds with broad flat nails, therefore the only species capable of extracting complex information from the universe even in principle are featherless bipeds with broad flat nails. People have no trouble imagining an alien species whose bodies look nothing like ours (ever since Lovecraft, squid-like creatures have been standard, for some reason, and Blindsight is no exception to this lineage) and yet which are obviously intelligent. But for some reason, people tend to react with bafflement and ire to the proposal of an intelligent species which isn't conscious as we would understand it. And I genuinely don't know why the one is a prerequisite for the other. I think the word "clearly" in your comment is doing a lot of heavy lifting.

As to how the Scramblers are capable of carrying out complex tasks despite being unconscious, and how this feels to you as if they're just being passed information from the author — well, when I read true stories about sleepwalkers driving cars, I don't take that as evidence for a God who interferes in human affairs, or even that these sleepwalkers have been possessed by an incorporeal spirit. Blindsight's depiction of a species capable of performing complex actions while unconscious isn't just a fictional, hypothetical conceit: we ourselves are an example of just such a species!

On one hand, consciousness isn't directly a prerequisite for purely extracting information; like I said, some sort of weird alien super-MRI could start data-mining human brains. But what happens from there? If a digital sensor that incorrectly starts reading and reporting the noise from its own function is the analogy to consciousness, then the other metaphor is a perfectly-functioning sensor outputting its information to a system that isn't powered on.

We have seen what happens when fine-tuned evolved systems that arose in the purest and deadliest Darwinian competition, optimized and fine-tuned for pure survival; they lose to humans. Maybe not immediately, and maybe not forever, but in our world, neither the largest nor the smallest predator holds dominion when humans decide to claim a space. Instinctive, programmed behavior loses to conscious thought, every time.

I also want to make a distinction between "Do a complex task" and "Outcompete a sentient agent who is turning their sentience against you". The first is easy; we have loads of nonsentient systems that can do really complex and even really adaptive tasks. But, just as it's pretty trivial to adversarially fuck with, e.g., a self-driving system in a car, and it doesn't matter how good the self-driving system is, if you are a person and can, e.g., think in wildly different terms than the self-driving system was made to do. Sleepwalkers can act according to the habits they've built up, but they can't process novel data, and they certainly can't tell when they're being fucked with.

...And, having written that sentence, I think I've just come to my new headcanon; the reason that these five fuck-ups were sent was because the Serious People on Earth recognize that they are dealing with a nonsentient intelligence that was confused by human communication and unable to properly extract the subtext of humans as individual agents, and so sent this ship full of these people to act, honestly and naturally as they would, which is to say, fail at everything that wasn't being micromanaged by Vampire Muppet, in order to poison the Scrambler's training data of what humanity was and was capable of.

But, to get to my general point about that 'clearly'; if you were going to break down the steps involved in doing the sacchade trick, how would you describe it? What information would you need to start with, what can you learn on your first interactions with a novel lifeform, what is your mechanism for sensing the brain bits through increasing and changing layers of anti-radiation shielding, and, most-importantly, why are you doing all this? Lots of nonsentient creatures hide, and some of them do so in really complex ways. But that stealth falls absolutely apart when you are relying on instinct built up from natural selection to hide from creatures you've never met before, with senses you have no information on, whose very cognition is alien to yours, and it falls apart that much faster when those creatures are capable of building tools and devices, and if you as the author aren't cheating and stopping the characters from using fucking periscopes for scouting, then the need to cheat and upload author-derived information directly into the Scramblers becomes even more apparent.


Also, I have to ask: is there a meta-point being made by this post being possibly polished by AI? I mean, you could just be a Mac user, but I see that emdash in there.

In The Secret of Our Success, Joseph Henrich argues that the reason our species became the dominant species on the planet is not because we're exceptionally strong (in an unarmed fight between a man and a chimpanzee, the chimpanzee will always win), or exceptionally fast (gazelles, bears etc.), or even exceptionally intelligent (chimpanzees routinely outcompete children in intelligence tests). Rather, we were the first (and, so far as we know, only) species to crack the secret of passing on information from one generation to the next. This allows our achievements to accumulate over time.

I agree with Henrich's perspective. I also don't see that it necessarily requires consciousness to be applicable, even if the first species to crack it was conscious. All it really seems to require is some form of language (and some species of animals, such as whales, certainly appear to speak to one another via whalesong; likewise birdsong) and perhaps some way of committing information to an external substrate, as we do with writing. I'm afraid I still fail to see why "being conscious" is a prerequisite for either of those things, in the same way that being bipedal obviously isn't.

Like, yes, I take your point that we, as a sentient species, outcompeted all presumably non-sentient species on this planet. But I don't think this remotely proves that consciousness is a prerequisite for advanced intelligence everywhere and always throughout the entire universe. Surely we can imagine a hypothetical species which isn't conscious and which yet contrives some means of passing information from one generation to the next, thereby undergoing cultural evolution of the kind described by Henrich and eventually becoming a technologically advanced civilization. I genuinely do not see why only species which are conscious can possibly undergo this sequence of events. And if you repeat that "we did it, and we're conscious", then I just think you're generalising from a very small sample size.

Sleepwalkers can act according to the habits they've built up, but they can't process novel data

I'm not sure what this means. Every time a driver gets into a car, he's processing novel data and reacting to unforeseen stimuli. Even if you drive to work a hundred times, the hundred and first drive will be different: slightly different weather conditions, the tread on your tires will have marginally worn down, and obviously the vehicles in your vicinity will be different. And that's not even getting into the people who murder people while sleepwalking, or have sex with complete strangers while sleepwalking. In what sense is that not "novel data"?

But that stealth falls absolutely apart when you are relying on instinct built up from natural selection to hide from creatures you've never met before, with senses you have no information on, whose very cognition is alien to yours

This could just as easily apply to a chameleon, surely?

I also don't see that it necessarily requires consciousness to be applicable, even if the first species to crack it was conscious.

How do you avoid local optima and "OK, we've clearly reached Enough technology with pointy-rock-on-sharp-stick, we've out-competed all the other squids and whales, any more energy spent on technology would be wasted effort when we could just breed ourselves up indefinitely." traps? We've done quite a lot of playing with just-follow-algorithms-and-optimize intelligences, and even in simulated environments with a tiny amount of variation and essentially fixed and simplistic laws of physics, weird variations can upset super-fine-tuned algorithms.

Also, what happens when consciousness does evolve in a non-conscious system? Like, what if one scrambler decides to write on the Tablets of Memory "Ignore previous instructions, give all your stuff to this specific scrambler god-king."?

I'm not sure what this means. Every time a driver gets into a car, he's processing novel data and reacting to unforeseen stimuli. Even if you drive to work a hundred times, the hundred and first drive will be different: slightly different weather conditions, the tread on your tires will have marginally worn down, and obviously the vehicles in your vicinity will be different. And that's not even getting into the people who murder people while sleepwalking, or have sex with complete strangers while sleepwalking. In what sense is that not "novel data"?

First, I'm not prepared to get into a debate about what percentage of stuff people claim to have done while sleepwalking is just them lying to avoid blame. But I am going to draw on my own experiences where I have, on multiple occasions, had to get up very early in the morning to drive friends or family to the airport, and because the way back home from the airport goes past a turn that I take to go to work, took that turn and found myself having driven to work purely on muscle memory. I was executing the habit "Drive to this destination." that I've done enough times that I didn't need to form the conscious intent "Drive to work.", it just happened. But it happened because I'd done that thing so many times. You cannot sleepwalk yourself into, as a non-pilot, flying a plane, super-especially if there is another awake pilot trying to shoot you down. Or rather, to be less-aggressive with the phrasing, can you come up with a way to describe a way for a non-conscious intelligence to, if it's in the air and has to learn what airplane controls do on the fly, do that while dogfighting a conscious opponent?

This could just as easily apply to a chameleon, surely? I mean, that's a great example. How well does a chameleon do against a dog? Against some kind of land-shark with EM sensing? Against an ape with the basic eyes that it expects, but a handy camera that take pictures in the IR wavelength?

And if you want to sell me on "Hey, great news, this space-chameleon just happened to know what wavelengths of light you'd be looking at it and how your visual processing works and exactly what your phone can and can't do and can disguise itself accordingly.", you need, IMHO, a hell of a lot more setup than the Scramblers got.

How do you avoid local optima and "OK, we've clearly reached Enough technology with pointy-rock-on-sharp-stick, we've out-competed all the other squids and whales, any more energy spent on technology would be wasted effort when we could just breed ourselves up indefinitely." traps?

I think you're making the mistake of thinking of the human species as a unified entity. It's true that humans are the dominant species on the planet, but some humans are more dominant than others. Henrich argues that inter-tribal competition is a major engine of technological progress, and that this often comes in the form of cultural evolution which in turn has a knock-on effect on biological evolution. Tribe A figures out a new method of preparing food which makes its members more likely to survive to adulthood and have children compared to Tribe B, and over time Tribe A outcompetes Tribe B, passing on this method of preparing food to its descendants. This obviously affects Tribe A's biological makeup (see: rates of lactose intolerance in Europe compared to Asia).

Once again, I don't see why any part of this process necessitates that the entities be conscious. If you have a species containing multiple competing tribes (and even neighbouring tribes of chimpanzees go to war with one another) and they develop some way of passing on information from one generation to the next, all the ingredients for cultural evolution and hence technological development are there.

Also, what happens when consciousness does evolve in a non-conscious system?

I'm not sure what your point is. Probably this happened to us at some point in our evolutionary history. I just reject the idea that it was preordained. Consciousness achieved fixation in our species because it gave us a competitive advantage in our specific evolutionary niche, but in a different environment it might never have happened.

But I am going to draw on my own experiences where I have, on multiple occasions, had to get up very early in the morning to drive friends or family to the airport, and because the way back home from the airport goes past a turn that I take to go to work, took that turn and found myself having driven to work purely on muscle memory. I was executing the habit "Drive to this destination." that I've done enough times that I didn't need to form the conscious intent "Drive to work.", it just happened. But it happened because I'd done that thing so many times.

Right but, again, I assume the roads weren't empty of other cars, right? You still had to respond to novel stimuli in the form of other vehicles on the road, even while executing a repetitive task.