site banner

Small-Scale Question Sunday for July 2, 2023

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

3
Jump in the discussion.

No email address required.

Time to Orbit: Unknown, a recommendation from /r/rational I've been enjoying.

It's set in a colony ship to a distant system where the protagonist gets woken up early in the middle of things going to shit and has to grapple to fix them. It's really well written with good characters and interesting worldbuilding, except for one part that makes me pull my hair out:

It always severely exasperates me when scifi stories written as late as 2022 depict AI as being dumber than GPT-4 (or GPT-3, going by what was SOTA at the time of writing).

Or the utter lack of robotics, so far. Seriously, a few remote manipulators and some things barely more advanced than the products of modern Boston Dynamics would be nice to see!

And the technology seems quaint given how far in the future it's set. You'd think they'd have full immersion VR or something like that.

I can forgive this in a novel written before, say, 2015, but when the characters bloviate about how AI can never replicate human intelligence and apparently neural networks can't do the work of a brain, I'm going to piss and cry and shit myself in frustration.

Seriously, I'm reading chapters written in Feb 2023 and the issue persists, it takes my suspension of disbelief and hangs it by the neck till it's dead.

Oh, apparently the AI isn't good enough to reliably identify things from video footage, when that was a thing 2 years ago. Ah, it can't do more than basic logic, and is apparently incapable of lying because it's too dumb. Fuck me sideways. It's 2023, even the Somalians have likely heard about ChatGPT.

Of course, I'm unusually sensitive to authors being pussies about advancing technology in their nominally scifi novels, which is why I wrote one myself and threw everything and the kitchen sink into it. Nobody can say I don't put my money where my mouth is in that regard!

Ahem, diatribe out of the way, it's a good story, I'd rate it a 9/10 if not for this point which forces me to penalize it to an 8/10. I know, I know, I'm sorry for being so harsh.

My favorite series of novels set in a sort-of colony ship still remains the Sunflower series by Peter Watts, where the AI in charge is intentionally made dumber than it could be so it can't diverge from its directives over literal millions of years and needs to rely on its human crew as form of checks and balance.

Of course I'm still reading Reverend Insanity, at chapter 1186. Yes, four digit chapter numbers are run of the mill in the genre. At this rate I'm still going to be reading it till the sun swells up and boils the oceans.

No sci fi really captures the possibilities inherent in generative AI because they’re so significant. Even far-out stuff like Culture or Revelation Space doesn’t really.

The problem is that there’s limited room for human protagonists’ agency when AI (or AI + decent robots which all science fiction generally assumes) and this is kind of the core of storytelling. It’s the same reason why sci fi struggles to move away from human pilots and captains and soldiers and so on. I think moving forward a lot of science fiction will be retro-future stuff that imagines we went to space with something like 1960s to 1990s technology and that AI wasn’t invented, or at least not in the way it was here. Starfield seems to be taking this approach.

I'm so, so tired of stories that follow human narrative sensibilities. Are there any books that ask the reader to fall in love with a well crafted structure that completely defies human narrative convention? That aims to map the reader to the alien rather than mapping the alien to the reader?

Stenislaw Lem's Solaris might be worth a read if you haven't already. It doesn't play with a narrative convention at all though, but conveys a sense of something truly alien.

I'd say there are, but the more you do this the more avant garde / surreal things become, and the more skilled a writer you have to be to make things work. I guess Flatland is probably one of the most famous examples.

Flatland was good. I was also a fan of the aliens in slaughterhouse five, though they weren't central. I like nature documentaries- but I don't think they go far enough. Ant youtubers who get really passionate about morphology and behavioral analysis are ok. Sometimes I get my jollies just by reading ML whitepapers. Animorphs had a lot going for it, though I read it all as a kid and don't know if I would again.

No sci fi really captures the possibilities inherent in generative AI because they’re so significant. Even far-out stuff like Culture or Revelation Space doesn’t really.

I'm really not asking for much, just that an author writing in X AD account for something that was clearly in existence in X-1 AD.

Or at the very least, it would be trivial for the author to make up some excuse for their absence. Say, normally the AI was a standard AGI, but it was intentionally sabotaged during the flight and is in a crippled position. Or the section carrying all the heavy robots in storage was hit by debris that made it past the shielding.

At least some sign that the author is aware of the issue and is attempting to placate my suspension of disbelief.

As per Yudkowsky's take on Vinge's law, it's pretty much impossible for a human to write a compelling superhuman intelligence in the first place, so I am willing to look the other way most of the time.

Still, I was so ticked off at this point that I went to the Author's discord, and boy did I end up on the wrong side of the tracks.

They/Thems for miles (even the author, which I sort of suspected from the emphasis on neo-pronouns and weird additional genders, but I actually don't mind that because it's set hundreds of years in the future and it would be weird if there weren't any running around).

I was confused to see half a dozen bot accounts replying to me, before someone informed me that this was people using "alters", some kind of DID bullshit I presume, since the bot's description explained it was a way for people to talk to themselves as a different persona (???).

I more or less copy pasted my complaints, and was immediately inundated by more They/Thems spouting the absolute worst takes on AI, to the point my eyes bled. At least they were mostly polite about it, but I'm sure they're already well acquainted with accommodating people with weird personal quirks, if you count my intolerance for gaping plot holes as one.

Then the author themselves showed up, and informed me that they were aware of modern AI, yet apparently disagreed on their capabilities and future.

This pretty much killed me outright, so I politely agreed to disagree and left. I am unsure what mechanism he's using to extrapolate the future that requires AI to he worse than they are today after hundreds of years, and I'd rather not even ask.

I guess if all science fiction written after 2024 includes something suspiciously like Butlerian Jihad then be careful what you wish(ed) for?

Well, in my setting, I chose the route of having the initial singularity be aborted by humanity panicking the fuck out regardless of whether the AI was actually aligned or not.

And the justification for limiting the spread of generalized ASI was to prevent that from happening again, with the operational solution being either having AGIs locked to the last human level proven empirically safe, or only allowing narrowly superhuman AGI.

It's a world where Yudkowskian fighter jets dropping bombs on data centers isn't a joke, but they usually go for nukes and antimatter explosives.

I'll leave it to the reader to decide whether that's a bad thing, but at the very least I don't commit the sin of writing about technology worse than today without an explanation of any kind. Butlerian Jihad it isn't though.