@FeepingCreature's banner p

FeepingCreature


				

				

				
0 followers   follows 0 users  
joined 2022 September 05 00:42:25 UTC
Verified Email

				

User ID: 311

FeepingCreature


				
				
				

				
0 followers   follows 0 users   joined 2022 September 05 00:42:25 UTC

					

No bio...


					

User ID: 311

Verified Email

Out of interest, do you think that a mars base is sci-fi? It's been discussed in science fiction for a long time.

I think any predictions about the future that assume new technology are "science fiction" p much by definition of the genre, and will resemble it for the same reason: it's the same occupation. Sci-fi that isn't just space opera ie. "fantasy in space", is inherently just prognostication with plot. Note stuff like Star Trek predicting mobile phones, or Snowcrash predicting Google Earth: "if you could do it, you would, we just can't yet."

"At this rate of growth, the entire lake will be this algae in a few days". "Ludicrous silliness!"

The point is we don't have a clue where the sigmoid will level, and there doesn't seem to be a strong reason to think it'll level at the human norm considering how different AI as a technology is to brains. To be clear, I can see reasons why it'll level below the human norm; lithography is a very different technology from brains and it does sure look like the easily Moore-reachable performance for a desktop or even datacenter deployment will sigmoid out well below the human brain scale. But note how that explanation has nothing to do with human brains for reference, and if things go a bit different and Moore keeps grinding for a few more turns, or we find some way to sidestep the limits of lithography like a much cheaper fabrication process leading to very different kinds of deployment, or OpenAI decide to go all in on a dedicated megatraining run with a new continuous-learning approach that happens to work on first, second or third try (their deployed capacity is already around a human brain), then there's nothing stopping it from capping out well above human level.

I genuinely don't understand how you can say it's plausible to happen at all, but sci-fi nonsense to happen likely. By and large probability is in the mind, and "sci-fi" is usually a claim about the reality part of a belief rather than the opinion. It'd be like saying "It's possible that it happens soon, but it's raving sci-fi nonsense for you to be worried about it."