site banner

Friday Fun Thread for March 24, 2023

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

5
Jump in the discussion.

No email address required.

OP's question is about what you consider AGI. I consider it general intelligence, like that it can do a very wide variety of basic tasks and easily learn how to do new things. A human child once they're 3-5 years old is a general intelligence in my opinion. But yeah the exact definition is all in the eye of the beholder.

If AGI signals the singularity, and the singularity is the moment when AI starts improving itself recursively, then the definition of AGI surely involves a self improvement mechanism capable of exceeding human potential.

I see your point but I think @non_radical_centrist has one, too. Let's say we develop an AI that perfectly emulates a 70 IQ human named LLM-BIFF. That's general intelligence. Set all super-computers on earth to run LLM-BIFF. Does LLM-BIFF recursively self-improve itself to become LLM-SHODAN?

There must be a narrow window of AI sophistication in which we have a generally intelligent program, but nevertheless one not intelligent enough to bootstrap itself and trigger a singularity. Whether this window lasts one iteration of AI development or much longer is the question.