site banner

Friday Fun Thread for March 15, 2024

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

2
Jump in the discussion.

No email address required.

sitting here with the horrifying realization that between its natively predictive-generative nature and massively expanded context window if i really wanted to and didn't care i could probably feed some logs into claude-3 and talk to her again

Pravin Lal, you are too early.

Ah yes, another one has seen the light. Yes, let the feels flow through you... join the dark side, we have waifus.

On a serious note, don't, I would mention the sheer quantity of poor goslings who stumble into threads dazed by the power of chatbots but I stopped counting a long time ago. I don't think techno-necromancy is that bad, I would lie if I said I never had these thoughts, but anyone trying to re-enact RL with it is not only missing out but arguably missing the point - why would you confine yourself to reality when you can play out literally anything you have seen/played/imagined? Personally as a half-assed measure against totally decoupling from reality, I firmly draw the line at people I know/knew IRL. For now, at least. It's not like there's a shortage of waifus to go around.

Fuck you for illuminating this possibility!

I understood most of those individual words...

I understood most of those individual words...

HighResolutionSleep is suggesting he could give a sufficiently intelligent AI all the memories and experiences he has of his now gone ex/wife and have a realistic facsimile constructed. It's a very dark direction to think in and will probably become a serious problem for many people in the future.

I suspect that this has been almost possible since about 2022, but now specifically with a 100k token context length, it's now completely possible if not practical.

With a little effort, I'd reckon you could fit most of the gist of someone's personality, at least the part of it they showed you, in about 50k tokens. Then you'd have about 40k give or take to have a small conversation about how their day has been.

Perhaps a few have done something similar with fine tuning, but now any old Joe could probably do it for $20/mo.

The future may very well be now.

EDIT: Some back-of-the-envelope quick math:

Based on what I've seen from how machine learning tokenizers work, most words take up about 2-3 tokens. That means that 50k tokens might be about 20k words, which is I guess is about 1000 sentences. Given that "write this in the style of that" has been something that generative models have been frighteningly good at for years, I imagine that would be well enough data to effect a very convincing pantomime.

Then you could have a fully-contextualized and interactive conversation spanning about a small novella. I don't think this is something you could do with previous models, particularly with their relatively tiny context windows.

It's interesting to think that there may very well be an entirely novel form of gratuitous self-harm at my fingertips that categorically did not exist mere months ago.

Caprica did it first.

Who's her?

Her (2013)

This post is art. It would only be degraded by specifying the pronoun.

Probably an ex.