site banner

Small-Scale Question Sunday for August 10, 2025

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

How do you all interact with LLMs?

I’ve seen a few articles recently noting the rise of AI as a buddy / therapist or whatever. It’s usually beside the point of the article but an implicit notion is that lot of folks regularly ‘chat’ with AI as if it were a person.

Which I find baffling. Outside of the very early novelty, I find this way of interacting extremely boring and tedious, and generally find the fact that AI wants to get conversational with me a general frustrater.

If I’m not using AI as a utility ‘write X, troubleshoot Y, give me steps for Z’, and I’m using it recreationally / casually, it’s more akin to web surfing or browsing Wikipedia than chatting on a forum or whatever. I will use it as an open format encyclopedia and explicitly not as a conversationalist sounding board. And i genuinely find negative value in the fact that the former is constantly interrupted with the attempt to be the latter.

So my question is again, how far outside of the grain am I?

I do not use LLMs as therapists or "buddies". There was one specific instance where I was genuinely depressed and anxious about my future finances, and Gemini 2.5 Pro did an excellent job and demonstrated great emotional intelligence while reassuring me. But that was mostly because it gave me concrete reasons not to worry, operating closer to a financial counselor than a standard therapist. Most therapists I know, while perfectly normal and decent people, do not give good investment advice.

(I was able to read its reasoning trace/COT, and to the extent that represents its internal cogitation, it seemed like it was making almost precisely the same emotional and logical considerations that I, as a human psychiatrist, would make in a similar situation)

At the same time, I think you can do worse than go to LLMs with your problems, as long as you don't use GPT-4o. I'm not tempted to do so, but I don't even use human therapy either.

What I do usually use them for, on a regular basis:

  • An intelligent search engine that hasn't been SEO'd to death. Even Google has realized how shitty it's become, and begun using AI to summarize answers. Unfortunately, Google uses just about the dumbest model it feels it can get away with in a bid to cut costs.

  • The ability to answer tip-of-the-tongue queries at superhuman levels of proficiency

  • Writing advice as a perfectly usable editor or second set of eyes.

  • It's probably easier to answer with the very limited subset of queries that I wouldn't use them for. They're good at most tasks, but far from perfect.