Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.

Jump in the discussion.
No email address required.
Notes -
I have similar experiences, but the LLMs will correct their correct answer to be incorrect. I now just view the whole project as useful for creative idea generation, but any claims on the real world need to be fact checked. No lab seems to be able to get these things to stop confabulating, and I'm astonished people trust them as much as they seem to.
Just to round out the space of anecdotes a little more: when I've called out LLMs in the past I've sometimes had them "correct" their incorrect answer to still be incorrect but in a different way.
(has anyone seen an LLM correct their correct answer to be correct but in a different way? that would fill the last cell of the 2x2 possibility space)
They're still very useful in cases where checking an answer for correctness is much easier than coming up with a possible answer to begin with. I love having a search engine where my queries can be vague descriptions and yet still come up with a high rate of reasonable results. You just can't skip the "checking an answer for correctness" step.
Yes this used to be commonplace in my experience. One should always at the very least triangulate results with other sources if the stakes are high.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link