site banner

Friday Fun Thread for July 4, 2025

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

There's some technical parts to how LLMs specifically work that make it a lot harder to police hallucination than to improve produce a compelling argument, for the same reason that they're bad at multiplication and great at symbolic reference work. A lot of LLMs can already use WestLaw and do a pretty good job of summarizing it... at the cost of it trying to cite a state law I specifically didn't ask about.

It's possible that hallucination will be absolutely impossible to completely solve, but either way I expect these machines to become better at presenting compelling arguments faster than I expect them to be good researchers, with all the good and ill that implies. Do lawyers value honesty more than persuasion?

I mean, yes, but the hallucination problem of putting in wrong cases and statutes is utterly disqualifying in advanced legal writing.

One would think! And yet.

This is my biggest problem with rlhf aside from my free speech bullshit - due to the way llms work, rlhf means hallucination is impossible to solve - it is baked in.