I want to believe, but I asked Gemini 2.5 Pro to spec a computer for me, and it starts hallucinating motherboards that don't exist, insisting on using them even after being told they don't exist. Maybe it's OK for brainstorming, but everything it says needs to be double-checked. We ain't there yet.
This is a form of Gell-Mann amnesia effect. When there's instant feedback and excellent legibility of when answers are correct vs incorrect, like programming, we instantly see the flaws. But on softer squishier questions, you accept the answers. But it's all similarly bad AI slop.
- Prev
- Next
I have similar experiences, but the LLMs will correct their correct answer to be incorrect. I now just view the whole project as useful for creative idea generation, but any claims on the real world need to be fact checked. No lab seems to be able to get these things to stop confabulating, and I'm astonished people trust them as much as they seem to.
More options
Context Copy link