site banner

Small-Scale Question Sunday for April 12, 2026

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

If you were creating an LLM, would you train on the test set? If not, does that mean that you lack benevolence? You could just clearly and directly give it the answers!

Of course I lack benevolence towards an LLM. I can be polite to it out of habit, but I wouldn't hesitate to do horrible things to it if that made it work better.

if that made it work better

It seems to me that you are saying that you have goals for what you want the end product to be like. As such, I think you're implicitly affirming that you would choose to not do things like train on the test set. That is, you wouldn't just clearly and directly give it the answers, even though you could.

Now, the question seems to me, "What do you even mean by benevolence?" You originally said:

Lack of benevolence: God created the world and all that is in it, and is able to interact with it, but doesn't actually care about us.

But this sort of doesn't make direct sense. You care about the LLM you're creating. You deeply care about it, at least in that you very much care to "ma[k]e it work better". It seems like you're using some other sense of words that is not fully fleshed-out. Like, maybe to be benevolent, you have to care about some particular type of goal or in some particular way, but other types of caring/goals do not count, or something. I think we just don't have enough information to figure out whether this reasoning makes much sense.