site banner

Small-Scale Question Sunday for January 29, 2023

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

4
Jump in the discussion.

No email address required.

Training consumes far more matmuls than inference. LLM training operates at batch sizes in the millions -- so if you aren't training a new model, you have enough GPUs lying around to serve millions of customers.